US20140129329A1 - Server, analysis method and computer program product - Google Patents

Server, analysis method and computer program product Download PDF

Info

Publication number
US20140129329A1
US20140129329A1 US14/065,670 US201314065670A US2014129329A1 US 20140129329 A1 US20140129329 A1 US 20140129329A1 US 201314065670 A US201314065670 A US 201314065670A US 2014129329 A1 US2014129329 A1 US 2014129329A1
Authority
US
United States
Prior art keywords
information
product
piece
combination
pieces
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/065,670
Inventor
Masahiro Sekine
Masashi Nishiyama
Kaoru Sugita
Hidetaka Ohira
Goh Itoh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Corp
Original Assignee
Toshiba Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Corp filed Critical Toshiba Corp
Assigned to KABUSHIKI KAISHA TOSHIBA reassignment KABUSHIKI KAISHA TOSHIBA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ITOH, GOH, NISHIYAMA, MASASHI, OHIRA, HIDETAKA, SEKINE, MASAHIRO, SUGITA, KAORU
Publication of US20140129329A1 publication Critical patent/US20140129329A1/en
Priority to US15/332,651 priority Critical patent/US10311497B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0623Item investigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0255Targeted advertisements based on user history
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0621Item configuration or customization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers

Definitions

  • Embodiments described herein relate generally to a server, an analysis method, and a computer program product.
  • O2O Online to Offline
  • FIG. 1 is a diagram illustrating an example of a system according to a first embodiment
  • FIG. 2 is a diagram illustrating an example of a first terminal according to the first embodiment
  • FIG. 3 is a diagram illustrating an example of a second terminal according to the first embodiment
  • FIG. 4 is a diagram illustrating an example of a server according to the first embodiment
  • FIG. 5 is a table illustrating examples of recognition information according to the first embodiment
  • FIG. 6 is a table illustrating examples of combination information according to the first embodiment
  • FIG. 7 is a diagram illustrating an example of a data structure of product information according to the first embodiment
  • FIG. 8 is a flowchart illustrating an example of processing according to the first embodiment
  • FIG. 9 is a diagram illustrating an example of a system according to a second embodiment.
  • FIG. 10 is a diagram illustrating an example of a third terminal according to the second embodiment.
  • FIG. 11 is a diagram illustrating an example of a server according to the second embodiment.
  • FIG. 12 is a table illustrating examples of purchase information according to the second embodiment
  • FIG. 13 is a table illustrating examples of first sales promotion information before being updated according to the second embodiment
  • FIG. 14 is a table illustrating examples of first sales promotion information after being updated according to the second embodiment
  • FIG. 15 is a table illustrating examples of store layout information before being updated according to the second embodiment
  • FIG. 16 is a table illustrating examples of store layout information after being updated according to the second embodiment
  • FIG. 17 is a flowchart illustrating an example of processing according to the second embodiment
  • FIG. 18 is a diagram illustrating an example of a system according to a modification.
  • FIG. 19 is a diagram illustrating an example of a hardware configuration of the server according to the embodiments and modifications.
  • a server includes a first acquiring unit, a recognition information storage unit, a second acquiring unit, a combination information storage unit, an analyzing unit, and an output unit.
  • the first acquiring unit is configured to acquire a piece of recognition information including a piece of product identification information for identifying a product included in a product image.
  • the recognition information storage unit is configured to store the piece of recognition information.
  • the second acquiring unit is configured to acquire a piece of combination information including the piece of product identification information of the product to be combined with an object image including an object.
  • the combination information storage unit is configured to store the piece of combination information.
  • the analyzing unit is configured to calculate product priorities for respective products by analyzing a plurality of pieces of recognition information stored in the recognition information storage unit and a plurality of pieces of combination information stored in the combination information storage unit.
  • the output unit is configured to output information based on the product priorities.
  • FIG. 1 is a configuration diagram illustrating an example of a system 1 according to a first embodiment.
  • the system 1 includes a first terminal 10 , a second terminal 20 , and a server 30 .
  • the first terminal 10 , the second terminal 20 and the server 30 are connected via a network 2 .
  • the network 2 can be realized by the Internet or a local area network (LAN), for example.
  • the first terminal 10 is an image recognition terminal that includes a recognizing unit 11 and that acquires related information on a real object of interest to the user by being held over the real object.
  • the first terminal 10 can be realized by a portable terminal, for example.
  • to acquire related information on a real object by focus the first terminal 10 over the real object may be referred to as “focus”.
  • the second terminal 20 is an image combining terminal that includes a combining unit 21 and that performs virtual fitting simulation, virtual installation simulation and the like.
  • the second terminal 20 is installed in a store selling products, for example.
  • to experience a product of interest to the user through virtual fitting simulation, virtual installation simulation or the like may be referred to as “try”.
  • the user holds the first terminal 10 over a real object of interest to acquire related information on the product and that, starting from the acquired related information, the user is encouraged to go to the store in which the second terminal 20 is installed and to experience the product through virtual fitting simulation or virtual installation simulation, which is linked to purchase of the product.
  • FIG. 2 is a configuration diagram illustrating an example of the first terminal 10 according to the first embodiment.
  • the first terminal 10 includes a recognizing unit 11 , an imaging unit 12 , a feedback information storage unit 13 , a display unit 14 , and an output unit 15 .
  • the recognizing unit 11 may be implemented by making a processor such as a central processing unit (CPU) execute a program, that is, by software, may be implemented by hardware such as an integrated circuit (IC), or may be implemented by combination of software and hardware, for example.
  • the imaging unit 12 can be realized by an imager such as a digital camera, for example.
  • the feedback information storage unit 13 can be realized by a storage device that can magnetically, optically or electrically store information such as a hard disk drive (HDD), a solid state drive (SSD), a memory card, an optical disk, or a random access memory (RAM), for example.
  • the display unit 14 can be realized by a display device such as a liquid crystal display or a touch panel display, for example.
  • the output unit 15 can be realized by a communication device such as a network interface card (NIC), for example.
  • NIC network interface card
  • the imaging unit 12 images a real object of interest to the user to generate a product image.
  • Examples of the real object of interest to the user include an advertisement of a product of interest to the user, but the real object may be a product itself of interest to the user.
  • the feedback information storage unit 13 stores feedback information. Details of the feedback information will be described later.
  • the recognizing unit 11 includes an image recognizing unit 16 and a feedback unit 17 .
  • the image recognizing unit 16 recognizes a product image, estimates a product included in the product image, and selects at least any one of a plurality of kinds of related information on the product. Specifically, the image recognizing unit 16 acquires product information on the estimated product from the server 30 and selects at least any one of a plurality of kinds of related information contained in the acquired product information.
  • the product information acquired by the image recognizing unit 16 contains a product ID (an example of product identification information) of the estimated product and a plurality of kinds of related information.
  • Examples of the kinds of related information include attribute information and accompanying information of the estimated product. Examples of the attribute information include brand, price, color, and material, and examples of the accompanying information include word of mouth, recommended coordinates and store information (address, map, etc.).
  • the image recognizing unit 16 selects related information according to the feedback information.
  • the feedback unit 17 stores feedback information based on information transmitted from the server 30 in the feedback information storage unit 13 .
  • the display unit 14 displays the related information selected by the image recognizing unit 16 .
  • the display unit 14 displays word of mouth, recommended coordinates, store information or the like of the product estimated by the image recognizing unit 16 as an image, for example.
  • the output unit 15 outputs the recognition information to the server 30 .
  • the recognition information at least contains a product ID of the product estimated by the image recognizing unit 16 .
  • the recognition information may contain product image information and related information of the product image.
  • the product image information may be the product image itself or may be an image matched with the product image in image recognition performed by the image recognizing unit 16 or an image ID of the image.
  • the recognition information may contain the date and time of recognition, the position of recognition, a user ID of the user, and the like.
  • FIG. 3 is a configuration diagram illustrating an example of the second terminal 20 according to the first embodiment.
  • the second terminal 20 includes a combining unit 21 , an imaging unit 22 , a feedback information storage unit 23 , a display unit 24 , and an output unit 25 .
  • the combining unit 21 may be implemented by making a processor such as a central processing unit (CPU) execute a program, that is, by software, may be implemented by hardware such as an IC, or may be implemented by combination of software and hardware, for example.
  • the imaging unit 22 can be realized by an imager such as a digital camera, for example.
  • the feedback information storage unit 23 can be realized by a storage device that can magnetically, optically or electrically store information such as an HDD, an SSD, a memory card, an optical disk, or a RAM, for example.
  • the display unit 24 can be realized by a display device such as a liquid crystal display or a touch panel display, for example.
  • the output unit 25 can be realized by a communication device such as an NIC, for example.
  • the imaging unit 22 images an object to be combined to generate an image to be combined. Examples of the object to be combined include the user.
  • the feedback information storage unit 23 stores feedback information. Details of the feedback information will be described later.
  • the combining unit 21 includes an image combining unit 26 and a feedback unit 27 .
  • the image combining unit 26 combines the image to be combined generated by the imaging unit 22 and an image for combination of a product (such as clothes). Specifically, the image combining unit 26 acquires product information of a plurality of products from the server 30 , displays images for combination contained in the acquired product information on the display unit 24 , and combines an image for combination selected by the user with the image to be combined generated by the imaging unit 22 .
  • the product information acquired by the image combining unit 26 contains product IDs (an example of product identification information) of the products and a group of images for combination. Since the images for combination are present for each category of products, the images for combination are in a form of groups. The category may be the kind or the use of products or the state in which products are tried on.
  • the image combining unit 26 displays the images for combination on the display unit 24 in a manner that the user can preferentially select an image for combination indicated by the feedback information.
  • the feedback unit 27 stores feedback information based on information transmitted from the server 30 in the feedback information storage unit 23 .
  • the display unit 24 displays images for combination to be selected by the user and combined images obtained by combination by the image combining unit 26 .
  • the output unit 25 outputs combination information to the server 30 .
  • the combination information at least contains a product ID of the product estimated by the image combining unit 26 .
  • the combination information may contain combined image information of an image to be combined and combination image information of an image for combination.
  • the combined image information may be the image to be combined itself or may contain depth information obtained by sensing the image to be combined, skeleton information indicating the outline of a person, and measurement information such as height, weight, chest circumference, sitting height and the like in addition to the image to be combined.
  • the combination image information may be the image for combination itself or may be an image ID of the image for combination.
  • the combination information may contain the date and time of combination, the position of combination, a user ID of the user, the category of the product and the like.
  • FIG. 4 is a configuration diagram illustrating an example of the server 30 according to the first embodiment.
  • the server 30 includes a first acquiring unit 31 , a recognition information storage unit 32 , a second acquiring unit 33 , a combination information storage unit 34 , an analyzing unit 35 , an output unit 36 , and a product information storage unit 37 .
  • the first acquiring unit 31 , the second acquiring unit 33 and the analyzing unit 35 may be implemented by making a processor such as a central processing unit (CPU) execute a program, that is, by software, may be implemented by hardware such as an integrated circuit (IC), or may be implemented by combination of software and hardware, for example.
  • the recognition information storage unit 32 , the combination information storage unit 34 , and the product information storage unit 37 can be realized by a storage device that can magnetically, optically or electrically store information such as an HDD, an SSD, a memory card, an optical disk, or a RAM, for example.
  • the output unit 36 can be realized by a communication device such as an NIC, for example.
  • the first acquiring unit 31 acquires recognition information including at least the product ID of the product estimated by the recognizing unit 11 from the recognizing unit 11 (the output unit 15 ), and stores the acquired recognition information in the recognition information storage unit 32 .
  • the recognition information may further contain information as mentioned in the description of the output unit 15 .
  • the recognition information storage unit 32 stores a plurality of pieces of recognition information stored by the first acquiring unit 31 .
  • FIG. 5 is a table illustrating examples of the recognition information according to the first embodiment.
  • the recognition information is information in which a number, the date and time of recognition, the product image information, a product ID, and the related information displayed at the first terminal 10 are associated, but the recognition information is not limited thereto.
  • the second acquiring unit 33 acquires combination information including at least the product ID of the product in the image for combination combined with the image to be combined from the combining unit 21 (the output unit 25 ), and stores the acquired combination information in the combination information storage unit 34 .
  • the combination information may further contain information as mentioned in the description of the output unit 25 .
  • the combination information storage unit 34 stores a plurality of pieces of combination information stored by the second acquiring unit 33 .
  • FIG. 6 is a table illustrating examples of the combination information according to the first embodiment.
  • the combination information is information in which the number, the date and time of combination, the combined image information, the product ID and the category are associated, but the combination information is not limited thereto.
  • the analyzing unit 35 analyzes a plurality of pieces of recognition information stored in the recognition information storage unit 32 and a plurality of pieces of combination information stored in the combination information storage unit 34 , and calculates product priority of each product. Specifically, the analyzing unit 35 analyzes the pieces of recognition information to calculate first product priority of each product, analyzes the pieces of combination information to calculate second product priority of each product, and calculate the product priority of each product on the basis of the first product priority and the second product priority of each product.
  • the analyzing unit 35 calculates the product priority E of a certain product by calculating the first product priority Er and the second product priority Es of the product and calculating weighted addition of the calculated first product priority Er and second product priority Es as expressed by Equation (1). If the product ID of the product for which the first product priority Er is calculated is not present in the pieces of combination information, the second product priority Es of the product is obviously 0, and if the product ID of the product for which the second priority Es is calculated is not present in the pieces of recognition information, the first product priority Er of the product is obviously 0.
  • Equation (1) wr represents the weight of the priority Er and ws represents the weight of the priority Es.
  • the analyzing unit 35 analyzes the pieces of recognition information and sets the first product priority Er of the product represented by a product ID associated with recognition date and time to be higher as the recognition date and time is closer to the current date and time. That is, the analyzing unit 35 sets the first product priority Er to be higher for a product over which a terminal was held on the date and time closer to the current date and time.
  • the analyzing unit 35 also analyzes the pieces of recognition information and sets the first product priority Er to be higher for a product represented by a product ID having a value whose number of occurrences is larger. That is, the analyzing unit 35 sets the first product priority Er to be higher for a product over which a terminal was held a larger number of times.
  • the analyzing unit 35 analyzes the pieces of combination information and sets the second product priority Es of a product represented by a product ID associated with combination date and time contained in the combination information to be higher as the combination date and time is closer to the current date and time. That is, the analyzing unit 35 sets the second product priority Es to be higher for a product that was tried on the date and time closer to the current date and time.
  • the analyzing unit 35 also analyzes the pieces of combination information and sets the second product priority Es to be higher for a product represented by a product ID having a value whose number of occurrences is larger. That is, the analyzing unit 35 sets the second product priority Es to be higher for a product that was tried a larger number of times.
  • the analyzing unit 35 further analyzes whether or not combination information including a product ID of a product whose product priority E satisfies a first predetermined condition exists in the pieces of combination information, and generates first recommendation information recommending related information according to the analysis result among a plurality of kinds of related information.
  • the first predetermined condition may be a threshold or may be the product priorities E from the highest priority to a certain predetermined rank of priority.
  • the analyzing unit 35 For example, if combination information including a product ID of a product whose product priority E satisfies the first predetermined condition does not exist in the pieces of combination information, the analyzing unit 35 generates first recommendation information recommending store information among a plurality of kinds of related information. In this case, since “focus” is performed but “try” is not performed, it is possible to encourage the user to perform “try” by recommending the store information of the store in which the second terminal 20 is installed, and as a result, it may be possible to motivate the user to buy the product.
  • the analyzing unit 35 If, for example, combination information including a product ID of a product whose product priority E satisfies the first predetermined condition exists in the pieces of combination information, the analyzing unit 35 generates first recommendation information recommending recommended coordinates among a plurality of kinds of related information. In this case, since both “focus” and “try” are performed, it may be possible to motivate the user to buy other products recommended in the recommended coordinates by recommending the recommended coordinates.
  • the analyzing unit 35 analyzes the number of occurrences of each of the categories in a plurality of pieces of combination information and generates second recommendation information recommending a category with the largest number of occurrences.
  • the second predetermined condition may be a threshold or may be the product priorities E from the highest priority to a certain predetermined rank of priority.
  • a product with the product priority E satisfying the second predetermined condition is a bag that can be carried in three ways: a handbag, a shoulder bag, and a backpack.
  • the analyzing unit 35 analyzes the number of occurrence of each of handbag, shoulder bag and backpack in the pieces of combination information. Then, if the number of occurrences of shoulder bag is the largest, the analyzing unit 35 generates second recommendation information recommending the shoulder bag. In this case, since it is popular among users to perform “try” on the shoulder bag, it may be possible to motivate the user to buy the product by recommending the shoulder bag. If, however, “try” on the shoulder bag style of the bag is already performed, second recommendation information recommending another category such as handbag or backpack on which “try” has not been performed may be generated.
  • the output unit 36 outputs information regarding a product based on the product priority calculated by the analyzing unit 35 to at least one of the recognizing unit 11 and the combining unit 21 .
  • the information based on the product priority may be the product priority itself or may be related information or an image for combination of the product with the product priority.
  • the related information and the image for combination can be obtained from the product information storage unit 37 .
  • the output unit 36 also outputs information based on the product priority calculated by the analyzing unit 35 and the first recommendation information generated by the analyzing unit 35 to the recognizing unit 11 .
  • the information based on the product priority and the first recommendation information may be information indicating the product priority and recommended related information or may be recommended related information on the product with the product priority.
  • the output unit 36 also outputs information based on the product priority calculated by the analyzing unit 35 and the second recommendation information generated by the analyzing unit 35 to the combining unit 21 .
  • the information based on the product priority and the second recommendation information may be the product priority and an image ID of a recommended image for combination or may be a recommended image for combination of the product with the product priority.
  • the information output by the output unit 36 in this manner is used as feedback information at the recognizing unit 11 and the combining unit 21 , so that information with higher probability of motivating the user to buy a product is preferentially displayed at the first terminal 10 and the second terminal 20 .
  • the output unit 36 When it is requested by the image recognizing unit 16 to acquire product information, the output unit 36 acquires the requested product information from the product information storage unit 37 and outputs the acquired product information to the image recognizing unit 16 . Similarly, when it is requested by the image combining unit 26 to acquire product information, the output unit 36 acquires the requested product information from the product information storage unit 37 and outputs the acquired product information to the image combining unit 26 .
  • the product information storage unit 37 stores product information of products.
  • FIG. 7 is a diagram illustrating an example of a data structure of the product information according to the first embodiment.
  • the product information is information in which a product ID, attribute information (brand, price, color, material, etc.), accompanying information (word of mouth, recommended coordinates, store information (address, map, etc.), etc.) and a group of images for combination are associated, but the product information is not limited thereto.
  • FIG. 8 is a flowchart illustrating an example of a flow of procedures of processing performed by the server 30 according to the first embodiment.
  • the first acquiring unit 31 acquires recognition information including at least a product ID of a product estimated by the recognizing unit 11 from the recognizing unit 11 (the output unit 15 ), and stores the acquired recognition information in the recognition information storage unit 32 (step S 101 ).
  • the second acquiring unit 33 acquires combination information including at least a product ID of a product in an image for combination combined with an image to be combined from the combining unit 21 (the output unit 25 ), and stores the acquired combination information in the combination information storage unit 34 (step S 103 ).
  • the analyzing unit 35 analyzes a plurality of pieces of recognition information stored in the recognition information storage unit 32 to calculate first product priority of each product, analyzes a plurality of pieces of combination information stored in the combination information storage unit 34 to calculate second product priority of each product, and calculates product priority of each product on the basis of the first product priority and the second product priority of each product (step S 105 ).
  • the analyzing unit 35 further analyzes whether or not combination information including a product ID of a product whose product priority satisfies the first predetermined condition exists in the pieces of combination information, and generates first recommendation information recommending related information according to the analysis result among a plurality of kinds of related information (step S 107 ).
  • the output unit 36 outputs information based on the product priority calculated by the analyzing unit 35 and the first recommendation information generated by the analyzing unit 35 to the recognizing unit 11 (step S 109 ).
  • the analyzing unit 35 analyzes the number of occurrences of each of the categories in the pieces of combination information and generates second recommendation information recommending a category with the largest number of occurrences (step S 111 ).
  • the output unit 36 outputs information based on the product priority calculated by the analyzing unit 35 and the second recommendation information generated by the analyzing unit 35 to the combining unit 21 (step S 113 ).
  • the product priority taking history of various O2O related technologies into consideration can be calculated by analyzing the history of the recognition information and the history of the combination information to calculate the product priority, products of greater interest to the user can be extracted.
  • the recognizing unit and the combining unit since information based on the calculated product priority is output to the recognizing unit and the combining unit, the recognizing unit and the combining unit can preferentially present products of greater interest to the user by using the information and it is thus possible to increase the probability of motivating the user to buy a product.
  • the recognizing unit can preferentially present information of greater interest to the user by using the information and it is thus possible to increase the probability of motivating the user to buy a product.
  • the combining unit can preferentially present information of greater interest to the user by using the information and it is thus possible to increase the probability of motivating the user to buy a product.
  • the recognizing unit 11 can contain product image information and related information in the recognition information, it is also possible to figure out over what real objects the user held the terminal and what products the user is interested in. For example, it is possible to figure out whether the user got interested in a product X by focus the terminal over an advertisement A or by focus the terminal over an advertisement B, which allows the history through which the user got interested in the product X to be used in the analysis.
  • the combining unit 21 (the output unit 25 ) can contain combined image information and combination image information in the combination information, it is also possible to figure out what image for combination is combined with what image to be combined. For example, it is possible to figure out such a fact that people with a body type A often try on clothes Y or such a fact that people with a body type B often try on clothes Z, and it is thus possible to obtain a tendency of “try” of each user by data analysis.
  • FIG. 9 is a configuration diagram illustrating an example of a system 101 according to the second embodiment. As illustrated in FIG. 9 , the system 101 is different from that in the first embodiment in a server 130 and a third terminal 140 thereof.
  • the third terminal 140 is a management terminal that includes a managing unit 141 and that manages sales information related to sales of products.
  • FIG. 10 is a configuration diagram illustrating an example of the third terminal 140 according to the second embodiment.
  • the third terminal 140 includes a managing unit 141 , a sales information storage unit 142 , a display unit 143 , and an output unit 144 .
  • the managing unit 141 may be implemented by making a processor such as a CPU execute a program, that is, by software, may be implemented by hardware such as an IC, or may be implemented by combination of software and hardware, for example.
  • the sales information storage unit 142 can be realized by a storage device that can magnetically, optically or electrically store information such as an HDD, an SSD, a memory card, an optical disk, or a RAM, for example.
  • the display unit 143 can be realized by a display device such as a liquid crystal display or a touch panel display, for example.
  • the output unit 144 can be realized by a communication device such as an NIC, for example.
  • the sales information storage unit 142 stores sales information related to sales of products.
  • Examples of the sales information include purchase information indicating details of purchase of a product, sales promotion information relating to sales promotion of a product, customer information, inventory information, and training information relating to training of store staff.
  • the purchase information contains at least a product ID of a product to be purchased.
  • the purchase information may also contain the date and time of purchase.
  • the sales promotion information contains first sales promotion information relating to sales promotion using product images and second sales promotion information relating to sales promotion using images for combination. Examples of the sales promotion information include information on advertising strategy, store layout, procurement plan, product lineup, and methods for recommending products to customers.
  • the managing unit 141 manages the sales information stored in the sales information storage unit 142 .
  • the display unit 143 displays the sales information managed by the managing unit 141 .
  • the output unit 144 outputs the sales information to the server 130 .
  • the output unit 144 outputs purchase information and sales promotion information to the server 130 .
  • FIG. 11 is a configuration diagram illustrating an example of the server 130 according to the second embodiment. As illustrated in FIG. 11 , the server 130 is different from that in the first embodiment in an analyzing unit 135 , a third acquiring unit 138 , and a sales information storage unit 139 .
  • the third acquiring unit 138 may be implemented by making a processor such as a CPU execute a program, that is, by software, may be implemented by hardware such as an IC, or may be implemented by combination of software and hardware, for example.
  • the sales information storage unit 139 can be realized by a storage device that can magnetically, optically or electrically store information such as an HDD, an SSD, a memory card, an optical disk, or a RAM, for example.
  • the third acquiring unit 138 acquires purchase information and sales promotion information including at least a product ID of a product to be purchased from the managing unit 141 (the output unit 144 ), and stores the acquired purchase information and sales promotion information in the sales information storage unit 139 .
  • the purchase information and the sales promotion information may further contain information mentioned in the description of the sales information storage unit 142 .
  • the sales information storage unit 139 stores a plurality of pieces of purchase information and sales promotion information stored by the third acquiring unit 138 .
  • FIG. 12 is a table illustrating examples of the purchase information according to the second embodiment.
  • the purchase information is information in which a number, the date and time of purchase, and a product ID are associated, but the purchase information is not limited thereto.
  • the analyzing unit 135 performs at least one of first analysis of analyzing a plurality of pieces of recognition information stored in the recognition information storage unit 32 , a plurality of pieces of combination information stored in the combination information storage unit 34 , and a plurality of pieces of purchase information stored in the sales information storage unit 139 to calculate the product priority of each product and a second analysis of analyzing at least either a plurality of pieces of recognition information or a plurality of pieces of combination information in addition to a plurality of pieces of purchase information to obtain updated contents of sales information.
  • the analyzing unit 135 analyzes a plurality of pieces of recognition information to calculate first product priority of each product, analyzes a plurality of pieces of combination information to calculate second product priority of each product, analyzes a plurality of pieces of purchase information to calculate third product priority of each product, calculate the product priority of each product on the basis of the first product priority, the second product priority and the third product priority of each product.
  • the analyzing unit 135 calculates the product priority E of a certain product by calculating the first product priority Er, the second product priority Es and the third product priority Eb of the product and calculating weighted addition of the calculated first product priority Er, second product priority Es and third product priority Eb as expressed by Equation (2).
  • Equation (2) wb represents the weight of the priority Eb.
  • the analyzing unit 135 analyzes the pieces of recognition information and sets the first product priority Er of the product represented by a product ID associated with recognition date and time to be higher as the recognition date and time is closer to the current date and time. That is, the analyzing unit 135 sets the first product priority Er to be higher for a product over which a terminal was held on the date and time closer to the current date and time.
  • the analyzing unit 135 also analyzes the pieces of recognition information and sets the first product priority Er to be higher for a product represented by a product ID having a value whose number of occurrences is larger. That is, the analyzing unit 135 sets the first product priority Er to be higher for a product over which a terminal was held a larger number of times.
  • the analyzing unit 135 analyzes the pieces of combination information and sets the second product priority Es of a product represented by a product ID associated with combination date and time contained in the combination information to be higher as the combination date and time is closer to the current date and time. That is, the analyzing unit 135 sets the second product priority Es to be higher for a product that was tried on the date and time closer to the current date and time.
  • the analyzing unit 135 also analyzes the pieces of combination information and sets the second product priority Es to be higher for a product represented by a product ID having a value whose number of occurrences is larger. That is, the analyzing unit 135 sets the second product priority Es to be higher for a product that was tried a larger number of times.
  • the analyzing unit 135 analyzes the pieces of purchase information and sets the third product priority Eb of a product represented by a product ID associated with purchase date and time contained in the purchase information to be higher as the purchase date and time is closer to the current date and time. That is, the analyzing unit 135 sets the third product priority Eb to be higher for a product that was purchased on the date and time closer to the current date and time.
  • the analyzing unit 135 also analyzes the pieces of purchase information and sets the third product priority Eb to be higher for a product represented by a product ID having a value whose number of occurrences is larger. That is, the analyzing unit 135 sets the third product priority Eb to be higher for a product that was purchased a larger number of times.
  • the analyzing unit 135 determines whether or not the behavior of “focus” and the behavior of “try” of the user led to purchase of a product by analyzing at least either a plurality of pieces of recognition information or a plurality of pieces of combination information in addition to a plurality of pieces of purchase information and obtains updated contents of sales promotion information.
  • the analyzing unit 135 analyzes a plurality of pieces of purchase information, analyzes the number of occurrences, in the pieces of recognition information, of product image information associated with a product ID having a value whose number of occurrences in the purchase information satisfies a third predetermined condition, and obtains updated contents of the first sales promotion information according to the number of occurrences of the product image information.
  • the third predetermined condition may be thresholds in multiple steps including an increase determination threshold for determining whether or not to increase a value and a decrease determination threshold for determining whether or not to decrease a value, for example.
  • FIG. 13 is a table illustrating examples of the first sales promotion information before being updated according to the second embodiment
  • FIG. 14 is a table illustrating examples of the first sales promotion information after being updated according to the second embodiment.
  • the first sales promotion information is information in which a number, an advertisement ID, an image ID (product image information) of a product image, and the number of advertisements are associated, but the first sales promotion information is not limited thereto.
  • the first sales promotion information is not limited thereto.
  • the analyzing unit 135 obtains updated contents in which the numbers of advertisement for the image IDs “IMAGE 10392 ” and “IMAGE 10192 ” are increased by 10 while the number of advertisements for the image ID “IMAGE 10291 ” is decreased by 20, for example, as the updated contents of the first sales promotion information.
  • the first sales promotion information illustrated in FIG. 13 with that as illustrated in FIG. 14 .
  • the analyzing unit 135 also analyzes the pieces of purchase information, analyzes the number of occurrences, in the pieces of combination information, of combination image information associated with a product ID having a value whose number of occurrences in the purchase information satisfies a fourth predetermined condition, and obtains updated contents of the second sales promotion information according to the number of occurrences of the combination image information.
  • the fourth predetermined condition may be thresholds in multiple steps including an increase determination threshold for determining whether or not to increase a value and a decrease determination threshold for determining whether or not to decrease a value, for example.
  • the analyzing unit 135 can also obtain updated contents of store layout by analyzing a plurality of pieces of purchase information and determining the sales rate of products sold together.
  • the sales rate of products sold together can be calculated from purchase date and time or the like in the purchase information.
  • FIG. 15 is a table illustrating examples of the store layout information before being updated according to the second embodiment
  • FIG. 16 is a table illustrating examples of the store layout information after being updated according to the second embodiment.
  • the store layout information is information in which a number, a shelf ID, and a product ID are associated, but the store layout information is not limited thereto. It is assumed that a shelf A and a shelf B are adjacent to each other while a shelf C is adjacent to neither of the shelf A and the shelf B. In the examples illustrated in FIG. 15 , it is assumed that the sales rates of products sold together of products IDs “PRODUCT 20928 ” and “PRODUCT 20290 ” satisfy the increase determination threshold.
  • the analyzing unit 135 obtains updated contents of arranging the product with the product ID “PRODUCT 20290 ” on the shelf B and arranging the product with the product ID “PRODUCT 20660 ” on the shelf C. As a result, it is possible to update the store layout information illustrated in FIG. 15 with that as illustrated in FIG. 16 .
  • the output unit 36 performs at least one of first output of outputting information based on the product priority calculated by the analyzing unit 135 to at least one of the recognizing unit 11 and the combining unit 21 and second output of outputting the updated contents obtained by the analyzing unit 135 to the managing unit 141 .
  • information output by the output unit 36 in this manner is used for update of sales promotion information at the managing unit 141 , and sales promotion information with higher probability of motivating the user to buy a product will thus be managed at the third terminal 140 .
  • FIG. 17 is a flowchart illustrating an example of a flow of procedures of processing performed by the server 130 according to the second embodiment.
  • the first acquiring unit 31 acquires recognition information including at least a product ID of a product estimated by the recognizing unit 11 from the recognizing unit 11 (the output unit 15 ), and stores the acquired recognition information in the recognition information storage unit 32 (step S 401 ).
  • the second acquiring unit 33 acquires combination information including at least a product ID of a product in an image for combination combined with an image to be combined from the combining unit 21 (the output unit 25 ), and stores the acquired combination information in the combination information storage unit 34 (step S 403 ).
  • the third acquiring unit 138 acquires purchase information and sales promotion information including at least a product ID of a product to be purchased from the managing unit 141 (the output unit 144 ), and stores the acquired purchase information and sales promotion information in the sales information storage unit 139 (step S 405 ).
  • the analyzing unit 135 analyzes a plurality of pieces of recognition information stored in the recognition information storage unit 32 to calculate the first product priority of each product, analyzes a plurality of pieces of combination information stored in the combination information storage unit 34 to calculate the second product priority of each product, analyzes a plurality of pieces of purchase information stored in the sales information storage unit 139 to calculate the third product priority of each product, and calculates the product priority of each product on the basis of the first product priority, the second product priority and the third product priority of each product (step S 407 ).
  • the analyzing unit 135 further analyzes whether or not combination information including a product ID of a product whose product priority satisfies the first predetermined condition exists in the pieces of combination information, and generates first recommendation information recommending related information according to the analysis result among a plurality of kinds of related information (step S 409 ).
  • the output unit 36 outputs information based on the product priority calculated by the analyzing unit 135 and the first recommendation information generated by the analyzing unit 135 to the recognizing unit 11 (step S 411 ).
  • the analyzing unit 135 analyzes the number of occurrences of each of the categories in the pieces of combination information and generates second recommendation information recommending a category with the largest number of occurrences (step S 413 ).
  • the output unit 36 outputs information based on the product priority calculated by the analyzing unit 135 and the second recommendation information generated by the analyzing unit 135 to the combining unit 21 (step S 415 ).
  • the analyzing unit 135 analyzes at least either of a plurality of pieces of recognition information or a plurality of pieces of combination information in addition to a plurality of pieces of purchase information to obtain updated contents of the sales promotion information (step S 417 ).
  • the output unit 36 outputs the updated contents of the sales promotion information obtained by the analyzing unit 135 to the managing unit 141 (step S 419 ).
  • products of greater interest to the user can be extracted more effectively by further analyzing the purchase information to calculate the product priority.
  • the recognizing unit and the combining unit can more preferentially present products of greater interest to the user by using the information and it is thus possible to further increase the probability of motivating the user to buy a product.
  • more effective sales management can be realized by analyzing at least one of the history of the recognition information and the history of the combination information in addition to the history of the purchase information, which can lead to analysis and improvement of advertising effectiveness, improvement in product lineup, efficiency in product recommendation to customers (improvement in methods for training store staff), improvement in procurement plan, and improvement in store layouts.
  • histories of the image recognition terminal that implements “focus”, the image combining terminal that implements “try” and the management terminal that manages sales information are used have been described in the embodiments described above, the embodiments are not limited thereto, and histories of terminals using various O2O related technologies can be used such as the history of a terminal implementing “search” that is searching for related product information according to attributes of a product over which “focus” is performed.
  • the analyzing unit 35 need not necessarily analyze all the histories. That is, the analyzing unit 35 may set any of the weights to 0.
  • the recognizing unit 11 and the combining unit 21 may be included in one terminal 250 as in a system 201 illustrated in FIG. 18 .
  • FIG. 19 is a diagram illustrating an example of a hardware configuration of the server according to the embodiments and modifications.
  • the server according to the embodiments and modifications described above includes a control device 901 such as a CPU, a storage device 902 such as a ROM and a RAM, an external storage device 903 such as a HDD, a display device 904 such as a display, an input device 905 such as a keyboard and a mouse, and a communication device 906 such as a communication interface (I/F), which is a hardware configuration utilizing a common computer system.
  • a control device 901 such as a CPU
  • a storage device 902 such as a ROM and a RAM
  • an external storage device 903 such as a HDD
  • a display device 904 such as a display
  • an input device 905 such as a keyboard and a mouse
  • a communication device 906 such as a communication interface (I/F), which is a hardware configuration utilizing a common computer system.
  • I/F
  • Programs to be executed by the server according to the embodiments and modifications described above are recorded on a computer readable recording medium such as a CD-ROM, a CD-R, a memory card, a digital versatile disk (DVD) and a flexible disk (FD) in a form of a file that can be installed or executed, and provided therefrom as a computer program product.
  • a computer readable recording medium such as a CD-ROM, a CD-R, a memory card, a digital versatile disk (DVD) and a flexible disk (FD) in a form of a file that can be installed or executed, and provided therefrom as a computer program product.
  • the programs to be executed by the server according to the embodiments and modifications may be stored on a computer system connected to a network such as the Internet, and provided by being downloaded via the network. Still alternatively, the programs to be executed by the server according to the embodiments and modifications may be provided or distributed through a network such as the Internet. Still alternatively, the programs to be executed by the server according to the embodiments and modifications may be embedded in a ROM or the like in advance and provided therefrom.
  • the programs to be executed by the server have modular structures for implementing the units described above on a computer system.
  • the CPU reads programs from the HDD and executes the programs on the RAM, whereby the respective units described above are implemented on a computer system.
  • the order in which the steps in the flowcharts in the embodiments described above are performed may be changed, a plurality of steps may be performed at the same time or the order in which the steps are performed may be changed each time the steps are performed to the extent that the changes are not inconsistent with the nature thereof.

Abstract

According to an embodiment, a server includes a first acquiring unit, a second acquiring unit, an analyzing unit, and an output unit. The first acquiring unit is configured to acquire recognition information includes a product identification information for identifying the product. The second acquiring unit is configured to acquire combination information including the product identification information of the product to be combined with an object image including an object. The analyzing unit is configured to calculate product priorities for respective products by analyzing the recognition information and the combination information. The output unit is configured to output information based on the product priorities.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2012-243762, filed on Nov. 5, 2012; the entire contents of which are incorporated herein by reference.
  • FIELD
  • Embodiments described herein relate generally to a server, an analysis method, and a computer program product.
  • BACKGROUND
  • In retail industry for general consumers, there are recently increasing attempts to differentiate shopping styles by creating new user qualities, and O2O (Online to Offline), for example, is attracting attention. The O2O means interaction of online and offline buying behaviors and influence of online information on buying behavior at brick-and-motor shops or the like, and services such as finding stores using location-based services of portable terminals and coupons available online and usable at brick-and-motor shops have been expanding.
  • In the meantime, various technologies relating to O2O such as technology for virtual fitting using product images are being developed.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a diagram illustrating an example of a system according to a first embodiment;
  • FIG. 2 is a diagram illustrating an example of a first terminal according to the first embodiment;
  • FIG. 3 is a diagram illustrating an example of a second terminal according to the first embodiment;
  • FIG. 4 is a diagram illustrating an example of a server according to the first embodiment;
  • FIG. 5 is a table illustrating examples of recognition information according to the first embodiment;
  • FIG. 6 is a table illustrating examples of combination information according to the first embodiment;
  • FIG. 7 is a diagram illustrating an example of a data structure of product information according to the first embodiment;
  • FIG. 8 is a flowchart illustrating an example of processing according to the first embodiment;
  • FIG. 9 is a diagram illustrating an example of a system according to a second embodiment;
  • FIG. 10 is a diagram illustrating an example of a third terminal according to the second embodiment;
  • FIG. 11 is a diagram illustrating an example of a server according to the second embodiment;
  • FIG. 12 is a table illustrating examples of purchase information according to the second embodiment;
  • FIG. 13 is a table illustrating examples of first sales promotion information before being updated according to the second embodiment;
  • FIG. 14 is a table illustrating examples of first sales promotion information after being updated according to the second embodiment;
  • FIG. 15 is a table illustrating examples of store layout information before being updated according to the second embodiment;
  • FIG. 16 is a table illustrating examples of store layout information after being updated according to the second embodiment;
  • FIG. 17 is a flowchart illustrating an example of processing according to the second embodiment;
  • FIG. 18 is a diagram illustrating an example of a system according to a modification; and
  • FIG. 19 is a diagram illustrating an example of a hardware configuration of the server according to the embodiments and modifications.
  • DETAILED DESCRIPTION
  • According to an embodiment, a server includes a first acquiring unit, a recognition information storage unit, a second acquiring unit, a combination information storage unit, an analyzing unit, and an output unit. The first acquiring unit is configured to acquire a piece of recognition information including a piece of product identification information for identifying a product included in a product image. The recognition information storage unit is configured to store the piece of recognition information. The second acquiring unit is configured to acquire a piece of combination information including the piece of product identification information of the product to be combined with an object image including an object. The combination information storage unit is configured to store the piece of combination information. The analyzing unit is configured to calculate product priorities for respective products by analyzing a plurality of pieces of recognition information stored in the recognition information storage unit and a plurality of pieces of combination information stored in the combination information storage unit. The output unit is configured to output information based on the product priorities.
  • Embodiments will be described in detail below with reference to the accompanying drawings.
  • First Embodiment
  • FIG. 1 is a configuration diagram illustrating an example of a system 1 according to a first embodiment. As illustrated in FIG. 1, the system 1 includes a first terminal 10, a second terminal 20, and a server 30. The first terminal 10, the second terminal 20 and the server 30 are connected via a network 2. The network 2 can be realized by the Internet or a local area network (LAN), for example.
  • In the first embodiment, an example in which the first terminal 10 is an image recognition terminal that includes a recognizing unit 11 and that acquires related information on a real object of interest to the user by being held over the real object will be described. The first terminal 10 can be realized by a portable terminal, for example. In the following, to acquire related information on a real object by focus the first terminal 10 over the real object may be referred to as “focus”.
  • Similarly, in the first embodiment, an example in which the second terminal 20 is an image combining terminal that includes a combining unit 21 and that performs virtual fitting simulation, virtual installation simulation and the like will be described. The second terminal 20 is installed in a store selling products, for example. In the following, to experience a product of interest to the user through virtual fitting simulation, virtual installation simulation or the like may be referred to as “try”.
  • With the system 1, it is assumed that the user holds the first terminal 10 over a real object of interest to acquire related information on the product and that, starting from the acquired related information, the user is encouraged to go to the store in which the second terminal 20 is installed and to experience the product through virtual fitting simulation or virtual installation simulation, which is linked to purchase of the product.
  • FIG. 2 is a configuration diagram illustrating an example of the first terminal 10 according to the first embodiment. As illustrated in FIG. 2, the first terminal 10 includes a recognizing unit 11, an imaging unit 12, a feedback information storage unit 13, a display unit 14, and an output unit 15.
  • The recognizing unit 11 may be implemented by making a processor such as a central processing unit (CPU) execute a program, that is, by software, may be implemented by hardware such as an integrated circuit (IC), or may be implemented by combination of software and hardware, for example. The imaging unit 12 can be realized by an imager such as a digital camera, for example. The feedback information storage unit 13 can be realized by a storage device that can magnetically, optically or electrically store information such as a hard disk drive (HDD), a solid state drive (SSD), a memory card, an optical disk, or a random access memory (RAM), for example. The display unit 14 can be realized by a display device such as a liquid crystal display or a touch panel display, for example. The output unit 15 can be realized by a communication device such as a network interface card (NIC), for example.
  • The imaging unit 12 images a real object of interest to the user to generate a product image. Examples of the real object of interest to the user include an advertisement of a product of interest to the user, but the real object may be a product itself of interest to the user.
  • The feedback information storage unit 13 stores feedback information. Details of the feedback information will be described later.
  • The recognizing unit 11 includes an image recognizing unit 16 and a feedback unit 17.
  • The image recognizing unit 16 recognizes a product image, estimates a product included in the product image, and selects at least any one of a plurality of kinds of related information on the product. Specifically, the image recognizing unit 16 acquires product information on the estimated product from the server 30 and selects at least any one of a plurality of kinds of related information contained in the acquired product information. The product information acquired by the image recognizing unit 16 contains a product ID (an example of product identification information) of the estimated product and a plurality of kinds of related information. Examples of the kinds of related information include attribute information and accompanying information of the estimated product. Examples of the attribute information include brand, price, color, and material, and examples of the accompanying information include word of mouth, recommended coordinates and store information (address, map, etc.).
  • When feedback information of the estimated product is stored in the feedback information storage unit 13, the image recognizing unit 16 selects related information according to the feedback information.
  • The feedback unit 17 stores feedback information based on information transmitted from the server 30 in the feedback information storage unit 13.
  • The display unit 14 displays the related information selected by the image recognizing unit 16. The display unit 14 displays word of mouth, recommended coordinates, store information or the like of the product estimated by the image recognizing unit 16 as an image, for example.
  • The output unit 15 outputs the recognition information to the server 30. The recognition information at least contains a product ID of the product estimated by the image recognizing unit 16. The recognition information may contain product image information and related information of the product image. The product image information may be the product image itself or may be an image matched with the product image in image recognition performed by the image recognizing unit 16 or an image ID of the image. The recognition information may contain the date and time of recognition, the position of recognition, a user ID of the user, and the like.
  • FIG. 3 is a configuration diagram illustrating an example of the second terminal 20 according to the first embodiment. As illustrated in FIG. 3, the second terminal 20 includes a combining unit 21, an imaging unit 22, a feedback information storage unit 23, a display unit 24, and an output unit 25.
  • The combining unit 21 may be implemented by making a processor such as a central processing unit (CPU) execute a program, that is, by software, may be implemented by hardware such as an IC, or may be implemented by combination of software and hardware, for example. The imaging unit 22 can be realized by an imager such as a digital camera, for example. The feedback information storage unit 23 can be realized by a storage device that can magnetically, optically or electrically store information such as an HDD, an SSD, a memory card, an optical disk, or a RAM, for example. The display unit 24 can be realized by a display device such as a liquid crystal display or a touch panel display, for example. The output unit 25 can be realized by a communication device such as an NIC, for example.
  • The imaging unit 22 images an object to be combined to generate an image to be combined. Examples of the object to be combined include the user.
  • The feedback information storage unit 23 stores feedback information. Details of the feedback information will be described later.
  • The combining unit 21 includes an image combining unit 26 and a feedback unit 27.
  • The image combining unit 26 combines the image to be combined generated by the imaging unit 22 and an image for combination of a product (such as clothes). Specifically, the image combining unit 26 acquires product information of a plurality of products from the server 30, displays images for combination contained in the acquired product information on the display unit 24, and combines an image for combination selected by the user with the image to be combined generated by the imaging unit 22. The product information acquired by the image combining unit 26 contains product IDs (an example of product identification information) of the products and a group of images for combination. Since the images for combination are present for each category of products, the images for combination are in a form of groups. The category may be the kind or the use of products or the state in which products are tried on.
  • When feedback information is stored in the feedback information storage unit 23, the image combining unit 26 displays the images for combination on the display unit 24 in a manner that the user can preferentially select an image for combination indicated by the feedback information.
  • The feedback unit 27 stores feedback information based on information transmitted from the server 30 in the feedback information storage unit 23.
  • The display unit 24 displays images for combination to be selected by the user and combined images obtained by combination by the image combining unit 26.
  • The output unit 25 outputs combination information to the server 30. The combination information at least contains a product ID of the product estimated by the image combining unit 26. The combination information may contain combined image information of an image to be combined and combination image information of an image for combination. The combined image information may be the image to be combined itself or may contain depth information obtained by sensing the image to be combined, skeleton information indicating the outline of a person, and measurement information such as height, weight, chest circumference, sitting height and the like in addition to the image to be combined. The combination image information may be the image for combination itself or may be an image ID of the image for combination. The combination information may contain the date and time of combination, the position of combination, a user ID of the user, the category of the product and the like.
  • FIG. 4 is a configuration diagram illustrating an example of the server 30 according to the first embodiment. As illustrated in FIG. 4, the server 30 includes a first acquiring unit 31, a recognition information storage unit 32, a second acquiring unit 33, a combination information storage unit 34, an analyzing unit 35, an output unit 36, and a product information storage unit 37.
  • The first acquiring unit 31, the second acquiring unit 33 and the analyzing unit 35 may be implemented by making a processor such as a central processing unit (CPU) execute a program, that is, by software, may be implemented by hardware such as an integrated circuit (IC), or may be implemented by combination of software and hardware, for example. The recognition information storage unit 32, the combination information storage unit 34, and the product information storage unit 37 can be realized by a storage device that can magnetically, optically or electrically store information such as an HDD, an SSD, a memory card, an optical disk, or a RAM, for example. The output unit 36 can be realized by a communication device such as an NIC, for example.
  • The first acquiring unit 31 acquires recognition information including at least the product ID of the product estimated by the recognizing unit 11 from the recognizing unit 11 (the output unit 15), and stores the acquired recognition information in the recognition information storage unit 32. The recognition information may further contain information as mentioned in the description of the output unit 15.
  • The recognition information storage unit 32 stores a plurality of pieces of recognition information stored by the first acquiring unit 31. FIG. 5 is a table illustrating examples of the recognition information according to the first embodiment. In the examples illustrated in FIG. 5, the recognition information is information in which a number, the date and time of recognition, the product image information, a product ID, and the related information displayed at the first terminal 10 are associated, but the recognition information is not limited thereto.
  • The second acquiring unit 33 acquires combination information including at least the product ID of the product in the image for combination combined with the image to be combined from the combining unit 21 (the output unit 25), and stores the acquired combination information in the combination information storage unit 34. The combination information may further contain information as mentioned in the description of the output unit 25.
  • The combination information storage unit 34 stores a plurality of pieces of combination information stored by the second acquiring unit 33. FIG. 6 is a table illustrating examples of the combination information according to the first embodiment. In the examples illustrated in FIG. 6, the combination information is information in which the number, the date and time of combination, the combined image information, the product ID and the category are associated, but the combination information is not limited thereto.
  • The analyzing unit 35 analyzes a plurality of pieces of recognition information stored in the recognition information storage unit 32 and a plurality of pieces of combination information stored in the combination information storage unit 34, and calculates product priority of each product. Specifically, the analyzing unit 35 analyzes the pieces of recognition information to calculate first product priority of each product, analyzes the pieces of combination information to calculate second product priority of each product, and calculate the product priority of each product on the basis of the first product priority and the second product priority of each product.
  • For example, the analyzing unit 35 calculates the product priority E of a certain product by calculating the first product priority Er and the second product priority Es of the product and calculating weighted addition of the calculated first product priority Er and second product priority Es as expressed by Equation (1). If the product ID of the product for which the first product priority Er is calculated is not present in the pieces of combination information, the second product priority Es of the product is obviously 0, and if the product ID of the product for which the second priority Es is calculated is not present in the pieces of recognition information, the first product priority Er of the product is obviously 0.

  • E=wr×Er+ws×Es  (1)
  • In Equation (1), wr represents the weight of the priority Er and ws represents the weight of the priority Es.
  • Note that the analyzing unit 35 analyzes the pieces of recognition information and sets the first product priority Er of the product represented by a product ID associated with recognition date and time to be higher as the recognition date and time is closer to the current date and time. That is, the analyzing unit 35 sets the first product priority Er to be higher for a product over which a terminal was held on the date and time closer to the current date and time.
  • The analyzing unit 35 also analyzes the pieces of recognition information and sets the first product priority Er to be higher for a product represented by a product ID having a value whose number of occurrences is larger. That is, the analyzing unit 35 sets the first product priority Er to be higher for a product over which a terminal was held a larger number of times.
  • Similarly, the analyzing unit 35 analyzes the pieces of combination information and sets the second product priority Es of a product represented by a product ID associated with combination date and time contained in the combination information to be higher as the combination date and time is closer to the current date and time. That is, the analyzing unit 35 sets the second product priority Es to be higher for a product that was tried on the date and time closer to the current date and time.
  • The analyzing unit 35 also analyzes the pieces of combination information and sets the second product priority Es to be higher for a product represented by a product ID having a value whose number of occurrences is larger. That is, the analyzing unit 35 sets the second product priority Es to be higher for a product that was tried a larger number of times.
  • The analyzing unit 35 further analyzes whether or not combination information including a product ID of a product whose product priority E satisfies a first predetermined condition exists in the pieces of combination information, and generates first recommendation information recommending related information according to the analysis result among a plurality of kinds of related information. The first predetermined condition may be a threshold or may be the product priorities E from the highest priority to a certain predetermined rank of priority.
  • For example, if combination information including a product ID of a product whose product priority E satisfies the first predetermined condition does not exist in the pieces of combination information, the analyzing unit 35 generates first recommendation information recommending store information among a plurality of kinds of related information. In this case, since “focus” is performed but “try” is not performed, it is possible to encourage the user to perform “try” by recommending the store information of the store in which the second terminal 20 is installed, and as a result, it may be possible to motivate the user to buy the product.
  • If, for example, combination information including a product ID of a product whose product priority E satisfies the first predetermined condition exists in the pieces of combination information, the analyzing unit 35 generates first recommendation information recommending recommended coordinates among a plurality of kinds of related information. In this case, since both “focus” and “try” are performed, it may be possible to motivate the user to buy other products recommended in the recommended coordinates by recommending the recommended coordinates.
  • Furthermore, if there exists a plurality of categories of a product whose product priority satisfies a second predetermined condition, the analyzing unit 35 analyzes the number of occurrences of each of the categories in a plurality of pieces of combination information and generates second recommendation information recommending a category with the largest number of occurrences. The second predetermined condition may be a threshold or may be the product priorities E from the highest priority to a certain predetermined rank of priority.
  • For example, it is assumed that a product with the product priority E satisfying the second predetermined condition is a bag that can be carried in three ways: a handbag, a shoulder bag, and a backpack. In this case, since the categories of the bag are handbag, shoulder bag and backpack, the analyzing unit 35 analyzes the number of occurrence of each of handbag, shoulder bag and backpack in the pieces of combination information. Then, if the number of occurrences of shoulder bag is the largest, the analyzing unit 35 generates second recommendation information recommending the shoulder bag. In this case, since it is popular among users to perform “try” on the shoulder bag, it may be possible to motivate the user to buy the product by recommending the shoulder bag. If, however, “try” on the shoulder bag style of the bag is already performed, second recommendation information recommending another category such as handbag or backpack on which “try” has not been performed may be generated.
  • The output unit 36 outputs information regarding a product based on the product priority calculated by the analyzing unit 35 to at least one of the recognizing unit 11 and the combining unit 21. The information based on the product priority may be the product priority itself or may be related information or an image for combination of the product with the product priority. The related information and the image for combination can be obtained from the product information storage unit 37.
  • The output unit 36 also outputs information based on the product priority calculated by the analyzing unit 35 and the first recommendation information generated by the analyzing unit 35 to the recognizing unit 11. The information based on the product priority and the first recommendation information may be information indicating the product priority and recommended related information or may be recommended related information on the product with the product priority.
  • The output unit 36 also outputs information based on the product priority calculated by the analyzing unit 35 and the second recommendation information generated by the analyzing unit 35 to the combining unit 21. The information based on the product priority and the second recommendation information may be the product priority and an image ID of a recommended image for combination or may be a recommended image for combination of the product with the product priority.
  • The information output by the output unit 36 in this manner is used as feedback information at the recognizing unit 11 and the combining unit 21, so that information with higher probability of motivating the user to buy a product is preferentially displayed at the first terminal 10 and the second terminal 20.
  • When it is requested by the image recognizing unit 16 to acquire product information, the output unit 36 acquires the requested product information from the product information storage unit 37 and outputs the acquired product information to the image recognizing unit 16. Similarly, when it is requested by the image combining unit 26 to acquire product information, the output unit 36 acquires the requested product information from the product information storage unit 37 and outputs the acquired product information to the image combining unit 26.
  • The product information storage unit 37 stores product information of products. FIG. 7 is a diagram illustrating an example of a data structure of the product information according to the first embodiment. In the example illustrated in FIG. 7, the product information is information in which a product ID, attribute information (brand, price, color, material, etc.), accompanying information (word of mouth, recommended coordinates, store information (address, map, etc.), etc.) and a group of images for combination are associated, but the product information is not limited thereto.
  • FIG. 8 is a flowchart illustrating an example of a flow of procedures of processing performed by the server 30 according to the first embodiment.
  • First, the first acquiring unit 31 acquires recognition information including at least a product ID of a product estimated by the recognizing unit 11 from the recognizing unit 11 (the output unit 15), and stores the acquired recognition information in the recognition information storage unit 32 (step S101).
  • Subsequently, the second acquiring unit 33 acquires combination information including at least a product ID of a product in an image for combination combined with an image to be combined from the combining unit 21 (the output unit 25), and stores the acquired combination information in the combination information storage unit 34 (step S103).
  • Subsequently, the analyzing unit 35 analyzes a plurality of pieces of recognition information stored in the recognition information storage unit 32 to calculate first product priority of each product, analyzes a plurality of pieces of combination information stored in the combination information storage unit 34 to calculate second product priority of each product, and calculates product priority of each product on the basis of the first product priority and the second product priority of each product (step S105).
  • Subsequently, the analyzing unit 35 further analyzes whether or not combination information including a product ID of a product whose product priority satisfies the first predetermined condition exists in the pieces of combination information, and generates first recommendation information recommending related information according to the analysis result among a plurality of kinds of related information (step S107).
  • Subsequently, the output unit 36 outputs information based on the product priority calculated by the analyzing unit 35 and the first recommendation information generated by the analyzing unit 35 to the recognizing unit 11 (step S109).
  • Subsequently, if there exists a plurality of categories of a product whose product priority satisfies the second predetermined condition, the analyzing unit 35 analyzes the number of occurrences of each of the categories in the pieces of combination information and generates second recommendation information recommending a category with the largest number of occurrences (step S111).
  • Subsequently, the output unit 36 outputs information based on the product priority calculated by the analyzing unit 35 and the second recommendation information generated by the analyzing unit 35 to the combining unit 21 (step S113).
  • As described above, according to the first embodiment, since the product priority taking history of various O2O related technologies into consideration can be calculated by analyzing the history of the recognition information and the history of the combination information to calculate the product priority, products of greater interest to the user can be extracted. In addition, according to the first embodiment, since information based on the calculated product priority is output to the recognizing unit and the combining unit, the recognizing unit and the combining unit can preferentially present products of greater interest to the user by using the information and it is thus possible to increase the probability of motivating the user to buy a product.
  • In particular, according to the first embodiment, since not only information on a product of higher interest to the user but also information with high probability of motivating the user to buy a product can be extracted from a plurality of kinds of related information of the product, the recognizing unit can preferentially present information of greater interest to the user by using the information and it is thus possible to increase the probability of motivating the user to buy a product.
  • Similarly, according to the first embodiment, since not only information on a product of higher interest to the user but also information with high probability of motivating the user to buy a product can be extracted from the categories of the product, the combining unit can preferentially present information of greater interest to the user by using the information and it is thus possible to increase the probability of motivating the user to buy a product.
  • According to the first embodiment, since the recognizing unit 11 (the output unit 15) can contain product image information and related information in the recognition information, it is also possible to figure out over what real objects the user held the terminal and what products the user is interested in. For example, it is possible to figure out whether the user got interested in a product X by focus the terminal over an advertisement A or by focus the terminal over an advertisement B, which allows the history through which the user got interested in the product X to be used in the analysis.
  • Similarly, according to the first embodiment, since the combining unit 21 (the output unit 25) can contain combined image information and combination image information in the combination information, it is also possible to figure out what image for combination is combined with what image to be combined. For example, it is possible to figure out such a fact that people with a body type A often try on clothes Y or such a fact that people with a body type B often try on clothes Z, and it is thus possible to obtain a tendency of “try” of each user by data analysis.
  • Second Embodiment
  • In the second embodiment, an example in which a third terminal including a managing unit that manages sales information on sales of products is further provided will be described. In the following, the difference from the first embodiment will be mainly described and components having similar functions as in the first embodiment will be designated by the same names and reference numerals as in the first embodiment, and the description thereof will not be repeated.
  • FIG. 9 is a configuration diagram illustrating an example of a system 101 according to the second embodiment. As illustrated in FIG. 9, the system 101 is different from that in the first embodiment in a server 130 and a third terminal 140 thereof.
  • In the second embodiment, an example in which the third terminal 140 is a management terminal that includes a managing unit 141 and that manages sales information related to sales of products will be described.
  • FIG. 10 is a configuration diagram illustrating an example of the third terminal 140 according to the second embodiment. As illustrated in FIG. 10, the third terminal 140 includes a managing unit 141, a sales information storage unit 142, a display unit 143, and an output unit 144.
  • The managing unit 141 may be implemented by making a processor such as a CPU execute a program, that is, by software, may be implemented by hardware such as an IC, or may be implemented by combination of software and hardware, for example. The sales information storage unit 142 can be realized by a storage device that can magnetically, optically or electrically store information such as an HDD, an SSD, a memory card, an optical disk, or a RAM, for example. The display unit 143 can be realized by a display device such as a liquid crystal display or a touch panel display, for example. The output unit 144 can be realized by a communication device such as an NIC, for example.
  • The sales information storage unit 142 stores sales information related to sales of products. Examples of the sales information include purchase information indicating details of purchase of a product, sales promotion information relating to sales promotion of a product, customer information, inventory information, and training information relating to training of store staff. The purchase information contains at least a product ID of a product to be purchased. The purchase information may also contain the date and time of purchase. The sales promotion information contains first sales promotion information relating to sales promotion using product images and second sales promotion information relating to sales promotion using images for combination. Examples of the sales promotion information include information on advertising strategy, store layout, procurement plan, product lineup, and methods for recommending products to customers.
  • The managing unit 141 manages the sales information stored in the sales information storage unit 142.
  • The display unit 143 displays the sales information managed by the managing unit 141.
  • The output unit 144 outputs the sales information to the server 130. For example, the output unit 144 outputs purchase information and sales promotion information to the server 130.
  • FIG. 11 is a configuration diagram illustrating an example of the server 130 according to the second embodiment. As illustrated in FIG. 11, the server 130 is different from that in the first embodiment in an analyzing unit 135, a third acquiring unit 138, and a sales information storage unit 139.
  • The third acquiring unit 138 may be implemented by making a processor such as a CPU execute a program, that is, by software, may be implemented by hardware such as an IC, or may be implemented by combination of software and hardware, for example. The sales information storage unit 139 can be realized by a storage device that can magnetically, optically or electrically store information such as an HDD, an SSD, a memory card, an optical disk, or a RAM, for example.
  • The third acquiring unit 138 acquires purchase information and sales promotion information including at least a product ID of a product to be purchased from the managing unit 141 (the output unit 144), and stores the acquired purchase information and sales promotion information in the sales information storage unit 139. Note that the purchase information and the sales promotion information may further contain information mentioned in the description of the sales information storage unit 142.
  • The sales information storage unit 139 stores a plurality of pieces of purchase information and sales promotion information stored by the third acquiring unit 138. FIG. 12 is a table illustrating examples of the purchase information according to the second embodiment. In the examples illustrated in FIG. 12, the purchase information is information in which a number, the date and time of purchase, and a product ID are associated, but the purchase information is not limited thereto.
  • The analyzing unit 135 performs at least one of first analysis of analyzing a plurality of pieces of recognition information stored in the recognition information storage unit 32, a plurality of pieces of combination information stored in the combination information storage unit 34, and a plurality of pieces of purchase information stored in the sales information storage unit 139 to calculate the product priority of each product and a second analysis of analyzing at least either a plurality of pieces of recognition information or a plurality of pieces of combination information in addition to a plurality of pieces of purchase information to obtain updated contents of sales information.
  • First, the first analysis will be described.
  • The analyzing unit 135 analyzes a plurality of pieces of recognition information to calculate first product priority of each product, analyzes a plurality of pieces of combination information to calculate second product priority of each product, analyzes a plurality of pieces of purchase information to calculate third product priority of each product, calculate the product priority of each product on the basis of the first product priority, the second product priority and the third product priority of each product.
  • For example, the analyzing unit 135 calculates the product priority E of a certain product by calculating the first product priority Er, the second product priority Es and the third product priority Eb of the product and calculating weighted addition of the calculated first product priority Er, second product priority Es and third product priority Eb as expressed by Equation (2).

  • E=wr×Er+ws×Es+wb×Eb  (2)
  • In Equation (2), wb represents the weight of the priority Eb.
  • Note that the analyzing unit 135 analyzes the pieces of recognition information and sets the first product priority Er of the product represented by a product ID associated with recognition date and time to be higher as the recognition date and time is closer to the current date and time. That is, the analyzing unit 135 sets the first product priority Er to be higher for a product over which a terminal was held on the date and time closer to the current date and time.
  • The analyzing unit 135 also analyzes the pieces of recognition information and sets the first product priority Er to be higher for a product represented by a product ID having a value whose number of occurrences is larger. That is, the analyzing unit 135 sets the first product priority Er to be higher for a product over which a terminal was held a larger number of times.
  • Similarly, the analyzing unit 135 analyzes the pieces of combination information and sets the second product priority Es of a product represented by a product ID associated with combination date and time contained in the combination information to be higher as the combination date and time is closer to the current date and time. That is, the analyzing unit 135 sets the second product priority Es to be higher for a product that was tried on the date and time closer to the current date and time.
  • The analyzing unit 135 also analyzes the pieces of combination information and sets the second product priority Es to be higher for a product represented by a product ID having a value whose number of occurrences is larger. That is, the analyzing unit 135 sets the second product priority Es to be higher for a product that was tried a larger number of times.
  • Similarly, the analyzing unit 135 analyzes the pieces of purchase information and sets the third product priority Eb of a product represented by a product ID associated with purchase date and time contained in the purchase information to be higher as the purchase date and time is closer to the current date and time. That is, the analyzing unit 135 sets the third product priority Eb to be higher for a product that was purchased on the date and time closer to the current date and time.
  • The analyzing unit 135 also analyzes the pieces of purchase information and sets the third product priority Eb to be higher for a product represented by a product ID having a value whose number of occurrences is larger. That is, the analyzing unit 135 sets the third product priority Eb to be higher for a product that was purchased a larger number of times.
  • Since the generation of the first recommendation information and the second recommendation information is the same as that in the first embodiment, the description thereof will not be repeated.
  • Next, the second analysis will be described.
  • The analyzing unit 135 determines whether or not the behavior of “focus” and the behavior of “try” of the user led to purchase of a product by analyzing at least either a plurality of pieces of recognition information or a plurality of pieces of combination information in addition to a plurality of pieces of purchase information and obtains updated contents of sales promotion information.
  • Specifically, the analyzing unit 135 analyzes a plurality of pieces of purchase information, analyzes the number of occurrences, in the pieces of recognition information, of product image information associated with a product ID having a value whose number of occurrences in the purchase information satisfies a third predetermined condition, and obtains updated contents of the first sales promotion information according to the number of occurrences of the product image information. The third predetermined condition may be thresholds in multiple steps including an increase determination threshold for determining whether or not to increase a value and a decrease determination threshold for determining whether or not to decrease a value, for example.
  • FIG. 13 is a table illustrating examples of the first sales promotion information before being updated according to the second embodiment, and FIG. 14 is a table illustrating examples of the first sales promotion information after being updated according to the second embodiment. In the examples illustrated in FIGS. 13 and 14, the first sales promotion information is information in which a number, an advertisement ID, an image ID (product image information) of a product image, and the number of advertisements are associated, but the first sales promotion information is not limited thereto. In the examples illustrated in FIG. 13, it is assumed that the number of occurrences, in the pieces of purchase information, of the value of product ID associated with each of image IDs “IMAGE 10392” and “IMAGE 10192” satisfies the increase determination threshold while the number of occurrences, in the pieces of purchase information, of the value of product ID associated with image ID “IMAGE 10291” satisfies the decrease threshold. It is also assumed that the numbers of occurrences, in the pieces of recognition information, of the image IDs “IMAGE 10392” and “IMAGE 10192” are larger than an average while the number of occurrences of the image ID “IMAGE 10291” is much smaller than the average.
  • That is, it is found that focus over an advertisement A and an advertisement C led to purchase of products for the products corresponding to the image IDs “IMAGE 10392” and “IMAGE 10192”, the effect of the advertisement A and the advertisement C is high, and sales promotion using the advertisement A and the advertisement C is therefore to be enhanced. On the other hand, it is found that focus over an advertisement B had not led to purchase of the product for the product corresponding to the image ID “IMAGE 10291”, the effect of the advertisement B is low, and sales promotion using the advertisement B is to be reduced.
  • In this case, the analyzing unit 135 obtains updated contents in which the numbers of advertisement for the image IDs “IMAGE 10392” and “IMAGE 10192” are increased by 10 while the number of advertisements for the image ID “IMAGE 10291” is decreased by 20, for example, as the updated contents of the first sales promotion information. As a result, it is possible to update the first sales promotion information illustrated in FIG. 13 with that as illustrated in FIG. 14.
  • The analyzing unit 135 also analyzes the pieces of purchase information, analyzes the number of occurrences, in the pieces of combination information, of combination image information associated with a product ID having a value whose number of occurrences in the purchase information satisfies a fourth predetermined condition, and obtains updated contents of the second sales promotion information according to the number of occurrences of the combination image information. The fourth predetermined condition may be thresholds in multiple steps including an increase determination threshold for determining whether or not to increase a value and a decrease determination threshold for determining whether or not to decrease a value, for example.
  • The analyzing unit 135 can also obtain updated contents of store layout by analyzing a plurality of pieces of purchase information and determining the sales rate of products sold together. The sales rate of products sold together can be calculated from purchase date and time or the like in the purchase information.
  • FIG. 15 is a table illustrating examples of the store layout information before being updated according to the second embodiment, and FIG. 16 is a table illustrating examples of the store layout information after being updated according to the second embodiment. In the examples illustrated in FIGS. 15 and 16, the store layout information is information in which a number, a shelf ID, and a product ID are associated, but the store layout information is not limited thereto. It is assumed that a shelf A and a shelf B are adjacent to each other while a shelf C is adjacent to neither of the shelf A and the shelf B. In the examples illustrated in FIG. 15, it is assumed that the sales rates of products sold together of products IDs “PRODUCT 20928” and “PRODUCT 20290” satisfy the increase determination threshold.
  • That is, since the rate of being sold together is high for the product IDs “PRODUCT 20928” and “PRODUCT 20290”, it is found that the sales promotion is to be enhanced by placing these products on adjacent shelves. In this case, the analyzing unit 135 obtains updated contents of arranging the product with the product ID “PRODUCT 20290” on the shelf B and arranging the product with the product ID “PRODUCT 20660” on the shelf C. As a result, it is possible to update the store layout information illustrated in FIG. 15 with that as illustrated in FIG. 16.
  • The output unit 36 performs at least one of first output of outputting information based on the product priority calculated by the analyzing unit 135 to at least one of the recognizing unit 11 and the combining unit 21 and second output of outputting the updated contents obtained by the analyzing unit 135 to the managing unit 141.
  • Since the first output is the same as in the first embodiment, the description thereof will not be repeated.
  • As for the second output, information output by the output unit 36 in this manner is used for update of sales promotion information at the managing unit 141, and sales promotion information with higher probability of motivating the user to buy a product will thus be managed at the third terminal 140.
  • FIG. 17 is a flowchart illustrating an example of a flow of procedures of processing performed by the server 130 according to the second embodiment.
  • First, the first acquiring unit 31 acquires recognition information including at least a product ID of a product estimated by the recognizing unit 11 from the recognizing unit 11 (the output unit 15), and stores the acquired recognition information in the recognition information storage unit 32 (step S401).
  • Subsequently, the second acquiring unit 33 acquires combination information including at least a product ID of a product in an image for combination combined with an image to be combined from the combining unit 21 (the output unit 25), and stores the acquired combination information in the combination information storage unit 34 (step S403).
  • Subsequently, the third acquiring unit 138 acquires purchase information and sales promotion information including at least a product ID of a product to be purchased from the managing unit 141 (the output unit 144), and stores the acquired purchase information and sales promotion information in the sales information storage unit 139 (step S405).
  • Subsequently, the analyzing unit 135 analyzes a plurality of pieces of recognition information stored in the recognition information storage unit 32 to calculate the first product priority of each product, analyzes a plurality of pieces of combination information stored in the combination information storage unit 34 to calculate the second product priority of each product, analyzes a plurality of pieces of purchase information stored in the sales information storage unit 139 to calculate the third product priority of each product, and calculates the product priority of each product on the basis of the first product priority, the second product priority and the third product priority of each product (step S407).
  • Subsequently, the analyzing unit 135 further analyzes whether or not combination information including a product ID of a product whose product priority satisfies the first predetermined condition exists in the pieces of combination information, and generates first recommendation information recommending related information according to the analysis result among a plurality of kinds of related information (step S409).
  • Subsequently, the output unit 36 outputs information based on the product priority calculated by the analyzing unit 135 and the first recommendation information generated by the analyzing unit 135 to the recognizing unit 11 (step S411).
  • Subsequently, if there exists a plurality of categories of a product whose product priority satisfies the second predetermined condition, the analyzing unit 135 analyzes the number of occurrences of each of the categories in the pieces of combination information and generates second recommendation information recommending a category with the largest number of occurrences (step S413).
  • Subsequently, the output unit 36 outputs information based on the product priority calculated by the analyzing unit 135 and the second recommendation information generated by the analyzing unit 135 to the combining unit 21 (step S415).
  • Subsequently, the analyzing unit 135 analyzes at least either of a plurality of pieces of recognition information or a plurality of pieces of combination information in addition to a plurality of pieces of purchase information to obtain updated contents of the sales promotion information (step S417).
  • Subsequently, the output unit 36 outputs the updated contents of the sales promotion information obtained by the analyzing unit 135 to the managing unit 141 (step S419).
  • As described above, according to the second embodiment, products of greater interest to the user can be extracted more effectively by further analyzing the purchase information to calculate the product priority. In addition, according to the second embodiment, since information based on the calculated product priority is output to the recognizing unit and the combining unit, the recognizing unit and the combining unit can more preferentially present products of greater interest to the user by using the information and it is thus possible to further increase the probability of motivating the user to buy a product.
  • Furthermore, according to the second embodiment, more effective sales management can be realized by analyzing at least one of the history of the recognition information and the history of the combination information in addition to the history of the purchase information, which can lead to analysis and improvement of advertising effectiveness, improvement in product lineup, efficiency in product recommendation to customers (improvement in methods for training store staff), improvement in procurement plan, and improvement in store layouts.
  • Modifications
  • While examples in which histories of the image recognition terminal that implements “focus”, the image combining terminal that implements “try” and the management terminal that manages sales information are used have been described in the embodiments described above, the embodiments are not limited thereto, and histories of terminals using various O2O related technologies can be used such as the history of a terminal implementing “search” that is searching for related product information according to attributes of a product over which “focus” is performed.
  • Furthermore, in the embodiments described above, the analyzing unit 35 need not necessarily analyze all the histories. That is, the analyzing unit 35 may set any of the weights to 0.
  • Furthermore, while examples in which the first terminal 10 including the recognizing unit 11 and the second terminal 20 including the combining unit 21 are different terminals have been described in the embodiments described above, the recognizing unit 11 and the combining unit 21 may be included in one terminal 250 as in a system 201 illustrated in FIG. 18.
  • Hardware Configuration
  • FIG. 19 is a diagram illustrating an example of a hardware configuration of the server according to the embodiments and modifications. The server according to the embodiments and modifications described above includes a control device 901 such as a CPU, a storage device 902 such as a ROM and a RAM, an external storage device 903 such as a HDD, a display device 904 such as a display, an input device 905 such as a keyboard and a mouse, and a communication device 906 such as a communication interface (I/F), which is a hardware configuration utilizing a common computer system.
  • Programs to be executed by the server according to the embodiments and modifications described above are recorded on a computer readable recording medium such as a CD-ROM, a CD-R, a memory card, a digital versatile disk (DVD) and a flexible disk (FD) in a form of a file that can be installed or executed, and provided therefrom as a computer program product.
  • Alternatively, the programs to be executed by the server according to the embodiments and modifications may be stored on a computer system connected to a network such as the Internet, and provided by being downloaded via the network. Still alternatively, the programs to be executed by the server according to the embodiments and modifications may be provided or distributed through a network such as the Internet. Still alternatively, the programs to be executed by the server according to the embodiments and modifications may be embedded in a ROM or the like in advance and provided therefrom.
  • The programs to be executed by the server according to the embodiments and modifications have modular structures for implementing the units described above on a computer system. In an actual hardware configuration, the CPU reads programs from the HDD and executes the programs on the RAM, whereby the respective units described above are implemented on a computer system.
  • For example, the order in which the steps in the flowcharts in the embodiments described above are performed may be changed, a plurality of steps may be performed at the same time or the order in which the steps are performed may be changed each time the steps are performed to the extent that the changes are not inconsistent with the nature thereof.
  • As described above, according to the embodiments and modifications described above, information with high probability of motivating the user to buy a product can be extracted.
  • While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.

Claims (16)

What is claimed is:
1. A server comprising:
a first acquiring unit configured to acquire a piece of recognition information including a piece of product identification information for identifying a product included in a product image;
a recognition information storage unit configured to store the piece of recognition information;
a second acquiring unit configured to acquire a piece of combination information including the piece of product identification information of the product to be combined with an object image including an object;
a combination information storage unit configured to store the piece of combination information;
an analyzing unit configured to calculate product priorities for respective products by analyzing a plurality of pieces of recognition information stored in the recognition information storage unit and a plurality of pieces of combination information stored in the combination information storage unit; and
an output unit configured to output information based on the product priorities.
2. The server according to claim 1, wherein
the analyzing unit calculates the product priorities on the basis of first product priorities for the respective products and second product priorities for the respective products,
the analyzing unit calculates the first product priorities by analyzing the pieces of recognition information, and
the analyzing unit calculates the second product priorities by analyzing the pieces of combination information.
3. The server according to claim 2, wherein
each piece of recognition information includes date and time of recognition,
each piece of combination information includes date and time of combination,
the analyzing unit sets, to a higher level, the first product priority of the product represented by the piece of product identification information associated with the date and time of recognition closer to current date and time by analyzing the pieces of recognition information, and
the analyzing unit sets, to a higher level, the second product priority of the product represented by the piece of product identification information associated with the date and time of combination closer to the current date and time by analyzing the pieces of combination information.
4. The server according to claim 2, wherein
the analyzing unit sets, to a higher level, the first product priority of the product represented by the piece of product identification information having a value whose number of occurrences is larger by analyzing the pieces of recognition information, and
the analyzing unit sets, to a higher level, the second product priority of the product represented by the piece of product identification information having a value whose number of occurrences is larger by analyzing the pieces of combination information.
5. The server according to claim 1, wherein
each piece of recognition information includes a plurality of kinds of related information relating to the product,
the analyzing unit further analyzes whether there is a piece of combination information including the piece of product identification information of the product whose product priority satisfies a first predetermined condition in the pieces of combination information, and generates first recommendation information for recommending related information on the basis of a result of the analysis among the kinds of related information, and
the output unit outputs information based on the product priority and the first recommendation information to the recognizing unit.
6. The server according to claim 1, wherein
each piece of combination information includes an image for combination,
the image for combination is provided for each category of product,
each of the pieces of combination information includes a category of the corresponding product,
if there is a plurality of categories of a product whose product priority satisfies a second predetermined condition, the analyzing unit analyzes the number of occurrences of each of the categories in the pieces of combination information and generates second recommendation information for recommending the category with the largest number of occurrences, and
the output unit outputs information based on the product priority and the second recommendation information to the combining unit.
7. A server comprising:
a first acquiring unit configured to acquire a piece of recognition information including a piece of product identification information for identifying a product included in a product image;
a recognition information storage unit configured to store the piece of recognition information;
a second acquiring unit configured to acquire a piece of combination information including the piece of product identification information of the product to be combined with an object image including an object;
a combination information storage unit configured to store the piece of combination information;
a third acquiring unit configured to acquire a piece of sales promotion information relating to sales promotion of the product and a piece of purchase information including at least the piece of product identification information of the product;
a sales information storage unit configured to store the piece of sales promotion information and the piece of purchase information;
an analyzing unit configured to perform at least one of first analysis and second analysis, the first analysis calculating product priorities for respective products by analyzing a plurality of pieces of recognition information stored in the recognition information storage unit, a plurality of pieces of combination information stored in the combination information storage unit, and a plurality of pieces of purchase information stored in the sales information storage unit, the second analysis obtaining updated contents of the pieces of sales promotion information by analyzing at least either the pieces of recognition information or the pieces of combination information in addition to the pieces of purchase information; and
an output unit configured to output at least one of information based on the product priorities and the updated contents.
8. The server according to claim 7, wherein
the analyzing unit calculates the product priorities on the basis of first product priorities for the respective products, second product priorities for the respective products, and third product priorities for the respective products,
the analyzing unit calculates the first product priorities by analyzing the pieces of recognition information,
the analyzing unit calculates the second product priorities by analyzing the pieces of combination information, and
the analyzing unit calculates the third product priorities by analyzing the pieces of purchase information.
9. The server according to claim 8, wherein
the analyzing unit sets, to a higher level, the first product priority of the product represented by the piece of product identification information having a value whose number of occurrences is larger by analyzing the pieces of recognition information,
the analyzing unit sets, to a higher level, the second product priority of the product represented by the piece of product identification information having a value whose number of occurrences is larger by analyzing the pieces of combination information, and
the analyzing unit sets, to a higher level, the third product priority of the product represented by the piece of product identification information having a value whose number of occurrences is larger by analyzing the pieces of purchase information.
10. The server according to claim 8, wherein
each piece of recognition information includes date and time of recognition,
each piece of combination information includes date and time of combination,
each piece of purchase information includes date and time of purchase,
the analyzing unit sets, to a higher level, the first product priority of the product represented by the piece of product identification information associated with the date and time of recognition closer to current date and time by analyzing the pieces of recognition information,
the analyzing unit sets, to a higher level, the second product priority of the product represented by the piece of product identification information associated with the date and time of combination closer to the current date and time by analyzing the pieces of combination information, and
the analyzing unit sets, to a higher level, the third product priority of the product represented by the piece of product identification information associated with the date and time of purchase closer to the current date and time by analyzing the pieces of purchase information.
11. The server according to claim 7, wherein
each piece of recognition information includes a plurality of kinds of related information relating to the product,
the analyzing unit further analyzes whether there is a piece of combination information including the piece of product identification information of the product whose product priority satisfies a first predetermined condition in the pieces of combination information, and generates first recommendation information for recommending related information on the basis of a result of the analysis among the kinds of related information, and
the output unit outputs information based on the product priority and the first recommendation information to the recognizing unit.
12. The server according to claim 7, wherein
each piece of combination information includes an image for combination,
each of the pieces of combination information includes a category of the corresponding product,
if there is a plurality of categories of a product whose product priority satisfies a second predetermined condition, the analyzing unit analyzes the number of occurrences of each of the categories in the pieces of combination information and generates second recommendation information for recommending the category with the largest number of occurrences, and
the output unit outputs information based on the product priority and the second recommendation information to the combining unit.
13. The server according to claim 7, wherein
each piece of recognition information includes a piece of product image information of the product image,
each piece of sales promotion information includes a piece of first sales promotion information relating to sales promotion using the product image,
the analyzing unit analyzes the pieces of purchase information and analyzes the number of occurrences, in the pieces of recognition information, of the piece of product image information associated with the piece of product identification information whose number of occurrences in the pieces of purchase information satisfies a third predetermined condition, and
the analyzing unit obtains, as the updated contents, updated contents of the pieces of first sales promotion information on the basis of the number of occurrences of the piece of product image information.
14. The server according to claim 7, wherein
each piece of combination information includes an image for combination and a piece of combination image information of the image for combination,
each piece of sales promotion information includes a piece of second sales promotion information relating to sales promotion using the image for combination, and
the analyzing unit analyzes the pieces of purchase information and analyzes the number of occurrences, in the pieces of combination information, of the piece of combination image information associated with the piece of product identification information having a value whose number of occurrences in the pieces of purchase information satisfies a fourth predetermined condition, and
the analyzing unit obtains, as the updated contents, updated contents of the pieces of second sales promotion information on the basis of the number of occurrences of the piece of combination image information.
15. An analysis method comprising:
acquiring a piece of recognition information including a piece of product identification information for identifying a product included in a product image;
storing the piece of recognition information in a recognition information storage unit,
acquiring a piece of combination information including the piece of product identification information of the product to be combined with an object image including an object;
storing the piece of combination information in a combination information storage unit;
calculating product priorities for respective products by analyzing a plurality of pieces of recognition information stored in the recognition information storage unit and a plurality of pieces of combination information stored in the combination information storage unit; and
outputting information based on the product priorities.
16. A computer program product comprising a computer-readable medium containing a program executed by a computer, the program causing the computer to execute:
acquiring a piece of recognition information including a piece of product identification information for identifying a product included in a product image;
storing the piece of recognition information in a recognition information storage unit,
acquiring a piece of combination information including the piece of product identification information of the product to be combined with an object image including an object;
storing the piece of combination information in a combination information storage unit;
calculating product priorities for respective products by analyzing a plurality of pieces of recognition information stored in the recognition information storage unit and a plurality of pieces of combination information stored in the combination information storage unit; and
outputting information based on the product priorities.
US14/065,670 2012-11-05 2013-10-29 Server, analysis method and computer program product Abandoned US20140129329A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/332,651 US10311497B2 (en) 2012-11-05 2016-10-24 Server, analysis method and computer program product for analyzing recognition information and combination information

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-243762 2012-11-05
JP2012243762A JP6005482B2 (en) 2012-11-05 2012-11-05 Server apparatus, analysis method, and program

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/332,651 Division US10311497B2 (en) 2012-11-05 2016-10-24 Server, analysis method and computer program product for analyzing recognition information and combination information

Publications (1)

Publication Number Publication Date
US20140129329A1 true US20140129329A1 (en) 2014-05-08

Family

ID=50623243

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/065,670 Abandoned US20140129329A1 (en) 2012-11-05 2013-10-29 Server, analysis method and computer program product
US15/332,651 Active 2034-06-01 US10311497B2 (en) 2012-11-05 2016-10-24 Server, analysis method and computer program product for analyzing recognition information and combination information

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/332,651 Active 2034-06-01 US10311497B2 (en) 2012-11-05 2016-10-24 Server, analysis method and computer program product for analyzing recognition information and combination information

Country Status (2)

Country Link
US (2) US20140129329A1 (en)
JP (1) JP6005482B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160070959A1 (en) * 2014-09-10 2016-03-10 Casio Computer Co., Ltd. Display System With Imaging Unit, Display Apparatus And Display Method
US20160253746A1 (en) * 2015-02-27 2016-09-01 3D Product Imaging Inc. Augmented reality e-commerce
US20170193438A1 (en) * 2016-01-06 2017-07-06 Wal-Mart Stores, Inc. Systems and methods for monitoring featured product inventory

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6313745B1 (en) * 2000-01-06 2001-11-06 Fujitsu Limited System and method for fitting room merchandise item recognition using wireless tag
US20020045959A1 (en) * 2000-08-23 2002-04-18 Van Overveld Cornelius Wilhelmus Antonius Marie Method and system for generating a recommendation for a selection of a piece of clothing
US20050018216A1 (en) * 2003-07-22 2005-01-27 International Business Machines Corporation Apparatus and method to advertise to the consumer based off a digital image
US20050131776A1 (en) * 2003-12-15 2005-06-16 Eastman Kodak Company Virtual shopper device
US20060020482A1 (en) * 2004-07-23 2006-01-26 Coulter Lori A Methods and systems for selling apparel
US20080163344A1 (en) * 2006-12-29 2008-07-03 Cheng-Hsien Yang Terminal try-on simulation system and operating and applying method thereof
US20090076881A1 (en) * 2006-03-29 2009-03-19 Concert Technology Corporation System and method for refining media recommendations
US20090089186A1 (en) * 2005-12-01 2009-04-02 International Business Machines Corporation Consumer representation rendering with selected merchandise
US20100191619A1 (en) * 2002-10-07 2010-07-29 Dicker Russell A User interface and methods for recommending items to users
US20110282758A1 (en) * 1998-09-18 2011-11-17 Jacobi Jennifer A Item recommendation service
US20110298897A1 (en) * 2010-06-08 2011-12-08 Iva Sareen System and method for 3d virtual try-on of apparel on an avatar
US20120066208A1 (en) * 2010-09-09 2012-03-15 Ebay Inc. Sizing content recommendation system
US20120095589A1 (en) * 2010-10-15 2012-04-19 Arkady Vapnik System and method for 3d shape measurements and for virtual fitting room internet service
US20120158502A1 (en) * 2010-12-17 2012-06-21 Microsoft Corporation Prioritizing advertisements based on user engagement
US20120154633A1 (en) * 2009-12-04 2012-06-21 Rodriguez Tony F Linked Data Methods and Systems
US20130046772A1 (en) * 2011-08-16 2013-02-21 Alibaba Group Holding Limited Recommending content information based on user behavior
US20130173377A1 (en) * 2011-12-30 2013-07-04 Ebay Inc. Systems and methods for delivering dynamic offers to incent user behavior
US20140040041A1 (en) * 2012-08-03 2014-02-06 Isabelle Ohnemus Garment fitting system and method
US20140122231A1 (en) * 2011-08-19 2014-05-01 Qualcomm Incorporated System and method for interactive promotion of products and services
US20140176565A1 (en) * 2011-02-17 2014-06-26 Metail Limited Computer implemented methods and systems for generating virtual body models for garment fit visualisation

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4413633B2 (en) * 2004-01-29 2010-02-10 株式会社ゼータ・ブリッジ Information search system, information search method, information search device, information search program, image recognition device, image recognition method and image recognition program, and sales system
JP2007011543A (en) * 2005-06-29 2007-01-18 Dainippon Printing Co Ltd Article-wearing simulation system, article-wearing simulation method, and the like
JP5112087B2 (en) 2008-01-18 2013-01-09 株式会社エヌ・ティ・ティ・ドコモ Information distribution server, information distribution system, and information distribution method
JP2012108805A (en) * 2010-11-18 2012-06-07 Toshiba Tec Corp Try-on system
JP5583087B2 (en) 2011-08-04 2014-09-03 株式会社東芝 Image processing apparatus, method, and program
US9262780B2 (en) * 2012-01-09 2016-02-16 Google Inc. Method and apparatus for enabling real-time product and vendor identification

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110282758A1 (en) * 1998-09-18 2011-11-17 Jacobi Jennifer A Item recommendation service
US6313745B1 (en) * 2000-01-06 2001-11-06 Fujitsu Limited System and method for fitting room merchandise item recognition using wireless tag
US20020045959A1 (en) * 2000-08-23 2002-04-18 Van Overveld Cornelius Wilhelmus Antonius Marie Method and system for generating a recommendation for a selection of a piece of clothing
US20100191619A1 (en) * 2002-10-07 2010-07-29 Dicker Russell A User interface and methods for recommending items to users
US20050018216A1 (en) * 2003-07-22 2005-01-27 International Business Machines Corporation Apparatus and method to advertise to the consumer based off a digital image
US20050131776A1 (en) * 2003-12-15 2005-06-16 Eastman Kodak Company Virtual shopper device
US20060020482A1 (en) * 2004-07-23 2006-01-26 Coulter Lori A Methods and systems for selling apparel
US20090089186A1 (en) * 2005-12-01 2009-04-02 International Business Machines Corporation Consumer representation rendering with selected merchandise
US8285595B2 (en) * 2006-03-29 2012-10-09 Napo Enterprises, Llc System and method for refining media recommendations
US20090076881A1 (en) * 2006-03-29 2009-03-19 Concert Technology Corporation System and method for refining media recommendations
US20080163344A1 (en) * 2006-12-29 2008-07-03 Cheng-Hsien Yang Terminal try-on simulation system and operating and applying method thereof
US20120154633A1 (en) * 2009-12-04 2012-06-21 Rodriguez Tony F Linked Data Methods and Systems
US20110298897A1 (en) * 2010-06-08 2011-12-08 Iva Sareen System and method for 3d virtual try-on of apparel on an avatar
US20120066208A1 (en) * 2010-09-09 2012-03-15 Ebay Inc. Sizing content recommendation system
US20120095589A1 (en) * 2010-10-15 2012-04-19 Arkady Vapnik System and method for 3d shape measurements and for virtual fitting room internet service
US20120158502A1 (en) * 2010-12-17 2012-06-21 Microsoft Corporation Prioritizing advertisements based on user engagement
US20140176565A1 (en) * 2011-02-17 2014-06-26 Metail Limited Computer implemented methods and systems for generating virtual body models for garment fit visualisation
US20130046772A1 (en) * 2011-08-16 2013-02-21 Alibaba Group Holding Limited Recommending content information based on user behavior
US20140122231A1 (en) * 2011-08-19 2014-05-01 Qualcomm Incorporated System and method for interactive promotion of products and services
US20130173377A1 (en) * 2011-12-30 2013-07-04 Ebay Inc. Systems and methods for delivering dynamic offers to incent user behavior
US20140040041A1 (en) * 2012-08-03 2014-02-06 Isabelle Ohnemus Garment fitting system and method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160070959A1 (en) * 2014-09-10 2016-03-10 Casio Computer Co., Ltd. Display System With Imaging Unit, Display Apparatus And Display Method
US20160253746A1 (en) * 2015-02-27 2016-09-01 3D Product Imaging Inc. Augmented reality e-commerce
US10497053B2 (en) * 2015-02-27 2019-12-03 3D Product Imaging Inc. Augmented reality E-commerce
US20170193438A1 (en) * 2016-01-06 2017-07-06 Wal-Mart Stores, Inc. Systems and methods for monitoring featured product inventory
US10510041B2 (en) * 2016-01-06 2019-12-17 Walmart Apollo, Llc Systems and methods for monitoring featured product inventory

Also Published As

Publication number Publication date
US10311497B2 (en) 2019-06-04
JP6005482B2 (en) 2016-10-12
US20170039620A1 (en) 2017-02-09
JP2014092983A (en) 2014-05-19

Similar Documents

Publication Publication Date Title
CN108876526B (en) Commodity recommendation method and device and computer-readable storage medium
Lu et al. A video-based automated recommender (VAR) system for garments
CN109284413A (en) Method of Commodity Recommendation, device, equipment and storage medium based on recognition of face
US20140156449A1 (en) Method and apparatus for item recommendation
US20100241525A1 (en) Immersive virtual commerce
JP5297501B2 (en) Information creation device, information creation method, recommendation device, recommendation method, and program
JP2018519613A (en) Systems and techniques for presenting and evaluating items in an online marketplace
KR20210054849A (en) Method and apparatus for matching product and influencer using artificial intelligence
JP7162417B2 (en) Estimation device, estimation method, and estimation program
JP7130991B2 (en) ADVERTISING DISPLAY SYSTEM, DISPLAY DEVICE, ADVERTISING OUTPUT DEVICE, PROGRAM AND ADVERTISING DISPLAY METHOD
JP6976207B2 (en) Information processing equipment, information processing methods, and programs
US10311497B2 (en) Server, analysis method and computer program product for analyzing recognition information and combination information
CN110807691B (en) Cross-commodity-class commodity recommendation method and device
JP7140588B2 (en) Decision device, decision method and decision program
Nurmi et al. Promotionrank: Ranking and recommending grocery product promotions using personal shopping lists
Zhao et al. Can users embed their user experience in user-generated images? Evidence from JD. com
US20240013287A1 (en) Real time visual feedback for augmented reality map routing and item selection
TW201942836A (en) Store system, article matching method and apparatus, and electronic device
CN111967924A (en) Commodity recommendation method, commodity recommendation device, computer device, and medium
CN115563357A (en) Automatic purchase of digital wish list content based on user set thresholds
CN111127128B (en) Commodity recommendation method, commodity recommendation device and storage medium
CN114596138A (en) Information recommendation method and device, computer equipment and storage medium
KR20210052237A (en) Product catalog automatic classification system based on artificial intelligence
JP2016081156A (en) Face photograph display device
JP6809148B2 (en) Program and combination extraction system

Legal Events

Date Code Title Description
AS Assignment

Owner name: KABUSHIKI KAISHA TOSHIBA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SEKINE, MASAHIRO;NISHIYAMA, MASASHI;SUGITA, KAORU;AND OTHERS;REEL/FRAME:031500/0450

Effective date: 20131010

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION