WO2005109875A1 - 再生装置 - Google Patents
再生装置 Download PDFInfo
- Publication number
- WO2005109875A1 WO2005109875A1 PCT/JP2005/008531 JP2005008531W WO2005109875A1 WO 2005109875 A1 WO2005109875 A1 WO 2005109875A1 JP 2005008531 W JP2005008531 W JP 2005008531W WO 2005109875 A1 WO2005109875 A1 WO 2005109875A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- information
- graphics
- time
- video
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/005—Reproducing at a different information rate from the information rate of recording
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
- G11B27/105—Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/25—Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
- G11B2220/2537—Optical discs
- G11B2220/2541—Blu-ray discs; Blue laser DVR discs
Definitions
- the present invention relates to an apparatus for reproducing AV (Audio Video) data recorded on a recording medium, and more particularly, to reproducing AV data being reproduced when a program for drawing graphics is stored on the recording medium.
- the present invention relates to a technique for controlling graphics drawing so as to cooperate with a video or the like.
- Patent Document 1 In order to temporally synchronize a plurality of media such as video and audio, a technology for efficiently managing the time of each media has been proposed (see Patent Document 1).
- a plurality of media can be efficiently synchronized.
- Patent Document 1 JP 2001-157175 A
- An object of the present invention is to provide a playback device that draws an application graph title so as to cooperate with an AV data image or the like that is being played when AV data such as video and an application are recorded on a recording medium. Aim. Means for solving the problem
- a playback device of the present invention is a playback device that plays back a video stream recorded on a recording medium together with a program to be executed at the time of playback of a video stream, wherein the video stream has a video playback timing.
- the program includes control information for specifying a position on a time axis related to the program, the program includes a predetermined code specifying a plurality of images and a time at which each image is to be drawn, And a reproducing unit for sequentially reproducing each video constituting the video stream at a reproduction timing related to the control information, and sequentially interpreting and executing each code constituting the program, and interpreting the predetermined code.
- a program for storing each image specified by the predetermined code and the time at which the image is to be drawn in the storage means is executed. Means and a position on the time axis specified based on the control information in relation to the reproduction timing of the video by the reproduction means, and a time to be drawn for each image stored in the storage means. If there is an image to be drawn among the images based on the matching result, the image is selected by the image selecting means for selecting the image based on the matching result of the video being reproduced by the selecting means. If there is an image, drawing means for drawing the image during reproduction of the video is provided.
- the reproduction of the video stream indicates that each video constituting the video stream is reproduced, and the reproduction means performing control for displaying the video. If the video data to be shown is a compressed moving image, the expanded video signal is output to the display device.
- the display device may exist outside the playback device or may be included inside.
- drawing means performing control for displaying an image, such as storing image data in an image memory, or outputting a signal representing an image to a display device.
- an image is drawn during video playback, the video and the image are displayed in such a way as to be superimposed or the like.
- time axis related to the reproduction timing refers to a time axis indicating the reproduction time of each video in the video stream data when reproducing the video stream data on the recording medium.
- the program execution means executes a predetermined code including information such as an image to be drawn during playback of the video stream, and stores the information such as the image to be drawn.
- the image selection unit can select the image to be drawn at the reproduction timing of each video reproduced by the reproduction unit, as well as the respective image powers stored in the storage unit.
- the selected image can be drawn together with the video reproduced at the reproduction timing.
- the reproducing apparatus further includes a processor and an image memory for storing an image constituting a screen to be displayed, and the program executing means executes a machine capable of executing each code constituting the program by the processor.
- the image selecting means performs the collation by causing the processor to execute a predetermined collation machine instruction sequence by causing the processor to execute the interpretation.
- the image may be drawn by transferring the image, if any, selected by the selection means to the image memory.
- the image memory is a kind of memory that outputs a signal for screen display based on a set of images stored in the image memory. Is displayed.
- the program executing means converts each code of the program into native code which can be interpreted and executed by the processor, and then executes the image using the converted native code. Therefore, it can be performed at high speed.
- the predetermined code specifies a time at which each image is to be drawn by a drawing start time and a drawing end time, and the program execution means executes the predetermined code at each time. This is realized by storing drawing time data indicating a drawing start time and a drawing end time of the image in the storage unit in association with each image data indicating the image. Relatedly specified based on the control information The image indicated by the image data corresponding to the drawing time data including the position on the time axis within the range from the drawing start time to the drawing end time may be selected.
- the image selecting means can select an image including the reproduction timing of the video in the range of the drawing start time and the end time, so that the image selecting means can have a width when selecting the image. it can.
- the selected image can be displayed between the drawing start time and the end time.
- control information further includes condition information indicating a predetermined condition determination criterion in association with one or more positions on a time axis
- the image selection unit is configured to execute the reproduction by the reproduction unit.
- the position on the time axis specified based on the control information in relation to the video, and the time to be drawn for one or more images stored in the storage means satisfy a predetermined relationship, and
- the condition information is associated with the position on the time axis, the image satisfying the predetermined relationship is only displayed when it is determined that the predetermined condition is satisfied based on the condition information. It may be elected.
- control information includes, for example, condition information such as an identifier indicating an image to be drawn at the video playback timing and coordinates, so that the image selection means performs only the video playback timing.
- condition information such as an identifier indicating an image to be drawn at the video playback timing and coordinates
- the predetermined code further specifies coordinates for drawing an image in association with each image
- the program execution means further executes the interpretation of the predetermined code to specify the coordinates.
- Image drawing coordinates are stored in the storage means in association with each image to be processed
- the condition information includes coordinates
- the image selection means controls the control in association with the video being reproduced by the reproduction means.
- the position on the time axis specified based on the information and the time to be drawn for one or more images stored in the storage means satisfy a predetermined relationship, and When the condition information is associated with the position, the coordinates included in the condition information and the image drawing coordinates stored in the storage unit corresponding to the image satisfying the predetermined relationship are used.
- the image selection means stores the coordinates of the condition information among the images stored in the storage means and corresponding to the video playback timing. It is possible to select an image in which the coordinates for image drawing match or the coordinates of the condition information and the coordinates for image drawing are within a predetermined interval.
- condition information includes information for specifying a playback speed
- the playback unit selects one of a plurality of playback speeds, performs the playback according to the selected playback speed, and executes the image selection unit.
- the predetermined condition according to the above determination is that the playback speed selected for the video being played back by the playback means is at a position on the time axis specified based on the control information in relation to the video. It may be the same as the playback speed specified by the associated condition information.
- the image selection unit determines the playback speed of the video stream being played back. An image including the corresponding reproduction mode in the condition information can be selected.
- the predetermined code further specifies an image identifier in association with each image, and the program execution means further executes interpretation of the predetermined code,
- the condition information includes an image identifier of an image to be drawn, and the image selecting unit controls the control in association with a video being reproduced by the reproducing unit.
- the position on the time axis specified based on the information and the time to be drawn for one or more images stored in the storage means satisfy a predetermined relationship, and the position on the time axis is determined.
- the image selection unit determines whether the image identifier is one of the images stored in the storage unit that can be accessed only by the video playback timing. A matching image can be selected.
- FIG. 1 is a data hierarchy diagram of a BD-ROM according to an embodiment of the present invention.
- FIG. 2 is a configuration diagram of a logical space on a BD-ROM according to the embodiment of the present invention.
- FIG. 3 is a functional configuration diagram of a playback device according to the present invention.
- FIG. 4 is a hardware configuration diagram of a playback device according to the present invention.
- FIG. 5 is a configuration diagram of an MPEG stream according to the embodiment of the present invention.
- FIG. 6 is a configuration diagram of a section for transmitting a stream event according to the embodiment of the present invention.
- FIG. 7 is a functional configuration diagram of an application and a platform according to Embodiment 2 of the present invention.
- FIG. 8 is a diagram showing an example of graphics information according to the present invention.
- FIG. 9 is a diagram showing an example of stream event information according to the present invention.
- FIG. 10 is a flowchart showing a process for controlling graphics drawing when a stream event occurs in Embodiment 2.
- FIG. 11 is a flowchart showing graphics information determination processing based on stream event information.
- FIG. 12 is a flowchart showing processing for controlling graphics drawing when a user event occurs in Embodiment 2.
- FIG. 13 is a flowchart showing graphics information determination processing based on user event information.
- FIG. 14 is a functional configuration diagram of an application and a platform according to the first embodiment.
- FIG. 15 is a flowchart showing a process of transmitting and receiving graphics information between an application and a platform according to the first embodiment.
- FIG. 16 is a flowchart showing a process for controlling graphics drawing when a stream event occurs in the first embodiment.
- FIG. 17 is a flowchart showing a process for controlling graphics drawing when a user event occurs in the first embodiment.
- FIG. 18 is a functional configuration diagram of an application and a platform according to Embodiments 3 and 4.
- FIG. 19 is a diagram showing an example of stream event information according to the third embodiment.
- FIG. 21 is a flowchart showing a process for controlling graphics drawing when a stream event occurs in Embodiments 3 and 4.
- FIG. 22 is a flowchart showing a process of extracting stream event information in Embodiment 3.
- FIG. 23 is a flowchart showing a process of extracting stream event information in a fourth embodiment.
- FIG. 1 shows a BD-ROM (hereinafter referred to as “BD”) to be reproduced by the reproducing apparatus according to the present invention.
- FIG. 3 is a diagram showing a configuration, in particular, a configuration of a BD 104 as a disk medium and data 101, 102, and 103 recorded on the BD.
- the data recorded on the BD disk 104 is AV data 103, BD management information 102 such as AV data management information and an AV playback sequence, and a BD playback program 101.
- a description is given of an AV application for reproducing AV content such as a movie, and a description of a BD disk.
- a BD disk is a recording medium for computer use such as a CD-ROM or a DVD-ROM. Of course, it is also possible to use it.
- FIG. 2 shows logical data recorded on the aforementioned BD disc.
- the BD disk 104 has a recording area in a spiral shape toward the outer circumference as in the case of other optical disks, for example, DVD and CD, and has a space between the lead-in on the inner circumference and the lead-out on the outer circumference. , With a logical address space. Also, there is a special area inside the lead-in called BCA (Burst Cutting Area) that can only be read by a drive. Since this area cannot read application power, it may be used for copyright protection technology, for example.
- BCA Burt Cutting Area
- the file system is UDF, ISO9660, etc., and logical data can be read according to the directory and file structure, just like a normal PC.
- the directory and file structure on the BD disc are such that the BD VIDEO directory is located immediately below the root directory (ROOT).
- This directory is a directory in which data 101, 102 and 103 such as AV contents and management information handled by the BD-ROM are recorded.
- the BD playback device reads this file first.
- BD. PROG> (Fixed file name)
- the correspondence with the playlist is identified by the file body name (XXX matches).
- the correspondence with the VOB is identified by the file body name (YYY matches).
- PNG data image format standardized by W3C
- FIG. 3 is a block diagram showing a functional configuration of the playback device 1000.
- Data on the BD disc 201 is read out through the optical pickup 202, and the read out data is stored in a dedicated memory according to the type of data.
- the storage destinations of the data 101 to 103 on the BD disk 104 in FIG. 2 are as follows.
- the BD playback program 101 (BD. PROG or XXX. PROG file) stores BD management information 102 (BD. INFO, XXX. PL or YYY. VOBI) in the program recording memory 203. Is recorded in the management information recording memory 204, and the AV data 103 (YYY. VOB or ZZZ. PNG) is recorded in the AV recording memory 205.
- the program processing unit 206 receives, from the management information processing unit 207, information on a playlist for reproducing AV data and event information such as execution of a BD reproduction program, and processes the BD reproduction program.
- the program processing unit 206 receives an event (hereinafter, referred to as a “user event”) via a remote controller, a joystick, or the like, and executes the BD playback program if there is a corresponding BD playback program.
- the presentation processing unit 208 reproduces AV data, if a stream event described later exists in the AV data, the stream event is received from the presentation processing unit 208.
- the BD playback program includes an application that plays back other AV data during playback of AV data, an application that draws graphics, and the like.
- an application that plays back the AV data and draws the overlay with the overlay as necessary at the same time can naturally be applied to only one of the applications.
- a Java (registered trademark) application is used as a general-purpose program. However, the same applies to a program written in C language or another programming language.
- the management information processing unit 207 receives an instruction from the program processing unit 206, analyzes the corresponding playlist and the management information of the VOB corresponding to the playlist, and instructs the presentation processing unit 208 to reproduce the AV data based on the analysis result. To instruct. Further, the management information processing unit 207 receives the time information from the presentation processing unit 208, and instructs the presentation processing unit 208 to stop reproduction based on the time information. Further, the management information processing unit 207 instructs the program processing unit 206 to execute a program.
- the presentation processing unit 208 decodes each of video, audio, and PNG data. It decodes and outputs AV data based on time information and instructions from the management information processing unit 207.
- the time information is control information for reproducing AV data according to a predetermined time axis.
- the video data is rendered on the video plane 210 after being decoded, and the PNG data is rendered on the image plane 209. Then, the images are synthesized by the synthesis processing unit 211 and output to a display device such as a TV.
- the presentation processing unit 208 when the AV data is being reproduced, the presentation processing unit 208
- the stream event is transmitted to the program processing unit 206.
- FIG. 4 shows an example in which the functional configuration of the playback device (FIG. 3) described above is realized by hardware.
- the AV recording memory 205 is provided for the image memory 308 and the track buffer 309
- the program processing unit 206 is provided for the program processor 302 and the UOP manager 303
- the management information processing unit 207 is provided for the scenario processor 305 and the presentation controller 306.
- the processing unit 208 corresponds to the clock 307, the demultiplexer 310, the image processor 311, the video processor 312, and the sound processor 313, respectively.
- processing of data read from the BD disc 201 will be outlined with reference to FIG.
- MPEG stream data is recorded in a track buffer 309, and PNG data is recorded in an image memory 308.
- the demultiplexer 310 extracts the MPEG stream data recorded in the track buffer 309 based on the time of the clock 307, sends out the video data to the video processor 312, and sends out the audio data to the sound processor 313.
- the video processor 312 and the sound processor 313 also have a decoder buffer and a decoder power, respectively, as defined in the MPEG system standard. That is, the video and audio data sent from the demultiplexer 310 are Is temporarily recorded in each decoder buffer, and is decoded by each decoder based on the clock 307.
- the presentation controller 306 instructs the decoding timing.
- the time information from the clock 307 is received by the scenario processor 305 and the caption is displayed (start and end), the caption is displayed or not displayed on the presentation controller 306 so that the caption is properly displayed. Is indicated.
- the image processor 311 reads the PNG data specified by the presentation controller 306 from the image memory 308, decodes the data, and draws the decoded data on the image plane 314.
- BD playback program card-a program for drawing a user image or the like When drawing PNG data indicating a menu image or the like, the program processor 302 instructs the image processor 311 to perform decoding timing. The timing at which the program processor 302 instructs the image processor 311 to decode the PNG data depends on the BD playback program that the program processor 302 processes.
- the PNG data and the video data are recorded on the image plane 314 and the video plane 315, respectively, after being decoded as described with reference to Fig. 3, and are synthesized and output by the synthesis processing unit 316.
- the BD management information (scenario, AV management information) read from the BD disc 201 is recorded in the management information recording memory 304.
- the scenario information (BD. INFO and XXX. PL) is read and processed by the scenario processor 305, and the AV management information (YYY. VOBI) is read and processed by the presentation controller 306. .
- the scenario processor 305 analyzes the information of the playlist and instructs the presentation controller 306 of the MPEG stream data referred to by the playlist and the reproduction position thereof, and the presentation controller 306 transmits the corresponding MPEG stream data. It analyzes the data management information (YYY. VOBI), to indicate to read the MPEG stream data to the drive controller 3 ⁇ 7.
- YYY. VOBI data management information
- the drive controller 317 reads the corresponding MPEG stream data by moving the optical pickup based on the instruction of the presentation controller 306.
- the scenario processor 305 monitors the time of the clock 307 and sends an event to the program processor 302 based on the setting of the management information.
- the BD playback program (BD. PROG or XXX. PROG) recorded in the program recording memory 301 is executed by the program processor 302.
- the program processor 302 processes the BD playback program when an event is transmitted from the scenario processor 305 or when an event is transmitted from the UOP manager 303.
- UOP manager 303 When a request by a user's remote control operation is transmitted, UOP manager 303 generates event information indicating the request, and sends the event information to program processor 302.
- FIG. 5 is a configuration diagram of an MPEG stream according to the present embodiment.
- the MPEG stream is composed of a plurality of transport packets (Transport Packet, hereinafter, referred to as “TS packets”).
- TS packets Transport Packet, hereinafter, referred to as “TS packets”.
- the size of one TS packet is 188 bytes.
- a video (Video) stream, an audio (Audio) stream, and the like are separated and multiplexed by a plurality of TS packets and transmitted.
- the video stream and the audio stream are composed of a plurality of PES packets, and the information transmitted by the TS packet includes a stream composed of PES packets and a section (Section).
- PSI Program Specific Information
- DSM stream event
- FIG. 6 is a configuration diagram of a section for transmitting a stream event (DSM-CC Stream Event).
- a section for transmitting a stream event is composed of table-id, event-msg-group-id, and payload.
- table-id indicates the data type stored in the payload that is the data storage area. If a stream event is stored, table-id has the value 0x3D.
- event-msg-group-id is a name for identifying a stream event group stored in the payload.
- the stream event is stored in the payload and transmitted. Further detailed configuration is described in the MPEG system (ISOZIEC13818-1) standard and the DSM-CC (ISO / IEC 13818-6) standard!
- FIG. 14 is a functional block diagram of the program (BD playback program) and the program processing unit 206 according to the present embodiment.
- an application 1400 corresponds to a program
- a platform 1410 corresponds to the program processing unit 206 which is an execution environment of the program.
- the application 1400 in the present embodiment is a Java (registered trademark) application
- the platform 1410 includes a processor, and sequentially interprets a Java (registered trademark) application program to execute a native application executable by the processor. It contains virtual machines that translate to code and the processor executes native code.
- the application 1400 is an application for overlaying graphics on a part or the whole of the image of the AV data being played back, and is related to graphics including information such as rendering time and rendering coordinates.
- Draws graphics based on information (hereinafter referred to as "graphics information").
- An example of such an application is an application such as a shooting game using an AV data image as a background.
- the platform 1410 is based on graphics information and information indicated by a stream event embedded in an MPEG stream in advance (hereinafter, referred to as "stream event information"). Then, graphics information indicating graphics to be drawn is determined. Also, it instructs the image plane 209 to draw graphics based on the determined graph status information.
- the processing of the platform 1410 is realized by determining graphics to be drawn and executing a native code for instructing drawing of the determined graphics by a processor.
- the application 1400 includes a user event receiving unit 1401, a user event information transmitting unit 1402, and a graphics information transmitting unit 1403. Each unit shows the operation of the application 1400.
- Each of the above units includes an interface (hereinafter, referred to as an "API function") for receiving an input from the platform-side application in the program code constituting the application. I have.
- API function an interface for receiving an input from the platform-side application in the program code constituting the application. I have.
- the application 1400 can output user event information and graphics information to the platform 1410 by appropriately calling API functions.
- the part that describes the instruction for calling the API function for transmitting the user event information is executed. It corresponds to the user event information transmission unit 1402, The one that executes the part that describes the instruction to call the API function for transmitting the graphics information corresponds to the graphics information transmitting unit 1403.
- the user event receiving unit 1401 receives a user event from the program processing unit 206, and extracts user event information such as an event type and coordinates.
- the user event information transmitting unit 1402 transmits the extracted user event information to the platform 1410.
- the graphics information transmitting unit 1403 transmits the graphics information that has been preliminarily determined to the platform 1410.
- “transmit graphics information” refers to, for example, calling an API function with reference to the Dallax information in the application 1400 as an argument.
- the call of the API function is also realized by the platform 1410 interpreting the function.
- the platform 1410 includes a user event information receiving unit 1411, a graphics information receiving unit 1412, a graphics information storage unit 1413, a graphics information control unit 1414, a graph ittus drawing unit 1415, and a stream event receiving unit 1416. .
- the user event information receiving unit 1411 receives user event information from the application 1400.
- the graphics information receiving unit 1412 receives predetermined graphics information from the application 1400.
- receiving graphics information from the application 1400 specifically means that the platform 1410 interprets an API function in which the reference to the graphics information in the application 1400 is a bow I number, and The native code for storing the graphics information in the storage unit 1413 is executed by the processor. As a result, the graphics information is stored in the graphics information storage unit 1413.
- the graphics information storage unit 1413 is a memory in a logical space in the platform 1410, and stores graphics information.
- the graphics information control unit 1414 determines graphics information to be drawn based on the graphics information stored in the graphics information storage unit 1413 and the stream event information received from the stream event receiving unit 1416.
- the graphics drawing unit 1415 is a unit that instructs the image plane 209 to draw graphics based on the graphics information.
- the stream event receiving unit 1416 receives the stream event embedded in the AV data from the presentation processing unit 208, and extracts stream event information.
- FIG. 8 shows an example of graphics information according to the present embodiment.
- the graphics information is composed of an object, a file name, coordinates, and a drawing time.
- the object ID is a name for identifying each graphics object, and is unique to the application.
- the file name indicates a file that stores PNG data corresponding to the graphics object.
- the coordinates are a drawing position that is an index when determining a graphics object to be drawn. For example, when coordinates are specified by the user, if the coordinates of the graphics information are included within a fixed distance range defined around the specified coordinates, the coordinates are associated with the coordinates. Draws a graphics object at the specified coordinates.
- the drawing time is a time at which the drawing of the graphics object should be started and a time at which it should be ended. These times are specified as positions on the reproduction time axis of the AV data reproduced together with the execution of the application.
- FIG. 9 shows an example of stream event information in the present embodiment.
- the stream event information includes a time, an object, coordinates, and an allowable error.
- the time is the position on the playback time axis of the AV data in which the stream event is embedded.
- Object ID is a name that identifies the corresponding graphics object
- the coordinates are coordinates at which the corresponding graphics object should be drawn.
- the permissible error is used to determine the graphics to be drawn corresponding to the stream event, and indicates the range of the coordinates of the graphics to be drawn. Specifically, the coordinates of the graphics information are included in the range of the value of the allowable error around the coordinates to be drawn. Determine the object as graphics to be drawn.
- FIG. 15 shows a flowchart when graphics information is transmitted to the platform 1410 when the application 1400 is started.
- the graphics information transmitting unit 1403 of the application 1400 transmits the graphics information to the platform 1410 (S1510).
- the graphics information receiving unit 1412 in the platform 1410 receives the graphics information transmitted in step S1510 (S1500).
- step S 1500 The graphics information received in step S 1500 is sent to graphics information storage section 1413 (S 1501), and stored in graphics information storage section 1413 (S 1502).
- FIG. 16 shows a flowchart for controlling the rendering of the Dallax in accordance with the stream event information which is buried in the AV data.
- the graphics information is already stored in the graphics information storage unit 1413 of the platform 1410 at the time of starting the application, and the AV event to be reproduced includes a stream event.
- the stream event receiving unit 1416 receives a stream event from the presentation processing unit 208 (S1600), and extracts stream event information from the received stream event. (S1601), and sends the stream event information to the graphics information control unit 1414 (S1602).
- the graphics information control unit 1414 reads out graphics information from the graphics information storage unit 1413 (S 1603).
- the graphics information control unit 1414 determines the graphics information corresponding to the stream event information extracted in step S1601, from among all the graphics information read in step S1603 (S1604).
- the graphics information is sent to the Dallafix drawing unit 1415 (S1605).
- the graphics drawing unit 1415 instructs the image plane 209 to draw graphics based on the graphics information (S1606).
- step S1100 graphics information control section 1414 scans all graphics information in graphics information storage section 1413 and performs a determination process.
- the graphics information control unit 1414 determines that the time (hereinafter referred to as “event time”) included in the stream event information extracted from the stream event received from the stream event information receiving unit 1416 is within the drawing time in the graphics information. Determine a certain force (S110
- the object ID of the Dallafix object corresponding to the graphics information having the drawing time information is determined. It is determined whether it is included in the stream event information (S1102).
- step SI102 if it is determined that the graphics information control unit 1414 is included in the stream event information (S1102: Y), the graphics information control unit 1414 selects the relevant graphics information, and The coordinates of the information are replaced with the coordinates of the stream event information (S1103).
- This graphics information is sent to the graphics drawing unit 1415,
- the drawing unit 1415 draws the graphics indicated by the graphics information on the image plane 209. Also, the graphics of the image plane 209 and the video data of the video plane 210 are superimposed and output by the synthesis processing unit 211.
- the pre-defined graphics information is transmitted to the platform 1410, so that the platform 1410 can transmit the stream event information to the application 1400 sequentially without the platform 1410 transmitting the stream event information.
- the graphics object to be drawn can be controlled, and the platform 1410 can perform the selection processing of the graphics object by executing the machine instruction set.
- the PNG data in the BD indicated by the platform information is read into the AV recording memory 205, and the presentation processing unit 208 Decode the PNG data read into 205 !!
- the event time is included in the drawing time, and the graphic event that matches the object ID of the stream event information and the Dallax information is selected as the graphics to be drawn.
- graphics having coordinates included within the allowable error range of the coordinates in the stream event information may be selected, or graphics information including the event time in the drawing time may be selected.
- FIG. 17 is a flowchart showing a process for controlling graphics drawing when the application 1400 receives a user event from the program processing unit 206.
- a graphics object having coordinates within an allowable range defined by the system is drawn around the coordinates specified by the user event.
- step S1710 the program processing unit 206 receives the user event and sends it to the user event receiving unit 1401 of the application 1400, and the user event receiving unit 1401 receives the user event.
- the user event receiving unit 1401 extracts user event information such as coordinates based on the received user event (S1711), and sends the user event information to the user event information transmitting unit 1402 (S1712).
- the user event information transmitting unit 1402 transmits the user event information to the platform 1410 (S1713).
- the user event information receiving unit 1411 of the platform 1410 receives the user event information (S1700) and sends it to the graphics information control unit 1414 (S1701).
- the graphics information control unit 1414 reads out the graphics information stored in the graphics information storage unit 1413 (S1702) and, based on the graphics information and the user event information, determines the graphics information of the graphics to be drawn. A determination is made (S1703).
- the graphics information control unit 1414 sends the graphics information determined in step S1703 to the graphics drawing unit 1415 (S1704).
- the graphics drawing unit 1415 instructs the image plane 209 to draw graphics based on the received graphics information (S1704).
- step S 1300 all graphics information in graphics information storage unit 1413 is scanned as a determination target.
- the graphics information control unit 1414 determines the coordinates (hereinafter, referred to as the coordinates) of the
- step S1301 when it is determined that there is graphics information having coordinates within the allowable error range of the graphics information control unit 1414 force event coordinates (step S1301)
- the determination processing of graphics to be drawn is performed on the platform side.
- the same playback apparatus 1000 Using Fig. 3
- the application performs graphics determination processing.
- FIG. 7 shows a functional configuration of the application 700 and the platform 710 according to the present embodiment.
- an application 700 includes a user event receiving unit 701, a graphics information storage unit 702, a graphics information control unit 703, a graphics information transmitting unit 704, and a stream event information receiving unit 705.
- the stream event information receiving unit 705 receives stream event information from the platform 710.
- the graphics information transmission unit 704 transmits the graphics information determined by the graphics information control unit 705 to the platform 700.
- the platform 710 includes a graphics information receiving unit 711, a stream event information transmitting unit 712, a graphics drawing unit 713, and a stream event receiving unit 714.
- the stream event information transmitting unit 712 transmits the stream event information to the application 700.
- the graphics information receiving unit 711 sequentially receives the graphics information in response to the call of the API function by which the application 700 transmits the graph status information.
- FIG. 10 is a flowchart showing a process of controlling the rendering of the Dallax in accordance with the stream event information which is buried in the AV data.
- steps S1000 and S1001 are the same as in the first embodiment, Step S1002 and subsequent steps will be described.
- step S1002 the stream event receiving unit 714 sends the stream event information to the stream event information transmitting unit 712.
- the stream event information transmitting unit 712 transmits the stream event information to the application 700 (S1003).
- the stream event information receiving unit 705 of the application 700 receives the stream event information (S1010) and sends it to the graphics information control unit 703 (S1011).
- the graphics information control unit 703 reads out the graphics information stored in the graphics information storage unit 702 (S1012), and determines the graphics information based on the graphics information and the stream event information (S1013).
- the graphics information control unit 703 sends the determined graphics information to the graphics information sending unit 712 (S1014), and the graphics information sending unit 712 sends the graphics information to the platform 710 (S1015). .
- the graphics information receiving unit 711 of the platform 710 receives the graphics information from the platform 710 (S1004) and sends it to the graphics drawing unit 713 (S1005).
- the determination process of the graphics information is the same as that of the first embodiment, and thus the description thereof will be omitted.
- the operation performed when a user event occurs differs from that of the first embodiment. I do.
- FIG. 12 is a flowchart showing a process for controlling the drawing of the graphics object when a user event occurs in the present embodiment.
- Steps S1210 to S1211 in the figure are the same as in the first embodiment, and a description thereof will be omitted.
- step S1212 the user event receiving unit 701 sends the user event information to the graphics information control unit 703 of the application 700, and the graphics information control unit 703 receives the user event information (S1213).
- the graphics information control unit 703 determines the graphics information based on the graphics information and the user event information (S1214), and The information is transmitted to the information transmitting unit 704 (S1215).
- the graphics information transmitting unit 704 transmits graphics information to the platform 710 (S1216), and the graphics information receiving unit 711 of the platform 710 receives the graphics information (S1200).
- This embodiment controls the drawing of graphics when the user switches the playback speed in the second embodiment described above.
- FIG. 18 is a functional block diagram of an application and a platform in the present embodiment.
- the application 1800 receives a mode switching event that occurs when the mode for reproducing AV data is switched, and transmits the mode switching event to the platform 1810.
- the mode indicates the playback speed of the AV data, for example, 2 ⁇ speed, and when switching the playback speed, the user switches the mode by operating a remote controller or the like.
- the mode switching event indicates occurrence of a mode switching operation by the user.
- the mode switching operation by the user operation is received by the program processing unit 206, and a mode switching event indicating the operation is transmitted to the application 1800.
- the application 1800 is configured by adding a mode switching event receiving unit 1806, a filtering information setting unit 1807, and a filtering information transmitting unit 1808 to the configuration of the application according to the second embodiment.
- the mode switching event receiving unit 1806 receives a mode switching event that occurs when the mode for reproducing AV data is switched, and receives information related to the mode switching event (hereinafter, “ Mode off event information. " ) To extract.
- the mode switching event information includes mode information before and after the mode switching operation and the like.
- the filtering information setting unit 1807 sets filtering information for extracting a stream event based on the mode switching event information.
- the filtering information is, for example, information indicating "2x speed" when the user switches the mode from “1x speed” to "2x speed".
- Filtering information transmitting section 1808 transmits the set filtering information to the platform.
- the platform 1810 is configured by adding a filtering information receiving unit 1815 and a filtering information storage unit 1816 to the configuration of the platform of the second embodiment.
- the filtering information receiving unit 1815 receives the filtering information from the application 1800.
- the filtering information storage unit 1816 stores the received filtering information.
- FIG. 19 shows stream event information according to the present embodiment.
- the stream event information is obtained by adding mode information to the stream event information of the above-described embodiment.
- the mode information may include a plurality of reproduction modes indicating the reproduction mode of the AV data being reproduced.
- FIG. 20 is a flowchart illustrating a process of transmitting filtering information from application 1800 to platform 1810 when the AV data playback mode is switched by a user operation.
- the mode switching event receiving unit 1806 of the application 1800 receives a mode switching event from the program processing unit 206 (S2010).
- the mode switching event information receiving unit 1806 extracts the mode switching event information from the received mode switching event (S2011), and sets the mode switching event information to the filtering information setting. It is sent to the fixed section 1807 (S2012).
- Filtering information setting section 1807 sets filtering information based on the mode switching event information (S2013), and sends it to filtering information transmitting section 1808 (S2014).
- the filtering information transmitting unit 1808 transmits the filtering information to the platform 1810 (S2015).
- the filtering information receiving unit 1815 of the platform 1810 receives the filtering information (S2000), and sends out the filtering information to the filtering information storage unit 1816 (S2001).
- the filtering information storage unit 1816 stores the filtering information (S2002).
- the stream event receiving unit 1814 When receiving the stream event from the presentation processing unit 208, the stream event receiving unit 1814 extracts the stream event information based on the filtering information stored in the filtering information storage unit 1816 in step S2002 described above. I do.
- FIG. 21 is a flowchart showing processing for controlling the drawing of a graphics object in the present embodiment.
- step S2100 the stream receiving unit 1814 of the platform 1810 receives a stream event from the presentation processing unit 208 (S2100).
- the stream event receiving unit 1814 extracts stream event information (S2101), and reads out filtering information from the filtering information storage unit 1810 (S2102).
- the stream event receiving unit 1814 extracts the stream event information based on the filtering information (S2103), and sends the extracted stream event information to the stream event information transmitting unit 1812 (S2104).
- the stream event information transmitting unit 1812 transmits the stream event information to the application 1800 (S2105). [0101]
- the stream event information receiving unit 1805 of the application 1800 receives the stream event information (S2110).
- step S2110 and subsequent steps is the same as the processing in the above-described second embodiment (FIG. 12), and a description thereof will be omitted.
- step S2200 stream event receiving section 1814 scans all received stream event information.
- the stream event receiving unit 1814 reads the filtering information from the filtering information storage unit 1816, and determines whether or not the power matches the mode information and the filtering information related to the stream event information (S2201).
- step S2201 when it is determined that the stream event receiving unit 1814 matches, the determined stream event information is extracted (S2202). ⁇ Embodiment 4>
- the stream event information includes the mode information. For example, when a mode switching event for switching the mode to the double speed occurs, the stream event receiving unit 1814 sets the mode information to “2 ⁇ speed”. Only the stream event information is extracted, and the extracted stream event information is transmitted to the application.
- the same stream event information as in the first and second embodiments is used, and when a mode switching event occurs, the stream event receiving unit 1814 sets an appropriate stream event information according to the mode switching event.
- the configuration of the section for transmitting the application is the same as that of the first embodiment.
- the functional configuration of the application is the same as that of the third embodiment, but the stream event receiving unit 1814 of the platform includes a stream event counter. Real This is different from the functional configuration of the platform in the third embodiment.
- the stream event counter counts stream event information in chronological order and holds the counted number.
- filtering information is stored in a filtering information storage unit 1816 of 810.
- FIG. 23 is a flowchart showing a process of extracting stream event information.
- step S2300 the stream event receiving unit 1814 initializes a stream event counter.
- the stream event receiving unit 1814 scans all the received stream event information (step S2301), and determines whether or not the power is the stream event information corresponding to the filtering information by referring to the value held by the stream event counter. (S2302).
- the filtering information is “2 ⁇ speed”, it is determined whether the stream event counter is a multiple of 2.
- step S2302 if it is determined that the stream event information corresponds to the stream event receiving unit 1814 filtering information (step S2302: Y), the stream event receiving unit 1814 extracts the stream event information (S2304).
- step S2302 If it is determined in step S2302 that the stream event information does not correspond to the stream event receiving unit 1814 filtering information (step S2302: N), the stream event counter is updated (S2303).
- the playback device according to the present invention has been described based on Embodiments 1 to 4, but the following modifications can be made, and the present invention is not limited to the playback device described in the above embodiment. Of course.
- the program according to the first embodiment operates in cooperation with one AV stream.
- the present invention can also be applied to a case where a plurality of types of AV streams are stored in a BD, and the program operates in cooperation with these AV streams.
- the platform 1410 selects the selected AV data. Receives graphics information corresponding to the AV data. In this way, when the playback of each AV data is instructed, the platform receives the graph status information corresponding to each AV data, and the program can be linked to the AV stream to be played without preparing a program for each AV stream. Graphics to be drawn.
- the power described when a single application is operating on the platform can be applied to a case where a plurality of applications are operating.
- the platform receives graphics information from both applications when the applications are started.
- the graphics information control unit 1414 selects graphics information corresponding to each game, and causes the graphics drawing unit 1415 to draw graphics corresponding to each game.
- the capability of the platform to draw a graphics object based on graphics information predetermined in the application for example, if the graphics data is recorded on the same BD as the AV data being reproduced.
- AV data may be switched (5)
- the user when the user switches the reproduction mode, the user is required to switch to any one of the predetermined modes.For example, the user performs reproduction using a joystick or the like. If a speed is specified, the filtering information may be set as if it had been switched to the default mode corresponding to that playback speed.
- the graphics drawing unit instructs the image plane to finish drawing graphics, but it is necessary to instruct the image plane to draw new graphics, etc. If the graphics cannot be drawn in time, the graphics to be drawn by the platform may be appropriately thinned out.
- the platform on which the application sets the filtering information may set the filtering information based on the mode switching event information.
- the mode switching event information is transmitted to the application platform, and the platform receives the mode switching event information, and determines the graphic information to be drawn based on the mode switching event information.
- the stream event information is extracted. Similar information (hereinafter referred to as “counter information”) may be included.
- the application may set more detailed filtering information such as extracting only stream event information having a predetermined value and a certain value of the bit of the counter information.
- the graphics are rendered in accordance with the stream event embedded in the AV data being reproduced in advance, and the graphics information is described based on the time information indicating the reproduction timing of each video of the AV data. First, you can also draw graphics. Yes.
- the playback device can execute a program that operates in cooperation with the AV data stream being played back, and can draw graphics in cooperation with the AV data stream. Therefore, it is used in the movie industry and the consumer electronics industry involved in the manufacture of equipment that processes it. For example, it can be used as a BD-ROM disc, a BD-ROM player, and the like.
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006513031A JP4048223B2 (ja) | 2004-05-11 | 2005-05-10 | 再生装置 |
EP05739195.5A EP1746827B1 (en) | 2004-05-11 | 2005-05-10 | Reproduction device |
US10/596,107 US8238714B2 (en) | 2004-05-11 | 2005-05-10 | Reproduction device |
MXPA06012895A MXPA06012895A (es) | 2004-05-11 | 2005-05-10 | Dispositivo de reproduccion. |
BRPI0511014-9A BRPI0511014A (pt) | 2004-05-11 | 2005-05-10 | dispositivo de reprodução |
US12/609,095 US8724965B2 (en) | 2004-05-11 | 2009-10-30 | Reproduction device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-141558 | 2004-05-11 | ||
JP2004141558 | 2004-05-11 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/596,107 A-371-Of-International US20080241827A1 (en) | 2004-05-10 | 2005-05-10 | Methods For Detecting A Mutant Nucleic Acid |
US12/609,095 Continuation US8724965B2 (en) | 2004-05-11 | 2009-10-30 | Reproduction device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005109875A1 true WO2005109875A1 (ja) | 2005-11-17 |
Family
ID=35320592
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/008531 WO2005109875A1 (ja) | 2004-05-11 | 2005-05-10 | 再生装置 |
Country Status (9)
Country | Link |
---|---|
US (2) | US8238714B2 (ja) |
EP (2) | EP2528325A1 (ja) |
JP (3) | JP4048223B2 (ja) |
CN (4) | CN101494077B (ja) |
BR (1) | BRPI0511014A (ja) |
MX (1) | MXPA06012895A (ja) |
MY (1) | MY157654A (ja) |
TW (3) | TWI364032B (ja) |
WO (1) | WO2005109875A1 (ja) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW200839560A (en) * | 2004-05-11 | 2008-10-01 | Matsushita Electric Ind Co Ltd | Reproducer, program, and reproducing method |
CN101814307A (zh) * | 2004-07-22 | 2010-08-25 | 松下电器产业株式会社 | 重放装置和重放方法 |
CA2609967A1 (en) * | 2005-05-31 | 2006-12-07 | Matsushita Electric Industrial Co. Ltd. | Recording and reproduction apparatus and recording and reproduction method |
WO2006129819A1 (en) | 2005-05-31 | 2006-12-07 | Matsushita Electric Industrial Co., Ltd. | Broadcast receiving terminal and program execution method |
JP4763589B2 (ja) * | 2006-12-18 | 2011-08-31 | 株式会社日立製作所 | 再生装置、および、その再生方法 |
US8559789B2 (en) * | 2007-06-06 | 2013-10-15 | Panasonic Corporation | Reproducing apparatus that uses continuous memory area |
US8380042B2 (en) * | 2008-04-16 | 2013-02-19 | Panasonic Corporation | Reproduction device, reproduction method, and program |
JP4962674B1 (ja) * | 2009-04-03 | 2012-06-27 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
JP4915456B2 (ja) * | 2009-04-03 | 2012-04-11 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
JP2011210052A (ja) * | 2010-03-30 | 2011-10-20 | Sharp Corp | ネットワークシステム、通信方法、および通信端末 |
KR102056893B1 (ko) * | 2012-08-24 | 2019-12-17 | 에스케이하이닉스 주식회사 | 반도체 장치 |
JP6855348B2 (ja) * | 2017-07-31 | 2021-04-07 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置およびダウンロード処理方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08256322A (ja) * | 1995-03-17 | 1996-10-01 | Nec Corp | 双方向通信による動画再生システム |
JP2001157175A (ja) * | 1999-09-17 | 2001-06-08 | Toshiba Corp | メディア時間管理情報記述方法、マルチメディア情報処理装置、マルチメディア情報処理方法および記録媒体 |
JP2005092971A (ja) * | 2003-09-17 | 2005-04-07 | Hitachi Ltd | プログラム及び記録媒体、再生装置 |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2748562B2 (ja) * | 1988-07-13 | 1998-05-06 | セイコーエプソン株式会社 | 画像処理装置 |
CA2168327C (en) * | 1995-01-30 | 2000-04-11 | Shinichi Kikuchi | A recording medium on which a data containing navigation data is recorded, a method and apparatus for reproducing a data according to navigationdata, a method and apparatus for recording a data containing navigation data on a recording medium. |
CN1099806C (zh) * | 1996-04-05 | 2003-01-22 | 松下电器产业株式会社 | 多媒体光盘及其再生装置、再生方法 |
US5845075A (en) * | 1996-07-01 | 1998-12-01 | Sun Microsystems, Inc. | Method and apparatus for dynamically adding functionality to a set of instructions for processing a Web document based on information contained in the Web document |
JPH10211358A (ja) * | 1997-01-28 | 1998-08-11 | Sega Enterp Ltd | ゲーム装置 |
JPH1127641A (ja) * | 1997-07-07 | 1999-01-29 | Toshiba Corp | テレビジョン受信機 |
CN1231049C (zh) * | 1998-09-08 | 2005-12-07 | 夏普公司 | 动态图像编辑方法及动态图像编辑装置及动态图像回放装置 |
JP2000100073A (ja) * | 1998-09-28 | 2000-04-07 | Sony Corp | 記録装置および方法、再生装置および方法、記録媒体、並びに提供媒体 |
WO2000028737A1 (fr) * | 1998-11-05 | 2000-05-18 | Tokyo Broadcasting System, Inc. | Terminal de reception, procede de commande de ce dernier et support d'enregistrement de programmes |
US6269373B1 (en) * | 1999-02-26 | 2001-07-31 | International Business Machines Corporation | Method and system for persisting beans as container-managed fields |
EP1104128A4 (en) * | 1999-05-28 | 2005-12-28 | Matsushita Electric Ind Co Ltd | BROADCASTING SYSTEM |
EP1089199A3 (en) | 1999-09-17 | 2003-11-05 | Kabushiki Kaisha Toshiba | Media time management information describing method, multimedia information processing apparatus, and multimedia information processing method |
JP2001092971A (ja) | 1999-09-24 | 2001-04-06 | Sony Corp | 画像処理装置および方法、並びに記録媒体 |
EP1156486B1 (en) | 2000-04-20 | 2016-04-06 | Hitachi Maxell, Ltd. | Digital signal recording/reproducing apparatus, receiving apparatus and transmitting method |
CN1239021C (zh) * | 2000-04-21 | 2006-01-25 | 索尼公司 | 信息处理设备及方法、程序和记录介质 |
KR100564893B1 (ko) * | 2000-06-30 | 2006-03-30 | 가부시끼가이샤 도시바 | 프레임정보 기술방법 및 그 장치, 프레임정보 생성방법 및 그 장치, 특수재생방법 및 그 장치, 특수재생 제어정보 생성장치 및 컴퓨터 독출가능한 기록매체 |
US6964025B2 (en) * | 2001-03-20 | 2005-11-08 | Microsoft Corporation | Auto thumbnail gallery |
WO2003036644A1 (en) * | 2001-10-23 | 2003-05-01 | Samsung Electronics Co., Ltd. | Information storage medium containing event occurrence information, and method and apparatus therefor |
KR100609392B1 (ko) * | 2002-09-05 | 2006-08-08 | 엘지전자 주식회사 | 정지 영상의 재생을 관리하기 위한 재생리스트 마크의데이터 구조를 갖는 기록 매체, 그에 따른 기록 및 재생방법 및 장치 |
CN1695197B (zh) * | 2002-09-12 | 2012-03-14 | 松下电器产业株式会社 | 播放设备、播放方法、以及记录介质的记录方法 |
EP1547080B1 (en) * | 2002-10-04 | 2012-01-25 | LG Electronics, Inc. | Recording medium having a data structure for managing reproduction of graphic data and recording and reproducing methods and apparatuses |
ATE447229T1 (de) * | 2003-01-20 | 2009-11-15 | Lg Electronics Inc | Aufzeichnungsmedium mit einer datenstruktur zur verwaltung der wiedergabe von darauf aufgezeichneten standbildern und aufzeichnungs- und wiedergabeverfahren und vorrichtungen |
ES2343065T3 (es) * | 2003-01-31 | 2010-07-22 | Panasonic Corporation | Medio de registro, dispositivo de reproduccion, procedimiento de registro, programa y procedimiento de reproduccion para un flujo de datos graficos especificando botones interactivos. |
KR20040080736A (ko) * | 2003-03-13 | 2004-09-20 | 삼성전자주식회사 | 인터랙티브 컨텐츠 동기화 장치 및 방법 |
GB0311141D0 (en) * | 2003-05-15 | 2003-06-18 | Koninkl Philips Electronics Nv | DVD player enhancement |
JP2004343532A (ja) * | 2003-05-16 | 2004-12-02 | Sony Corp | 符号化装置および方法、復号装置および方法、記録装置および方法、並びに再生装置および方法 |
-
2005
- 2005-05-10 CN CN2009100081582A patent/CN101494077B/zh not_active Expired - Fee Related
- 2005-05-10 CN CN2009100081559A patent/CN101494076B/zh not_active Expired - Fee Related
- 2005-05-10 MX MXPA06012895A patent/MXPA06012895A/es active IP Right Grant
- 2005-05-10 JP JP2006513031A patent/JP4048223B2/ja active Active
- 2005-05-10 US US10/596,107 patent/US8238714B2/en active Active
- 2005-05-10 BR BRPI0511014-9A patent/BRPI0511014A/pt not_active IP Right Cessation
- 2005-05-10 EP EP12181366A patent/EP2528325A1/en not_active Withdrawn
- 2005-05-10 CN CNB2005800150614A patent/CN100484226C/zh active Active
- 2005-05-10 EP EP05739195.5A patent/EP1746827B1/en active Active
- 2005-05-10 CN CN2009100081597A patent/CN101521034B/zh not_active Expired - Fee Related
- 2005-05-10 WO PCT/JP2005/008531 patent/WO2005109875A1/ja active Application Filing
- 2005-05-11 TW TW094115225A patent/TWI364032B/zh not_active IP Right Cessation
- 2005-05-11 TW TW097123162A patent/TWI361430B/zh active
- 2005-05-11 TW TW097123163A patent/TWI371750B/zh not_active IP Right Cessation
- 2005-05-11 MY MYPI20052112A patent/MY157654A/en unknown
-
2008
- 2008-05-08 JP JP2008122518A patent/JP4231541B2/ja active Active
- 2008-05-08 JP JP2008122519A patent/JP4654265B2/ja active Active
-
2009
- 2009-10-30 US US12/609,095 patent/US8724965B2/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08256322A (ja) * | 1995-03-17 | 1996-10-01 | Nec Corp | 双方向通信による動画再生システム |
JP2001157175A (ja) * | 1999-09-17 | 2001-06-08 | Toshiba Corp | メディア時間管理情報記述方法、マルチメディア情報処理装置、マルチメディア情報処理方法および記録媒体 |
JP2005092971A (ja) * | 2003-09-17 | 2005-04-07 | Hitachi Ltd | プログラム及び記録媒体、再生装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1746827A4 * |
Also Published As
Publication number | Publication date |
---|---|
EP1746827B1 (en) | 2018-10-24 |
CN101521034B (zh) | 2012-05-02 |
CN1954603A (zh) | 2007-04-25 |
JP2008252920A (ja) | 2008-10-16 |
CN101494077B (zh) | 2012-09-05 |
JP4654265B2 (ja) | 2011-03-16 |
TWI364032B (en) | 2012-05-11 |
EP2528325A1 (en) | 2012-11-28 |
JP2008206193A (ja) | 2008-09-04 |
CN101521034A (zh) | 2009-09-02 |
CN101494077A (zh) | 2009-07-29 |
TW200620262A (en) | 2006-06-16 |
CN101494076B (zh) | 2012-09-05 |
BRPI0511014A (pt) | 2007-11-20 |
EP1746827A1 (en) | 2007-01-24 |
CN101494076A (zh) | 2009-07-29 |
TW200842855A (en) | 2008-11-01 |
MY157654A (en) | 2016-07-15 |
JPWO2005109875A1 (ja) | 2008-03-21 |
US8724965B2 (en) | 2014-05-13 |
JP4231541B2 (ja) | 2009-03-04 |
TWI361430B (en) | 2012-04-01 |
JP4048223B2 (ja) | 2008-02-20 |
TWI371750B (en) | 2012-09-01 |
US20100046920A1 (en) | 2010-02-25 |
EP1746827A4 (en) | 2011-10-12 |
MXPA06012895A (es) | 2007-01-26 |
US20080285947A1 (en) | 2008-11-20 |
TW200844991A (en) | 2008-11-16 |
CN100484226C (zh) | 2009-04-29 |
US8238714B2 (en) | 2012-08-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4048223B2 (ja) | 再生装置 | |
JP4126066B2 (ja) | 記録媒体再生システム、再生装置、記録方法、再生方法、プログラム | |
KR20050048612A (ko) | 정보 기록 매체, 정보 기록 장치 및 방법, 정보 재생 장치및 방법, 정보 기록 재생 장치 및 방법, 기록 또는 재생 제어용의 컴퓨터 프로그램, 및 제어 신호를 포함한 데이터 구조 | |
WO2009130862A1 (ja) | 情報記録装置および情報記録方法 | |
KR100884395B1 (ko) | 재생모드를 자동 설정할 수 있는 기록매체 재생장치 및 그제어방법 | |
JP4231535B2 (ja) | 再生装置 | |
JP5166036B2 (ja) | 再生装置、再生方法及び再生プログラム | |
JP2007133938A (ja) | オーディオミキシング出力の可否を示すフラグを持った情報記録媒体、および、その再生装置、再生方法 | |
WO2005004156A1 (ja) | 情報記録媒体、情報記録装置及び方法、情報再生装置及び方法、情報記録再生装置及び方法、記録又は再生制御用のコンピュータプログラム、並びに制御信号を含むデータ構造 | |
WO2004082273A1 (ja) | 情報記録装置及び方法、情報記録媒体並びに記録制御用のコンピュータプログラム | |
CN106104687B (zh) | 记录介质、再现装置及其方法 | |
JP2003304498A (ja) | 情報記録媒体、情報記録装置及び方法、情報再生装置及び方法、情報記録再生装置及び方法、記録又は再生制御用のコンピュータプログラム、並びに制御信号を含むデータ構造 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 10596107 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: PA/a/2006/012895 Country of ref document: MX |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006513031 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200580015061.4 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005739195 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2005739195 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: PI0511014 Country of ref document: BR |