WO2003105474A1 - 画像検出装置、画像検出方法および画像検出プログラム - Google Patents
画像検出装置、画像検出方法および画像検出プログラム Download PDFInfo
- Publication number
- WO2003105474A1 WO2003105474A1 PCT/JP2003/007409 JP0307409W WO03105474A1 WO 2003105474 A1 WO2003105474 A1 WO 2003105474A1 JP 0307409 W JP0307409 W JP 0307409W WO 03105474 A1 WO03105474 A1 WO 03105474A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- frame
- display
- images
- moving
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 43
- 238000001514 detection method Methods 0.000 claims description 34
- 238000000605 extraction Methods 0.000 claims description 27
- 239000000284 extract Substances 0.000 claims description 8
- 230000000994 depressogenic effect Effects 0.000 abstract 1
- 230000001747 exhibiting effect Effects 0.000 abstract 1
- 238000012805 post-processing Methods 0.000 abstract 1
- 238000012545 processing Methods 0.000 description 106
- 238000010586 diagram Methods 0.000 description 40
- 230000008569 process Effects 0.000 description 36
- 238000000926 separation method Methods 0.000 description 26
- 230000005236 sound signal Effects 0.000 description 23
- 230000006870 function Effects 0.000 description 22
- 230000002194 synthesizing effect Effects 0.000 description 18
- 238000004891 communication Methods 0.000 description 11
- 238000012986 modification Methods 0.000 description 8
- 230000004048 modification Effects 0.000 description 8
- 239000002131 composite material Substances 0.000 description 7
- 238000012217 deletion Methods 0.000 description 7
- 230000037430 deletion Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 5
- 230000015572 biosynthetic process Effects 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 238000003786 synthesis reaction Methods 0.000 description 4
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000001454 recorded image Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 238000005562 fading Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011045 prefiltration Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
- G11B27/32—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
- G11B27/327—Table of contents
- G11B27/329—Table of contents on a disc [VTOC]
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/034—Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
- G11B27/105—Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/36—Monitoring, i.e. supervising the progress of recording or reproducing
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B5/00—Recording by magnetisation or demagnetisation of a record carrier; Reproducing by magnetic means; Record carriers therefor
- G11B5/74—Record carriers characterised by the form, e.g. sheet shaped to wrap around a drum
- G11B5/76—Drum carriers
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/25—Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
- G11B2220/2508—Magnetic discs
- G11B2220/2516—Hard disks
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/25—Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
- G11B2220/2525—Magneto-optical [MO] discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/60—Solid state media
- G11B2220/61—Solid state media wherein solid state memory is used for storing A/V content
Definitions
- Image detection device image detection method, and image detection program
- the present invention relates to, for example, a recording medium mounted on a digital recorder and the like,
- the present invention relates to an image detecting method and an image detecting program applied to the image detecting apparatus.
- Some existing home digital recorders have an editing function.
- a home digital recorder equipped with an editing function performs editing operations such as deleting unnecessary scenes from a recorded TV program or extracting only necessary scenes, This is done by switching the digital recorder to edit mode.
- the user of the home digital recorder first shifts the home digital recorder to the edit mode and then plays back the recorded television program to be edited. While checking the playback image, the “play key”, “fast forward key”, “fast rewind key”, “stop key”, “one key” provided on the remote controller (remote control device) of the home digital recorder. Search the edit point of the target scene by operating the “stop key”.
- the desired editing operation is performed by operating the editing operation keys such as. If you want to specify the edit points as accurately as possible,
- the “slow playback key” and “frame advance key” are used to search for the image of the frame that is most appropriate as an edit point.
- edits such as “Chapter division key”, “Start point specification key”, “End point specification key”, “Delete key”, and “Extraction key” are used only for editing.
- By enabling the operation of the operation keys it is possible to prevent erroneous operation of the editing operation keys used only at the time of editing.
- Editing work on recorded data is possible after switching to edit mode, as described above. However, depending on the user, editing of recorded data can be performed without knowing the switch to edit mode itself. It is considered that it may take some time before it is performed. m
- the recording time of a moving image that can be recorded on the recording medium also increases. For example, when determining edit points from a program that has been recorded for a long time, it would take an enormous amount of time to play back the program and search for edit points from the beginning. In addition, even when a program is reproduced in the middle, a desired editing point may be present in a skipped portion, and an appropriate editing point may not be detected. For this reason, it is desired to take measures to enable a target scene to be quickly and accurately detected from image information for a long period of time such as several hours of reproduction time.
- the present invention eliminates the above-described problems, and can quickly and accurately detect target image information from recorded image information. And an image detecting method and an image detecting program used in the image detecting apparatus. Disclosure of the invention
- Extracting means for extracting one or more frame images from a plurality of frame images constituting a moving image
- Display control means for controlling display of the extracted frame image and a predetermined number of frame images temporally before and after the frame image; and a plurality of frame images whose display is controlled by the display control means.
- the extracting means extracts, for example, a frame image designated by a user from a plurality of frame images constituting the moving image, and extracts the frame image.
- the display control is performed by the display control means so that the system image and the predetermined number of frame images before and after the frame image are simultaneously displayed.
- the target frame image is pointed by the instruction unit.
- an editing point such as a start point and an end point of a target scene can be quickly, accurately, and easily detected from a plurality of frame images constituting a moving image. .
- the image detecting device of the invention according to claim 5 is:
- First extraction means for extracting a predetermined number of frame images from a plurality of frame images constituting a moving image
- a first display control unit that controls display of a predetermined number of moving images with the predetermined number of frame images extracted by the first extraction unit as an initial image, and a display controlled by the first display control unit.
- First instruction means for selecting and instructing a desired moving image from the predetermined number of moving images
- Second extracting means for extracting an arbitrary frame image from the moving image specified by the first specifying means
- Second display control means for controlling display of the frame image extracted by the second extraction means and a predetermined number of frame images temporally before and after the frame image;
- Second instruction means for selecting and instructing a desired frame image from the frame images the display of which has been controlled by the second display control means
- the first extracting means extracts, for example, a frame image designated by a user from a plurality of frame images constituting the moving image, and uses the extracted frame image as an initial image to obtain a predetermined number of moving images.
- TM a frame image designated by a user
- the display is controlled by the first display control means.
- a target moving image is designated by the first instruction means from among the plurality of moving images displayed by the first display control means.
- an arbitrary frame image is extracted by the second extracting means from the instructed moving image by the second extracting means.
- the frame image extracted by the second extracting means and the frame images before and after this frame image are displayed by the second display control means. From the displayed images, the target frame image is displayed in the second frame. Instructions can be given through instruction means (2).
- FIG. 1 is a diagram for explaining an image processing system formed by using a recording / reproducing device to which the image detecting device according to the present invention is applied.
- FIG. 2 is a block diagram for explaining a recording / reproducing device to which the image detecting device according to the present invention is applied.
- FIG. 3 is a diagram for explaining an example of a display image displayed on the display screen 201 immediately after an operation is performed so that a chapter mark is added.
- FIG. 4 illustrates a case where a reduced still image (thumbnail) is displayed.
- FIG. 7 is a diagram for explaining a display example of an image in a case where the image is set to the above.
- 5A to 5C are diagrams for explaining selection of a reduced still image, scroll display, and menu display for the reduced still image.
- FIG. 6 is a diagram for explaining the use of the timeline and the triangle mark.
- FIG. 7 is a diagram for explaining examples of display items displayed on the display screen.
- FIG. 8 is a flowchart for explaining the operation at the time of reproduction.
- FIG. 9 is a flowchart following FIG.
- FIG. 10 is a flowchart following FIG.
- FIG. 11 is a diagram for explaining another example of the display image.
- FIG. 12 is a diagram for explaining another example of the display image.
- FIG. 13 is a diagram for explaining a second embodiment of the recording / reproducing device to which the image detecting device according to the present invention is applied.
- FIG. 14 is a diagram for explaining an example of a display image formed by the recording / reproducing apparatus shown in FIG.
- FIG. 15 is a diagram for explaining another example of a display image formed by the recording / reproducing apparatus shown in FIG.
- FIG. 16 is a view for explaining another example of a display image formed in the recording / reproducing apparatus shown in FIG.
- FIG. 17 is a diagram for explaining the operation of the user and the recording / reproducing device 600 when selecting an image to be used as an editing candidate point from images of a target scene.
- FIG. 18 is a diagram for explaining the operation of the user and the recording / reproducing device 600 when selecting an image to be used as an editing candidate point from images of a target scene.
- FIG. 19 is a flowchart for explaining the operation of the recording / reproducing apparatus of the second embodiment shown in FIG.
- FIG. 20 is a flowchart following FIG.
- FIG. 21 illustrates processing in a case where a subsequent playback point catches up with the previously set editing candidate point in the recording / playback apparatus of the second example of the second embodiment.
- FIG. 1 A first figure.
- FIG. 22 is a diagram for explaining a case where the cursor CS is not positioned at the sub-image display area SG2.
- FIG. 23 is a diagram for explaining the operation of the recording / reproducing apparatus after setting the edit point based on the reproduction point that has caught up to the edit candidate point.
- FIG. 24 is a diagram for explaining an example in which the playback speed is varied according to the position of the playback point.
- FIG. 25 is a diagram for explaining an example in which the reproduction speed is varied according to the position of the reproduction point.
- FIG. 26 is a diagram for explaining an example in which the reproduction speed is varied according to the position of the reproduction point.
- FIG. 27A and FIG. 27B are diagrams for explaining a case where a preceding playback point overtakes a preceding playback point.
- FIG. 28A and FIG. 28B are diagrams for explaining a case where a preceding playback point overtakes a preceding playback point.
- FIG. 29A to FIG. 29C are diagrams for explaining a case where a preceding reproduction point overtakes a preceding reproduction point.
- FIG. 30 is a diagram for describing automatic setting of a reproduction point when a reproduction speed is set by a user.
- FIGS. 31A to 31C are diagrams for explaining automatic setting of a reproduction point when a reproduction speed is set by a user.
- FIG. 32 is a diagram for explaining automatic setting of the playback point when the playback speed is set by the user.
- FIG. 33 is a diagram for explaining an image in which editing candidate point information is displayed using past editing point information.
- Figure 34 shows the display of editing candidate point information using past editing point information. It is a figure for explaining the image which was made into.
- FIG. 1 is a diagram for explaining a recording / reproducing device 100 to which an image detection device, an image detection method, and an image detection program according to the present invention are applied.
- the recording / reproducing device 100 includes various types of information receiving devices including a monitor receiver as a supply destination of a reproduced image signal formed by the recording / reproducing device 100 and information including image information.
- An image processing system can be configured by connecting digital and analog devices.
- FIG. 1 is a diagram for explaining an image processing system configured using the recording / reproducing device 100 of this embodiment.
- the recording / reproducing apparatus 100 includes a BS / CS tuner 300 as a digital device, a terrestrial television tuner 400 as an analog device, and a video camera 5 as a digital device.
- the device that can be connected to the recording / reproducing device 100 is not limited to the device shown in FIG.
- various playback devices such as a DVD (Digital Versatile Disk) player, a VTR (Video Tape Recorder), a recording and playback device using a hard disk as a recording medium, and a recording and playback device can be connected.
- the recording / reproducing device 100 is a B SZC PC Geran 09
- S-Tuner 300 broadcast signal from terrestrial TV tuner 400, digital information including image information such as recording information from video camera 500 It can be recorded as a signal on a recording medium (hard disk) built into the device itself.
- a recording medium hard disk
- the recording / reproducing apparatus 100 reads out image information recorded on its own recording medium and reproduces an image signal for displaying an image on the display screen of the monitor receiver 200 as a display element. Then, this can be supplied to the monitor receiver 200. As a result, an image corresponding to the image information recorded on the hard disk of the recording / reproducing apparatus 100 is displayed on the display screen of the monitor receiver 200 so that the user can view it.
- broadcast signals that provide television broadcast programs and recorded information from video cameras include audio information in addition to image information.
- the recording and playback device 100 is supplied with image information in addition to image information. Audio information is recorded on a recording medium, and the recorded information can be reproduced. The audio information to be reproduced is supplied to, for example, a speaker provided in the monitor receiver, although not shown.
- the recording / reproducing apparatus 100 is operated by operating an operation key group of an operation panel 45 provided on a front panel surface of the recording / reproducing apparatus 100.
- a remote commander for remote control remote control
- a variety of instructions can be given by operating a group of 150 operation keys.
- the infrared remote control signal corresponding to the operation of the operation keys of the remote controller 150 is output from the remote controller signal output section 151 of the remote controller 150, and this is output from the front panel of the recording / reproducing apparatus 100 main body. On the surface side When the remote controller 150 receives the light, the information corresponding to the operation of the operation key group of the remote controller 150 is notified to the recording / reproducing apparatus 100.
- a mark called a chapter mark is provided near the target scene of the reproduced image information. It is possible to input to the recording / reproducing apparatus 100 an editing instruction such as adding a mark, deleting an image section in which a cap mark is added, or moving the image section. I am doing it.
- FIG. 2 is a block diagram for explaining the configuration of the recording / reproducing apparatus 100 of the present embodiment.
- a CPU Central Processing Unit
- the CPU 40 accesses a ROM (Read Only Memory) 41, a RAM (Random Access Memory) 42, and an EEPROM (Electrically Erasable and Programmable ROM) 43 via a host bus as needed, and uses this recording / reproducing device ( (Image detection device) The entire control of 100 is performed.
- ROM Read Only Memory
- RAM Random Access Memory
- EEPROM Electrical Erasable and Programmable ROM
- a light receiving section 44 for infrared remote control signals is connected to the host bus.
- the light receiving section 44 receives an infrared remote control signal from the remote controller 150, converts it into an electric signal, and supplies the electric signal to the CPU 40.
- the ROM 41 stores various programs to be executed in the recording / reproducing apparatus 100 of this embodiment and various data necessary for processing, and the RAM 42 temporarily records an intermediate result of the processing. It is mainly used as a work area.
- the EPP ROM 43 is a so-called non-volatile memory for storing data that needs to be retained even when the power is turned off, for example, various setting data. .
- the recording / reproducing apparatus 100 of this embodiment shown in FIG. 2 has digital input terminals 1, digital input Z output terminals 2, analog audio input terminals 4, analog image input terminals 8, 100 as input terminals. It has.
- the analog image input terminal 8 is for a composite image signal (Cps)
- the analog image input terminal 10 is for a separate image signal (S).
- the recording / reproducing apparatus 100 is provided with a digital input / output terminal 2, an analog audio output terminal 22, and analog image output terminals 28, 29 as output terminals.
- the analog image output terminal 28 is for a composite image signal (Cps)
- the analog image output terminal 29 is for a separate image signal (S).
- the recording / reproducing apparatus 100 of this embodiment includes a communication connection terminal 31 and, for example, via a communication interface (hereinafter abbreviated as a communication I / F) 30, for example, Various types of data can be sent and received through a communication network such as the Internet.
- a communication interface hereinafter abbreviated as a communication I / F
- the recording / reproducing apparatus 100 transmits the image signal and the audio signal received through the input terminal and the input / output terminal described above to a recording It is recorded on the medium 18, output through the output terminal and the input / output terminal described above, and the image signal and the audio signal recorded on the recording medium 18 are read and reproduced, and this is output through each output terminal. It can be output.
- the data received through the communication connection terminal 31 described above is recorded on the recording medium 18 or output digitally. If the received data is image data or audio data, these are converted into analog data. It can be converted to a signal and output through an analog output terminal.
- a BS / CS tuner 300 As shown in FIG. 1, in this embodiment, a BS / CS tuner 300, a terrestrial TV tuner 400, and a video camera 500 are connected to the recording / reproducing apparatus 100. It has been like that.
- the BS / CS tuner 300 is connected through the digital input terminal 1
- the terrestrial TV tuner 400 is connected through the analog audio input terminal 4 and the analog image input terminal 8 or the analog image input terminal 10.
- the video camera 500 is connected through the digital input / output terminal 2.
- the digital BSZCS tuner 300 is connected to a parabona antenna for receiving a digital broadcast signal from a satellite.
- Digital B SZC S Tuner 300 receives and tunes to the desired digital broadcast signal based on a tuning control signal in response to a tuning instruction from the user, and receives and tunes the digital broadcast signal. Is supplied to the recording / reproducing apparatus 100 through the digital input terminal 1. The digital broadcast signal supplied through the digital input terminal 1 is supplied to the multiplexing / demultiplexing circuit 16.
- Digital broadcast signals are transmitted to various control data such as channel selection information called PSI (Program Specific Information) and EPG (Electronic Program Guide) data for forming an electronic program guide table for each channel as a program transmission path.
- PSI Program Specific Information
- EPG Electronic Program Guide
- image data video data
- audio data audio data
- various other data that make up a broadcast program are packetized, multiplexed, and transmitted.
- An identifier is added to each packet, and this identifier can be used to extract PSI data and EPG data, and to extract image buckets and audio buckets that make up the same program. It has been made possible.
- the multiplexing Z separation circuit 16 extracts PSI and EPG data from the TS signal from the digital BS / CS tuner 300 and supplies it to the CPU 40 to enable program selection and electronic program selection.
- a program guide table is formed and output so as to be displayed according to an instruction from a user, thereby enabling program selection and recording reservation through the electronic program guide table.
- the multiplexing / demultiplexing circuit 16 When the recording of the selected program is instructed, the multiplexing / demultiplexing circuit 16 outputs an image packet of the target program selected by the user from the TS signal from the digital BSZC S tuner 300. Then, a new TS signal composed of these and the necessary control data is formed by extracting the packet and the voice bucket, and this is recorded on the recording medium 18 through the buffer control circuit 17.
- the multiplexing Z separation circuit 16 forms an image ES (Elementary Stream) from the image bucket of the target program extracted from the TS signal from the digital BS / CS tuner 300, and (Moving Picture Experts Group) This is supplied to the video decoder 23 and forms an audio ES (Elementary Stream) from the audio packet, and supplies this to the MPEG audio decoder 19.
- image ES Elementary Stream
- audio ES Elementary Stream
- the MPEG audio decoder 19 decodes the audio ES supplied thereto, obtains baseband audio data, and supplies this to the post audio signal processing circuit 20.
- the MPEG decoder 23 decodes the supplied image ES to obtain baseband image data, and supplies the dust to the video image signal processing circuit 24.
- the post-video signal processing circuit 24 performs switching between image data from the MPEG video decoder 23 and image data from the pre-video signal processing circuit 14, which will be described later, and performs screen synthesis and filter processing.
- the image data is supplied to the synthesizing circuit 26.
- the synthesizing circuit 26 is formed by image data from the post video signal processing circuit 24, image data of a reduced still image called a thumbnail from the still image generating circuit 25, and the CPU 40. After receiving the supplied graphics data for screen display, character data, etc., it performs processing such as synthesizing them and partially overlapping the display, and processes the processed image data using the NTSC encoder 27 To supply.
- the NTSC encoder 27 converts the input image data (component digital signal) into a YC signal, then performs DZA conversion, and outputs the analog composite image signal (Cps) and the separate image. Generate signals (S) and output them through analog image output terminals 28 and 29.
- the post-audio signal processing circuit 20 performs switching between audio data from the MPEG audio decoder 19 and audio data from the pre-audio signal processing circuit 9, filtering, fading, and speech speed conversion. Then, the processed audio data is supplied to the audio D / A converter 21.
- the audio D / A converter 21 converts the audio data supplied thereto into an analog audio signal and outputs this through an analog audio output terminal 22.
- a monitor receiver 200 is connected downstream of the analog audio output terminals 22 and analog image output terminals 28 and 29, and outputs audio corresponding to the analog audio signal output through the analog audio output terminal 22. Sound is output from the speaker provided in the monitor receiver 200, and an image corresponding to the analog image signal output through the analog image output terminals 28 and 29 is displayed on the display screen of the monitor receiver. To be.
- the recording / reproducing apparatus 100 of this embodiment extracts image data and audio data of a target program from a digital broadcast signal supplied from the BS / CS tuner 300. This is recorded on the recording medium 18, and at the same time, an analog image signal and an analog audio signal are formed and can be output. In other words, it is possible to view the program while recording the target program provided as a digital broadcast signal on the recording medium 18.
- the TS signal newly formed in the multiplexed Z separation circuit 16 is transferred to a digital interface circuit (hereinafter abbreviated as a digital IZF circuit) 3, a digital input / output terminal 2, and, for example, to another recording device. It is also possible to output to external devices such as PCs and personal computers.
- the digital I / F circuit 3 converts the supplied digital signal into a digital signal in a format compatible with an external device, and Output.
- a digital signal supplied from a digital video camera 500 or the like via a digital interface such as IEEE 1394 is received through a digital input / output terminal 2 and recorded. It is also configured to be able to record on the medium 18 or form an analog image signal and an analog audio signal and output them.
- the digital signal supplied through the digital input / output terminal 2 is supplied to the digital IZF circuit 3.
- the digital I / F circuit 3 performs processing such as format conversion on the supplied digital signal so as to conform to the method used by the recording / reproducing apparatus 100 of this embodiment, Is generated and supplied to the multiplexing / demultiplexing circuit 16.
- the multiplexing / demultiplexing circuit 16 further analyzes and generates control signals and the like, forms a TS signal in a format to be recorded on the recording medium 18, and as described above, transmits the TS signal to the recording medium through the buffer control circuit 17. It can be recorded in 18
- the multiplexing / demultiplexing circuit 16 forms an image ES and an audio ES from the TS signal supplied from the digital I / F circuit 3 and supplies them to the MPEG video decoder 23 and the MPEG audio decoder 19, thereby It forms analog image signals and analog audio signals as if they were, so that they can be output.
- a television broadcast signal is supplied from a terrestrial television tuner 400 connected through the analog audio input terminal 4 and the analog image input terminal 8, for example, and this is recorded on the recording medium 18 or The operation when analog output is performed will be described.
- the terrestrial TV tuner 400 receives analog terrestrial broadcast signals, Tuning and demodulation, an analog composite image signal (C ps) and an analog audio signal are obtained and supplied to the recording / reproducing apparatus 100.
- the analog audio signal from the terrestrial TV tuner 400 is supplied to the A / D converter 5 through the analog audio input terminal 4 and the analog composite image signal (C ps) is supplied to the YC separation circuit 9 through the analog image input terminal 8.
- an analog separated image signal (S) it is supplied to the selector 11 through the analog image input terminal 10.
- the YC separation circuit 11 separates the supplied analog composite image signal (C ps) into a luminance signal Y and a color difference signal C (so-called Y C separation), and supplies these to the selector 11.
- the selector 11 is also supplied with an analog separate image signal (S) supplied through an analog image input terminal 10.
- the selector 11 selects one of the image signal supplied from the YC separation circuit 11 and the image signal supplied as a separate image signal through the analog image input terminal 10 into the NTSC ( Switch between supplying to National Television System Commi Uee) decoder 1 and 2.
- the NTSC decoder 12 performs A / D conversion, chroma decoding, and other processing on the input analog image signal, converts it into digital component video data (video data overnight), and converts this to pre-video.
- Signal processing circuit 13 The NTSC decoder 12 also converts the clock generated based on the horizontal synchronization signal of the input image signal, and the horizontal synchronization signal, vertical synchronization signal, and field discrimination signal obtained by synchronizing and separating the signals into a synchronization control circuit 15 To supply.
- the synchronization control circuit 15 uses the signals supplied thereto as a reference, and generates a clock signal and a synchronization signal that provide the necessary evening in each circuit block. Generate and supply this to each circuit block.
- the pre-video signal processing circuit 13 performs various video signal processing such as a pre-filter on the input image data and supplies it to the MPEG video encoder 14 and the lost video signal processing circuit 24.
- the MPEG video encoder 14 performs an encoding process such as a block DCT (Discrete Cosine Transform) on the image data from the pre-video signal processing circuit 13 to generate an image ES, and generates a multiplexed Z. Supply to separation circuit 16.
- an encoding process such as a block DCT (Discrete Cosine Transform) on the image data from the pre-video signal processing circuit 13 to generate an image ES, and generates a multiplexed Z. Supply to separation circuit 16.
- a block DCT Discrete Cosine Transform
- the audio signal supplied to the AZD converter 5 through the audio input terminal 4 is converted into digital audio data in the AZD converter 5 and then supplied to the pre-audio signal processing circuit 6.
- the pre-audio signal processing circuit 6 performs a filtering process on the supplied audio data, and supplies this to the MPEG audio encoder 7.
- the MPEG audio encoder 7 compresses the supplied audio data in accordance with the MPEG format, generates audio ES, and supplies it to the multiplexed Z separation circuit 16 as in the case of image data.
- the demultiplexer 16 multiplexes the image ES from the MPEG video encoder 15, the audio ES from the MPEG audio encoder 10, and various control signals.
- the multiplexing Z separation circuit 16 at the time of recording combines the MPEG image ES and the MPEG sound ES input thereto with various control signals and performs multiplexing processing, for example, the TS of the MPEG system. Generate a signal.
- the TS signal generated here is recorded on the recording medium 18 through the buffer control circuit 17.
- the audio data from the pre-audio signal processing circuit 6 is supplied to the MPEG audio encoder 7 and also to the post-audio signal processing circuit 20. From circuit 13 The image data is supplied to the MPEG video encoder 14 and also to the lost video signal processing circuit 24.
- An analog audio signal is formed by the functions of the post audio signal processing circuit 20 and the D / A converter 21 and output through the audio output terminal 22.
- the circuit 26 and the function of the NTSC encoder 27 form an analog image signal, which can be output through the analog image output terminals 28 and 29.
- the image data to be recorded is And audio data can be reproduced and output.
- a TS signal to be reproduced is read from the recording medium 18 and supplied to the multiplexing / demultiplexing circuit 16 through the buffer control circuit 17.
- the multiplexing / separating circuit 16 separates the image ES and the sound ES from the TS signal read from the recording medium 18 and supplies the separated sound ES to the MPEG audio decoder 19. Then, the image ES is supplied to the MPEG video decoder 23.
- MP EG audio decoder Processing of each circuit section after 19, and M The processing of each circuit unit after the PEG video decoder 23 is as described in the description of the use of digital input described above. That is, an analog audio signal is formed and output from the audio ES supplied to the MPEG audio decoder 19, and an analog image signal is formed and output from the image ES supplied to the MPEG video decoder 23.
- the recording / reproducing device 100 of this embodiment includes the communication IZF 30 and the communication connection terminal 31.
- a network such as the Internet via a telephone line or the like is provided. It is designed to be able to connect to a network and obtain various data and send various data to the network through the network.
- the various types of data that can be transmitted and received include not only image data and audio data but also various programs and text data.
- the data can be recorded on the recording medium 18 through the multiplex Z separation circuit 16.
- a multiplex separation circuit 16 an MPEG audio decoder 19, a post audio signal processing circuit 20, a DZA converter 21, and an audio output terminal 2
- programs and control data used in the recording / reproducing apparatus 100 of this embodiment are provided via a network, and are recorded and stored in an EEPROM 43 or the like, and are used as necessary. You can do something like that.
- the function of the recording / reproducing apparatus according to the present embodiment may be improved through a communication network, or EPG data for BS digital broadcasting or CS digital broadcasting may be obtained in advance, and an electronic program guide may be created in advance. You will be able to do things like to keep things alive.
- the image data and the audio data are compressed according to the MPEG method.
- other compression methods can be used. Without compression, it can be processed uncompressed.
- the recording / reproducing apparatus 100 of this embodiment has an editing function, reproduces a broadcast program or the like recorded on the recording medium 18 and constructs a desired scene while checking the reproduced image. It is possible to add a cap mark as a landmark near the image to be processed with a simple operation.
- the reduced still image is displayed in a scrollable manner based on the image in which the chapter mark is added, and the start and end of the target scene are accurately specified for each frame, that is, for each image, and edited.
- the start and end of the target signal section are strictly specified so that the desired editing can be performed on the specified image section.
- the portion that forms the image signal for displaying the reduced still image is the still image generation circuit 25 shown in FIG.
- the still image generation circuit 25 includes, for example, a buffer memory of about 4 Mbytes, and controls the CPU 40.
- the target image data is obtained from the image data recorded on the recording medium 18 via the MPEG video decoder 23, and an image signal for displaying the reduced image is formed. .
- the editing function is not that which does not operate unless the recording / reproducing apparatus is switched to the editing mode, as in the editing function of the conventional recording / reproducing apparatus.
- the recording / reproducing apparatus 100 of this embodiment enables various editing functions to be used in the reproducing mode, and switches the operation from the reproducing operation to the editing operation. This allows seamless, seamless operation.
- the recording medium 18 of the recording / reproducing apparatus 100 is a hard disk as described above, and can record a large amount of data. For example, if a group of information such as one TV broadcast program is called a title, the recording medium 18 can store and hold a group of information including image information for a plurality of titles. Like that.
- a title is a set of image information, audio information, and the like, which is to be treated as a set of (one) information, such as one broadcast program, one movie, and the like. As described above, it is possible to record information for multiple titles.
- a title to be reproduced is selected from the titles recorded and held on the recording medium 18 and the reproduction is instructed, a group of information (TS signal) including the image information of the title is formed.
- TS signal a group of information
- the function of the multiplexed Z separation circuit 16, the MPEG video decoder 23, the post video signal processing circuit 24, the synthesizing circuit 26, and the NTSC encoder 27 It is reproduced by the function of.
- the image to be played here The image information is supplied to the monitor device 200, and the reproduced image is displayed on the display screen 201 of the monitor device 200.
- FIG. 3 is a diagram for explaining an example of the display image 201G of the display screen 201 immediately after the operation to add the cap mark.
- the CPU 40 of the recording / reproducing device 100 knows which image is currently being reproduced by the time code of the image divided by the frame number. Therefore, when the chapter mark key 154 is pressed down, time code information or frame information is used as information for specifying an image to be displayed on the display screen 201 of the monitor receiver 200. A number is obtained, and this is recorded in a file provided on the recording medium 18 as cap mark information together with, for example, identification information of the title being reproduced.
- the CPU 40 detects that the cap mark key 154 has been pressed, as shown in FIG. 3, the CPU 40 sets the total playback time corresponding to the total data amount of the title being played at that time.
- the triangular marks M1, M2, M3,... Indicating the position of the image marked with the cap mark during the total playback time of the title Is displayed.
- the cap mark is added to the image data in units of one frame
- the triangular mark is a display mark for indicating the position of the image to which the cap mark is added. It is.
- the exact target scene is strictly specified. It is not always added to the first image or the last image of the.
- the user can operate a predetermined operation key of the remote controller 150 or display a predetermined menu, and for example, select “Edit Mark” from the menu.
- the recording / reproducing apparatus 100 is caused to transition to a mode for accepting an operation on the cap mark.
- the display image displayed on the display screen 201 should be paused. To be. That is, a still image is obtained.
- the CPU 40 of the recording / reproducing apparatus 100 displays the timeline 201 T and a triangle mark indicating the position of the image with the cap mark.
- Image information is formed and displayed on the monitor receiver 200 so that selection of an image with a chapter mark is accepted.
- a plurality of images to which cap marks are already added such as an image to which a cap mark is added most recently or an image to which cap marks are added first.
- the triangle mark corresponding to the position of the predetermined image in is selected.
- the target triangle mark on the timeline 201 T can be selected. Is done.
- the CPU 40 of the recording / reproducing device 100 specifies, from the position of the triangular mark, the order in which the chapter mark information corresponding to the triangular mark is indicated.
- the image at the position indicated by the selected triangle mark is specified with reference to the chapter mark information recorded and held in the recording medium 18.
- the CPU 40 obtains, from the recording medium 18, the image information forming the specified image (the image at the position indicated by the selected triangle mark) and the image information forming the neighboring image.
- the acquired image information is supplied to the still image generation circuit 25 via the MPEG video decoder 23, and the still image generation circuit 25 is controlled to reduce the still image called a thumbnail.
- An image signal for displaying the image SlS2S3S4S5 is formed.
- the image signals for displaying a plurality of, in this embodiment, at least five reduced still images generated by the still image generation circuit 25 are supplied to the synthesizing section 26, where the post video signal processing circuit 2
- the signal is synthesized with the image signal from 4 and output via the NTSC encoder 27, and is supplied to the monitor receiver 200 as shown in FIG. 1 so that the image is displayed on the G plane 201. .
- the image at the position indicated by the triangle mark is set as a reference (S3), and two frames before the image S3 are used.
- a reduced still image of 5 frames including an image (S 1 S 2) and an image of 2 frames after the image S 3 (S 4 S 5) is displayed.
- the outer frame of the reduced still image S3 that is currently marked with a cap mark is shown thick, so that the so-called force sol can be positioned. This indicates that the reduced still image S3 is the reduced still image to be operated.
- the cursor can be moved by operating the left arrow key 15 2 L and the right arrow key 15 2 R.
- the reduced still image to be operated can be sequentially changed.
- FIGS. 5A to 5C show examples of changing the reduced still image of the operation target while the reduced still images S 1 (operation target) and S 2 S 3 S 4 S 5 are displayed.
- FIG. By pressing the left arrow key 15 2 L, the cursor can be sequentially moved to the left. Therefore, when the left arrow key 1 5 2 L is pressed twice while the force sol is positioned on the reduced still image S 3, as shown in FIG. The cursor can be positioned on the reduced still image S1 at the left end of the. Then, a reduced still image with a thick outer frame (a reduced still image in which a force sol is positioned) can be set as an operation target.
- the image with the cap mark is the image displayed as the reduced still image S3 as indicated by the arrow in FIG. 5A.
- the reduced still image is scrolled rightward of the display screen 201 as shown in FIG. 5B, and is displayed when in the state of FIG. 5A.
- the image (S1-1) that is not the target and is one frame after the reduced still image S1 is displayed as the reduced still image S1.
- the image with the cap mark is the image displayed as the reduced still image S4, as indicated by the arrow in FIG. 5B.
- the reduced still image can be scrolled one frame at a time in the direction going back in the past, and the desired image can be selected. I can do it.
- the cursor can be moved to the right of the display screen 201, and the cursor is reduced at the rightmost end of the display screen 201.
- the reduced still image can be scrolled one frame at a time in the direction to advance the time, and the target image can be selected.
- a menu MN1 for selecting an operation on the reduced display image to be operated is displayed.
- the user can select whether to reset the cap mark.
- the reset of the chapter mark is performed by adding a new chapter mark to the original image of the selected reduced still image and deleting the chapter mark attached to the image used as the reference image this time. It is.
- the resetting of the cap mark is performed in the state shown in FIGS. 5B and 5C, the original image of the reduced dispute image S 1 in FIGS. 5B and 5C is obtained.
- the cap mark is added to the image, and the cap mark added to the original image of the reduced still image S4 is deleted.
- the reduced still image S 4 The information for specifying the original image of the reduced still image S1 can be changed to the information for specifying the original image of the reduced still image S1.
- the triangular mark indicating the position of the image where the cap mark has been deleted is deleted, and the triangular mark indicating the position of the image with the new cap mark is newly displayed. Can be.
- a rough mark is added to an unnecessary scene portion.
- the rough mark on the cap is checked at an appropriate time so that the target image can be remarked accurately.
- the display returns to a display consisting of a still image, a display image 201G, a timeline 201, and triangular marks Ml, M2, M3, ... Then, playback and various editing can be performed based on the image with the cap mark.
- FIG. 6 is a diagram for explaining a case where reproduction and various kinds of editing are performed based on an image with a chapter mark.
- the display image 201 G After returning to the display consisting of the display image 201 G, the timeline 201 T, and the triangular marks M 1, M 2, M 3,... Which are regarded as still images, select the triangular mark and When the operation is performed, a selection menu MN2 of processing that can be performed based on the selected image with the triangle mark is displayed.
- the playback mode and the editing mode are not clearly different modes, and various editing functions are used as functions that can be executed in the playback mode. Have to be able to. Therefore, the normal operation for reproducing and viewing and the operation for editing can be performed seamlessly, thereby realizing an easy-to-use recording / reproducing apparatus.
- the operation of the recording / reproducing apparatus 100 of this embodiment at the time of reproduction is the one to which the image detection method according to the present invention is applied.
- the image reproduction program executed by the CPU 40 is executed. To be performed.
- FIG. 7 is a diagram for explaining an example of display items to be displayed on the display screen 201 of the monitor receiver 200 by the recording / reproducing device 100 of this embodiment.
- the recording / reproducing apparatus 100 of the present embodiment reads out the image information recorded on the recording medium 18 and performs a reproducing process to thereby provide an analog image signal to be supplied to the monitor receiver 200. Is formed and output, and a display image 201G can be displayed on the display screen 201 of the monitor receiver 200.
- the status display G1 the remote control operable key display G2, the evening line 201T, the triangle mark ⁇ ⁇ ⁇ 1, ⁇ 2, ⁇ 3, ⁇ , Start point display S ⁇ , End point display ED, Reduced still image (thumbnail) S, Menu MN1, MN2, Form image signals for displaying MNS and monitor them It can be displayed on a display screen of 200.
- the status display G1 is for notifying the operation status of the recording / reproducing device 100 such as the recording mode and the reproducing mode. It is for notifying the operation keys of a possible remote control.
- the timeline 201 T indicates the total playback time of the currently playing title (total data amount of the title), and is close to the timeline 201 T.
- Triangular marks to be displayed] VI1, M2, M3, ... indicate the position of the image to be marked with a chapter mark.
- start point ST indicates the start point of the section for which the range is specified
- end point ED indicates the end point of the section for which the range is specified.
- start point ST is specified corresponding to the triangle mark.
- the section between ST and end point ED can be edited or deleted.
- the reduced still images S are five reduced still images S 1, S 2, S 3, S 4, and S 5 as shown in FIGS. 3 and 4. It is composed of a reduced still image of the image selected by the user and the neighboring images among the images marked with a cap mark.
- the menu MN1 is for selecting a process for the selected reduced still image
- the menu MN2 is for selecting a process for a triangle mark to be displayed corresponding to the chapter mark. belongs to.
- the menu M NS is for selecting an executable process in the reproduction mode.
- the recording / reproducing apparatus 100 of this embodiment performs the reproducing operation while appropriately displaying the display items shown in FIG. 7 in the reproducing operation described with reference to FIGS. Processing and various processing that can be performed in the playback mode are performed.
- the CPU 400 of the recording / reproducing apparatus 100 receives a title selection input from the user through the remote control 150 or an operation panel provided on the front panel of the recording / reproducing apparatus 100, and ( Step S102), the operation instruction input from the user through the remote controller 150 or the operation panel is received (step S103).
- step S104 the CPU 40 determines whether or not an end instruction has been input by the user as an operation instruction input.
- the CPU 40 ends the processing shown in FIGS. 8 to 10 and selects a recording mode selection instruction or a title list display instruction. It waits for what is called initial input to accept input.
- step S104 If it is determined in step S104 that the end instruction has not been input, it is determined whether the target title selection input and the reproduction instruction input of the selected title have been input (step S104). S105). In the determination processing of step S105, when it is determined that the title has not been selected or that the reproduction instruction has not been input, the CPU 40 repeats the processing from step S102, Accepts title selection input and operation instruction input.
- step S105 If it is determined in step S105 that the reproduction instruction of the selected title has been input, the CPU 40 sets the buffer control circuit 17, the multiplexing / demultiplexing circuit 16, and the MPEG audio. Decoder 19, post The audio signal processing circuit 20, D / A converter 21, MPEG video decoder 23, post video signal processing circuit 24, synthesizing circuit 26, and NTSC encoder 27 are controlled, and step S 1 The reproduction of the title selected in 02 is started (step S106).
- step S106 the status display Gl and the remote control operable key display G2 described with reference to FIG. 7 are displayed for a fixed period of time, for example, about several seconds, to notify that the reproduction mode has been set. At the same time, available remote control 150 and operation keys on the operation panel are notified.
- step S107 the CPU 40 receives an instruction input from the user through the remote controller 150 or the operation panel (step S107).
- Instruction inputs that can be accepted in step S107 include instruction input for adding a chapter mark by pressing and holding the chapter mark key 154, display instruction input for a submenu, and other items, such as pause, fast forward, fast reverse, and playback. There is an instruction input such as stop.
- step S108 it is determined whether or not the instruction input received in step S107 is an instruction input for adding a chapter mark.
- the CPU 40 upon receiving the instruction input, displays the display screen 2 of the monitor receiver 200. A time code and a frame number for specifying the image to be displayed at 01 are acquired (step S109).
- the CPU 40 After obtaining the time code, etc., the CPU 40 is provided with title information indicating the title currently being played back and image information constituting the title, and a cap mark is added. The time code of the image information and the like are associated with each other and recorded on the recording medium 18 (step S110). Then, the processing from step S107 is repeated to obtain a plurality of images in the same title. (Multiple locations) can be marked with a cap. If it is determined in step S108 that the input is not an instruction to add a cap mark, the CPU 40 determines whether it is an instruction to display the submenu MNS (step S108). S 1 1 1).
- step S111 If the CPU 40 determines in the determination processing of step S111 that it does not instruct the display of the submenu, the CPU 40 performs other processing instructed by the user, for example, pause, fast forward, fast rewind, or play. Processing such as stopping is performed (step S112).
- step S111 If it is determined in step S111 that the instruction is a display instruction for the sub-menu MNS, the process proceeds to the process shown in FIG. 9 and the sub-menu MNS is displayed (step S113).
- the submenu MNS prevents the ⁇ mark edit '' for editing the cap mark added to the image and prevents the already recorded title from being accidentally deleted. This is used to select “Title protection” to delete a title, or “Delete title” to delete an already recorded title.
- the CPU 400 accepts a selection input from the user through the remote control 150 or an operation panel provided on the front panel of the recording / reproducing device 100 (step S114). Then, the CPU 40 determines whether or not the selection input received in step S114 is an instruction to execute "mark editing" (step S115).
- step S115 when it is determined that the received selection input is not "mark edit”, a selected process such as "Title protection” or “Title deletion” is performed. (Step S111).
- step S115 If it is determined in step S115 that the "mark edit" has been selected, the CPU 40 sets the multiplex Z separation circuit 16 and the MPEG PC brutality 09
- the video decoder 23, the boosted video signal processing circuit 24, the synthesizing circuit 26, and the NTSC encoder 27 are controlled to temporarily suspend the display image 201G (step S117).
- the CPU 40 forms an image signal for displaying the timeline 201T and the triangular marks M1, M2, M3, M4,... To the circuit 26 to display the timeline 201 T and the triangle mark Ml, M2, M3, ... on the display screen 201 of the monitor receiver 200, and the nearest triangle mark is displayed.
- the selected state is set (step S118).
- the CPU 40 accepts a key input from the user through the remote controller 150 or the operation panel (step S119).
- the input keys that can be operated in step S 119 include the up arrow key 15 2 U, the decision key 15 3, the left and right arrow keys 15 2 L, It is 15 2 R.
- step S120 determines what input key was operated in step S119 (step S120).
- step S120 determines that the operated input key is the up arrow key 152U
- the CPU 40 determines that the reduced still image display (thumbnail display) has been instructed, and
- the caption mark information is extracted as information for identifying the image corresponding to the selected triangle mark, and based on this chapter mark information, the image corresponding to the currently selected triangle mark and its neighboring
- the image data of the image is read from the recording medium 18 to form a reduced still image (thumbnail) S, which is supplied to the monitor receiver to be displayed on the display screen (step S122).
- the CPU 40 accepts a key input for the reduced still image displayed as the thumbnail (step S122).
- the input keys that can be operated are a down arrow key 152D, an enter key 1553, left and right arrow keys 152L, and 15R.
- the CPU 40 determines what input key was accepted in step S122 (step S123), and when it is determined that the down arrow key has been operated, the displayed thumbnail S is not displayed. It is determined that the instruction is a display instruction, the displayed thumbnail is hidden (step S124), and the processing from step S119 is repeated.
- step S123 if the received input key is determined to be the determination key, an image signal for displaying menu MN1 is formed and supplied to the monitor receiver. And display it (step S125).
- step S123 when it is determined that the received input keys are the left and right keys 152L and 152R, the display is performed as described above with reference to FIGS. A process for selecting the reduced still image thus performed is performed (step S122), and thereafter, the process from step S122 is repeated.
- step S120 if it is determined that the key input received in step S119 is the determination key 1553, CPU 40 sets menu MN2 An image signal for displaying is displayed and supplied to a monitor receiver to be displayed (step S127). After the processing of step S125 and after the processing of step S127, the processing shifts to the processing shown in FIG. 10 described later.
- step S120 when it is determined that the key input received in step S119 is the left and right keys 152L and 152R, CPU 40 already displays the key input. Timeline 201 T and the triangular marks Ml, M2, M3, ... Then, a triangle mark corresponding to the image is selected (step S128), and thereafter, the processing from step S119 is repeated.
- step S125 and step S127 the CPU 40 proceeds to the processing shown in FIG. 10 and proceeds to the processing shown in FIG.
- the function selection from the menu MN1 displayed in 1 25 or the menu MN 2 displayed in step S127 is accepted (step S129).
- step S129 if the displayed menu is menu MN1, since it is a menu for a reduced still image, as shown in FIG. 5C, "return” and "reset mark” are set. It can be selected.
- the displayed menu is the menu MN2, it is a menu for the timeline and the triangle mark. Therefore, as shown in FIG. 6, “return”, “delete mark”, “A-B "Delete” and “Play from here” can be selected.
- step S129 determines that “return” is selected in step S129 (step S130)
- step S130 determines that “return” is selected in step S129 (step S130)
- step S123 shown in FIG.
- the process from 1 2 2 is repeated to accept the key input for the reduced still image again, and when the menu MN 2 is displayed, the process from the step S 1 19 shown in FIG. The process is repeated so that a key input to the triangle mark corresponding to the chapter mark is accepted again.
- step S129 If the CPU 40 determines in step S129 that the mark reset has been selected from the menu MN1, or if the CPU 40 determines that the mark deletion has been selected from the menu MN2 (step S129). S 13 1), the CPU 40 first stores the chapter mark information to be attached to the image corresponding to the selected triangle mark on the recording medium 18.
- the CPU 40 determines whether or not the mark reset is instructed (step S133), and when the mark reset is instructed, the CPU 40 is selected as shown in FIG. Since this is an instruction input to the menu MN1 for the reduced still image, a time code, for example, is acquired as information for specifying the image corresponding to the selected reduced still image, and this is newly recorded as the chapter mark information. It is registered (recorded) in (step S134).
- step S133 After the processing in step S1334, or when it is determined in step S133 that the mark is not reset (the mark is deleted in menu MN2 for the triangle mark), stop in step S117.
- the reproduction of the displayed image 201G is restarted (step S135), and the processing from step S107 shown in FIG. 8 is repeated.
- step S129 If the CPU 40 determines in step S129 that “A—B delete” has been selected from the menu MN2 (step S136), the CPU 40 returns to the start point ST in FIG. As shown in (1), the selection input of the start point is accepted using the displayed triangular mark as a processing unit, and the position is displayed (step S1337). Similarly, in FIG. 7, the CPU 40 accepts the selection of the end point by using the displayed triangular mark as a processing unit and displays the position as shown by the end point ED in FIG. 7 (step S1). 3 8).
- step S 13 9 the CPU 40 deletes the image from the start point selected in step S 13 7 to the end point selected in step S 13 8 (step S 13 9), and thereafter, the step S 13 The reproduction of the display image 201G stopped at 117 is restarted (step S135), and the processing from step S107 shown in FIG. 8 is repeated.
- step S129 the menu "MN2" If the CPU 40 determines that “Raw” has been selected (step S140), the CPU 40 plays back the title to be processed from the image corresponding to the currently selected triangle mark. The process is restarted (step S135), and the processing from step S107 shown in FIG. 8 is repeated.
- the two modes of the reproducing mode and the editing mode are not properly used.
- editing can be performed as needed. Therefore, there is no inconvenience such as not knowing the operation to shift to the edit mode or taking time to shift to the edit mode.
- editing-related operations are performed using only a limited number of operation keys such as a remote control, such as the up arrow key 1 5 2 U, the down arrow key 1 5 2 D, the left arrow key 1 5 2 L, and the right arrow key 1 Since the operation can be performed using the 5 2 R, the determination key 15 3, and the chapter mark key 15 4, the editing operation is extremely simple. Therefore, it does not cause inconveniences such as erroneous operation being frequently generated or a target scene being overlooked due to a careless operation.
- a remote control such as the up arrow key 1 5 2 U, the down arrow key 1 5 2 D, the left arrow key 1 5 2 L, and the right arrow key 1
- an image of the target scene is marked with a cap, and the image with the cap marked and the neighboring image are simultaneously checked as a reduced image, and are scroll-displayed. This allows you to specify the image of the target scene accurately based on the image with the cap mark, without using many conventional functions such as fast forward, fast reverse, and frame advance. To make edits.
- the chapter mark is added to the image of the target scene without stopping the reproduced image.
- Attach I was able to do it.
- various edits including replacement (re-setting) of the capture mark can be performed on the basis of the image with the capture mark. However, it is not limited to this.
- cap mark button 154 is pressed while the target title is being played, for example, as shown in FIG.
- the display image 201G displayed on the display screen of the receiver 200 is a still image as an image to which a cap mark is added.
- a triangle mark is displayed in the vicinity of the timeline 201 ⁇ corresponding to the image position where the timeline 201T and the chapter mark are added, and the deletion of the caption mark and the specification of the range are performed. Make it acceptable.
- both a mode for resetting the chapter mark and a mode for resetting the chapter mark can be mounted, and the user can switch and use the mode.
- the reduced still image is displayed so as to overlap the display of the moving image.
- the moving image display area 201M and the reduced image display area 201S may be divided so as not to overlap.
- the playback image of the title recorded on the recording medium 18 is displayed in the moving image display area 201M, and a reduced still image called a thumbnail is displayed in the still image display area 201S. Will be displayed. Without this use, the entire reproduced image of the title can be viewed without being hidden by a reduced still image or the like.
- the reduced image display area 201S may be made as large as possible to display more reduced still images.
- the moving image display area may be provided below the display screen 201, and the still image display area may be provided above the display screen 201.
- the moving image display area may be provided on the right side of the display screen 201, and the still image display area may be provided on the left side of the display screen 201.
- the moving image display area may be provided on the display screen 201. It may be provided on the left side, and the still image display area may be provided on the right side of the display screen 201.
- the display screen 201 can be appropriately used by separating it into a moving image display area and a still image display area.
- the editing section is accurately specified based on the attached cap mark, and the desired editing can be performed. Therefore, the editing operation can be started seamlessly at the time of reproduction without being particularly aware of the reproduction mode and the edit mode.
- the reproduction is performed from the beginning of the image information recorded on the recording medium, and the reproduced image is viewed until the target scene is displayed on the display screen of the monitor receiver. It takes time to find the scene to do. It is also conceivable to perform so-called fast-forward playback to find the desired scene. In such a case, there is a high possibility that the target scene is overlooked, which may lack certainty.
- a plurality of small areas (sub-image display areas (child screens)) for displaying a moving image are provided on the display screen, and are recorded on a recording medium.
- small areas sub-image display areas (child screens)
- multiple moving images from different points in time are played back simultaneously, so that the target scene can be found quickly.
- the image processing system has the same configuration as the image processing system of the first embodiment shown in FIG.
- the recording / reproducing apparatus 600 constituting the image processing system in this case is different from the recording / reproducing apparatus of the first embodiment shown in FIG. It is configured almost similarly to the device 100.
- the same reference numerals are given to the same components as those of the recording / reproducing apparatus 100 shown in FIG. Description is omitted.
- the recording / reproducing apparatus 600 of the second embodiment has a sufficiently high speed of reading image data or the like of a target title from the recording medium 18 and a sufficiently high processing speed of the MPEG video decoder 23.
- a display image forming circuit 50 capable of simultaneously processing not only still images but also a plurality of moving images, a title selected to be reproduced can be simultaneously processed from a plurality of different points in time. It can be played.
- FIG. 14 shows a first example of a recording / reproducing apparatus 600 capable of quickly detecting a target scene using moving image display.
- the CPU 40 of the recording / reproducing device 600 of this example includes a first moving image display area 201 M 1, a second moving image display area 201 M 2, A third moving image display area 201 M3 can be provided. Then, the CPU 40 controls the buffer control circuit 17 to start the target title (a group of information signals such as a broadcast program) at a start time and a reproduction time of 10 seconds from the start time.
- the image data reading is started from three points, i.e., the time point and the reproduction time 20 seconds after the head point, and each of the read image data is passed through the multiplexing Z separation circuit 16 to the MPEG video decoder 2. Supply to 3.
- the MPEG video decoder 23 performs a decoding process on each of the image data read out starting from the three time points supplied through the multiplexing Z separation circuit 16 and sequentially processes the image data after the decoding process. It is supplied to the display image forming circuit 50.
- the display image forming circuit 50 forms image data for displaying a moving image based on the respective image data from the three image data supplied at different readout start times, and It is supplied to the synthesis circuit 26.
- the synthesizing circuit 26 displays the playback moving image corresponding to the image data from the beginning of the target title in the first moving image display area 201M1,
- the playback video corresponding to the image data from the point 10 seconds after the start time of the target title is displayed in the second video display area 201M2
- the playback video corresponding to the image data from the time 20 seconds after the start time of the title to be played is synthesized in the third moving image display area 201M3.
- the synthesized image data is sent from the synthesis circuit 26 to the NTSC encoder 27.
- the NTSC encoder 27 generates an analog composite image signal (C ps) and a separate image signal (S), which are output through the analog image output terminals 28 and 29. .
- the target scene can be detected without missing the target scene.
- the second video display area 201 M2 and the third video display area 201 M3 that display the image earlier than the first video display area 201M1 are displayed. Therefore, the image of the target scene can be reliably detected and specified as an editing candidate point.
- the interval at the start of image reading is relatively long, for example, several minutes or several tens of minutes, a plurality of target signals can be detected simultaneously and in parallel.
- the playback speed (display speed) of the moving image displayed in the first moving image display area, the moving image displayed in the second moving image display area, and the moving image displayed in the third moving image display area is determined.
- the moving image displayed in the second moving image display area and the moving image displayed in the third moving image display area have a higher playback speed than the moving image displayed in the first moving image display area.
- 2x or 3x it is possible to promote quick detection of the target scene.
- the playback speed of the moving image is not defined by the speed of reading data from the recording medium or the speed of processing the data, but by the speed of displaying the moving image. That is, in this specification, the reproduction speed of a moving image is synonymous with the display speed of a moving image.
- time difference provided between a plurality of moving images may not be necessarily equal.
- the number of video display areas to be provided is not limited to three. No. Two or more moving image display areas may be provided.
- the reference moving image is displayed in the first moving image display area 201M1, the moving image 10 seconds later is displayed in the second moving image display area 201M2, and the moving image is displayed 10 seconds later.
- the moving image may be displayed in the third moving image display area 201M3.
- the second moving image display area 20 1 M2 or third moving image display area 2 0 1 M3 based on the moving image displayed in the first moving image display area 201 M1 The image of the target scene) can be easily checked.
- the reduced image display area 201S includes, for example, the recording and reproduction of the image displayed in the first moving image display area 201M1 according to the first embodiment.
- the image displayed in the first moving image display area 201M1 is set as a still image, and the frame images (still images) before and after the image are displayed based on this image.
- By displaying it is possible to select and set detailed edit points as in the case of the first embodiment.
- the use of the reduced image display area 201S is the same as that of the recording / reproducing apparatus 100 of the first embodiment described above.
- the timeline and the triangle mark indicating the position in the title of the image (candidate candidate point) to which the caption mark is added are displayed in the first embodiment.
- the information can be displayed on the display screen.
- Still images displayed in the reduced image display area 201S are also displayed in the table. It is formed in the image forming circuit 50. That is, the display image forming circuit 50 generates the moving image data to be displayed in the first, second, and third moving image display areas 201 M1, 201 M2, and 201 M3. At the same time, it is possible to generate a plurality of still images to be displayed in the reduced image display area 201S.
- the image data (moving image data) displayed in the first, second, and third moving image display areas 201M1, 201M2, and 201M3 are all displayed image forming areas. Although described as being formed in the part 50, the present invention is not limited to this.
- a moving image displayed in the first moving image display area 201M1 and used as a reference image may be formed in the post video signal processing circuit 24.
- Each of the buffer control circuit 17, the multiplexed Z separation circuit 16, the MPEG video decoder 23, the display image forming circuit 50, the synthesizing circuit 26, the NTSC encoder 27, etc. is operated by the CPU 40. Is controlled, and as shown in FIG. 14, a plurality of moving images can be displayed on one screen.
- audio data accompanying the image data to be reproduced is, for example, audio data corresponding to an image displayed in the first moving image display area 201 M1. It is possible to read data from the recording medium 18 through the buffer control circuit 17 and reproduce the data through the MPEG audio decoder 19, the post audio signal processing circuit 20, and the D / A converter 21.
- the audio to be played back is played back according to the moving image displayed in the first moving image display area 201M1, or displayed in the second moving image display area 201M2.
- the sound corresponding to the moving image displayed on the third moving image display area 201M3 You can also play a voice.
- FIG. 15 is a view for explaining a display example of an image displayed on the display screen 201 of the monitor receiver by the recording / reproducing device 600 in the second example of the second embodiment.
- FIG. When a title to be played is selected from the titles (a set of information signals such as broadcast programs) recorded on the recording medium 18, the CPU 40 of the recording / reproducing apparatus 600 is configured as shown in FIG. As shown in, the main image display area 201M is first provided in the display screen 201, and the timeline 2 corresponding to the total image data amount of the title selected to be reproduced is set. Display 0 1 T.
- the CPU 40 is stored in the ROM 41 to form the main image display area 201M, the timeline 201T, and the reproduction position designation mark moving on the timeline 201T.
- the main image display area 201M, the timeline 20IT, and the mark for designating the reproduction position are designated by supplying the data to the synthesizing circuit 26 through, for example, the multiplexed Z separation circuit 16.
- the information is displayed on the display screen of the monitor receiver 200 connected to the recording / reproducing device 600.
- the user of the recording / reproducing device 600 can operate the timeline by operating the arrow keys of the remote controller 150, for example, using the timeline 201T as a guide. Move the mark for specifying the playback position on the input 201T to select and input one or more video playback points.
- the CPU 40 of the recording / reproducing device 600 receives a selection input of a video playback point from a user, and displays playback points ⁇ 1, ⁇ 2, ⁇ 3,... Indicating the playback position of the video at the selected position. To do it. Note that an input item for setting a video playback time may be provided, and one or more playback points may be set by a video playback time, that is, a numerical value.
- the CPU 40 of the recording / reproducing device 600 forms a sub-image display area, which is a display area of the reproduced image, according to the number of the received moving image reproduction points, and sets the sub-image display area to the received moving image reproduction point.
- One frame of image data at the corresponding position is read from the recording medium 18, and a frame image (still image) based on the read image data is displayed in the sub-image display area.
- the reproduction points ⁇ 1, ⁇ 2, ⁇ 3,... are formed by the CPU 40 in the same manner as the timeline 201T and supplied to the synthesizing circuit 26 so that they are displayed in the image. To be.
- an image data forming an image of the corresponding frame is read out from the recording medium 18 under the control of the CPU 40, and this is multiplexed / demultiplexed
- the signal is supplied to the MPEG video decoder 23 through 16, where it is decoded by the MPEG and supplied to the display image forming circuit 50.
- an image display for displaying a frame image as a still image is formed in each sub-image display area, and this is supplied to the synthesizing circuit 26 and synthesized, whereby 1 A frame image at a position corresponding to the playback point is displayed in each sub-image display area provided on the screen.
- the sub-image table The display area can also be selected, and the CPU 40 of the recording / reproducing device 600 stores the same image in the main image display area as the image in the sub image display area in which the cursor is positioned in the main image display area 7201M. To be displayed. The force for the sub-image display area is displayed by the CPU 40, and the display position is controlled.
- three points, the first, middle, and last, of the title selected to be played are selected as the playback points of the moving image, and are placed on the evening line 201T.
- the reproduction points MK1, MK2, and MK3 are displayed, and three sub-image display areas SG1, SG2, and SG3 are provided.
- Each of the frame images (still images) at the positions corresponding to the playback points MK1, MK2, and MK3 is, as described above, a multiplexed Z separation circuit 16, an MPEG video decoder 23, and a display image.
- the signals are processed by the forming circuit 50, the synthesizing circuit 26, and the NTSC encoder 27, and are displayed on the corresponding sub-image display areas SG1, SG2, and SG3 formed on the display screen.
- the cursor CS for the sub-image display area is positioned in the sub-image display area SG1, and the same image as the sub-image display area SG1 is displayed in the main image. It is also displayed on the display area 201M.
- the image to be displayed in the main image display area 201M is to be displayed by the image data formed in the lost video signal processing circuit 24 in this example.
- the images displayed in the main image display area 201M and the sub image display areas SG1, SG2, and SG3 are reproduced by the playback designated by the user.
- This is a frame image (still image) at the position corresponding to the point.
- the position of the playback point can be changed by operating the remote control 150, or the cursor CS for the sub-image display area can be moved to another sub-image display area. And so on. Note that the movement of the cursor CS with respect to the sub-image display area can be performed even after the reproduction of the actually selected title is started, as described below.
- the CPU 40 of the recording / reproducing apparatus 600 causes the main image display area 200 1 M, Sub-image display area SG1, SG2, SG3 Start playing the movie.
- FIG. 16 shows an image displayed on the display screen 201 of the monitor receiver 200 at the time of reproducing a moving image by the recording / reproducing apparatus 600 of the second example of the second embodiment.
- FIG. 16 shows an image displayed on the display screen 201 of the monitor receiver 200 at the time of reproducing a moving image by the recording / reproducing apparatus 600 of the second example of the second embodiment.
- the CPU 40 sequentially reads the image data at the positions corresponding to the playback points MK1, MK2, and MK3 from the recording medium 18 sequentially. .
- each of the main image display area 201M and the sub image display areas SG1, SG2, and SG3 provides a moving image reproduced from the corresponding playback point. To be displayed.
- the CPU 40 of the recording / reproducing apparatus moves each of the reproduction points ⁇ 1, ⁇ 2, ⁇ 3 in the reproduction direction in synchronization with the reproduction of the moving image as indicated by the arrow in FIG. Display and each sub-image display area
- the user can visually recognize the position of the image displayed on SG1, SG2, and SG3 in the corresponding playing night.
- marks ST1, ST2, and ST3 indicated by dotted lines indicate respective start points of the reproduction points MK1, MK2, and MK3.
- the cursor CS for the sub-image display area is positioned in the sub-image display area SG1, so the main image display area 201M has the sub-image display area.
- the same moving image as the moving image displayed in area SG1 is displayed.
- the user of the recording / reproducing apparatus 600 of the second example has a plurality of display areas having different playback points displayed in the three sub-image display areas SG1, SG2, and SG3, respectively. Through this moving image, it is possible to detect an image of a target scene to be used as an editing candidate point.
- the target scene is displayed in any of the sub-image display areas SG1, SG2, and SG3, the target scene is displayed. Position the force sol CS on the sub-image display area where the image of the target or the image near the target scene is displayed.
- the same moving image as the sub-image display area where the cursor CS is positioned is also displayed in the main image display area 201M, and the image of the target scene to be used as an editing candidate point is displayed as a sub-image.
- Detection can be performed through a moving image displayed in the main image display area 201M having a larger display area than the image display area.
- FIGS. 17 and 18 are diagrams for explaining the operation of the user and the recording / reproducing device 600 when selecting an image to be used as an editing candidate point from among images of a target scene. is there.
- the playback of moving images from multiple playback points MK1, MK2, and MK3 starts simultaneously, and the force CS is positioned in the sub-image display area SG1.
- a sub-image display area SG 3 displays an image that appears to be near the target scene.
- the user operates the arrow keys of the remote controller 150 so that the cursor positioned in the sub-image display area SG1 is positioned in the sub-image display area SG3.
- the CPU 40 of the recording / reproducing device 600 positions the cursor CS in the sub-image display area SG 3 and displays the sub-image in response to an instruction from the user through the remote controller 150 as shown in FIG.
- the moving image displayed in the area SG3 is also displayed in the main image display area 201M.
- the CPU 40 of the recording / reproducing apparatus When the CPU 40 of the recording / reproducing apparatus receives a signal indicating that the cap key 154 has been pressed from the remote controller 150, the CPU 40 displays the main image display area 201M at that time. For example, a time code or a frame number is acquired as information capable of specifying the displayed frame image, and this is recorded on the recording medium 18 as information indicating an editing candidate point.
- the CPU 40 displays, on the timeline 201 T, an editing candidate point mark CN (1) indicating the position of the frame image in the title at the time when the chapter mark is pressed.
- the editing candidate mark is displayed in such a manner that it can be immediately distinguished from the reproduction point mark, for example, by changing its color and shape.
- the main image display area 201M and the sub image display areas SG1, SG2, and SG3 are displayed. The display of the moving image to be displayed is not stopped, and the display of the moving image in each display area is continued.
- the CPU 40 of the recording / reproducing device 600 positions the cursor CS on the sub-image display area SG 2 and displays the sub-image as shown in FIG. 18 in response to an instruction from the user through the remote controller 150.
- the moving image displayed on the area SG2 is also displayed on the main image display area 201M.
- the user of the recording / reproducing device 600 can use the chapter mark provided on the remote controller 150. Press key 1 54.
- the CPU 40 of the recording / reproducing device 600 When the CPU 40 of the recording / reproducing device 600 receives a signal indicating that the chapter mark key 154 has been pressed from the remote control 150, the CPU 40 displays the main image display area 201M at that time. A time code or a frame number capable of specifying the frame image being obtained is obtained, and this is recorded on the recording medium 18 as information indicating an editing candidate point together with, for example, title identification information.
- the CPU 40 sets the edit candidate point mark C indicating the position in the title of the frame image at the time when the cap mark key is pressed.
- N (2) is displayed on the timeline 201T.
- the user can quickly detect an image of a target scene as an editing candidate point and edit the image. It can be registered in the recording medium 19 as a candidate point.
- the operation of the recording / reproducing apparatus 600 of the second example of the second embodiment will be described with reference to the flowcharts shown in FIGS.
- the power is turned on to the recording / reproducing apparatus 600 of the second embodiment, and a title list of broadcast programs and the like recorded on the recording medium 18 is displayed.
- the CPU 40 refers to, for example, a directory on the recording medium 18 through the buffer control circuit 17 and displays an image for displaying a list of titles recorded on the recording medium 18.
- a signal is formed and supplied to the synthesizing circuit 26 through the multiplexed Z separation circuit 16 so that a title list is synthesized and displayed on the display image 201 G of the monitor receiver 200. (Step S201).
- the CPU 40 of the recording / reproducing device 600 accepts a user's selection input of a title through the remote controller 150 or an operation panel provided on the front panel of the recording / reproducing device 600 ( Step S202) to form the main image display area 201M as shown in FIG. repet ⁇ TM
- a timeline 20 IT is formed and these are displayed (step S203).
- the CPU 40 of the recording / reproducing apparatus 600 accepts a reproduction point selection input or an operation instruction input from the user through the remote controller 150 or the like (step S204). Then, the CPU 40 determines whether or not the input from the user is an instruction for ending the reproduction (step S205).
- step S205 If it is determined in step S205 that the input received from the user in step S203 is an instruction to end reproduction, the processing shown in FIG. 19 is terminated. For example, it is in a state of waiting for input from the user.
- the CPU 40 determines the input from the user. Determines whether or not it is a playback start input (step S206)
- step S206 If it is determined in step S206 that the input from the user is not the playback start input, in this example, the CPU 40 repeats the processing from step S202 and repeats the processing from the title. Select input, playback point Select input and operation instruction input are accepted.
- step S206 when it is determined that the input from the user accepted in step S204 is to instruct the start of reproduction, the CPU 40 proceeds as described above. Then, a sub-image display area is provided according to the number of playback points selected and input by the user, and playback of a moving image from the playback point is started (step S207).
- the CPU 40 receives an instruction input from the user, such as an input for setting an editing candidate point, an input for moving a force cursor, and an input for selecting an editing candidate point (step S208). It is determined whether the instruction input is a setting input of an editing candidate point (step S209). In the determination process of step S209, when it is determined that the received instruction input from the user is an input for setting an editing candidate point, the CPU 40 specifies the image displayed in the main image display area. For example, a time code or a frame number, etc., which is information for obtaining the information (step S210). Then, the information for specifying the image acquired in step S210 and the information (title information) for specifying the title to be reproduced are associated with each other and recorded on the recording medium 18 (step S2). 1 1), and repeat the processing from step S 208.
- an instruction input from the user such as an input for setting an editing candidate point, an input for moving a force cursor, and an input for selecting an editing candidate point. It is determined whether the instruction input is
- step S209 when it is determined that the received instruction input from the user is not an input setting of the editing candidate point, the CPU 40 determines that the received instruction input from the user is It is determined whether or not a movement instruction input of the cursor CS for the image display area is made (step S212). If it is determined in step S212 that the input received from the user in step S208 is an instruction to move the cursor CS to the sub-image display area, the CPU 40 determines that the CPU 40 is in use. The cursor CS for the sub-image display area is moved according to the instruction input from the user (step S2113).
- step S214 the CPU 40 changes the display image of the main image display area 201M so that it becomes the same image as the sub-image display area where the cursor CS is positioned. The processing from step S208 is repeated.
- step S210 If it is determined in step S210 that the input received from the user in step S208 is not an instruction to move the cursor CS to the sub-image display area, the process shown in FIG.
- the CPU 40 proceeds to step S208 to select and input an editing candidate point in which the input from the user accepted in step S208 is registered in accordance with an instruction from the user. It is determined whether or not the force is applied (step S215).
- step S215 when it is determined that the input from the user accepted in step S208 is a selection input of an editing candidate point, CPU 40 stops reproduction of the moving image.
- the information that specifies the frame image of the selected editing candidate point stored in the recording medium 18 is read out, and the image data forming the frame image of the editing candidate point and a plurality of neighboring frame images are formed.
- the image data is read from the recording medium 18 and is displayed as a still image (step S216).
- step S216 is performed through the respective sections of the multiplex Z separation circuit 16, the MPEG video decoder 23, the display image forming circuit 50, the synthesizing circuit 26, and the NTSC encoder 27. Then, the display image forming circuit 50 can form image data for displaying a moving image in a plurality of sub-image display areas, and can display the image data shown in FIGS. 4 and 5 of the first embodiment. Image data for displaying a still image (thumbnail) as shown can also be formed.
- the recording / reproducing apparatus 600 according to the second embodiment also has, for example, a still image to be displayed as shown in FIG. 4, similarly to the recording / reproducing apparatus 100 according to the first embodiment. Images can be displayed in a scrollable manner. Then, the CPU 40 receives a selection input of a frame image to be finally used as an editing point from among a plurality of still images displayed based on the selected editing candidate point (step). Step S2 17).
- the selection input in step S217 is one of a plurality of still images displayed based on the selected editing candidate point. Moving the still image selection cursor to be positioned in accordance with the instruction from the user. Therefore, it is possible to select the image of the final edit point.
- the CPU 40 determines the input key accepted in step S217 (step S218). If it is determined that the accepted input key is the down key (down arrow key) in the determination processing of step S218, it is determined that the editing point is not to be selected, and the process is stopped in step S216. The display of the moving image in the main image display area and the sub image display area is restarted (step S221), and the processing from step S208 shown in FIG.
- step S2128 when it is determined that the input key received in step S217 is the determination key, the input candidate point selected in step S207 is determined to be the edit candidate point. In other words, the frame image selected this time is set as an edit point, and information for specifying the frame image is obtained (step S219).
- step S219 the title information and the information specifying the frame image acquired in step S219 are recorded as edit point information on the recording medium 18 (step S220). Then, the display of the moving image in the main image display area and the sub-image display area, which was stopped in step S216, is restarted (step S221), and the step S210 shown in FIG. Repeat the process from step 8.
- step S2128 when it is determined that the input key accepted in step S217 is the left and right key (left arrow key or right arrow key), the selection of the still image is changed. That is, it is determined that the instruction is to move the cursor positioned at the still image, and the displayed still image (thumbnail) is selected by moving the cursor (step S2). twenty two ) . After that, step S 2 17 is repeated.
- step S215 when it is determined that the input received from the user in step S208 is the selection input of the editing candidate point, the CPU 40 sets the user A process corresponding to the instruction input received from is executed (step S223). In step S 223, of various processing such as pause, fast forward, fast reverse, and stop of reproduction, processing according to a request from the user is performed.
- a plurality of different reproduction points are designated for one title, and the plurality of different reproduction points are designated. Simultaneously, by starting the playback of the moving image, the image of the target scene can be quickly found (detected).
- the image is specified as an image of an editing candidate point, and the image of the editing candidate point and its neighboring images are displayed as a still image, and accurate. It allows you to select different edit points. Then, information for specifying the image at the selected editing point is obtained, and this is recorded on the recording medium 18 as editing candidate point information. In addition, using the selected edit point, it is possible to perform various edits such as deleting, moving, and extracting a section sandwiched between the edit points.
- FIG. 21 is a diagram for explaining processing in a case where a subsequent playback point catches up with the previously set editing candidate point in the recording / playback apparatus of the second example of the second embodiment. is there.
- three playback points MK1, MK2, and MK3 are designated with the playback start points ST1, ST2, and ST3. It is assumed that reproduction has started.
- whether or not the subsequent playback point has caught up with the previously set editing candidate point is determined by the CPU 40 of the recording / reproducing device 600, for example, by determining the time between the editing candidate point and the frame image of the playback point. By monitoring the code / frame number, if both timecode / frame numbers match, it can be determined that the subsequent playback point has caught up to the editing candidate point.
- the CPU 40 displays the sub-image display area SG 2. Then, the editing candidate point CN (1) and its neighboring images are displayed as still images. In addition, in the sub-image display areas SG1 and SG3 other than the sub-image display area SG2 displaying the image corresponding to the reproduction point MK2, the reproduction of the moving image is continued.
- the cursor CS is positioned on the sub-image display area displaying the moving image corresponding to the playback point that has caught up to the editing candidate point, the still image displayed in the sub-image display area is used as the main image. Also displayed in the image display area 201M. In the example shown in FIG. 21, the still image displayed in the sub-image display area SG2 is also displayed in the main image display area 201M.
- the editing candidate point is CN (1)
- the editing candidate point is CN (1), two frames before and four frames after the next frame. Is read out, and as shown in the main image display area 201M of FIG. 21, the image CN (1) of the editing candidate point and the image CN (1) adjacent thereto are displayed. — 2, CN (2) — 1, CN (1) + 1, CN (1) + 2 are displayed as thumbnails.
- the cursor MCS is positioned in the display area of the image CN (1) of the candidate editing point displayed in the main image display area 201M.
- the image to which the cursor MCS is positioned is also enlarged and displayed on the entire main image display area 201M.
- This force sol MCS can be moved by operating the left and right arrow keys and the right arrow key of the remote controller 150, for example, as in the case of the first embodiment described above.
- the frame image in which the force MCS is positioned can be selected and registered as an editing point.
- FIG. 22 illustrates a case where the cursor CS is not positioned at the sub-image display area SG2 displaying the moving image corresponding to the playback point MK2 that has caught up to the editing candidate point CN (1).
- the image at the position corresponding to the editing candidate point CN (1), the image at the editing candidate point CN (1), and the neighboring images are displayed as thumbnails only in the sub-image display area SG2. To be displayed.
- the editing point can be determined by selecting a still image only through the main image display area 201M, so that the force CS is not positioned in the sub image display area SG2. As long as the editing point cannot be selected, for example, the reproduction of the moving image is restarted in the sub-image display area SG2 after a predetermined time has elapsed.
- the selection and registration of edit points can be performed only through the main image display area 201M. Therefore, as shown in the display example of FIG. It is not necessary to display the editing candidate point and its neighboring images as so-called thumbnails.
- FIG. 23 is a diagram for explaining the operation of the recording / reproducing apparatus after the editing point has been registered based on the reproduction point that has caught up to the editing candidate point.
- the target still image among the five still images displayed as thumbnails in the main image display area 201M as thumbnails is displayed on the left of the remote control 150. Operate the arrow keys and right arrow key to position the force cursor MC S, and press the enter key to select and register the edit point.
- the edit point DC (1) is set on the recording medium 18 instead of the latest edit candidate point CN (1). Then, as shown in FIG. 23, the determined edit point DC (1) is shown on the timeline 201T, and thereafter, the reproduction of the moving image from the reproduction point MK2 is restarted.
- the edit point is registered instead of the edit candidate point.However, the color of the mark indicating the edit candidate point CN (1) and the edit point DC (1) is changed and displayed. It is also possible to display both the editing candidate point and the editing point in a distinguishable manner.
- the CPU 40 manages the time interval between the playback point and the editing candidate point, for example, "can be an editing candidate point in about ⁇ seconds.” Display a message. By doing so, the user can adjust the force—sol CS to the corresponding sub-image display area in advance.
- Marks indicating the playback points, editing candidate points, and editing point positions to be displayed on the timeline 201 T are, for example, the mark indicating the playback point is green, and the mark indicating the editing candidate point is Change the color, such as blue, edit point red, etc., or mark the playback point as a triangle, edit candidate point as a square, edit point as a circle, etc. By changing the shape or shifting the display position of each mark, it is possible to clearly notify the user of the position of each mark.
- the reproduction speeds of the moving images from the respective reproduction points are all equal.
- the playback speed may be changed for each playback point.
- the middle and rear parts of the title are played at the same speed, but the former is played at double speed, so the middle and the latter parts of the title are played at normal speed. While the image of the target scene can be found at the playback speed, the front part of the title is played back at 2x speed, so the image of the target scene can be detected more quickly. Can be done.
- the playback speed is changed in accordance with a plurality of playback points selected in this way, for example, high-speed playback is performed in a portion where the target scene is unlikely to exist, and the target scene may exist.
- the playback speed can be set for each playback point that is selected according to the user's purpose, such as performing playback at a constant speed or at a low speed in a portion having high performance.
- the playback speed for each playback point can be set via the remote controller 150.
- the CPU 40 can change the speed at which data is read from the recording medium 18 for each selected playback point, Alternatively, by adjusting the processing speed in the MPEG video decoder 23 or the display image forming circuit 50, the reproduction speed of the moving image at each reproduction point can be changed.
- the second modification as in the case of the second example of the second embodiment, when a plurality of editing candidate points are set, one of the plurality of editing candidate points is set. An edit candidate point is selected, and an image (still image) at a position corresponding to the edit candidate point and an image (still image) near the position corresponding to the edit candidate point are displayed in a scrollable manner. Finally, an editing point to be used at the time of editing can be set.
- FIG. 25 and FIG. 26 are diagrams for explaining processing in a case where selection of an edit point is accepted based on edit candidate points in sections having different playback speeds.
- a cursor CST that can be moved in units of editing candidate points on the timeline 201 T is displayed.
- the remote controller 150 can be moved by pressing it and the decision key 154 can be pressed.
- the selected editing candidate point When the selected editing candidate point is in the section where the playback at the normal speed is performed, as shown in FIG. 25, the selected editing candidate point (in the case of FIG. 25, the editing candidate point CN (1 )) And the image of one frame before and after it are displayed as thumbnails in the main image display area 201T so that edit points can be selected.
- the selected editing candidate point is in the section where the 2x speed playback is performed, as shown in FIG. 26, the selected editing candidate point (in the case of FIG. 26, the editing candidate point The image of CN (2)) and the images of the two frames before and two frames after it are displayed as thumbnails in the main image display area 201T, and the editing point is selected.
- the selected editing candidate point in the case of FIG. 26, the editing candidate point The image of CN (2)
- the images of the two frames before and two frames after it are displayed as thumbnails in the main image display area 201T, and the editing point is selected.
- more thumbnails for selecting the editing point are displayed than in the section where the reproduction speed is slow.
- changing the number of so-called thumbnail images for selecting an editing point according to the playback speed is because when a user marks an editing candidate point on a screen with a high playback speed, the editing candidate point is changed. This is because there is a high possibility that the user is away from the desired edit point. In other words, the higher the playback speed, the greater the possibility that the interval between the actually marked editing candidate point and the target editing point increases, so the display of a still image adjacent to the editing candidate point in proportion to the playback speed is increased. Increase the number. Thus, the editing point can be selected more quickly.
- CN (n) +1 is the image one frame ahead of the editing candidate point CN (n)
- CN (n) — 1 is the image of the editing candidate point CN (n).
- CN (n) +2 indicates an image two frames ahead of the editing candidate point CN (n)
- CN (n) one 2 indicates an editing candidate. It means the image two frames before the point CN (n).
- FIG. 27A to FIG. 29C are diagrams for explaining a case where a preceding playback point overtakes a preceding playback point.
- three playback points MK1, MK2, and MK3 are selected.
- For playback point MK1, double-speed playback is performed, and for playback points MK2 and MK3, etc. It is assumed that double-speed playback is performed.
- Playback is started from the state shown in FIG. 27A, and after a certain period of time, the playback point MK1 passes the playback point MK2 as shown in FIG. 27B.
- the CPU 40 of the recording / reproducing device 600 sets the sub-image display area in accordance with the position of the reproduction points MKl, MK2, and MK3 on the title. Change the moving image displayed on SG1, SG2, SG3.
- the order in which the playback points MK1, MK2, and MK3 move on the timeline 201T matches the order in which the sub-image display areas SG1, SG2, and SG3 are arranged. And the sub-image display area are not confused, and it is possible to prevent inconveniences such as display information being difficult to see and incorrect processing.
- the case where the subsequent playback point overtakes the preceding playback point is not limited to the case where the playback speeds are different.
- the playback point MK2 catches up with the editing candidate point registered at the time of playback of the playback point MK3. As described above, it is assumed that the playback point MK2 stops to determine the editing point.
- the playback point MK1 may pass the playback point MK2 while the playback point MK2 is stopped.
- the CPU 40 transfers the image of the playback point MK2 to the sub-image display area SG1. Display and switch the display area so that the image at the playback point MK1 is displayed in the sub-image display area SG2.
- the order of the MK 3 matches the order of the sub-image display areas SG 1, SG 2, and SG 3, making it difficult to see the display information without confusing the relationship between the playback point and the sub-image display area. It is possible to prevent inconveniences such as dullness and wrong processing.
- playback points MK1 and MK2 will play the same moving image thereafter.
- the playback points MK1 and MK2 are merged, and the sub-image display area is also grouped into the sub-image display area SGF and the sub-image display area SGE.
- the sub-image display area SGF displays the image at the reproduction point where the reproduction points MK1 and MK2 are fused, and the sub-image display area SGE displays the image at the reproduction point MK3.
- the same moving image can be prevented from being displayed in a plurality of sub-image display areas, and inconveniences such as difficulty in detecting an image of a target scene can be prevented.
- FIGS. 29A to 29C are diagrams for explaining the processing when the preceding reproduction point has been reproduced to the end of the title.
- Fig. 29A shows that when playback point MK3, which is played from the latter part of the title, finishes playing to the end of the title, playback point MK3 becomes the playback point from the beginning of the title. In this case, the reproduction is continued, and the images displayed in the sub-image display areas SG1, SG2, and SG3 are replaced in accordance with this.
- Fig. 29B shows that, when the playback point MK3 to be played from the latter part of the title has been played to the end of the title, the movement of the playback point MK3 is stopped at the end of the title. , The case where the state is maintained. Therefore, in the case of FIG. 29B, when both the playback points MK1 and MK2 have reached the end of the playback of the title, the sub-image display areas SG1, SG2, The same image will be displayed on SG3.
- Fig. 29C shows the playback point MK3 and the playback point MK3 at the position corresponding to the playback point MK3 when the playback point MK3 to be played back from the latter part of the title has been played back to the end of the title.
- the sub-image display area SG3 where the playback image was displayed is deleted, and the playback points MK1, MK2 and the sub-image display area that display the corresponding image have not been played to the end yet. Leave only SG1 and SG2.
- the user does not set the position and the number of the reproduction points, but rather uses the reproduction speed of the image.
- the recording / reproducing device uses the reproduction speed of the image.
- the target image of the target scene can be quickly and accurately detected from a series of titles.
- the recording / reproducing apparatus of the third example is also the same as the recording / reproducing apparatus 600 shown in FIG.
- the playback speed of the title can be set freely.
- the setting of the reproduction speed can be performed through the remote controller 150.
- a reproduced image is displayed on the entire display screen 201 as shown in FIG.
- a sub-image display area SG1 is provided in a lower portion of the display screen 201, and a prefetch image is displayed here.
- the playback point in the title of the image displayed on the entire display screen 201 is indicated by the current playback position mark NL on the timeline 201, and the sub image display area SG The playback point in the title of the image displayed in 1 is indicated by the look-ahead point PT1 on the timeline 201.
- reproduction is performed at double speed (in FIG. 30, described as “X 2”).
- the state shown in FIG. 30 is the same as the state shown in FIG. 31A. If the playback speed is, for example, about 2 ⁇ speed, the playback speed is too fast, and there are few inconveniences such as missing the image of the target scene.
- the playback speed becomes relatively high, for example, 5 times speed
- the possibility of missing the image of the target scene increases. Therefore, as shown in FIG. 31B, when the reproduction speed is, for example, 5 ⁇ speed, two sub-image display areas SG 1 and SG 2 are provided, and the sub-image display areas SG 1 and SG 2 are separated by a predetermined interval.
- the playback image from the position corresponding to the look-ahead point PT1, ⁇ ⁇ 2 is displayed.
- Pre-reading point Displays the playback image from the position corresponding to ⁇ ⁇ 1, ⁇ ⁇ 2, and ⁇ ⁇ 3.
- Pre-reading point Displays the playback image from the position corresponding to ⁇ ⁇ 1, ⁇ ⁇ 2, and ⁇ ⁇ 3.
- an image displayed on the entire display screen 201 is formed by a multiplexing Z separation circuit 16, an MPEG video decoder 23, and a lost video signal processing circuit 24.
- the images displayed in the sub image display areas SG1, SG2, SG3,... are formed by the multiplexing / demultiplexing circuit 16, the MPEG video decoder 23, and the display image forming circuit 50. .
- the image from the post video signal processing circuit 24 and the image to be displayed on one or more sub-image display areas from the display image forming circuit 50 are synthesized in the synthesizing circuit 26, Output through the NT SC encoder 27.
- the image based on the image data from a plurality of playback points including the current playback position is to be played back at the same time, but the buffer control circuit 1 controlled by the CPU 40 controls.
- image data read out sequentially from different reproduction points on the recording medium 18 can be collected and processed as continuous image data from each reproduction point.
- the image at the playback point where the force sol CS ⁇ is positioned in the example of FIG. 32, the image at the playback point ⁇ ⁇ 2) 74
- a frame image for setting an edit point is displayed on the display screen. In the case of the example in Fig.
- the image at the playback point PT2 where the force solver CST is positioned is used as a reference, and two frames before and after it are used, for a total of five frame images, that is, images PT2-2, PT2 2 — 1, PT 2, ⁇ ⁇ 2 + 1, ⁇ ⁇ 2 + 2
- the five frame images are displayed in a scrollable manner. Then, by moving the cursor CS that can move on the displayed five frame images, the final edit point can be selected and registered in the recording medium 18.
- one sub-image display area is used when the playback speed is 2 ⁇ , and two sub-image display areas are used when the playback speed is 5 ⁇ .
- the playback speed is 10 ⁇ speed
- the present invention is not limited to this.
- the number of sub-image display areas used according to the reproduction speed may be arbitrarily set, and the interval between each reproduction point may be arbitrarily set.
- the interval between the current playback position and the first playback point is made different from the interval between the first playback point and the playback point on the second surface. In this way, the intervals between the playback points including the current playback position may be made different.
- the intervals between the playback points including the current playback position can all be made equal.
- the editing points of the title to be edited this time can be quickly detected and registered by reusing the editing points registered in the past.
- the program is recorded and reproduced according to the second embodiment. If the recording is performed on the recording medium 18 by the device and the editing points are registered and editing is performed, the information indicating the editing points is registered in the recording medium 18 and can be reused. .
- the title and one or more editing points are stored in association with each other.
- the title identification information is also added to a main information signal called a title of a broadcast program or the like recorded on the recording medium 18.
- the CPU 40 checks whether or not there is edit point information having the same title identification information, and if there is, edits the edit point information. By using the information, information indicating an editing candidate point is formed, and this is combined with a reproduced image and displayed.
- FIG. 33 and FIG. 34 are diagrams for explaining an image in which editing candidate point information is displayed using past editing point information in the fourth example. As shown in FIG. 33, a reproduced image of the title to be reproduced is displayed on the entire display screen 201.
- the edit point information is edited on the timeline 201T based on the edit point information.
- the candidate points PPl, PP2, and PP3 are displayed, and the title image corresponding to the position indicated by each editing candidate point is displayed in the sub-image display areas SG1, SG2, and SG3.
- a continuous program that is broadcast at a fixed time every week, even if the editing point for the previous broadcast program and the editing point for the current broadcast program are exactly the same. It's hard to imagine, and even the same program can have errors of a few seconds.
- FIG. 34 shows a state in which a frame image adjacent to the editing candidate point is displayed.
- the cursor CST is positioned at the first edit candidate point PP1 among the three displayed edit candidate points PP1, PP2, and PP3. Then, based on the image of the edit candidate point P P1, two frame images before and after the frame are displayed in a scrollable manner as images for selecting an edit point.
- the editing point is determined by performing a determination operation such as pressing the determination key while positioning the force CS on the target frame image, and determining the determined image position.
- the editing point can be registered in the editing point information of the recording medium.
- the cursor CS was positioned at the leftmost frame image of the five frame images displayed in the frame image display area 201S. Later, when the left arrow key is pressed further, the frame image is scrolled to the right in frame units.
- the image displayed on the entire surface of the display screen 201 is formed by the multi-multiplexing Z separation circuit 16, the MPEG video decoder 23, and the post video signal processing circuit 24.
- the images displayed in the sub-image display areas SG 1, SG 2, SG 3,... And the frame images displayed in the frame image display area 201 S are multiplexed Z separation circuits 16, It is formed by the MPEG video decoder 23 and the display image forming circuit 50.
- the image from the post video signal processing circuit 23 and the image from the display image forming circuit 50 are synthesized in the synthesizing circuit 26, and output through the NTSC encoder 27.
- images based on image data from a plurality of playback points are played back simultaneously, but the buffer control circuit 17 controlled by the CPU 40 sequentially controls different playback of the recording medium 18.
- the image data read from the points is compiled and processed as continuous image data from each playback point.
- the editing points can be quickly and accurately determined for broadcast programs such as serial dramas that are to be recorded regularly. You can register this.
- the correspondence between the already registered editing point information and the newly recorded title is determined by the title identification information for identifying each title.
- the title identification information may include, for example, information such as a recording date and time, a day of the week, and a broadcast channel, in addition to a title and a title input by a user or the like. it can.
- the management of edit candidate points and edit points throughout This may be position information and time information for the entire moving image, such as a time code (time stamp) and a frame number, or may be time information for recording and position information for the entire recording medium.
- the information on the caption mark, the information on the editing candidate points, and the information on the editing points have been described as being recorded in a separate file provided on the recording medium 18. It is not limited. Information on the chapter mark, information on the editing candidate point, and information on the editing point may be recorded in the memory of the recording / reproducing apparatus 100.
- the recording medium 18 is removable, the information for specifying the recording medium, the information for specifying the title, the information on the caption mark, the information on the editing candidate point, or By recording information in association with the edit points, it is possible to remove the recording medium 18 and change it to another recording medium without causing an illegal match. .
- the recording medium 18 of the recording / reproducing apparatus 100 is not limited to a hard disk, and various recording media such as an optical disk, a magneto-optical disk, and a semiconductor memory can be used.
- the recording medium 18 is not limited to one provided inside the recording / reproducing device 100, and image information or the like recorded on an externally connected hard disk device or the like may be used as information for reproduction. It is possible.
- the present invention is not limited to this.
- the present invention can be applied to a video camera. In other words, it is possible to capture image information and record it on a recording medium, and various types of functions having a function of reproducing the recorded image information
- the present invention can be applied to a recording / reproducing device.
- the present invention can be applied to a reproducing apparatus having a function of reproducing image information recorded on a recording medium.
- the information about the cap mark and the edit point information can be written to the memory of the own device, and if necessary. It just needs to be able to read out.
- the number of sub-image display areas and the number of still images (thumbnails) displayed for selecting an editing point are not limited to those in the above-described embodiment, but may be determined according to the size of the display screen. It is possible to increase or decrease. Industrial applicability
- the present invention unlike the conventional recording / reproducing apparatus, it is necessary to basically use the two modes of the reproducing mode and the editing mode in the reproducing mode. Editing can be performed according to. Therefore, there is no inconvenience such as not knowing the operation for shifting to the edit mode or taking time to shift to the edit mode.
- editing operations can be performed using only a limited number of operation keys of an operation device such as a remote controller, so that operations for editing are extremely simple. As a result, there is no inconvenience such as erroneous operation being frequently generated or a target scene being overlooked due to a careless operation.
- a caption mark is added to the image of the target scene, and the image in which the caption mark is added and the neighboring images are reduced. Since it is possible to check the small image at the same time and scroll the display, there is no need to use the conventional functions such as fast forward, fast reverse, frame forward, etc., and based on the image with the chapter mark attached. It is possible to accurately specify an image of a target scene and perform editing.
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/486,609 US7496211B2 (en) | 2002-06-11 | 2003-06-11 | Image processing apparatus, image processing method, and image processing program |
KR10-2004-7001831A KR20050009270A (ko) | 2002-06-11 | 2003-06-11 | 화상 검출 장치, 화상 검출 방법 및 화상 검출 프로그램 |
EP03733361A EP1515552A4 (en) | 2002-06-11 | 2003-06-11 | IMAGE DETECTION APPARATUS, IMAGE DETECTION METHOD, AND IMAGE DETECTION PROGRAM |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002-169565 | 2002-06-11 | ||
JP2002169565 | 2002-06-11 | ||
JP2002203479A JP3738843B2 (ja) | 2002-06-11 | 2002-07-12 | 画像検出装置、画像検出方法および画像検出プログラム |
JP2002-203479 | 2002-07-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003105474A1 true WO2003105474A1 (ja) | 2003-12-18 |
Family
ID=29738355
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2003/007409 WO2003105474A1 (ja) | 2002-06-11 | 2003-06-11 | 画像検出装置、画像検出方法および画像検出プログラム |
Country Status (7)
Country | Link |
---|---|
US (1) | US7496211B2 (ja) |
EP (1) | EP1515552A4 (ja) |
JP (1) | JP3738843B2 (ja) |
KR (1) | KR20050009270A (ja) |
CN (1) | CN1294750C (ja) |
TW (1) | TWI233302B (ja) |
WO (1) | WO2003105474A1 (ja) |
Families Citing this family (220)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050183017A1 (en) * | 2001-01-31 | 2005-08-18 | Microsoft Corporation | Seekbar in taskbar player visualization mode |
US20070084897A1 (en) | 2003-05-20 | 2007-04-19 | Shelton Frederick E Iv | Articulating surgical stapling instrument incorporating a two-piece e-beam firing mechanism |
US9060770B2 (en) | 2003-05-20 | 2015-06-23 | Ethicon Endo-Surgery, Inc. | Robotically-driven surgical instrument with E-beam driver |
US7757182B2 (en) * | 2003-06-25 | 2010-07-13 | Microsoft Corporation | Taskbar media player |
US7512884B2 (en) | 2003-06-25 | 2009-03-31 | Microsoft Corporation | System and method for switching of media presentation |
JP2005303863A (ja) * | 2004-04-15 | 2005-10-27 | Mitsubishi Electric Corp | テレビ受像機用の選局装置及び選局方法 |
US11896225B2 (en) | 2004-07-28 | 2024-02-13 | Cilag Gmbh International | Staple cartridge comprising a pan |
JP4727342B2 (ja) | 2004-09-15 | 2011-07-20 | ソニー株式会社 | 画像処理装置、画像処理方法、画像処理プログラム及びプログラム格納媒体 |
US8117544B2 (en) * | 2004-10-26 | 2012-02-14 | Fuji Xerox Co., Ltd. | System and method for detecting user actions in a video stream |
JP2006134383A (ja) * | 2004-11-02 | 2006-05-25 | Canon Inc | 再生装置、再生方法及びそのプログラム |
JP4471882B2 (ja) * | 2005-03-31 | 2010-06-02 | 三洋電機株式会社 | 表示装置および表示方法 |
US20060236264A1 (en) * | 2005-04-18 | 2006-10-19 | Microsoft Corporation | Automatic window resize behavior and optimizations |
JP2006309867A (ja) * | 2005-04-28 | 2006-11-09 | Hitachi Ltd | 映像記録再生装置 |
KR100724984B1 (ko) * | 2005-06-16 | 2007-06-04 | 삼성전자주식회사 | 디지털 멀티미디어 방송을 다양한 방식으로 재생하는 방법및 그에 따른 디지털 멀티미디어 방송 수신장치 |
JP4537277B2 (ja) * | 2005-07-08 | 2010-09-01 | 株式会社日立ハイテクノロジーズ | 半導体検査装置 |
KR100731378B1 (ko) * | 2005-07-18 | 2007-06-21 | 엘지전자 주식회사 | 녹화정보 제공기능을 갖는 영상표시기기 및 그 제어방법 |
US11246590B2 (en) | 2005-08-31 | 2022-02-15 | Cilag Gmbh International | Staple cartridge including staple drivers having different unfired heights |
US7669746B2 (en) | 2005-08-31 | 2010-03-02 | Ethicon Endo-Surgery, Inc. | Staple cartridges for forming staples having differing formed staple heights |
US10159482B2 (en) | 2005-08-31 | 2018-12-25 | Ethicon Llc | Fastener cartridge assembly comprising a fixed anvil and different staple heights |
US8340500B2 (en) * | 2005-09-07 | 2012-12-25 | Canon Kabushiki Kaisha | Video signal recording apparatus |
JP4701958B2 (ja) * | 2005-09-26 | 2011-06-15 | ソニー株式会社 | 画像処理装置、画像処理方法及びそのプログラム |
KR100691234B1 (ko) * | 2005-10-20 | 2007-03-12 | 삼성전자주식회사 | 영상 기록 및 재생 장치 및 그 제어 방법 |
FR2894692B1 (fr) * | 2005-12-08 | 2008-06-13 | Thomson Licensing Sas | Procede d'identification d'un document enregistre par affichage et selection d'images clefs, et recepteur associe. |
JP2007157300A (ja) * | 2005-12-08 | 2007-06-21 | Toshiba Corp | 情報処理装置および制御方法 |
US8186555B2 (en) | 2006-01-31 | 2012-05-29 | Ethicon Endo-Surgery, Inc. | Motor-driven surgical cutting and fastening instrument with mechanical closure system |
US8708213B2 (en) | 2006-01-31 | 2014-04-29 | Ethicon Endo-Surgery, Inc. | Surgical instrument having a feedback system |
US11793518B2 (en) | 2006-01-31 | 2023-10-24 | Cilag Gmbh International | Powered surgical instruments with firing system lockout arrangements |
US7845537B2 (en) | 2006-01-31 | 2010-12-07 | Ethicon Endo-Surgery, Inc. | Surgical instrument having recording capabilities |
KR100793752B1 (ko) * | 2006-05-02 | 2008-01-10 | 엘지전자 주식회사 | 녹화물의 부분 편집 기능을 구비한 영상기기 및 그제어방법 |
JP2007329833A (ja) | 2006-06-09 | 2007-12-20 | Sony Corp | 情報処理システム、記録再生装置、再生端末、情報処理方法、およびプログラム |
JP2008065964A (ja) * | 2006-09-11 | 2008-03-21 | Sony Corp | 情報処理装置、情報処理方法、およびプログラム |
US10568652B2 (en) | 2006-09-29 | 2020-02-25 | Ethicon Llc | Surgical staples having attached drivers of different heights and stapling instruments for deploying the same |
KR100911145B1 (ko) * | 2006-10-02 | 2009-08-06 | 삼성전자주식회사 | 단말기 및 이를 위한 디스플레이 방법 |
KR101265626B1 (ko) | 2006-10-10 | 2013-05-22 | 엘지전자 주식회사 | 화면 분할 탐색 기능을 구비한 영상기기 및 그 제어방법 |
KR101282802B1 (ko) * | 2006-11-17 | 2013-07-05 | 삼성전자주식회사 | 통합재생시 현재재생지점 안내방법 및 이를 적용한영상기기 |
JP4759503B2 (ja) * | 2006-12-20 | 2011-08-31 | キヤノン株式会社 | 画像処理装置、画像処理装置の制御方法、プログラム |
US8684253B2 (en) | 2007-01-10 | 2014-04-01 | Ethicon Endo-Surgery, Inc. | Surgical instrument with wireless communication between a control unit of a robotic system and remote sensor |
US8540128B2 (en) | 2007-01-11 | 2013-09-24 | Ethicon Endo-Surgery, Inc. | Surgical stapling device with a curved end effector |
US8931682B2 (en) | 2007-06-04 | 2015-01-13 | Ethicon Endo-Surgery, Inc. | Robotically-controlled shaft based rotary drive systems for surgical instruments |
US11564682B2 (en) | 2007-06-04 | 2023-01-31 | Cilag Gmbh International | Surgical stapler device |
US11849941B2 (en) | 2007-06-29 | 2023-12-26 | Cilag Gmbh International | Staple cartridge having staple cavities extending at a transverse angle relative to a longitudinal cartridge axis |
KR101396998B1 (ko) | 2007-08-29 | 2014-05-20 | 엘지전자 주식회사 | 영상기기 및 이 영상기기에서 녹화물을 디스플레이하는방법 |
US20090063981A1 (en) * | 2007-09-03 | 2009-03-05 | Canon Kabushiki Kaisha | Display control apparatus and control method thereof, program, and recording medium |
JP5188124B2 (ja) * | 2007-09-03 | 2013-04-24 | キヤノン株式会社 | 表示制御装置及びその制御方法、プログラム、記録媒体 |
JP4933386B2 (ja) * | 2007-09-03 | 2012-05-16 | キヤノン株式会社 | 表示制御装置及びその制御方法、プログラム |
JP4663698B2 (ja) * | 2007-09-13 | 2011-04-06 | オリンパス株式会社 | 画像表示装置、画像表示方法および画像表示プログラム |
JP4590444B2 (ja) * | 2007-09-13 | 2010-12-01 | オリンパス株式会社 | 画像表示装置、画像表示方法および画像表示プログラム |
JP4959498B2 (ja) * | 2007-09-27 | 2012-06-20 | 富士フイルム株式会社 | 画像表示装置、画像表示方法、プログラム及び撮影装置 |
KR101407636B1 (ko) | 2007-11-05 | 2014-06-16 | 삼성전자주식회사 | 영상 표시 장치 및 그 제어 방법 |
USD610160S1 (en) * | 2008-01-09 | 2010-02-16 | Apple Inc. | Graphical user interface for a display screen or portion thereof |
RU2493788C2 (ru) | 2008-02-14 | 2013-09-27 | Этикон Эндо-Серджери, Инк. | Хирургический режущий и крепежный инструмент, имеющий радиочастотные электроды |
JP5037425B2 (ja) * | 2008-05-14 | 2012-09-26 | 東芝Itコントロールシステム株式会社 | 高速度撮影装置 |
CN103402070B (zh) | 2008-05-19 | 2017-07-07 | 日立麦克赛尔株式会社 | 记录再现装置及方法 |
JP4934105B2 (ja) | 2008-06-09 | 2012-05-16 | ソニー株式会社 | 信号処理装置、マーク付与方法、プログラム |
JP4788739B2 (ja) | 2008-06-09 | 2011-10-05 | ソニー株式会社 | 端末装置、情報送信方法 |
US8210411B2 (en) | 2008-09-23 | 2012-07-03 | Ethicon Endo-Surgery, Inc. | Motor-driven surgical cutting instrument |
US9005230B2 (en) | 2008-09-23 | 2015-04-14 | Ethicon Endo-Surgery, Inc. | Motorized surgical instrument |
US9386983B2 (en) | 2008-09-23 | 2016-07-12 | Ethicon Endo-Surgery, Llc | Robotically-controlled motorized surgical instrument |
US11648005B2 (en) | 2008-09-23 | 2023-05-16 | Cilag Gmbh International | Robotically-controlled motorized surgical instrument with an end effector |
US8608045B2 (en) | 2008-10-10 | 2013-12-17 | Ethicon Endo-Sugery, Inc. | Powered surgical cutting and stapling apparatus with manually retractable firing system |
US20110113315A1 (en) * | 2008-12-31 | 2011-05-12 | Microsoft Corporation | Computer-assisted rich interactive narrative (rin) generation |
US9092437B2 (en) * | 2008-12-31 | 2015-07-28 | Microsoft Technology Licensing, Llc | Experience streams for rich interactive narratives |
US20110113316A1 (en) * | 2008-12-31 | 2011-05-12 | Microsoft Corporation | Authoring tools for rich interactive narratives |
US8046691B2 (en) * | 2008-12-31 | 2011-10-25 | Microsoft Corporation | Generalized interactive narratives |
US20110119587A1 (en) * | 2008-12-31 | 2011-05-19 | Microsoft Corporation | Data model and player platform for rich interactive narratives |
US8909683B1 (en) | 2009-07-17 | 2014-12-09 | Open Invention Network, Llc | Method and system for communicating with internet resources to identify and supply content for webpage construction |
US20110157322A1 (en) | 2009-12-31 | 2011-06-30 | Broadcom Corporation | Controlling a pixel array to support an adaptable light manipulator |
US9247286B2 (en) | 2009-12-31 | 2016-01-26 | Broadcom Corporation | Frame formatting supporting mixed two and three dimensional video data communication |
JP5524653B2 (ja) * | 2010-02-26 | 2014-06-18 | キヤノン株式会社 | 表示制御装置及びその制御方法 |
JP2011205217A (ja) * | 2010-03-24 | 2011-10-13 | Sony Corp | 情報処理装置、情報処理方法、プログラム |
US9645996B1 (en) * | 2010-03-25 | 2017-05-09 | Open Invention Network Llc | Method and device for automatically generating a tag from a conversation in a social networking website |
JP2011211609A (ja) * | 2010-03-30 | 2011-10-20 | Nec Personal Products Co Ltd | 表示装置、方法及びプログラム |
JP4983961B2 (ja) | 2010-05-25 | 2012-07-25 | 株式会社ニコン | 撮像装置 |
USD667020S1 (en) * | 2010-09-24 | 2012-09-11 | Research In Motion Limited | Display screen with graphical user interface |
US9861361B2 (en) | 2010-09-30 | 2018-01-09 | Ethicon Llc | Releasable tissue thickness compensator and fastener cartridge having the same |
US10945731B2 (en) | 2010-09-30 | 2021-03-16 | Ethicon Llc | Tissue thickness compensator comprising controlled release and expansion |
US11812965B2 (en) | 2010-09-30 | 2023-11-14 | Cilag Gmbh International | Layer of material for a surgical end effector |
US9320523B2 (en) | 2012-03-28 | 2016-04-26 | Ethicon Endo-Surgery, Llc | Tissue thickness compensator comprising tissue ingrowth features |
US11925354B2 (en) | 2010-09-30 | 2024-03-12 | Cilag Gmbh International | Staple cartridge comprising staples positioned within a compressible portion thereof |
US9629814B2 (en) | 2010-09-30 | 2017-04-25 | Ethicon Endo-Surgery, Llc | Tissue thickness compensator configured to redistribute compressive forces |
JP5678576B2 (ja) * | 2010-10-27 | 2015-03-04 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラム、および監視システム |
JP5754119B2 (ja) * | 2010-12-07 | 2015-07-29 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
KR101758273B1 (ko) * | 2011-01-17 | 2017-07-14 | 엘지전자 주식회사 | 이동단말기 및 그 제어방법 |
USD669495S1 (en) * | 2011-02-03 | 2012-10-23 | Microsoft Corporation | Display screen with graphical user interface |
USD669489S1 (en) * | 2011-02-03 | 2012-10-23 | Microsoft Corporation | Display screen with graphical user interface |
USD669494S1 (en) * | 2011-02-03 | 2012-10-23 | Microsoft Corporation | Display screen with graphical user interface |
USD687841S1 (en) | 2011-02-03 | 2013-08-13 | Microsoft Corporation | Display screen with transitional graphical user interface |
USD693361S1 (en) | 2011-02-03 | 2013-11-12 | Microsoft Corporation | Display screen with transitional graphical user interface |
USD692913S1 (en) | 2011-02-03 | 2013-11-05 | Microsoft Corporation | Display screen with graphical user interface |
USD669488S1 (en) * | 2011-02-03 | 2012-10-23 | Microsoft Corporation | Display screen with graphical user interface |
USD669490S1 (en) * | 2011-02-03 | 2012-10-23 | Microsoft Corporation | Display screen with graphical user interface |
USD669493S1 (en) * | 2011-02-03 | 2012-10-23 | Microsoft Corporation | Display screen with graphical user interface |
USD673169S1 (en) * | 2011-02-03 | 2012-12-25 | Microsoft Corporation | Display screen with transitional graphical user interface |
USD669492S1 (en) * | 2011-02-03 | 2012-10-23 | Microsoft Corporation | Display screen with graphical user interface |
USD669491S1 (en) * | 2011-02-03 | 2012-10-23 | Microsoft Corporation | Display screen with graphical user interface |
RU2606493C2 (ru) | 2011-04-29 | 2017-01-10 | Этикон Эндо-Серджери, Инк. | Кассета со скобками, содержащая скобки, расположенные внутри ее сжимаемой части |
KR101797041B1 (ko) | 2012-01-17 | 2017-12-13 | 삼성전자주식회사 | 디지털 영상 처리장치 및 그 제어방법 |
JP5786736B2 (ja) * | 2012-01-31 | 2015-09-30 | Nkワークス株式会社 | 画像再生プログラムおよび画像再生装置 |
USD716825S1 (en) * | 2012-03-06 | 2014-11-04 | Apple Inc. | Display screen or portion thereof with graphical user interface |
CN104321024B (zh) | 2012-03-28 | 2017-05-24 | 伊西康内外科公司 | 包括多个层的组织厚度补偿件 |
CN104334098B (zh) | 2012-03-28 | 2017-03-22 | 伊西康内外科公司 | 包括限定低压强环境的胶囊剂的组织厚度补偿件 |
US9101358B2 (en) | 2012-06-15 | 2015-08-11 | Ethicon Endo-Surgery, Inc. | Articulatable surgical instrument comprising a firing drive |
US20140001231A1 (en) | 2012-06-28 | 2014-01-02 | Ethicon Endo-Surgery, Inc. | Firing system lockout arrangements for surgical instruments |
US9289256B2 (en) | 2012-06-28 | 2016-03-22 | Ethicon Endo-Surgery, Llc | Surgical end effectors having angled tissue-contacting surfaces |
JP2014107641A (ja) * | 2012-11-26 | 2014-06-09 | Sony Corp | 情報処理装置および方法、並びにプログラム |
USD845978S1 (en) * | 2013-01-23 | 2019-04-16 | Yandex Europe Ag | Display screen with graphical user interface |
RU2672520C2 (ru) | 2013-03-01 | 2018-11-15 | Этикон Эндо-Серджери, Инк. | Шарнирно поворачиваемые хирургические инструменты с проводящими путями для передачи сигналов |
USD745542S1 (en) * | 2013-03-13 | 2015-12-15 | Samsung Electronics Co., Ltd. | Display screen or a portion thereof with a graphical user interface |
BR112015026109B1 (pt) | 2013-04-16 | 2022-02-22 | Ethicon Endo-Surgery, Inc | Instrumento cirúrgico |
CN104215336A (zh) * | 2013-05-29 | 2014-12-17 | 杭州美盛红外光电技术有限公司 | 热像装置、分析装置及热像拍摄方法和分析方法 |
USD757739S1 (en) * | 2013-06-05 | 2016-05-31 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphic user interface |
AU352884S (en) * | 2013-06-05 | 2013-12-11 | Samsung Electronics Co Ltd | Display screen with graphical user interface |
USD760254S1 (en) * | 2013-06-05 | 2016-06-28 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphic user interface |
AU353110S (en) * | 2013-06-05 | 2013-12-24 | Samsung Electronics Co Ltd | Display screen with graphical user interface |
USD741353S1 (en) | 2013-06-10 | 2015-10-20 | Apple Inc. | Display screen or portion thereof with animated graphical user interface |
USD766914S1 (en) * | 2013-08-16 | 2016-09-20 | Yandex Europe Ag | Display screen with graphical user interface having an image search engine results page |
US9510828B2 (en) | 2013-08-23 | 2016-12-06 | Ethicon Endo-Surgery, Llc | Conductor arrangements for electrically powered surgical instruments with rotatable end effectors |
BR112016023807B1 (pt) | 2014-04-16 | 2022-07-12 | Ethicon Endo-Surgery, Llc | Conjunto de cartucho de prendedores para uso com um instrumento cirúrgico |
CN106456176B (zh) | 2014-04-16 | 2019-06-28 | 伊西康内外科有限责任公司 | 包括具有不同构型的延伸部的紧固件仓 |
US20150297223A1 (en) | 2014-04-16 | 2015-10-22 | Ethicon Endo-Surgery, Inc. | Fastener cartridges including extensions having different configurations |
US9788836B2 (en) | 2014-09-05 | 2017-10-17 | Ethicon Llc | Multiple motor control for powered medical device |
BR112017004361B1 (pt) | 2014-09-05 | 2023-04-11 | Ethicon Llc | Sistema eletrônico para um instrumento cirúrgico |
USD810763S1 (en) * | 2014-09-23 | 2018-02-20 | Beijing Eryiju Technology Co., Ltd | Display screen with graphical user interface |
USD771099S1 (en) * | 2015-03-20 | 2016-11-08 | Beijing Eryiju Technology Co., Ltd | Display screen with graphical user interface |
USD799510S1 (en) * | 2015-03-20 | 2017-10-10 | Beijing Eryiju Technology Co., Ltd | Display screen with graphical user interface |
US9924944B2 (en) | 2014-10-16 | 2018-03-27 | Ethicon Llc | Staple cartridge comprising an adjunct material |
US10517594B2 (en) | 2014-10-29 | 2019-12-31 | Ethicon Llc | Cartridge assemblies for surgical staplers |
BR112017012996B1 (pt) | 2014-12-18 | 2022-11-08 | Ethicon Llc | Instrumento cirúrgico com uma bigorna que é seletivamente móvel sobre um eixo geométrico imóvel distinto em relação a um cartucho de grampos |
US10085748B2 (en) | 2014-12-18 | 2018-10-02 | Ethicon Llc | Locking arrangements for detachable shaft assemblies with articulatable surgical end effectors |
US11154301B2 (en) | 2015-02-27 | 2021-10-26 | Cilag Gmbh International | Modular stapling assembly |
US10441279B2 (en) | 2015-03-06 | 2019-10-15 | Ethicon Llc | Multiple level thresholds to modify operation of powered surgical instruments |
USD760265S1 (en) * | 2015-03-20 | 2016-06-28 | Beijing Eryiju Technology Co., Ltd | Display screen with graphical user interface |
US10213201B2 (en) | 2015-03-31 | 2019-02-26 | Ethicon Llc | Stapling end effector configured to compensate for an uneven gap between a first jaw and a second jaw |
US10105139B2 (en) | 2015-09-23 | 2018-10-23 | Ethicon Llc | Surgical stapler having downstream current-based motor control |
US20170086829A1 (en) | 2015-09-30 | 2017-03-30 | Ethicon Endo-Surgery, Llc | Compressible adjunct with intermediate supporting structures |
US11890015B2 (en) | 2015-09-30 | 2024-02-06 | Cilag Gmbh International | Compressible adjunct with crossing spacer fibers |
US10292704B2 (en) | 2015-12-30 | 2019-05-21 | Ethicon Llc | Mechanisms for compensating for battery pack failure in powered surgical instruments |
US11213293B2 (en) | 2016-02-09 | 2022-01-04 | Cilag Gmbh International | Articulatable surgical instruments with single articulation link arrangements |
US10448948B2 (en) | 2016-02-12 | 2019-10-22 | Ethicon Llc | Mechanisms for compensating for drivetrain failure in powered surgical instruments |
US10357247B2 (en) | 2016-04-15 | 2019-07-23 | Ethicon Llc | Surgical instrument with multiple program responses during a firing motion |
USD791159S1 (en) | 2016-04-18 | 2017-07-04 | Apple Inc. | Display screen or portion thereof with graphical user interface |
US20170296173A1 (en) | 2016-04-18 | 2017-10-19 | Ethicon Endo-Surgery, Llc | Method for operating a surgical instrument |
USD816117S1 (en) | 2016-06-13 | 2018-04-24 | Apple Inc. | Display screen or portion thereof with icon |
US10537325B2 (en) | 2016-12-21 | 2020-01-21 | Ethicon Llc | Staple forming pocket arrangement to accommodate different types of staples |
JP7010956B2 (ja) | 2016-12-21 | 2022-01-26 | エシコン エルエルシー | 組織をステープル留めする方法 |
US10758230B2 (en) | 2016-12-21 | 2020-09-01 | Ethicon Llc | Surgical instrument with primary and safety processors |
JP7034729B2 (ja) * | 2017-02-10 | 2022-03-14 | キヤノン株式会社 | 表示制御装置、その制御方法、および制御プログラム |
US10881399B2 (en) | 2017-06-20 | 2021-01-05 | Ethicon Llc | Techniques for adaptive control of motor velocity of a surgical stapling and cutting instrument |
US10307170B2 (en) | 2017-06-20 | 2019-06-04 | Ethicon Llc | Method for closed loop control of motor velocity of a surgical stapling and cutting instrument |
US10779820B2 (en) | 2017-06-20 | 2020-09-22 | Ethicon Llc | Systems and methods for controlling motor speed according to user input for a surgical instrument |
USD906355S1 (en) * | 2017-06-28 | 2020-12-29 | Ethicon Llc | Display screen or portion thereof with a graphical user interface for a surgical instrument |
US11389161B2 (en) | 2017-06-28 | 2022-07-19 | Cilag Gmbh International | Surgical instrument comprising selectively actuatable rotatable couplers |
US10932772B2 (en) | 2017-06-29 | 2021-03-02 | Ethicon Llc | Methods for closed loop velocity control for robotic surgical instrument |
US11944300B2 (en) | 2017-08-03 | 2024-04-02 | Cilag Gmbh International | Method for operating a surgical system bailout |
JP6958226B2 (ja) * | 2017-10-23 | 2021-11-02 | 富士フイルムビジネスイノベーション株式会社 | 情報処理装置及びプログラム |
JP6956380B2 (ja) * | 2017-11-09 | 2021-11-02 | パナソニックIpマネジメント株式会社 | 情報処理装置 |
USD905094S1 (en) * | 2017-11-15 | 2020-12-15 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with transitional graphical user interface |
USD905095S1 (en) * | 2017-11-15 | 2020-12-15 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with transitional graphical user interface |
USD918953S1 (en) * | 2017-11-15 | 2021-05-11 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with transitional graphical user interface |
US10779826B2 (en) | 2017-12-15 | 2020-09-22 | Ethicon Llc | Methods of operating surgical end effectors |
US20190192147A1 (en) | 2017-12-21 | 2019-06-27 | Ethicon Llc | Surgical instrument comprising an articulatable distal head |
USD906364S1 (en) | 2018-02-13 | 2020-12-29 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with transitional graphical user interface |
USD886836S1 (en) * | 2018-05-01 | 2020-06-09 | Magic Leap, Inc. | Display panel or portion thereof with graphical user interface |
USD877174S1 (en) | 2018-06-03 | 2020-03-03 | Apple Inc. | Electronic device with graphical user interface |
US11207065B2 (en) | 2018-08-20 | 2021-12-28 | Cilag Gmbh International | Method for fabricating surgical stapler anvils |
US11696761B2 (en) | 2019-03-25 | 2023-07-11 | Cilag Gmbh International | Firing drive arrangements for surgical systems |
US11903581B2 (en) | 2019-04-30 | 2024-02-20 | Cilag Gmbh International | Methods for stapling tissue using a surgical instrument |
US11241235B2 (en) | 2019-06-28 | 2022-02-08 | Cilag Gmbh International | Method of using multiple RFID chips with a surgical assembly |
US11684434B2 (en) | 2019-06-28 | 2023-06-27 | Cilag Gmbh International | Surgical RFID assemblies for instrument operational setting control |
US11771419B2 (en) | 2019-06-28 | 2023-10-03 | Cilag Gmbh International | Packaging for a replaceable component of a surgical stapling system |
US11660163B2 (en) | 2019-06-28 | 2023-05-30 | Cilag Gmbh International | Surgical system with RFID tags for updating motor assembly parameters |
US11701111B2 (en) | 2019-12-19 | 2023-07-18 | Cilag Gmbh International | Method for operating a surgical stapling instrument |
US11844520B2 (en) | 2019-12-19 | 2023-12-19 | Cilag Gmbh International | Staple cartridge comprising driver retention members |
USD997953S1 (en) | 2020-04-17 | 2023-09-05 | Magic Leap, Inc. | Display panel with a graphical user interface |
US11737748B2 (en) | 2020-07-28 | 2023-08-29 | Cilag Gmbh International | Surgical instruments with double spherical articulation joints with pivotable links |
US11896217B2 (en) | 2020-10-29 | 2024-02-13 | Cilag Gmbh International | Surgical instrument comprising an articulation lock |
US11844518B2 (en) | 2020-10-29 | 2023-12-19 | Cilag Gmbh International | Method for operating a surgical instrument |
US11931025B2 (en) | 2020-10-29 | 2024-03-19 | Cilag Gmbh International | Surgical instrument comprising a releasable closure drive lock |
USD1013170S1 (en) | 2020-10-29 | 2024-01-30 | Cilag Gmbh International | Surgical instrument assembly |
US11779330B2 (en) | 2020-10-29 | 2023-10-10 | Cilag Gmbh International | Surgical instrument comprising a jaw alignment system |
US11944296B2 (en) | 2020-12-02 | 2024-04-02 | Cilag Gmbh International | Powered surgical instruments with external connectors |
US11744581B2 (en) | 2020-12-02 | 2023-09-05 | Cilag Gmbh International | Powered surgical instruments with multi-phase tissue treatment |
US11890010B2 (en) | 2020-12-02 | 2024-02-06 | Cllag GmbH International | Dual-sided reinforced reload for surgical instruments |
US11653920B2 (en) | 2020-12-02 | 2023-05-23 | Cilag Gmbh International | Powered surgical instruments with communication interfaces through sterile barrier |
US11849943B2 (en) | 2020-12-02 | 2023-12-26 | Cilag Gmbh International | Surgical instrument with cartridge release mechanisms |
US11653915B2 (en) | 2020-12-02 | 2023-05-23 | Cilag Gmbh International | Surgical instruments with sled location detection and adjustment features |
US11737751B2 (en) | 2020-12-02 | 2023-08-29 | Cilag Gmbh International | Devices and methods of managing energy dissipated within sterile barriers of surgical instrument housings |
US11744583B2 (en) | 2021-02-26 | 2023-09-05 | Cilag Gmbh International | Distal communication array to tune frequency of RF systems |
US11749877B2 (en) | 2021-02-26 | 2023-09-05 | Cilag Gmbh International | Stapling instrument comprising a signal antenna |
US11730473B2 (en) | 2021-02-26 | 2023-08-22 | Cilag Gmbh International | Monitoring of manufacturing life-cycle |
US11701113B2 (en) | 2021-02-26 | 2023-07-18 | Cilag Gmbh International | Stapling instrument comprising a separate power antenna and a data transfer antenna |
US11723657B2 (en) | 2021-02-26 | 2023-08-15 | Cilag Gmbh International | Adjustable communication based on available bandwidth and power capacity |
US11812964B2 (en) | 2021-02-26 | 2023-11-14 | Cilag Gmbh International | Staple cartridge comprising a power management circuit |
US11925349B2 (en) | 2021-02-26 | 2024-03-12 | Cilag Gmbh International | Adjustment to transfer parameters to improve available power |
US11793514B2 (en) | 2021-02-26 | 2023-10-24 | Cilag Gmbh International | Staple cartridge comprising sensor array which may be embedded in cartridge body |
US11950777B2 (en) | 2021-02-26 | 2024-04-09 | Cilag Gmbh International | Staple cartridge comprising an information access control system |
US11751869B2 (en) | 2021-02-26 | 2023-09-12 | Cilag Gmbh International | Monitoring of multiple sensors over time to detect moving characteristics of tissue |
US11696757B2 (en) | 2021-02-26 | 2023-07-11 | Cilag Gmbh International | Monitoring of internal systems to detect and track cartridge motion status |
US11759202B2 (en) | 2021-03-22 | 2023-09-19 | Cilag Gmbh International | Staple cartridge comprising an implantable layer |
US11737749B2 (en) | 2021-03-22 | 2023-08-29 | Cilag Gmbh International | Surgical stapling instrument comprising a retraction system |
US11826042B2 (en) | 2021-03-22 | 2023-11-28 | Cilag Gmbh International | Surgical instrument comprising a firing drive including a selectable leverage mechanism |
US11723658B2 (en) | 2021-03-22 | 2023-08-15 | Cilag Gmbh International | Staple cartridge comprising a firing lockout |
US11826012B2 (en) | 2021-03-22 | 2023-11-28 | Cilag Gmbh International | Stapling instrument comprising a pulsed motor-driven firing rack |
US11806011B2 (en) | 2021-03-22 | 2023-11-07 | Cilag Gmbh International | Stapling instrument comprising tissue compression systems |
US11717291B2 (en) | 2021-03-22 | 2023-08-08 | Cilag Gmbh International | Staple cartridge comprising staples configured to apply different tissue compression |
US11849945B2 (en) | 2021-03-24 | 2023-12-26 | Cilag Gmbh International | Rotary-driven surgical stapling assembly comprising eccentrically driven firing member |
US11832816B2 (en) | 2021-03-24 | 2023-12-05 | Cilag Gmbh International | Surgical stapling assembly comprising nonplanar staples and planar staples |
US11793516B2 (en) | 2021-03-24 | 2023-10-24 | Cilag Gmbh International | Surgical staple cartridge comprising longitudinal support beam |
US11896218B2 (en) | 2021-03-24 | 2024-02-13 | Cilag Gmbh International | Method of using a powered stapling device |
US11744603B2 (en) | 2021-03-24 | 2023-09-05 | Cilag Gmbh International | Multi-axis pivot joints for surgical instruments and methods for manufacturing same |
US11786243B2 (en) | 2021-03-24 | 2023-10-17 | Cilag Gmbh International | Firing members having flexible portions for adapting to a load during a surgical firing stroke |
US11903582B2 (en) | 2021-03-24 | 2024-02-20 | Cilag Gmbh International | Leveraging surfaces for cartridge installation |
US11857183B2 (en) | 2021-03-24 | 2024-01-02 | Cilag Gmbh International | Stapling assembly components having metal substrates and plastic bodies |
US11849944B2 (en) | 2021-03-24 | 2023-12-26 | Cilag Gmbh International | Drivers for fastener cartridge assemblies having rotary drive screws |
US11896219B2 (en) | 2021-03-24 | 2024-02-13 | Cilag Gmbh International | Mating features between drivers and underside of a cartridge deck |
US11786239B2 (en) | 2021-03-24 | 2023-10-17 | Cilag Gmbh International | Surgical instrument articulation joint arrangements comprising multiple moving linkage features |
CN113221792B (zh) * | 2021-05-21 | 2022-09-27 | 北京声智科技有限公司 | 一种章节检测模型构建方法、编目方法及其相关设备 |
US11826047B2 (en) | 2021-05-28 | 2023-11-28 | Cilag Gmbh International | Stapling instrument comprising jaw mounts |
US11937816B2 (en) | 2021-10-28 | 2024-03-26 | Cilag Gmbh International | Electrical lead arrangements for surgical instruments |
CN114697762B (zh) * | 2022-04-07 | 2023-11-28 | 脸萌有限公司 | 一种处理方法、装置、终端设备及介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06302161A (ja) * | 1993-04-14 | 1994-10-28 | Sony Corp | 編集装置 |
JPH1051734A (ja) * | 1996-04-12 | 1998-02-20 | Hitachi Denshi Ltd | 動画像編集装置および動画像編集方法 |
JPH10285523A (ja) * | 1997-04-06 | 1998-10-23 | Sony Corp | 画像表示装置及び方法 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1984002606A1 (en) | 1982-12-22 | 1984-07-05 | Video Composition Corp | Video composition method and apparatus |
JPH04330881A (ja) | 1991-05-02 | 1992-11-18 | Fuji Xerox Co Ltd | 画像検索方法 |
JP3420257B2 (ja) | 1992-03-12 | 2003-06-23 | キヤノン株式会社 | 再生装置 |
DE69424896T2 (de) * | 1993-04-13 | 2000-12-14 | Sony Corp | Editiergerät |
DE69518610T2 (de) * | 1994-06-24 | 2001-01-11 | Microsoft Corp | Methode und System zum Durchblättern von Daten |
JP3454396B2 (ja) * | 1995-10-11 | 2003-10-06 | 株式会社日立製作所 | 動画像の変化点検出制御方法とそれに基づく再生停止制御方法およびそれらを用いた動画像の編集システム |
JP3198980B2 (ja) * | 1996-10-22 | 2001-08-13 | 松下電器産業株式会社 | 画像表示装置及び動画像検索システム |
EP0843311A3 (en) | 1996-11-15 | 2000-01-26 | Hitachi Denshi Kabushiki Kaisha | Method for editing image information with aid of computer and editing system |
JP4207099B2 (ja) | 1998-09-29 | 2009-01-14 | ソニー株式会社 | 画像編集装置及びその方法 |
US6670966B1 (en) * | 1998-11-10 | 2003-12-30 | Sony Corporation | Edit data creating device and edit data creating method |
JP3569800B2 (ja) * | 1998-12-24 | 2004-09-29 | カシオ計算機株式会社 | 画像処理装置及び画像処理方法 |
JP4227241B2 (ja) * | 1999-04-13 | 2009-02-18 | キヤノン株式会社 | 画像処理装置及び方法 |
-
2002
- 2002-07-12 JP JP2002203479A patent/JP3738843B2/ja not_active Expired - Fee Related
-
2003
- 2003-06-09 TW TW092115577A patent/TWI233302B/zh not_active IP Right Cessation
- 2003-06-11 CN CNB038009226A patent/CN1294750C/zh not_active Expired - Fee Related
- 2003-06-11 WO PCT/JP2003/007409 patent/WO2003105474A1/ja active Application Filing
- 2003-06-11 US US10/486,609 patent/US7496211B2/en not_active Expired - Fee Related
- 2003-06-11 EP EP03733361A patent/EP1515552A4/en not_active Withdrawn
- 2003-06-11 KR KR10-2004-7001831A patent/KR20050009270A/ko not_active Application Discontinuation
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06302161A (ja) * | 1993-04-14 | 1994-10-28 | Sony Corp | 編集装置 |
JPH1051734A (ja) * | 1996-04-12 | 1998-02-20 | Hitachi Denshi Ltd | 動画像編集装置および動画像編集方法 |
JPH10285523A (ja) * | 1997-04-06 | 1998-10-23 | Sony Corp | 画像表示装置及び方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1515552A4 * |
Also Published As
Publication number | Publication date |
---|---|
TW200402993A (en) | 2004-02-16 |
TWI233302B (en) | 2005-05-21 |
KR20050009270A (ko) | 2005-01-24 |
US20050044489A1 (en) | 2005-02-24 |
CN1294750C (zh) | 2007-01-10 |
EP1515552A4 (en) | 2006-05-10 |
JP2004072132A (ja) | 2004-03-04 |
CN1547848A (zh) | 2004-11-17 |
JP3738843B2 (ja) | 2006-01-25 |
US7496211B2 (en) | 2009-02-24 |
EP1515552A1 (en) | 2005-03-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP3738843B2 (ja) | 画像検出装置、画像検出方法および画像検出プログラム | |
TWI472228B (zh) | 資訊處理裝置,成像裝置,影像顯示控制方法及電腦程式 | |
JP5092469B2 (ja) | 撮像装置、画像処理装置、および画像表示制御方法、並びにコンピュータ・プログラム | |
US20090249208A1 (en) | Method and device for reproducing images | |
US7715692B2 (en) | Still picture information recording medium and method and apparatus for reproducing still picture information therefrom | |
KR20100091990A (ko) | 전자기기, 재생방법 및 프로그램 | |
JPWO2006121049A1 (ja) | データ処理装置 | |
US20080092048A1 (en) | Data Processor | |
EP2008450B1 (en) | Apparatus and method for displaying recordings | |
JP4191082B2 (ja) | 画像検出装置、画像検出方法および画像検出プログラム | |
US20060245722A1 (en) | Recording/reproducing apparatus | |
JP2007129368A (ja) | 情報記録装置およびその方法 | |
JP4288514B2 (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
JP2003037796A (ja) | 情報記録再生装置 | |
KR101396964B1 (ko) | 녹화물 재생방법 및 장치 | |
JP2005235249A (ja) | コンテンツ記録装置 | |
KR20050073011A (ko) | 디지털 방송 수신기 및 디지털 방송 수신기에서 섬네일탐색 방법 | |
JP2007068042A (ja) | 受信装置および方法 | |
JP2007043401A (ja) | 情報記録再生装置 | |
JP2003309794A (ja) | 信号記録再生装置および情報表示方法 | |
JP2005303616A (ja) | ディジタル放送記録再生装置 | |
EP2432220A2 (en) | Stream file management apparatus and a method of the same | |
JP2005065039A (ja) | 記録再生装置 | |
JP2006345249A (ja) | 映像記録再生装置 | |
KR20100028289A (ko) | 영상표시기기에서 영상 정보 디스플레이 장치 및 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003733361 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020047001831 Country of ref document: KR |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 20038009226 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10486609 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 2003733361 Country of ref document: EP |