WO2002058400A2 - Client-assisted motion estimation for client-server video communication system - Google Patents
Client-assisted motion estimation for client-server video communication system Download PDFInfo
- Publication number
- WO2002058400A2 WO2002058400A2 PCT/US2002/000029 US0200029W WO02058400A2 WO 2002058400 A2 WO2002058400 A2 WO 2002058400A2 US 0200029 W US0200029 W US 0200029W WO 02058400 A2 WO02058400 A2 WO 02058400A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- client
- pixels
- request
- receiving
- generating
- Prior art date
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 101
- 238000004891 communication Methods 0.000 title abstract description 17
- 230000006854 communication Effects 0.000 title abstract description 17
- 238000000034 method Methods 0.000 claims abstract description 40
- 239000013598 vector Substances 0.000 claims abstract description 32
- 230000003068 static effect Effects 0.000 claims description 22
- 230000004044 response Effects 0.000 claims description 12
- 230000000415 inactivating effect Effects 0.000 claims description 5
- 238000013441 quality evaluation Methods 0.000 claims description 2
- 238000004590 computer program Methods 0.000 claims 1
- 238000012545 processing Methods 0.000 description 10
- 239000000463 material Substances 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 238000004422 calculation algorithm Methods 0.000 description 5
- 238000007906 compression Methods 0.000 description 4
- 230000006835 compression Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000012163 sequencing technique Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000004091 panning Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 230000014616 translation Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000006837 decompression Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000007670 refining Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/258—Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
- H04N21/25808—Management of client data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/637—Control signals issued by the client directed to the server or network components
- H04N21/6377—Control signals issued by the client directed to the server or network components directed to server
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
Definitions
- the present invention relates generally to systems and methods for client-server communication , and more particularly to client-assisted motion estimation for client- server video communication.
- the present invention relates generally to systems and methods for client-server communication , and more particularly to client-assisted motion estimation for client- server video communication.
- a typical network involves a software application running on a client computer which retrieves information and accesses resources managed by a server computer. Resources available at the client and server vary depending upon the network and its
- the client may be a laptop, handheld PC (e.g. WinCE®
- Standard protocols involve client requests for data from the server, which the server then retrieves and/or generates, and which the client then processes and displays.
- the server may perform some pre-processing which assists the client, such as streamlining web pages, while the client performs final-processing and rendering of the data, (e.g. the client takes streamlined HTTP data and creates a web page). While processing and bandwidth requirements vary, high-bandwidth information transfers over computer networks are becoming more and more common, especially with the growth of e-commerce.
- FIG. 1 is a dataflow diagram of a conventional video compression system 100.
- a client 102 displaying a current frame, sends a request for a new frame over line 104 to a server 106.
- a retrieve material module 108 within the server 106 retrieves the new frame and sends the new frame contained within the current frame (a.k.a.
- P-Blocks to a motion estimation (ME) module 110 to determine if blocks in the new frame may be accurately predicted using blocks from the current frame.
- Blocks in the new frame which can not be accurately predicted are coded as intra-frame (a.k.a. "I- blocks") without prediction.
- the ME module 110 generates one or more motion vectors (MNs) for predicting motion of blocks in the new frame with reference to previous positions of the blocks in the current frame.
- the ME module 110 conventionally computes these MN's using a brute force method which compares each block in the new frame with a corresponding block in the current frame.
- a prediction error (PE) is then computed from each MN.
- the ME module 110 contains, by far, the most complex and computationally intensive algorithms used in the system 100, and thus consumes a substantial amount of the server's 106 computational time.
- the encode module 112 within the server 106 receives the MNs and PEs from the ME module 110.
- the encode module 112 determines whether each block should be coded as a P or I block before coding the blocks into a compressed bit-stream.
- the compressed bit-stream is then transmitted to the client 102 on line 114.
- a decoder 116 within the client 102 conventionally decodes the bit-stream into the new frame to be displayed by the client computer 102.
- the present invention is a system and method for client-assisted motion estimation for client-server communication.
- the system and method of the present invention includes the steps of receiving a request to update a current set of pixels within a video display on a client computer, and generating a motion vector for moving the current set of pixels within the video display using video information obtained only from the request and the client computer.
- the present invention operates as well on objects, blocks and frames of video information.
- the receiving step may also include the step of receiving a motion request to translate subsets of the current set of pixels by predetermined amounts
- the generating step may also include the step of generating a corresponding set of motion vector for moving the subsets within the video display by the predetermined amounts.
- the receiving step may also include the step of receiving a motion request to keep static a second subset of the current set of pixels
- the generating step may also include the step of instructing the client computer to leave the second subset undisturbed within the video display.
- the present invention may further include the steps of, identifying gaps in the video display after moving the subsets by the predetermined amounts, and retrieving new video information from a server computer containing new video pixels corresponding to the gaps in the video display, after which the gaps in the video display with the new video pixels.
- the present invention provides for inactivating resources for computing motion estimation using non-client video information obtained from other than the request and the client computer.
- the present invention provides for performing quality evaluation on the motion vectors created from the request and client video information.
- the present invention permits an exchange of side information between the client and server so that responsibility for executing the receiving and generating steps of the present invention may be delegated between the client and server as resources permit.
- the system and method of the present invention are particularly advantageous over the prior art because new material requested by a client computer over a network from a server is often related to old material already at the client, and the present invention enables the server to use the client request itself to direct various compression, decompression, and communication operations.
- present invention uses these client requests to selectively retrieve and generate motion vectors at the server which take advantage of visual information already on the video display. Updates to client video displays can thus be transmitted over the network using significantly less bandwidth than current prior art communications systems.
- motion vector calculations using conventional methods can make up to 90 % of server computations, the present invention, by estimating the motion vectors from client requests, essentially bypasses conventional complex motion vector calculations.
- Figure 1 is a dataflow diagram of a conventional client-server communication system
- Figure 2 is a dataflow diagram of one embodiment of a system for client-assisted motion estimation for client-server communication
- Figure 3 is a flowchart of one embodiment of a method for responding to a static motion request within the system
- Figure 4 is a flowchart of one embodiment of a method for responding to a global motion request routine within the system
- Figure 5 is a flowchart of one embodiment of a method for responding to a partial motion request routine within the system
- Figure 6 is a flowchart of one embodiment of a method for performing motion vector quality checking within the system
- Figure 7 is a dataflow diagram of one embodiment of a second system for client- assisted motion estimation for client-server communication
- Figure 8 is a dataflow diagram of one embodiment of a third system for client- assisted motion estimation for client-server communication.
- FIG. 2 is a dataflow diagram of one embodiment of a system 200 for client- assisted motion estimation for client-server communication.
- a client 202 video display images a current frame, block, or set of pixels. Sets of pixels are also known as "objects.” Objects have an arbitrary shape.
- frames, blocks, and objects describe a single image or portion thereof on a computer display. Frames are typically divided into a matrix of blocks, and each block or object typically contains a matrix of pixels.
- the client 202 In response to a user command, the client 202 generates a client request.
- User commands include mouse movements, text selections, scrolling, panning, paging up/down, as well as many others computer commands known to those skilled in the art.
- the client 202 formats these user commands in to client request containing video information on how to map one or more portions of the current frame into a new frame based on the user commands.
- the client 202 sends the client request over line 204 to a server 206.
- the server 206 includes a motion estimation (ME) manager 208, which receives the client request.
- the ME manager 208 analyzes the request.
- client requests are in a format defined by each particular software application being executed by the client 202.
- the ME manager 208 then divides the client requests into several types, including: static/no-motion requests, global motion requests, partial motion requests, and new data requests. Depending upon the type of request, blocks are either standby-coded, Intra-coded (a.k.a. "I” coded - without prediction), and Inter-coded (a.k.a. "P” or "B” coded - with prediction) before being sent back to the client 202.
- I Intra-coded
- P Inter-coded
- Standby-coded blocks are generated in response to static motion requests; I-coded blocks are generated in response to new block requests; and motion vectors (MNs) and P or B-coded blocks are generated in response to global/partial motion requests.
- a retrieve material module 210 retrieves new blocks of frame video information as required by the ME manager 208 during I-block generation. This new frame video information is not available at the client and is defined herein as non-client video information.
- a ME module 212 performs conventional block-by-block motion estimation between current blocks and new blocks as required by the ME manager 208 during P/B-block generation.
- An encode module 214 within the server 206, receives various standby-codes, I- blocks, and MNs from the ME manager 208.
- the encode module 214 then converts the MNs into prediction errors and assembles a new frame packet into a compressed bit- stream for transmission to the client 202.
- the encode module 214 is preferably standard- compliant, although it need not be.
- a decode module 220 in the client 202, receives and converts the new frame packet into the new frame which then replaces the current frame on the client 202 video display.
- the decoder is also preferably standard-compliant, although it need not be. Operation of the ME manager 208 and each of the modules 210-214, and 218 is discussed in more detail below.
- FIG. 3 is a flowchart of one embodiment of a method 300 for responding to a static motion request within the system 200.
- the ME manager 208 within the server 206 receives a static motion request.
- the ME manager 208 assigns a standby-code to the current frame.
- the standby-code is a simple "use last decoded frame" command. In such a completely static situation video quality at the client 202 is already very high or nearly perfect.
- the ME manager 208 sends the standby-code to the encode module 214.
- the encode module 214 receives the standby-codes and assembles a new frame packet containing the standby-codes.
- the encode module 214 next transmits the packet over line 216 to the client 202.
- the ME manager 208 also pauses and/or turns off the retrieve material module 210, the ME module 212, and any other server 206 resources which normally perform conventional motion vector computations and compression operations.
- the present invention recognizes when video information is static at the client 202 and in response, conserves limited server computing resources and limited network bandwidth by not compressing and transmitting what amounts to only a static image.
- FIG. 4 is a flowchart of one embodiment of a method 400 for responding to a global motion request within the system 200.
- the ME manager 208 receives a request that the current frame being displayed on the client 202 be globally translated/moved.
- the ME manager 208 aligns an origin of a coordinate system of the current frame with a coordinate system of the client video display. The origin usually corresponds to an upper left corner of the client 202 video display, where positive horizontal is right and positive vertical is down; however, if the origin is different, then an appropriate offset and or sign switch must be effected.
- the ME manager 208 determines which blocks within the new
- the ME manager 208 converts the client motion request into pixel units, if not already in pixel units.
- the video information in the client request contains a specific number of pixels by which a set of pixels is to be translated, no conversion is required.
- a client request to pan through a displayed picture contains both a panning command as well as an explicit number of pixels by which the displayed picture is to be panned.
- video information within the client request specifies translations in a fractional or percentage portion of the current frame, then the ME manager 208 must convert the fraction/percentage into a specific number of pixels.
- a client request to translate the current frame downward by 1/10 of the current frame size is converted into a 64 pixel downward movement, if the current frame is 640 x 480 pixels in size.
- Video information available from the client in addition to video information within the request, such as the size of the current frame on the client computer, is herein defined as client video information.
- the ME manager 208 must evaluate the expression to determine a corresponding number of pixels to translate the current frame.
- the ME manager 208 can compute pixel motion from the complex mathematical expression in a number of ways, including: (1) using a motion at a center of a block, (2) using motion at various points in a block (e.g.
- the ME manager 208 in step 410, then generates a single motion vector (MV) corresponding to the global request. Motion vectors indicate a direction and a magnitude for translating blocks.
- the ME manager 208 assigns the single MV to each of the translated- blocks. For example, if the client request that the new frame moves globally by (5,-7) pixels with respect to the current frame, then the ME manager 208 generates a single motion vector of (5,-7) which is assigned to each of the translated-blocks.
- the ME manager 208 sends the single MV to the encode module 214.
- the ME manager 208 in step 414, identifies all blocks within the new frame
- the ME manager 208 retrieves the new-blocks from the retrieve material module 210.
- the ME manager 208 instructs the encode module 214 to intra-code the new-blocks, creating I-blocks.
- the ME manager 208 in step 418, also sends the new-blocks to the ME
- the ME manager 208 then performs a quality check to determine if the computed MVs provide good predictions. If the MVs provide good predictions for the new-blocks, the ME manager 208 instructs the encode module 214 to inter-code the new-blocks, creating P-blocks.
- the encode module 214 in step 420, converts all received MVs into a corresponding prediction errors. Note, "prediction errors" in video compression and techniques for generating them from motion vectors are well known.
- the encode module 214 then assembles a new frame packet containing the prediction errors and I-blocks and transmits the packet over line 216 to the client 202.
- the decode module 218 receives the new frame packet and in response generates the new frame for display at the client 202.
- the ME manager 208 while responding to the global request, the ME manager 208 preferably inactivates any unused resources, such as the ME module 212, by pausing them, placing them into a standby mode, or turning them off. By inactivating the ME module 214, substantial server 206 resources are conserved.
- FIG. 5 is a flowchart of one embodiment of a method 500 for responding to a partial motion request within the system 200.
- Partial motion requests are client requests that one or more portions of the current frame be translated within the new frame. Partial motion requests are a blend of static motion requests and global motion requests in that some sets of blocks remain static while others translate.
- the ME manager 208 receives a request that the current frame being displayed on the client 202 be partially translated.
- step 504 the ME manager 208 aligns the origin of the coordinate system of the video display with the coordinate system of the current frame according to step 404 of Figure 3.
- the ME manager 208 in step 506, groups together all sets of blocks which remain static in the new frame with respect to the current frame.
- the ME manager 208 then assigns a standby-code to each set of static blocks and send them to the encode module 214 for transmission to the client 202 in accordance with step 306 of Figure 3.
- step 508 groups together all sets of blocks which translate together in the new frame with respect to the current frame.
- the ME manager 208 then converts the client request into pixel units according to step 408 of
- the ME manager 208 then generates a single MN corresponding to each set of translated blocks. If there are sets of blocks in the current frame which overlap and/or undergo multiple movements, the ME manager 208 executes an "occupancy/overlap" algorithm which ranks and assigns a MN to each set of translated- blocks. Occupancy algorithms are discussed with reference to U.S. Patent Application Serial No. 09/561315, entitled “ Transcoding Method And Transcoder For Transcoding A Predictively-coded Object-based Picture Signal To A Predictively-coded Block-based Picture Signal,” filed on April 28, 2000, by John protestopoulos et al., which is herein incorporated by reference.
- the ME manager 208 abandons using the client request to compute a new MV and instead retrieves new-blocks from the retrieve material module 212 and sends them to the ME module 212 for conventional MV computation. After MV computation, the ME manager 208 sends the MVs to the encode module 214 for processing in accordance with step 420 of Figure 4. In step 512, the ME manager 208 identifies and processes into I-blocks all blocks
- the ME manager 208 then sends the I-blocks to the
- the ME manager 208 While responding to the partial motion request, the ME manager 208, in step 514, preferably pauses and/or turns off any unused resources, such as the ME module 212 except in the overlapping block context discussed above.
- the ME manager 208 processes the request and generates I-blocks for each block in the new frame in accordance with steps 416 and 418 of Figure 4.
- the ME manager 208 then sends the I- blocks to the encode module 214 for processing in accordance with step 420 of Figure 4.
- the ME manager 208 While responding to the partial motion request, the ME manager 208 preferably pauses and/or turns off any unused resources, such as the ME module 212.
- FIG. 6 is a flowchart of one embodiment of a method 600 for performing MV quality checking within the system 200. Quality checks can be performed at any time, such as at a beginning of a new client-server communication session.
- the ME manager 208 in step 602 receives a request from either the client 202 or the server 206 to perform a MV quality check.
- the ME manager 208 computes a first MV for a block in accordance with either the global or partial motion request routines described with respect to Figures 4 and 5 above.
- the ME manager 208 then commands the ME module 212, in step 606, to compute a second MV for the block, using convention MV computation algorithms.
- the ME manager 208 evaluates quality of the first MV by comparing a first prediction error generated from the first MV to either a predetermined threshold, a second prediction error generated from the second MV, and/or evaluates the first MV using some other technique, in order to determine the first MV's quality.
- step 610 if the quality of the first prediction error meets a predetermined threshold, the ME manager 208 sends the first MV to the encode module 214 for processing as discussed above. If however, the quality is below the predetermined threshold, the ME manager 208, in step 612, selects from a number of possible alternatives, such as refining the first MV (i.e.
- FIG 7 is a dataflow diagram of one embodiment of a second system 700 for client-assisted motion estimation for client-server video communication.
- the client 204 includes a client ME module 702 which receives a standard client request on line 704 and transmits a shortened client request and side information to the ME manager 208 over line 204.
- the client ME module 702 also passes information over line 706 to a bit-stream sequencing module 708.
- the bit-stream sequencing module 708 also receives encoded data and side information from the server 206 over line 216. Operation of these modules 702 and 708 are discussed in detail below.
- Figure 8 is a flowchart of one embodiment of a method 800 for performing MN quality checking within the second system 700.
- the client ME module 702 receives the standard client request on line 704 in response to a user command on the client 202.
- the client ME module 702 then assumes responsibility for several of the ME manager's 208 functions. Specifically, in step 804, the client ME module 702 compares the client requested new frame with the current frame and identifies sets of static blocks, translated-blocks, and new blocks.
- the client ME module 702 groups together the static blocks and standby-codes them according to the steps discussed with respect to Figure 3.
- the client ME module 702, in step 808, groups together the sets of translated-blocks and computes a MNs for each set, according to the steps discussed with respect to Figures 4 and 5.
- the client ME module 702 in step 810, groups together and sets aside the required new blocks.
- the client ME module 702 generates a shortened client request from the standard client request.
- the shortened client request includes a request for the specific set of new blocks and preferably does not include any request for static or translated blocks.
- the client ME module 702 then transmits the shortened request over line 204 to the ME manager 208.
- the ME manager 208 in step 814, retrieves the new blocks from the retrieve material module 210 as specified in the shortened client request, and instructs the encode module 214 to I-code and transmit the new blocks to the sequencing module 708 in the client 202.
- the sequencing module 708 receives the standby-codes and MVs from the client ME module 702 and the I-coded new blocks from the server 206 and sequences them into a standard-compliant bit-stream for the decode module 218 to process into the new frame for display on the client 202 computer.
- the client ME module 702 significantly reduces computational demands on the server 206 and enables a significantly more compact non-standard compliant bit-stream to be sent over line 216 to the client 202. This also permits the decoder 218 to be a standard-compliant decoder.
- the client 202 and server 206 can exchange "side-information" which can enable the server 206 to perform dynamic load-balancing within the system 700.
- “Side-information” is herein generically defined to include any information which describes what information either the client 202 and/or server 206 is processing and how that information is being processed.
- network resources can be managed in any number of ways. For instance, depending on a number of different client computers which the server 206 is connected to, the server 206, through the use of side information, can query the different clients for their current processing capacity and shift some motion estimation management responsibilities to the different clients in order to avoid having the server's 206 limited computational capacity or the network's limited bandwidth from slowing down the network.
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002558754A JP2004537079A (en) | 2001-01-16 | 2002-01-03 | Client-assisted motion estimation for client-server video communication |
EP02712599A EP1352527A2 (en) | 2001-01-16 | 2002-01-03 | Client-assisted motion estimation for client-server video communication system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/760,693 | 2001-01-16 | ||
US09/760,693 US6678329B2 (en) | 2001-01-16 | 2001-01-16 | Client-assisted motion estimation for client-server video communication |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2002058400A2 true WO2002058400A2 (en) | 2002-07-25 |
WO2002058400A3 WO2002058400A3 (en) | 2003-02-27 |
Family
ID=25059888
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2002/000029 WO2002058400A2 (en) | 2001-01-16 | 2002-01-03 | Client-assisted motion estimation for client-server video communication system |
Country Status (4)
Country | Link |
---|---|
US (1) | US6678329B2 (en) |
EP (1) | EP1352527A2 (en) |
JP (1) | JP2004537079A (en) |
WO (1) | WO2002058400A2 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009102011A1 (en) * | 2008-02-14 | 2009-08-20 | Nec Corporation | Update region detection device |
JP2009252153A (en) * | 2008-04-10 | 2009-10-29 | Sony Corp | Information processing device and information processing method, and computer program |
US10142651B1 (en) * | 2014-12-11 | 2018-11-27 | Pixelworks, Inc. | Frame rate conversion with partial motion vector |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5991443A (en) * | 1995-09-29 | 1999-11-23 | U.S.Philips Corporation | Graphics image manipulation |
WO1999065243A1 (en) * | 1998-06-09 | 1999-12-16 | Worldgate Service, Inc. | Mpeg encoding technique for encoding web pages |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5321750A (en) * | 1989-02-07 | 1994-06-14 | Market Data Corporation | Restricted information distribution system apparatus and methods |
US5903892A (en) * | 1996-05-24 | 1999-05-11 | Magnifi, Inc. | Indexing of media content on a network |
US6349297B1 (en) * | 1997-01-10 | 2002-02-19 | Venson M. Shaw | Information processing system for directing information request from a particular user/application, and searching/forwarding/retrieving information from unknown and large number of information resources |
US6292512B1 (en) * | 1998-07-06 | 2001-09-18 | U.S. Philips Corporation | Scalable video coding system |
US6275531B1 (en) * | 1998-07-23 | 2001-08-14 | Optivision, Inc. | Scalable video coding method and apparatus |
US6476873B1 (en) * | 1998-10-23 | 2002-11-05 | Vtel Corporation | Enhancement of a selectable region of video |
-
2001
- 2001-01-16 US US09/760,693 patent/US6678329B2/en not_active Expired - Lifetime
-
2002
- 2002-01-03 EP EP02712599A patent/EP1352527A2/en not_active Withdrawn
- 2002-01-03 WO PCT/US2002/000029 patent/WO2002058400A2/en not_active Application Discontinuation
- 2002-01-03 JP JP2002558754A patent/JP2004537079A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5991443A (en) * | 1995-09-29 | 1999-11-23 | U.S.Philips Corporation | Graphics image manipulation |
WO1999065243A1 (en) * | 1998-06-09 | 1999-12-16 | Worldgate Service, Inc. | Mpeg encoding technique for encoding web pages |
Also Published As
Publication number | Publication date |
---|---|
WO2002058400A3 (en) | 2003-02-27 |
JP2004537079A (en) | 2004-12-09 |
EP1352527A2 (en) | 2003-10-15 |
US20020094029A1 (en) | 2002-07-18 |
US6678329B2 (en) | 2004-01-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4900976B2 (en) | Method for switching compression level in an image streaming system, and system, server, and computer program | |
US7336841B2 (en) | Fingerprinting digital video for rights management in networks | |
US6067322A (en) | Half pixel motion estimation in motion video signal encoding | |
US8254685B2 (en) | Detecting content change in a streaming image system | |
US7830961B2 (en) | Motion estimation and inter-mode prediction | |
CN101189882B (en) | Method and apparatus for encoder assisted-frame rate up conversion (EA-FRUC) for video compression | |
CN1901676B (en) | Streaming image system and method | |
KR20050007607A (en) | Spatial prediction based intra coding | |
CN110248192B (en) | Encoder switching method, decoder switching method, screen sharing method and screen sharing system | |
KR20080110454A (en) | Bi-prediction coding method and apparatus, bi-prediction decoding method and apparatus, and recording midium | |
US8542735B2 (en) | Method and device for coding a scalable video stream, a data stream, and an associated decoding method and device | |
CN109922340B (en) | Image coding and decoding method, device, system and storage medium | |
JP2005520417A (en) | Method and apparatus for performing smooth transitions between FGS coding configurations | |
US6678329B2 (en) | Client-assisted motion estimation for client-server video communication | |
KR100978465B1 (en) | Bi-prediction coding method and apparatus, bi-prediction decoding method and apparatus, and recording midium | |
KR100488421B1 (en) | Lossy coding method of binary image | |
CN117676266A (en) | Video stream processing method and device, storage medium and electronic equipment | |
CN114640844A (en) | Reference block searching method and device in live video coding and computer equipment | |
KR100397133B1 (en) | Method and System for compressing/transmiting of a picture data | |
KR100207388B1 (en) | Image encoder using adaptive vector quantization | |
JPH08116511A (en) | Video signal recording and reproducing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): JP |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
AK | Designated states |
Kind code of ref document: A3 Designated state(s): JP |
|
AL | Designated countries for regional patents |
Kind code of ref document: A3 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2002712599 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2002558754 Country of ref document: JP |
|
WWP | Wipo information: published in national office |
Ref document number: 2002712599 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2002712599 Country of ref document: EP |