US6462785B1 - Motion display technique - Google Patents

Motion display technique Download PDF

Info

Publication number
US6462785B1
US6462785B1 US08/869,056 US86905697A US6462785B1 US 6462785 B1 US6462785 B1 US 6462785B1 US 86905697 A US86905697 A US 86905697A US 6462785 B1 US6462785 B1 US 6462785B1
Authority
US
United States
Prior art keywords
video signal
frame rate
surrogate
video
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US08/869,056
Inventor
Gianpaolo U. Carraro
John T. Edmark
James Robert Ensor
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia of America Corp
Original Assignee
Lucent Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lucent Technologies Inc filed Critical Lucent Technologies Inc
Priority to US08/869,056 priority Critical patent/US6462785B1/en
Assigned to LUCENT TECHNOLOGIES INC. reassignment LUCENT TECHNOLOGIES INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CARRARO, GIANPAOLO U., EDMARK, JOHN T., ENSOR, JAMES ROBERT
Application granted granted Critical
Publication of US6462785B1 publication Critical patent/US6462785B1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/80Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/85Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • H04N19/137Motion inside a coding unit, e.g. average field, frame or block difference

Definitions

  • This invention relates to the displaying of video with motion where the frame rate of the displayed video is inadequate to fluidly convey that motion.
  • a well known problem in the art of video communication is the inability to fluidly display motion in a video when the frame rate at which the video is displayed must be either less than or greater than the frame rate of the source video.
  • Conventional solutions to this problem are either to display the video at a frame rate that is mismatched to the source frame rate or to attempt to rationalize the frame rates.
  • Such rationalization may be done by either a) dropping frames, e.g., during transmission or prior to display, when the display rate is less than the source rate, or b) by padding frames—with or without interpolation-when the display rate is greater than the source rate.
  • Such solutions are inadequate because of the resulting poor quality, e.g., jumpiness and lack of smooth notion.
  • Another problem in the art is an inability to transmit frames from a video source to a video display without loss of frames during transmission.
  • Conventional solutions to this problem are to either 1) buffer frames so that there is a stored reserve of frames that can be shown during times when no frames are received or 2) display the last received frame until the next one is received. Again, such solutions are inadequate because of the resulting poor quality, e.g., jumpiness and lack of smooth motion, as well as the fact that the cost of the memory to implement the buffer is not negligible.
  • the buffer length introduces a transmission delay, which can be disconcerting in interactive video applications such as video conferences or gaming.
  • a surrogate effect when there is an inability to convey motion that is occurring in the frames of a video source by a full-frame-rate transmission, a surrogate effect, other than another form of motion, may be used as a compensation technique to better convey motion to a viewer.
  • the surrogate effect employed may be a) fading, b) wiping, c) dissolving, d) blurring, e) enhancing the contrast, f) enhancing one or more colors, g) enhancing the brightness, h) scaling the image, and i) the like.
  • How the surrogate effect is applied to any frame may be a function of one or more video frames.
  • more than one effect may be used in combination.
  • a more continuous sense of motion is perceived by a viewer.
  • full-frame-rate video is initially available.
  • a controller monitors the ability to transmit or display full-frame-rate video.
  • the images of the frames that will be displayed are adjusted to incorporate a surrogate effect that helps to better convey the intended motion to the viewer.
  • the surrogate effect applied may be selected as a joint function of the available frame rate and knowledge of the content, e.g., the specific type of motion, intended to be shown to the viewer.
  • the same surrogate effect techniques may be employed even when full-frame-rate capability is available, if it is desired to give motion occurring at a first speed within the video the appearance to a viewer of occurring at a second speed. This may be necessary when the desired frame rate is greater than the full frame rate, which may be required for interactive applications, such as simulations or gaming.
  • the controller determines the first speed of the motion and the desired second speed and applies the surrogate effect as appropriate.
  • FIG. 1 shows an exemplary video communication system in accordance with the invention
  • FIG. 2 shows a flow chart for a process by which a video client processes frames of video received in packet format from a network, in accordance with the principles of the invention
  • FIG. 3 shows a flow chart for a more specific example of the invention.
  • FIG. 1 shows an exemplary video communication system 100 arranged in accordance with the invention.
  • Video communication system 100 includes video server 101 , network 103 and video client 105 .
  • Video server 101 transmits selected video at a fixed source frame rate to video client 105 via network 103 .
  • Video server 107 may obtain the video information from video camera 107 , magnetic disk 109 , video tape 111 , or optical disk 113 .
  • video server 101 formats the video information of the frames into packets and transmits the packets to network 103 .
  • Network 103 attempts to transmit all of the packets to video client 105 .
  • network 103 is not always entirely successful. This lack of success is typically due to congestion which results in packets being dropped within network 103 .
  • the packets that successfully traverse network 103 are supplied to video client 105 .
  • Video client 105 receives the packets from network 103 and reconstructs frames of the original video signal therefrom. For purposes of this invention, it is assumed that any errors within a frame, e.g., as a result of packets dropped or damaged during transmission, are either compensated for by other techniques beyond the scope of this disclosure or they cause the entire frame containing such errors to be dropped.
  • the reconstructed frames are converted into a signal suitable for display on video monitor 115 .
  • Video monitor 115 may be a computer monitor, a television, or the like.
  • FIG. 2 shows a flow chart for a process by which video client 105 processes frames of video received in packet format from network 103 , in accordance with the principles of the invention.
  • the process is entered in step 201 when video information is initially received from network 103 .
  • the source frame rate is determined.
  • the source frame rate is the frame rate at which the maker of the video intended to have the video shown and achieve full-motion. Typically, this is the rate at which the video was filmed, most often 30 frames per second in the United States.
  • header information in at least one of the packets of the first frame specifies what the source frame rate is.
  • the desired display frame rate is determined.
  • the desired frame rated may be fixed, e.g., for static viewing of the video.
  • the desired frame rate may be dynamically changing, e.g., for interactive applications such as where a user is simulating a bicycle ride, the video is the terrain the bicyclist sees, and the frames need to be shown at a rate that is a function of the speed at which the bicycle is moving.
  • Conditional branch point 207 tests to determine if the source frame rate is equal to the desired display frame rate.
  • the source frame rate may be less than the desired display frame rate for interactive applications, such as the foregoing bicycle simulation, when the bicyclist's speed is greater than the speed at which the camera used to shoot the video footage of the terrain was moving when the video footage was shot.
  • video camera 107 may be a low-end, less-than-full-motion camera operating at 10 frames per second (fps) while video client 105 is a high-end, full-motion user workstation capable of displaying 30 fps.
  • the source frame rate may be greater than the desired display frame rate for interactive applications, such as the foregoing bicycle simulation, when the bicyclist's speed is less than the speed at which the camera used to shoot the video footage of the terrain was moving when the video footage was shot.
  • video camera 107 may be a high-end, full-motion camera operating at 30 fps while video client 105 is a low-end, less-than-full-motion user workstation that is only capable of 10 fps.
  • test result in step 207 is YES
  • control passes to conditional branch point 209 which tests to determine if the received frame rate equals the desired display frame rate. This test is implemented by checking if a new frame of video has been received at the desired display frame rate and is ready for display. If the test result in step 209 is YES, this means that the source frame rate, the received frame rate, and the desired frame rate all match. In other words, all the frames that need to be displayed are being received and at the rate at which they should be displayed. Therefore, the video can be directly displayed without further processing. Accordingly, control passes to step 211 and the frame is displayed. Control then passes back to step 205 and the process continues as described above.
  • the surrogate effect employed may be selected from a) fading, b) wiping, c) dissolving, d) blurring, e) enhancing the contrast, f) enhancing one or more colors in the frame, g) enhancing the brightness, h) scaling the image, and i) the like.
  • more that one surrogate effect may be applied in combination.
  • Step 213 is optional because the processing technique to be employed may be preset for the entire video. Otherwise, the selection of a surrogate effect may be made in step 213 for a portion of the video, and the portion may be as small as a single frame.
  • the particular technique selected may be as a function of the content of the frame. This may be implemented, for example, by analysis of the content of the frame, such as by video client 105 .
  • the surrogate effect to be employed may be specified as pail of the video source and transmitted along with the frames in one or more of the packet headers.
  • step 215 the selected surrogate effect is applied to the video.
  • information from one or more frames may be employed. For example, such frames may be stored in memory. Control then passes to step 211 and the process continues as described above.
  • the process is completed when no further video is indicated to be received, e.g., by receiving an end-of-file indication or by the elapsing of a predetermined period of time without receipt of a frame.
  • video server 101 could actually be transmitting the source video at varying frame rates, e.g., in response to detection of the existence of various conditions at network 103 or video client 105 . Additionally video server 101 and network 103 could be cooperating with video client 105 , either individually or in combination, to implement the invention.
  • FIG. 3 shows a flow chart for a more specific example of the invention.
  • the example shown in FIG. 3 looks at the process that must be performed when it is time to display a frame of video, and it utilizes fading as the surrogate effect should a surrogate effect need to be applied.
  • the previously displayed frame is reduced in brightness by 15%.
  • step 301 The process shown in FIG. 3 is entered, in step 301 , when it is determined that a new frame of video must be prepared for display, e.g., just before a new frame must begin being displayed.
  • step 303 the most recently deposited frame of video is retrieved from storage.
  • conditional branch point 305 tests to determine if the source frame rate is equal to the desired display frame rate. If the test result in step 305 is YES, control passes to conditional branch point 307 , which tests to determine if the received frame rate equals the desired display frame rate. If the test result in step 307 is YES, this means that the source frame rate, the received frame rate, and the desired frame rate all match.
  • step 321 the frame which was retrieved is the most recently received frame, and it can be directly displayed without further processing. Accordingly, control passes to step 321 and the frame is displayed. The process then exits in step 323 .
  • step 309 in which the process of applying the fading surrogate effect is begun. More specifically, conditional branch point 309 tests to determine if the frame retrieved in step 303 is a new frame, i.e., it is a frame that has not yet had any surrogate effect processing performed on it. If the test result in step 309 is YES, control passes to step 311 , in which a counter variable, COUNT, is set to zero.
  • test result in step 309 is NO, or after execution of step 309 , control passes to conditional branch point 313 , which tests to determine if count is equal to five. This test is used to control the fading of the frame so that it does not fade to the extent that the frame becomes practically invisible. If the test result in step 313 is YES, indicating that the frame has been faded by the maximum allowable amount, control passes to step 321 to display the frame. The process then exits in step 323 .
  • step 313 If the test result in step 313 is NO, indicating that the frame has not been faded by the maximum allowable amount, control passes to step 315 , and the brightness of the frame is reduced by 15%. The brightness-reduced frame is then stored back in the frame buffer in step 317 , and the value of COUNT is incremented in step 319 . Thereafter, the frame stored in the frame buffer is displayed in step 321 and the process is exited in step 323 .

Abstract

When there is an inability to convey motion that is occurring in the frames of a video source by a full-frame-rate transmission, a surrogate effect, other than another form of motion, may be used as a compensation technique to better convey motion to a viewer. The surrogate effect employed may be a) fading, b) wiping, c) dissolving, d) blurring, e) enhancing the contrast, f) enhancing one or more colors, g) enhancing the brightness, h) scaling the image, and i) the like. How the surrogate effect is applied to any frame may be a function of one or more video frames. Optionally, more than one effect may be used in combination. Advantageously, a more continuous sense of motion is perceived by a viewer. In one embodiment of the invention, full-frame-rate video is initially available. A controller monitors the ability to transmit or display full-frame-rate video. In the event that it is determined that, ultimately, full-frame-rate video cannot be displayed to a user, the images of the frames that will be displayed are adjusted to incorporate a surrogate effect that helps to better convey the intended motion to the viewer. The surrogate effect applied may be selected as a joint function of the available frame rate and knowledge of the content, e.g., the specific type of motion, intended to be shown to the viewer.

Description

TECHNICAL FIELD
This invention relates to the displaying of video with motion where the frame rate of the displayed video is inadequate to fluidly convey that motion.
BACKGROUND OF THE INVENTION
A well known problem in the art of video communication is the inability to fluidly display motion in a video when the frame rate at which the video is displayed must be either less than or greater than the frame rate of the source video. Conventional solutions to this problem are either to display the video at a frame rate that is mismatched to the source frame rate or to attempt to rationalize the frame rates. Such rationalization may be done by either a) dropping frames, e.g., during transmission or prior to display, when the display rate is less than the source rate, or b) by padding frames—with or without interpolation-when the display rate is greater than the source rate. Such solutions are inadequate because of the resulting poor quality, e.g., jumpiness and lack of smooth notion.
Another problem in the art is an inability to transmit frames from a video source to a video display without loss of frames during transmission. Conventional solutions to this problem are to either 1) buffer frames so that there is a stored reserve of frames that can be shown during times when no frames are received or 2) display the last received frame until the next one is received. Again, such solutions are inadequate because of the resulting poor quality, e.g., jumpiness and lack of smooth motion, as well as the fact that the cost of the memory to implement the buffer is not negligible. Also, the buffer length introduces a transmission delay, which can be disconcerting in interactive video applications such as video conferences or gaming.
SUMMARY OF THE INVENTION
We have recognized that, in accordance with the principles of the invention, when there is an inability to convey motion that is occurring in the frames of a video source by a full-frame-rate transmission, a surrogate effect, other than another form of motion, may be used as a compensation technique to better convey motion to a viewer. The surrogate effect employed may be a) fading, b) wiping, c) dissolving, d) blurring, e) enhancing the contrast, f) enhancing one or more colors, g) enhancing the brightness, h) scaling the image, and i) the like. How the surrogate effect is applied to any frame may be a function of one or more video frames. Optionally, more than one effect may be used in combination. Advantageously, a more continuous sense of motion is perceived by a viewer.
In one embodiment of the invention, full-frame-rate video is initially available. A controller monitors the ability to transmit or display full-frame-rate video. In the event that it is determined that, ultimately, full-frame-rate video cannot be displayed to a user, the images of the frames that will be displayed are adjusted to incorporate a surrogate effect that helps to better convey the intended motion to the viewer. In accordance with an aspect of the invention, the surrogate effect applied may be selected as a joint function of the available frame rate and knowledge of the content, e.g., the specific type of motion, intended to be shown to the viewer.
In addition, the same surrogate effect techniques may be employed even when full-frame-rate capability is available, if it is desired to give motion occurring at a first speed within the video the appearance to a viewer of occurring at a second speed. This may be necessary when the desired frame rate is greater than the full frame rate, which may be required for interactive applications, such as simulations or gaming. The controller determines the first speed of the motion and the desired second speed and applies the surrogate effect as appropriate.
BRIEF DESCRIPTION OF THE DRAWING
In the drawing:
FIG. 1 shows an exemplary video communication system in accordance with the invention;
FIG. 2 shows a flow chart for a process by which a video client processes frames of video received in packet format from a network, in accordance with the principles of the invention; and
FIG. 3 shows a flow chart for a more specific example of the invention.
DETAILED DESCRIPTION
FIG. 1 shows an exemplary video communication system 100 arranged in accordance with the invention. Video communication system 100 includes video server 101, network 103 and video client 105. Video server 101 transmits selected video at a fixed source frame rate to video client 105 via network 103. Video server 107 may obtain the video information from video camera 107, magnetic disk 109, video tape 111, or optical disk 113. Typically video server 101 formats the video information of the frames into packets and transmits the packets to network 103.
Network 103 attempts to transmit all of the packets to video client 105. However, network 103 is not always entirely successful. This lack of success is typically due to congestion which results in packets being dropped within network 103. The packets that successfully traverse network 103 are supplied to video client 105.
Video client 105 receives the packets from network 103 and reconstructs frames of the original video signal therefrom. For purposes of this invention, it is assumed that any errors within a frame, e.g., as a result of packets dropped or damaged during transmission, are either compensated for by other techniques beyond the scope of this disclosure or they cause the entire frame containing such errors to be dropped. The reconstructed frames are converted into a signal suitable for display on video monitor 115. Video monitor 115 may be a computer monitor, a television, or the like.
FIG. 2 shows a flow chart for a process by which video client 105 processes frames of video received in packet format from network 103, in accordance with the principles of the invention. The process is entered in step 201 when video information is initially received from network 103. In step 203, the source frame rate is determined. The source frame rate is the frame rate at which the maker of the video intended to have the video shown and achieve full-motion. Typically, this is the rate at which the video was filmed, most often 30 frames per second in the United States. In one embodiment of the invention, header information in at least one of the packets of the first frame specifies what the source frame rate is.
In step 205 the desired display frame rate is determined. The desired frame rated may be fixed, e.g., for static viewing of the video. Alternatively, the desired frame rate may be dynamically changing, e.g., for interactive applications such as where a user is simulating a bicycle ride, the video is the terrain the bicyclist sees, and the frames need to be shown at a rate that is a function of the speed at which the bicycle is moving.
Conditional branch point 207 tests to determine if the source frame rate is equal to the desired display frame rate. The source frame rate may be less than the desired display frame rate for interactive applications, such as the foregoing bicycle simulation, when the bicyclist's speed is greater than the speed at which the camera used to shoot the video footage of the terrain was moving when the video footage was shot. Alternatively, there may simply be a mismatch between the capabilities of the video source and video client 105. For example, video camera 107 may be a low-end, less-than-full-motion camera operating at 10 frames per second (fps) while video client 105 is a high-end, full-motion user workstation capable of displaying 30 fps. Similarly, the source frame rate may be greater than the desired display frame rate for interactive applications, such as the foregoing bicycle simulation, when the bicyclist's speed is less than the speed at which the camera used to shoot the video footage of the terrain was moving when the video footage was shot. Likewise, video camera 107 may be a high-end, full-motion camera operating at 30 fps while video client 105 is a low-end, less-than-full-motion user workstation that is only capable of 10 fps.
If the test result in step 207 is YES, control passes to conditional branch point 209, which tests to determine if the received frame rate equals the desired display frame rate. This test is implemented by checking if a new frame of video has been received at the desired display frame rate and is ready for display. If the test result in step 209 is YES, this means that the source frame rate, the received frame rate, and the desired frame rate all match. In other words, all the frames that need to be displayed are being received and at the rate at which they should be displayed. Therefore, the video can be directly displayed without further processing. Accordingly, control passes to step 211 and the frame is displayed. Control then passes back to step 205 and the process continues as described above.
If the test result in either step 207 or 209 is NO, this indicates that frames are not being received at the rate at which they should be displayed. Therefore, processing of the video is necessary to better convey motion contained within the video to a viewer. Accordingly, control passes to optional step 213 in which a surrogate effect other than another form of motion, is selected as a compensation technique to better convey motion to a viewer. The surrogate effect employed may be selected from a) fading, b) wiping, c) dissolving, d) blurring, e) enhancing the contrast, f) enhancing one or more colors in the frame, g) enhancing the brightness, h) scaling the image, and i) the like. Optionally, more that one surrogate effect may be applied in combination.
Step 213 is optional because the processing technique to be employed may be preset for the entire video. Otherwise, the selection of a surrogate effect may be made in step 213 for a portion of the video, and the portion may be as small as a single frame. The particular technique selected may be as a function of the content of the frame. This may be implemented, for example, by analysis of the content of the frame, such as by video client 105. Alternatively, the surrogate effect to be employed may be specified as pail of the video source and transmitted along with the frames in one or more of the packet headers.
In step 215 the selected surrogate effect is applied to the video. In applying the surrogate effect to any frame, information from one or more frames may be employed. For example, such frames may be stored in memory. Control then passes to step 211 and the process continues as described above.
The process is completed when no further video is indicated to be received, e.g., by receiving an end-of-file indication or by the elapsing of a predetermined period of time without receipt of a frame.
It will be appreciated that video server 101 could actually be transmitting the source video at varying frame rates, e.g., in response to detection of the existence of various conditions at network 103 or video client 105. Additionally video server 101 and network 103 could be cooperating with video client 105, either individually or in combination, to implement the invention.
FIG. 3 shows a flow chart for a more specific example of the invention. The example shown in FIG. 3 looks at the process that must be performed when it is time to display a frame of video, and it utilizes fading as the surrogate effect should a surrogate effect need to be applied. In particular, at each time that a frame of video must be displayed but not at the source video frame rate, so that a surrogate effect needs to be applied, in the example of FIG. 3 the previously displayed frame is reduced in brightness by 15%.
The process shown in FIG. 3 is entered, in step 301, when it is determined that a new frame of video must be prepared for display, e.g., just before a new frame must begin being displayed. In step 303, the most recently deposited frame of video is retrieved from storage. Next, conditional branch point 305 tests to determine if the source frame rate is equal to the desired display frame rate. If the test result in step 305 is YES, control passes to conditional branch point 307, which tests to determine if the received frame rate equals the desired display frame rate. If the test result in step 307 is YES, this means that the source frame rate, the received frame rate, and the desired frame rate all match. In other words, all the frames that need to be displayed are being received and at the rate at which they should be displayed. Thus the frame which was retrieved is the most recently received frame, and it can be directly displayed without further processing. Accordingly, control passes to step 321 and the frame is displayed. The process then exits in step 323.
If the test result in either step 305 or 307 is NO, this indicates that processing of the video is necessary to better convey motion contained within the video to a viewer. Accordingly, control passes to step 309, in which the process of applying the fading surrogate effect is begun. More specifically, conditional branch point 309 tests to determine if the frame retrieved in step 303 is a new frame, i.e., it is a frame that has not yet had any surrogate effect processing performed on it. If the test result in step 309 is YES, control passes to step 311, in which a counter variable, COUNT, is set to zero. If the test result in step 309 is NO, or after execution of step 309, control passes to conditional branch point 313, which tests to determine if count is equal to five. This test is used to control the fading of the frame so that it does not fade to the extent that the frame becomes practically invisible. If the test result in step 313 is YES, indicating that the frame has been faded by the maximum allowable amount, control passes to step 321 to display the frame. The process then exits in step 323.
If the test result in step 313 is NO, indicating that the frame has not been faded by the maximum allowable amount, control passes to step 315, and the brightness of the frame is reduced by 15%. The brightness-reduced frame is then stored back in the frame buffer in step 317, and the value of COUNT is incremented in step 319. Thereafter, the frame stored in the frame buffer is displayed in step 321 and the process is exited in step 323.
The foregoing merely illustrates the principles of the inventions. It will thus be appreciated that those skilled in the art will be able to devise various arrangements which, although not explicitly described or shown herein, embody the principles of the invention and are included within its spirit and scope.

Claims (29)

What is claimed is:
1. A method for use in processing frames of a video signal in which motion occurs, said video signal having a source frame rate, comprising the steps of:
determining that the video signal is to be displayed at a second frame rate different from the source frame rate; and
applying a surrogate effect, other than another form of motion, to a portion of the video signal when the video signal is displayed at the second frame rate whereby the motion is better conveyed to a viewer.
2. The invention as defined in claim 1 wherein said portion of the video signal is a frame of the video signal.
3. The invention as defined in claim 1 wherein said portion of the video signal is a plurality of frames of the video signal.
4. The invention as defined in claim 1 wherein said surrogate effect is one of a group consisting of: fading, wiping, dissolving, blurring, contrast enhancing, enhancing one or more colors, brightness enhancing, scaling.
5. The invention as defined in claim 1 wherein said surrogate effect is a combination of at least two effects.
6. The invention as defined in claim 1 wherein said surrogate effect is a combination of at least two effects from the group consisting of: fading, wiping, dissolving, blurring, contrast enhancing, enhancing one or more colors, brightness enhancing, scaling.
7. The invention as defined in claim 1 wherein, in said applying step, said surrogate effect is applied to a frame of said video signal as a function of information in one frame of said video signal.
8. The invention as defined in claim 1 wherein, in said applying step, said surrogate effect is applied to a frame of said video signal as a function of information in at least one frame of said video signal.
9. The invention as defined in claim 1 further including the step of determining said surrogate effect for application in said applying step.
10. The invention as defined in claim 1 further including the step of determining said surrogate effect for application in said applying step as a function of a plurality of surrogate effects using as basis for determining said surrogate effect information from at least one frame of said video signal.
11. The invention as defined in claim 1 wherein said second frame rate is greater than said source frame rate.
12. The invention as defined in claim 1 wherein said second frame rate is less than said source frame rate.
13. Apparatus for use in video signal processing, comprising:
means for receiving a video signal having a source frame rate; and
a processor for determining that the video signal is to he displayed at a second frame rate different from the source frame rate and for applying a surrogate effect, other than another form of motion, to a portion of the video signal when the video signal is displayed at the second frame rate whereby the motion is better conveyed to a viewer.
14. The invention as defined in claim 13 wherein said video signal is received from a network.
15. The invention as defined in claim 13 wherein second frame rate is greater than said source frame rate.
16. The invention as defined in claim 13 wherein second frame rate is less than said source frame rate.
17. The invention as defined in claim 13 wherein second frame rate is less than said source frame rate because frames of said source video are not available at a time when they need to be displayed.
18. The invention as defined in claim 13 wherein said surrogate effect is selected from the group consisting of: fading, wiping, dissolving, blurring, contrast enhancing, enhancing one or more colors, brightness enhancing, scaling.
19. Apparatus for use with a video transmission system that includes a video server and a network, comprising:
a video signal receiver adapted to receive video signals transported by said network; and
a processor for determining if said video signal, as received by said video signal adapter, is to be displayed at a second frame rate different from the source frame rate as supplied by said video server and for applying a surrogate effect, other than another form of motion, to a portion of the video signal, as received by said video signal adapter, when the video signal is displayed at the second frame rate.
20. The invention as defined in claim 19 wherein said surrogate effect selected is selected as a function of a type of motion being conveyed.
21. The invention as defined in claim 20 wherein said surrogate effect is selected from the group consisting of: fading, wiping, dissolving, blurring, contrast enhancing, enhancing one or more colors, brightness enhancing, scaling.
22. The invention as defined in claim 19 wherein said video signal is arranged into packets.
23. The invention as defined in claim 22 wherein said surrogate effect selected is selected as a function of information encoded in at least one of said packets.
24. The invention as defined in claim 22 wherein said surrogate effect selected is selected as a function of information encoded in a packet that is transmitted along with said packets of said video signal.
25. A method for use in processing frames of a video signal in which motion occurs, said video signal having a source frame rate, comprising the steps of:
determining that the video signal is to be displayed at a second frame rate different from the source frame rate; and
applying a surrogate effect, other than another form of motion, to a portion of the video signal when the video signal is displayed at the second frame rate whereby the motion is better conveyed to a viewer;
wherein said surrogate effect is one of a group consisting of: fading, wiping, dissolving, blurring, scaling.
26. The invention as defined in claim 25 further including the step of determining said surrogate effect for application in said applying step.
27. The invention as defined in claim 25 further including the step of determining said surrogate effect for application in said applying step as a function of a plurality of surrogate effects using as basis for determining said surrogate effect information from at least one frame of said video signal.
28. The invention as defined in claim 25 further including the step of selecting said surrogate effect from an available plurality of surrogate effects.
29. The invention as defined in claim 25 wherein said surrogate effect selected is selected as a function of a type of motion being conveyed.
US08/869,056 1997-06-04 1997-06-04 Motion display technique Expired - Lifetime US6462785B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US08/869,056 US6462785B1 (en) 1997-06-04 1997-06-04 Motion display technique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US08/869,056 US6462785B1 (en) 1997-06-04 1997-06-04 Motion display technique

Publications (1)

Publication Number Publication Date
US6462785B1 true US6462785B1 (en) 2002-10-08

Family

ID=25352849

Family Applications (1)

Application Number Title Priority Date Filing Date
US08/869,056 Expired - Lifetime US6462785B1 (en) 1997-06-04 1997-06-04 Motion display technique

Country Status (1)

Country Link
US (1) US6462785B1 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070070221A1 (en) * 2004-08-11 2007-03-29 Sony Corporation Image processing apparatus and method, recording medium, and program
US20080219493A1 (en) * 2004-03-30 2008-09-11 Yoav Tadmor Image Processing System
US7453519B2 (en) * 2004-03-30 2008-11-18 Olympus Corporation Method and apparatus for converting images at a given frame or field rate to second and third frame or field rates while maintaining system synchronism
US20090060373A1 (en) * 2007-08-24 2009-03-05 General Electric Company Methods and computer readable medium for displaying a restored image
US20090094518A1 (en) * 2007-10-03 2009-04-09 Eastman Kodak Company Method for image animation using image value rules
US20100033626A1 (en) * 2008-08-05 2010-02-11 Samsung Electronics Co.., Ltd. Image processing apparatus and control method thereof
WO2011041903A1 (en) * 2009-10-07 2011-04-14 Telewatch Inc. Video analytics with pre-processing at the source end
US20110109742A1 (en) * 2009-10-07 2011-05-12 Robert Laganiere Broker mediated video analytics method and system
US20120044359A1 (en) * 2010-08-17 2012-02-23 Voltz Christopher D Frame rate measurement
CN101472131B (en) * 2007-12-28 2012-07-04 希姆通信息技术(上海)有限公司 Visual telephone with movement perceptive function and method for enhancing image quality
US8780162B2 (en) 2010-08-04 2014-07-15 Iwatchlife Inc. Method and system for locating an individual
US8860771B2 (en) 2010-08-04 2014-10-14 Iwatchlife, Inc. Method and system for making video calls
US8885007B2 (en) 2010-08-04 2014-11-11 Iwatchlife, Inc. Method and system for initiating communication via a communication network
US9143739B2 (en) 2010-05-07 2015-09-22 Iwatchlife, Inc. Video analytics with burst-like transmission of video data
US9420250B2 (en) 2009-10-07 2016-08-16 Robert Laganiere Video analytics method and system
US9667919B2 (en) 2012-08-02 2017-05-30 Iwatchlife Inc. Method and system for anonymous video analytics processing

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4591913A (en) * 1983-05-21 1986-05-27 Robert Bosch Gmbh Method and circuit apparatus for special-effect television picture transitions in which a picture comes on by growing in size or fades away by diminishing in size
US4698682A (en) * 1986-03-05 1987-10-06 Rca Corporation Video apparatus and method for producing the illusion of motion from a sequence of still images
US4780763A (en) * 1987-03-27 1988-10-25 The Grass Valley Group, Inc. Video special effects apparatus
US4951144A (en) * 1989-04-12 1990-08-21 The Grass Valley Group, Inc. Recursive video blur effect
US5008755A (en) * 1989-04-21 1991-04-16 Abekas Video Systems Ltd. Digital video special effect system
US5105313A (en) * 1987-09-26 1992-04-14 Quantel Limited Method and apparatus for producing slow motion television pictures
US5125041A (en) * 1985-08-05 1992-06-23 Canon Kabushiki Kaisha Still image processing method for blurring an image background and producing a visual flowing effect
US5191416A (en) * 1991-01-04 1993-03-02 The Post Group Inc. Video signal processing system
US5245432A (en) * 1989-07-31 1993-09-14 Imageware Research And Development Inc. Apparatus and method for transforming a digitized signal of an image to incorporate an airbrush effect
US5367343A (en) * 1992-07-30 1994-11-22 Global Telecommunications Industries, Inc. Motion enhanced compressed video system
US5428399A (en) * 1991-04-15 1995-06-27 Vistek Electronics Limited Method and apparatus for image translation with improved motion compensation
US5440336A (en) * 1993-07-23 1995-08-08 Electronic Data Systems Corporation System and method for storing and forwarding audio and/or visual information on demand
US5502503A (en) * 1990-06-27 1996-03-26 Koz; Mark C. Digital color TV for personal computers
US5547382A (en) * 1990-06-28 1996-08-20 Honda Giken Kogyo Kabushiki Kaisha Riding simulation system for motorcycles
US5550982A (en) * 1993-06-24 1996-08-27 Starlight Networks Video application server
US5646697A (en) * 1994-01-19 1997-07-08 Sony Corporation Special effects video processor
US5701163A (en) * 1995-01-18 1997-12-23 Sony Corporation Video processing method and apparatus
US5767921A (en) * 1996-05-06 1998-06-16 Winbond Electronics Corp. Method and apparatus for gradually adjusting color components of digital video data
US5777689A (en) * 1996-04-10 1998-07-07 Tektronix, Inc. Method and apparatus for video signal sharpening
US5793436A (en) * 1996-06-17 1998-08-11 Samsung Electronics Co., Ltd. Buffer occupancy control method for use in video buffering verifier
US5821986A (en) * 1994-11-03 1998-10-13 Picturetel Corporation Method and apparatus for visual communications in a scalable network environment
US5844618A (en) * 1995-02-15 1998-12-01 Matsushita Electric Industrial Co., Ltd. Method and apparatus for telecine image conversion
US6005638A (en) * 1996-03-04 1999-12-21 Axcess, Inc. Frame averaging for use in processing video data

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4591913A (en) * 1983-05-21 1986-05-27 Robert Bosch Gmbh Method and circuit apparatus for special-effect television picture transitions in which a picture comes on by growing in size or fades away by diminishing in size
US5125041A (en) * 1985-08-05 1992-06-23 Canon Kabushiki Kaisha Still image processing method for blurring an image background and producing a visual flowing effect
US4698682A (en) * 1986-03-05 1987-10-06 Rca Corporation Video apparatus and method for producing the illusion of motion from a sequence of still images
US4780763A (en) * 1987-03-27 1988-10-25 The Grass Valley Group, Inc. Video special effects apparatus
US5105313A (en) * 1987-09-26 1992-04-14 Quantel Limited Method and apparatus for producing slow motion television pictures
US4951144A (en) * 1989-04-12 1990-08-21 The Grass Valley Group, Inc. Recursive video blur effect
US5008755A (en) * 1989-04-21 1991-04-16 Abekas Video Systems Ltd. Digital video special effect system
US5245432A (en) * 1989-07-31 1993-09-14 Imageware Research And Development Inc. Apparatus and method for transforming a digitized signal of an image to incorporate an airbrush effect
US5502503A (en) * 1990-06-27 1996-03-26 Koz; Mark C. Digital color TV for personal computers
US5547382A (en) * 1990-06-28 1996-08-20 Honda Giken Kogyo Kabushiki Kaisha Riding simulation system for motorcycles
US5191416A (en) * 1991-01-04 1993-03-02 The Post Group Inc. Video signal processing system
US5428399A (en) * 1991-04-15 1995-06-27 Vistek Electronics Limited Method and apparatus for image translation with improved motion compensation
US5367343A (en) * 1992-07-30 1994-11-22 Global Telecommunications Industries, Inc. Motion enhanced compressed video system
US5550982A (en) * 1993-06-24 1996-08-27 Starlight Networks Video application server
US5440336A (en) * 1993-07-23 1995-08-08 Electronic Data Systems Corporation System and method for storing and forwarding audio and/or visual information on demand
US5646697A (en) * 1994-01-19 1997-07-08 Sony Corporation Special effects video processor
US5821986A (en) * 1994-11-03 1998-10-13 Picturetel Corporation Method and apparatus for visual communications in a scalable network environment
US5701163A (en) * 1995-01-18 1997-12-23 Sony Corporation Video processing method and apparatus
US5844618A (en) * 1995-02-15 1998-12-01 Matsushita Electric Industrial Co., Ltd. Method and apparatus for telecine image conversion
US6005638A (en) * 1996-03-04 1999-12-21 Axcess, Inc. Frame averaging for use in processing video data
US5777689A (en) * 1996-04-10 1998-07-07 Tektronix, Inc. Method and apparatus for video signal sharpening
US5767921A (en) * 1996-05-06 1998-06-16 Winbond Electronics Corp. Method and apparatus for gradually adjusting color components of digital video data
US5793436A (en) * 1996-06-17 1998-08-11 Samsung Electronics Co., Ltd. Buffer occupancy control method for use in video buffering verifier

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
(Halsall; Internetworking; Data Communications, Computer Networks and Open Systems; pp. 499-501), 1996. *

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080219493A1 (en) * 2004-03-30 2008-09-11 Yoav Tadmor Image Processing System
US7453519B2 (en) * 2004-03-30 2008-11-18 Olympus Corporation Method and apparatus for converting images at a given frame or field rate to second and third frame or field rates while maintaining system synchronism
US9171374B2 (en) 2004-03-30 2015-10-27 University Of Newcastle Upon Tyne Method and apparatus to highlight information in complex visual environments
WO2005096128A3 (en) * 2004-03-30 2009-05-07 Univ Newcastle Image processing system
US8649551B2 (en) 2004-03-30 2014-02-11 University Of Newcastle Upon Tyne Method and apparatus to highlight information in complex visual environments
US20070070221A1 (en) * 2004-08-11 2007-03-29 Sony Corporation Image processing apparatus and method, recording medium, and program
US7602440B2 (en) * 2004-08-11 2009-10-13 Sony Corporation Image processing apparatus and method, recording medium, and program
US20090060373A1 (en) * 2007-08-24 2009-03-05 General Electric Company Methods and computer readable medium for displaying a restored image
US8122356B2 (en) 2007-10-03 2012-02-21 Eastman Kodak Company Method for image animation using image value rules
US20090094518A1 (en) * 2007-10-03 2009-04-09 Eastman Kodak Company Method for image animation using image value rules
CN101472131B (en) * 2007-12-28 2012-07-04 希姆通信息技术(上海)有限公司 Visual telephone with movement perceptive function and method for enhancing image quality
US20100033626A1 (en) * 2008-08-05 2010-02-11 Samsung Electronics Co.., Ltd. Image processing apparatus and control method thereof
US20110109742A1 (en) * 2009-10-07 2011-05-12 Robert Laganiere Broker mediated video analytics method and system
WO2011041903A1 (en) * 2009-10-07 2011-04-14 Telewatch Inc. Video analytics with pre-processing at the source end
US9788017B2 (en) 2009-10-07 2017-10-10 Robert Laganiere Video analytics with pre-processing at the source end
US9420250B2 (en) 2009-10-07 2016-08-16 Robert Laganiere Video analytics method and system
US9143739B2 (en) 2010-05-07 2015-09-22 Iwatchlife, Inc. Video analytics with burst-like transmission of video data
US8885007B2 (en) 2010-08-04 2014-11-11 Iwatchlife, Inc. Method and system for initiating communication via a communication network
US8860771B2 (en) 2010-08-04 2014-10-14 Iwatchlife, Inc. Method and system for making video calls
US8780162B2 (en) 2010-08-04 2014-07-15 Iwatchlife Inc. Method and system for locating an individual
US8358347B2 (en) * 2010-08-17 2013-01-22 Hewlett-Packard Development Company, L.P. Frame rate measurement
US20120044359A1 (en) * 2010-08-17 2012-02-23 Voltz Christopher D Frame rate measurement
US9667919B2 (en) 2012-08-02 2017-05-30 Iwatchlife Inc. Method and system for anonymous video analytics processing

Similar Documents

Publication Publication Date Title
US6462785B1 (en) Motion display technique
US9986306B2 (en) Method and apparatus for selection of content from a stream of data
US6637031B1 (en) Multimedia presentation latency minimization
US7242850B2 (en) Frame-interpolated variable-rate motion imaging system
US6141053A (en) Method of optimizing bandwidth for transmitting compressed video data streams
US6166730A (en) System for interactively distributing information services
EP0702492B1 (en) Interactive video system
US20030088646A1 (en) Random access video playback system on a network
US20060116164A1 (en) System and method for divided display on multiple mobile terminals
US20190184284A1 (en) Method of transmitting video frames from a video stream to a display and corresponding apparatus
US20070113261A1 (en) System for transmitting video streams using a plurality of levels of quality over a data network to a remote receiver
US20020174438A1 (en) System and method for time shifting the delivery of video information
US20050034005A1 (en) Method and system for synchronizing data
CN108810636A (en) Video broadcasting method, equipment and system
US5404446A (en) Dual buffer video display system for the display of asynchronous irregular frame rate video data
CA2364733A1 (en) System and method for interactive distribution of selectable presentations
US7069575B1 (en) System for interactively distributing information services
WO1994018776A2 (en) Multimedia distribution system
CA2275058A1 (en) Method of discrete interactive multimedia list broadcasting
CA2196571A1 (en) System and method for telecommunication
CN106331530B (en) A kind of simultaneously and rapidly switching display methods of video wall, decoding device
US7937735B2 (en) Apparatus for the decoding of video data in first and second formats
JPH11313301A (en) Program distribution system, program distributor, program quality converter and program receiver
US20130227062A1 (en) Apparatus and method of displaying a contents using for key frame in a terminal
EP0817486A2 (en) Method for altering a broadcast transmission as a function of its recipient on a communications network

Legal Events

Date Code Title Description
AS Assignment

Owner name: LUCENT TECHNOLOGIES INC., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CARRARO, GIANPAOLO U.;EDMARK, JOHN T.;ENSOR, JAMES ROBERT;REEL/FRAME:008606/0306

Effective date: 19970604

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12