US20090135200A1 - Selective Edge Blending Based on Displayed Content - Google Patents

Selective Edge Blending Based on Displayed Content Download PDF

Info

Publication number
US20090135200A1
US20090135200A1 US11/922,540 US92254005A US2009135200A1 US 20090135200 A1 US20090135200 A1 US 20090135200A1 US 92254005 A US92254005 A US 92254005A US 2009135200 A1 US2009135200 A1 US 2009135200A1
Authority
US
United States
Prior art keywords
blending
edges
images
pair
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/922,540
Inventor
Mark Alan Schultz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Assigned to THOMSON LICENSING reassignment THOMSON LICENSING ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SCHULTZ, MARK ALAN
Publication of US20090135200A1 publication Critical patent/US20090135200A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2300/00Aspects of the constitution of display devices
    • G09G2300/02Composition of display devices
    • G09G2300/026Video wall, i.e. juxtaposition of a plurality of screens to create a display screen of bigger dimensions
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2310/00Command of the display device
    • G09G2310/02Addressing, scanning or driving the display screen or processing steps related thereto
    • G09G2310/0232Special driving of display border areas
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/10Mixing of images, i.e. displayed pixel being the result of an operation, e.g. adding, on the corresponding input pixels

Definitions

  • the present invention generally relates to image processing and, more particularly, to processing segmented images for display.
  • a segmented display simultaneously presents multiple images.
  • a segmented display can comprise a single display that presents multiple images simultaneously in different regions of the display, an array of display panels in which the display panels cooperate to present images, a projection system using a plurality of projectors to project multiple images, or any other display system which can display a plurality of images. Sometimes each of the images remains distinct from the other displayed images. Other times the adjacent images together form a larger image.
  • the present invention relates to a method and an image processing system for blending edges of images for collective display.
  • the method includes the step of evaluating at least a pair of images whose edges border each other when displayed to determine whether the collective display of images will benefit from blending of the edges. If so, at least portions of the edges undergo blending.
  • Another embodiment of the present invention can include a machine-readable storage being programmed to cause a machine to perform the various steps described herein.
  • FIG. 1 depicts a flowchart, which is useful for understanding the present invention.
  • FIG. 2 depicts a segmented display having presented thereon a group of images.
  • FIG. 3 depicts the segmented display having presented thereon another group of images.
  • FIG. 4 a depicts the segmented display having presented thereon yet another group of images.
  • FIG. 4 b depicts an exploded view of individual images presented on the segmented display of FIG. 4 a.
  • FIG. 5 depicts a block diagram of an image processing system, which is useful for understanding the present invention.
  • FIG. 5 depicts a block diagram of an image processing system 500 which is useful for understanding the present invention.
  • the image processing system 500 can include frame buffers 502 , 504 , a seaming controller 506 and an Look-up Table (LUT)/algorithm controller 508 , each of which receive image data 510 .
  • the seaming controller 506 serves to evaluate images for display in accordance the methods described herein to selectively control edge blending processors 512 , which are used to selectively apply edge blending.
  • the LUT/algorithm controller 508 evaluates images to be displayed and modifies the look up tables (LUTs) and/or select algorithms 514 which are used by the edge blending processors 512 , each executing at least one edge blending process, to compute pixel values to implement edge blending. Moreover, if the seaming controller 506 instructs edge blending processors 512 to blend a portion of a particular seam, but another portion of the seam should remain unblended, the LUT/algorithm controller 508 can modify look up tables and/or algorithms used by the edge blending processors 512 so that selective blending can be applied as required. Such look up tables and algorithms are known to the skilled artisan.
  • a plurality of frame buffers 502 , 504 serve to assemble incoming image data 510 before being processed by the seaming controller 506 , LUT/Algorithm controller 508 and the edge blending processors 512 .
  • Each frame buffer 502 , 504 can include a plurality of sections 502 - 1 , 502 - 2 , 502 - 3 , 502 - 4 , 504 - 1 , 504 - 2 , 504 - 3 , 504 - 4 , respectively, of frame memory.
  • a frame memory in each frame buffer 502 , 504 can be allocated to a respective display system 516 .
  • the frame buffer 502 can be used to store data of a first frame, and then frame buffer 504 serves to store data of a next frame. Accordingly, while data undergoes storage in the frame buffer 504 , the frame buffer 502 can be read into the blending processors 512 and forwarded to the display systems 516 . In a similar manner, while data is being stored to frame buffer 502 , frame buffer 504 can be read into the blending processors 512 and forwarded to the display systems 516 .
  • the architecture can duplicate the seamed pixels at the input to the frame buffers 502 , 504 . In another arrangement, seamed pixels can be read from the frame buffers 502 , 504 twice to build the edge blended seams. Nonetheless, other arrangements can be implemented and the invention is not limited in this regard.
  • the edge blending processors 512 will forward processed images to a respective portion of a display system 516 for presentation.
  • the display system 516 can comprise a segmented display having a single display in which multiple images are simultaneously presented in different regions of the display, an array of display panels in which the display panels cooperate to present images, a projection system using a plurality of projectors to project multiple images, or any other display system which can display a plurality of images.
  • the image processing system of FIG. 5 can be realized in hardware, software, or a combination of hardware and software.
  • the image processing system can be realized in a centralized fashion in one processing system or in a distributed fashion where different elements are spread across several interconnected processing systems. Any kind of processing system or other apparatus adapted for carrying out the methods described herein is suited.
  • a typical combination of hardware and software can be a computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
  • the present invention also can be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which when loaded in a processing system is able to carry out these methods.
  • Computer program, software, or software application in the present context, means any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
  • FIG. 1 depicts a flowchart, which is useful for understanding a method 100 capable of being practiced by the apparatus of FIG. 5 for implementing the present invention.
  • Step 105 commences with the receipt of image data for images for presentation by the segmented display system of FIG. 5 .
  • step 110 of FIG. 1 selection of a first seam, formed by a pair of adjacent images, occurs.
  • step 115 the adjacent images undergo evaluation to determine whether the images will benefit from edge blending of the selected seam. For instance, data representing positioning of the images in a presentation and whether the images cooperate to form a larger image undergo processing by the image processing system of FIG. 5 as discussed previously.
  • the type of display that is used to present the images can be considered as part of the evaluation process. The display type can be received as a user selectable input entered into the image processing system.
  • FIG. 2 depicts a segmented display 200 useful for understanding the present invention.
  • the display 200 of FIG. 2 includes a first group of images 202 , 204 , 206 , 208 for presentation.
  • the images 202 , 204 , 206 , 208 cooperate to form a larger image 210 .
  • Seams 212 , 214 , 216 , 218 form at the boundaries of adjacent ones of the images 202 , 204 , 206 , 208 , respectively.
  • adjacent ones of the images 202 , 204 , 206 , 208 should blend smoothly together.
  • seams 212 , 214 , 216 , 218 can benefit from edge blending, for example if the display 200 does not undergo significant movement. Nonetheless, if the display 200 comprises a flexible display, such as projection screen, the images likely will not benefit from edge blending since movement of the screen can cause misalignment of the images.
  • the display 200 presents a second group of images 302 , 304 , 306 , 308 .
  • the second group of images 302 , 304 , 306 , 308 of FIG. 3 do not cooperate to form a single larger image, but instead each presents a self-contained image.
  • smooth blending of the images 302 , 304 , 306 , 308 generally will not prove desirable. Accordingly, the seams 312 , 314 , 316 , 318 will not benefit from edge blending.
  • the display 200 presents a third group of images 402 , 404 , 406 , 408 , 410 for display.
  • images 402 , 404 , 406 , 408 cooperate to form a single larger image, while a self-contained image 410 overlays images 402 , 404 , 406 , 408 .
  • Implementation of priority overlays exists in the art. In this instance smoothly blending the images 402 , 404 , 406 , 408 will prove desirable, while image 410 will not undergo blending with the other images 402 , 404 , 406 , 408 . Accordingly, seams 412 , 414 , 416 will benefit from edge blending, while seams 420 , 422 , 424 , 426 , 428 , 430 will not benefit from edge blending.
  • step 125 if the images will not benefit from edge blending of the selected seam, data values which do not implement edge blending of the selected seam are selected, and/or an image-processing algorithm that does not implement edge blending of the selected seam can be selected, as shown in step 125 .
  • a decision occurs whether to apply a black border at the selected seam. For example, if the adjacent images are significantly different or starkly contrast, a black border generally will prove desirable.
  • the black border can be applied at the selected seam to separate the adjacent images forming the seam.
  • the black border can be generated by elevation of black levels. Such black levels are known to the skilled artisan.
  • the placement of black borders around the images can minimize perception of distortion caused by movement of the images relative to one another caused by screen movement. If a decision is made not to apply the black border, step 130 can be skipped.
  • step 135 if the adjacent images will benefit from edge blending of the selected seam, data values which implement edge blending of the selected seam can be selected, and/or an image-processing algorithm that implements edge blending of the selected seam can be selected.
  • the seam then can be blended in accordance with the data values and/or image-processing algorithm, as shown in step 140 .
  • step 145 a next seam formed by a pair of adjacent images can be selected and the process can repeat until all seams to be displayed are evaluated.
  • the images 402 , 404 each include a region 432 , 434 , respectively, which overlap at seam 412 .
  • portions 436 , 438 of the respective regions 432 , 434 lie beneath, image 410 , which constitutes an overlay image. Accordingly, seaming and blending need not occur in portions 436 , 438 since they will not appear visible.
  • edge blending of a seam can occur on a pixel-by-pixel basis so that certain portions 440 , 442 of the respective regions 432 , 434 undergo edge blending while portions 436 , 438 do not.
  • pixels in portion 436 of image 402 can be set to zero so that the first projector projects minimum light for portion 436 . Accordingly, a portion of image 410 that lies over the seam 412 will undergo projection exclusively by a single projector, namely the second projector. This arrangement can be implemented to maximize the quality of image 410 .
  • the present invention relates to a method and a system for selectively implementing edge blending of adjacent images in a segmented display system. More particularly, the present invention implements edge blending on adjacent images exclusively when such edge blending will improve the appearance of images being displayed, while not blending adjacent images when such images will not benefit from edge blending. For example, edge blending can be turned off when smaller images being displayed do not cooperate to form a larger image, but instead present separate distinct images on a display. Edge blending also can be turned off when multiple projectors are used to project adjacent images onto a flexible screen that is subject to movement. When edge blending is not implemented, black borders can be placed around the images. Advantageously, placing black borders around the images can minimize perception of the movement of images relative to one another when movement of the screen occurs.

Abstract

A method and an image processing system for blending edges of images for collective display. The method includes the step of evaluating at least a pair of images whose edges border each other when displayed to determine whether the collective display of images will benefit from blending of the edges (113). If so, at least portions of the edges are blended.

Description

    FIELD OF THE INVENTION
  • The present invention generally relates to image processing and, more particularly, to processing segmented images for display.
  • BACKGROUND OF THE INVENTION
  • A segmented display simultaneously presents multiple images. A segmented display can comprise a single display that presents multiple images simultaneously in different regions of the display, an array of display panels in which the display panels cooperate to present images, a projection system using a plurality of projectors to project multiple images, or any other display system which can display a plurality of images. Sometimes each of the images remains distinct from the other displayed images. Other times the adjacent images together form a larger image.
  • When adjacent images form a larger image, the images typically overlap to insure blank regions don't appear between the individual images. With adjacent images forming a larger image, edge blending often occurs to blend the seams of the adjacent images by evening out the brightness in the seamed area. When multiple projectors project images onto a flexible screen, however, movement of the screen can cause edges of a blended seam to become misaligned, which is undesirable. Moreover, evening of the brightness reduces contrast. When multiple images are not being used to form a single large image, but instead are providing multiple independent images, the reduction in contrast can become undesirable.
  • SUMMARY OF THE INVENTION
  • The present invention relates to a method and an image processing system for blending edges of images for collective display. The method includes the step of evaluating at least a pair of images whose edges border each other when displayed to determine whether the collective display of images will benefit from blending of the edges. If so, at least portions of the edges undergo blending.
  • Another embodiment of the present invention can include a machine-readable storage being programmed to cause a machine to perform the various steps described herein.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Preferred embodiments of the present invention will be described below in more detail, with reference to the accompanying drawings, in which:
  • FIG. 1 depicts a flowchart, which is useful for understanding the present invention.
  • FIG. 2 depicts a segmented display having presented thereon a group of images.
  • FIG. 3 depicts the segmented display having presented thereon another group of images.
  • FIG. 4 a depicts the segmented display having presented thereon yet another group of images.
  • FIG. 4 b depicts an exploded view of individual images presented on the segmented display of FIG. 4 a.
  • FIG. 5 depicts a block diagram of an image processing system, which is useful for understanding the present invention.
  • DETAILED DESCRIPTION
  • FIG. 5 depicts a block diagram of an image processing system 500 which is useful for understanding the present invention. The image processing system 500 can include frame buffers 502, 504, a seaming controller 506 and an Look-up Table (LUT)/algorithm controller 508, each of which receive image data 510. The seaming controller 506 serves to evaluate images for display in accordance the methods described herein to selectively control edge blending processors 512, which are used to selectively apply edge blending. The LUT/algorithm controller 508 evaluates images to be displayed and modifies the look up tables (LUTs) and/or select algorithms 514 which are used by the edge blending processors 512, each executing at least one edge blending process, to compute pixel values to implement edge blending. Moreover, if the seaming controller 506 instructs edge blending processors 512 to blend a portion of a particular seam, but another portion of the seam should remain unblended, the LUT/algorithm controller 508 can modify look up tables and/or algorithms used by the edge blending processors 512 so that selective blending can be applied as required. Such look up tables and algorithms are known to the skilled artisan.
  • A plurality of frame buffers 502, 504 serve to assemble incoming image data 510 before being processed by the seaming controller 506, LUT/Algorithm controller 508 and the edge blending processors 512. Each frame buffer 502, 504 can include a plurality of sections 502-1, 502-2, 502-3, 502-4, 504-1, 504-2, 504-3, 504-4, respectively, of frame memory. For example, a frame memory in each frame buffer 502, 504 can be allocated to a respective display system 516. The frame buffer 502 can be used to store data of a first frame, and then frame buffer 504 serves to store data of a next frame. Accordingly, while data undergoes storage in the frame buffer 504, the frame buffer 502 can be read into the blending processors 512 and forwarded to the display systems 516. In a similar manner, while data is being stored to frame buffer 502, frame buffer 504 can be read into the blending processors 512 and forwarded to the display systems 516. In one arrangement, the architecture can duplicate the seamed pixels at the input to the frame buffers 502, 504. In another arrangement, seamed pixels can be read from the frame buffers 502, 504 twice to build the edge blended seams. Nonetheless, other arrangements can be implemented and the invention is not limited in this regard.
  • After selectively applying edge blending, where required, the edge blending processors 512 will forward processed images to a respective portion of a display system 516 for presentation. The display system 516 can comprise a segmented display having a single display in which multiple images are simultaneously presented in different regions of the display, an array of display panels in which the display panels cooperate to present images, a projection system using a plurality of projectors to project multiple images, or any other display system which can display a plurality of images.
  • The image processing system of FIG. 5 can be realized in hardware, software, or a combination of hardware and software. The image processing system can be realized in a centralized fashion in one processing system or in a distributed fashion where different elements are spread across several interconnected processing systems. Any kind of processing system or other apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware and software can be a computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
  • The present invention also can be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which when loaded in a processing system is able to carry out these methods. Computer program, software, or software application, in the present context, means any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
  • FIG. 1 depicts a flowchart, which is useful for understanding a method 100 capable of being practiced by the apparatus of FIG. 5 for implementing the present invention. Step 105 commences with the receipt of image data for images for presentation by the segmented display system of FIG. 5. During step 110 of FIG. 1, selection of a first seam, formed by a pair of adjacent images, occurs. Proceeding to step 115, the adjacent images undergo evaluation to determine whether the images will benefit from edge blending of the selected seam. For instance, data representing positioning of the images in a presentation and whether the images cooperate to form a larger image undergo processing by the image processing system of FIG. 5 as discussed previously. In addition, the type of display that is used to present the images can be considered as part of the evaluation process. The display type can be received as a user selectable input entered into the image processing system.
  • FIG. 2 depicts a segmented display 200 useful for understanding the present invention. The display 200 of FIG. 2 includes a first group of images 202, 204, 206, 208 for presentation. In this example, the images 202, 204, 206, 208 cooperate to form a larger image 210. Seams 212, 214, 216, 218 form at the boundaries of adjacent ones of the images 202, 204, 206, 208, respectively. To maximize image quality of the larger image 210, adjacent ones of the images 202, 204, 206, 208 should blend smoothly together. Accordingly the seams 212, 214, 216, 218 can benefit from edge blending, for example if the display 200 does not undergo significant movement. Nonetheless, if the display 200 comprises a flexible display, such as projection screen, the images likely will not benefit from edge blending since movement of the screen can cause misalignment of the images.
  • Referring to FIG. 3, the display 200 presents a second group of images 302, 304, 306, 308. In contrast to the first group of images 202, 204, 206, 208 of FIG. 2, the second group of images 302, 304, 306, 308 of FIG. 3 do not cooperate to form a single larger image, but instead each presents a self-contained image. In this instance smooth blending of the images 302, 304, 306, 308 generally will not prove desirable. Accordingly, the seams 312, 314, 316, 318 will not benefit from edge blending.
  • Referring to FIG. 4 a, the display 200 presents a third group of images 402, 404, 406, 408, 410 for display. In this example, images 402, 404, 406, 408 cooperate to form a single larger image, while a self-contained image 410 overlays images 402, 404, 406, 408. Implementation of priority overlays exists in the art. In this instance smoothly blending the images 402, 404, 406, 408 will prove desirable, while image 410 will not undergo blending with the other images 402, 404, 406, 408. Accordingly, seams 412, 414, 416 will benefit from edge blending, while seams 420, 422, 424, 426, 428, 430 will not benefit from edge blending.
  • Referring to decision box 120 of FIG. 1, if the images will not benefit from edge blending of the selected seam, data values which do not implement edge blending of the selected seam are selected, and/or an image-processing algorithm that does not implement edge blending of the selected seam can be selected, as shown in step 125.
  • Proceeding to decision box 128 of FIG. 1, a decision occurs whether to apply a black border at the selected seam. For example, if the adjacent images are significantly different or starkly contrast, a black border generally will prove desirable. At step 130, the black border can be applied at the selected seam to separate the adjacent images forming the seam. The black border can be generated by elevation of black levels. Such black levels are known to the skilled artisan. When a flexible screen serves to display the images, the placement of black borders around the images can minimize perception of distortion caused by movement of the images relative to one another caused by screen movement. If a decision is made not to apply the black border, step 130 can be skipped.
  • At step 135, if the adjacent images will benefit from edge blending of the selected seam, data values which implement edge blending of the selected seam can be selected, and/or an image-processing algorithm that implements edge blending of the selected seam can be selected. The seam then can be blended in accordance with the data values and/or image-processing algorithm, as shown in step 140. At step 145, a next seam formed by a pair of adjacent images can be selected and the process can repeat until all seams to be displayed are evaluated.
  • Briefly referring again to FIG. 4 b, an exploded view of images 402, 404 appears. The images 402, 404 each include a region 432, 434, respectively, which overlap at seam 412. Figuratively speaking, portions 436, 438 of the respective regions 432, 434 lie beneath, image 410, which constitutes an overlay image. Accordingly, seaming and blending need not occur in portions 436, 438 since they will not appear visible. Notably, edge blending of a seam can occur on a pixel-by-pixel basis so that certain portions 440, 442 of the respective regions 432, 434 undergo edge blending while portions 436, 438 do not.
  • Further, in an arrangement in which a first projector projects image 402 and a second projector projects image 404, pixels in portion 436 of image 402 can be set to zero so that the first projector projects minimum light for portion 436. Accordingly, a portion of image 410 that lies over the seam 412 will undergo projection exclusively by a single projector, namely the second projector. This arrangement can be implemented to maximize the quality of image 410.
  • The present invention relates to a method and a system for selectively implementing edge blending of adjacent images in a segmented display system. More particularly, the present invention implements edge blending on adjacent images exclusively when such edge blending will improve the appearance of images being displayed, while not blending adjacent images when such images will not benefit from edge blending. For example, edge blending can be turned off when smaller images being displayed do not cooperate to form a larger image, but instead present separate distinct images on a display. Edge blending also can be turned off when multiple projectors are used to project adjacent images onto a flexible screen that is subject to movement. When edge blending is not implemented, black borders can be placed around the images. Advantageously, placing black borders around the images can minimize perception of the movement of images relative to one another when movement of the screen occurs.
  • While the foregoing is directed to the preferred embodiment of the present invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. Further, ordinal references in the specification are provided to describe distinct features of the invention, but such ordinal references do not limit the scope of the present invention. Accordingly, the scope of the present invention is determined by the claims that follow.

Claims (17)

1. A method for blending edges of images for collective display, comprising the steps of:
evaluating at least a pair of images whose edges border each other when displayed to determine whether the collective display of the at least pair of images will benefit from blending of the edges; and if so;
blending at least first portions of the edges of the at least pair of images.
2. The method according to claim 1, wherein said blending step further comprises the step of changing data values in a look-up-table.
3. The method according to claim 1, wherein said blending step further comprises the step of selecting at least one blending algorithm optimal for blending the edges, and the blending of the edges is performed in accordance with the selected at least one blending algorithm.
4. The method according to claim 1, wherein the first portions of the edges are blended, and at least second portions of the edges are not blended.
5. The method according to claim 1, wherein the edges are not blended if the collective display of the at least pair of images will not benefit from blending.
6. The method according to claim 5, further comprising the step of changing data values in a look-up-table to prevent blending of the edges.
7. The method according to claim 5, further comprising the step of selecting at least one display algorithm optimal for presenting the edges as unblended, wherein the edges are presented in accordance with the selected at least one display algorithm.
8. A machine readable storage, having stored thereon a computer program having a plurality of code sections executable by a machine for causing the machine to selectively implement edge blending by performing the steps of:
evaluating at least a pair of images whose edges border each other when displayed to determine whether the collective display of the at least pair of images will benefit from blending of the edges; and if so;
blending at least first portions of the edges of the at least pair of images.
9. The machine readable storage of claim 8, wherein said blending step comprises the step of changing data values in a look-up-table.
10. The machine readable storage of claim 8, wherein said blending step comprises the step of selecting at least one blending algorithm optimal for blending the edges, and the blending of the edges is performed in accordance with the selected at least one blending algorithm.
11. The machine readable storage of claim 8, wherein the first portions of the edges are blended, and at least second portions of the edges are not blended.
12. The machine readable storage of claim 8, Wherein the edges are not blended if the collective display of the at least pair of images will not benefit from blending.
13. The machine readable storage of claim 12, further causing the machine to perform the step of changing data values in a look-up-table to prevent blending of the edges.
14. The machine readable storage of claim 12, further causing the machine to perform the step of selecting at least one display algorithm optimal for presenting the edges as unblended, wherein the edges are presented in accordance with the selected at least one display algorithm.
15. Apparatus for displaying images comprising:
means for receiving images for display;
means for evaluating at least a pair of images whose edges border each other when displayed to determine whether the collective display of the at least pair of images will benefit from blending of the edges; and;
means for blending at least first portions of the edges of the at least pair of images when the at least pair of will benefit from blending of the edges.
16. The apparatus according to claim 15 wherein the evaluating means further comprises a look-up table and algorithm controller.
17. The apparatus according to claim 15 wherein the blending means further comprises at least one edge blending processor which executes at least one edge blending process in response to data from the evaluating means to carry out edge blending.
US11/922,540 2005-06-28 2005-06-28 Selective Edge Blending Based on Displayed Content Abandoned US20090135200A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2005/022674 WO2007001298A1 (en) 2005-06-28 2005-06-28 Selective edge blending based on displayed content

Publications (1)

Publication Number Publication Date
US20090135200A1 true US20090135200A1 (en) 2009-05-28

Family

ID=35695991

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/922,540 Abandoned US20090135200A1 (en) 2005-06-28 2005-06-28 Selective Edge Blending Based on Displayed Content

Country Status (2)

Country Link
US (1) US20090135200A1 (en)
WO (1) WO2007001298A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110216284A1 (en) * 2010-03-02 2011-09-08 Canon Kabushiki Kaisha Automatic mode switching between single and multiple projectors
US20150269913A1 (en) * 2014-03-19 2015-09-24 Canon Kabushiki Kaisha Display apparatus
US20160343116A1 (en) * 2015-05-22 2016-11-24 Samsung Electronics Co., Ltd. Electronic device and screen display method thereof
JP2017083710A (en) * 2015-10-29 2017-05-18 キヤノン株式会社 Image processing device, image processing method, and program
JP2022058677A (en) * 2014-11-28 2022-04-12 株式会社半導体エネルギー研究所 Display device and image processing method for display device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014215604A (en) * 2013-04-30 2014-11-17 ソニー株式会社 Image processing apparatus and image processing method
WO2017015991A1 (en) * 2015-07-27 2017-02-02 南京巨鲨显示科技有限公司 Image combination processing system arranged in display

Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5437946A (en) * 1994-03-03 1995-08-01 Nikon Precision Inc. Multiple reticle stitching for scanning exposure system
US5703604A (en) * 1995-05-22 1997-12-30 Dodeca Llc Immersive dodecaherdral video viewing system
US5706025A (en) * 1989-05-22 1998-01-06 Tektronix, Inc. Smooth vertical motion via color palette manipulation
US5963247A (en) * 1994-05-31 1999-10-05 Banitt; Shmuel Visual display systems and a system for producing recordings for visualization thereon and methods therefor
US6229550B1 (en) * 1998-09-04 2001-05-08 Sportvision, Inc. Blending a graphic
US6292713B1 (en) * 1999-05-20 2001-09-18 Compaq Computer Corporation Robotic telepresence system
US20020008675A1 (en) * 2000-06-14 2002-01-24 Theodore Mayer Method and apparatus for seamless integration of images using a transmissive/reflective mirror
US20020063726A1 (en) * 1999-05-20 2002-05-30 Jouppi Norman P. System and method for displaying images using anamorphic video
US6456339B1 (en) * 1998-07-31 2002-09-24 Massachusetts Institute Of Technology Super-resolution display
US20020180727A1 (en) * 2000-11-22 2002-12-05 Guckenberger Ronald James Shadow buffer control module method and software construct for adjusting per pixel raster images attributes to screen space and projector features for digital warp, intensity transforms, color matching, soft-edge blending, and filtering for multiple projectors and laser projectors
US6545685B1 (en) * 1999-01-14 2003-04-08 Silicon Graphics, Inc. Method and system for efficient edge blending in high fidelity multichannel computer graphics displays
US6568816B2 (en) * 2000-10-04 2003-05-27 Panoram Technologies, Inc. Projection system and method for using a single light source to generate multiple images to be edge blended for arrayed or tiled display
US6570623B1 (en) * 1999-05-21 2003-05-27 Princeton University Optical blending for multi-projector display wall systems
US20030206179A1 (en) * 2000-03-17 2003-11-06 Deering Michael F. Compensating for the chromatic distortion of displayed images
US20030208345A1 (en) * 2002-05-02 2003-11-06 O'neill Julia Catherine Color matching and simulation of multicolor surfaces
US20040159636A1 (en) * 1999-09-09 2004-08-19 Torbjorn Sandstrom Data path for high performance pattern generator
US6804406B1 (en) * 2000-08-30 2004-10-12 Honeywell International Inc. Electronic calibration for seamless tiled display using optical function generator
US20040251574A1 (en) * 2003-06-13 2004-12-16 Collins David C. Methods to produce an object through solid freeform frabrication
US20060007239A1 (en) * 2004-07-06 2006-01-12 Harrison Charles F Color correction system
US7015954B1 (en) * 1999-08-09 2006-03-21 Fuji Xerox Co., Ltd. Automatic video system using multiple cameras
US7079287B1 (en) * 2000-08-01 2006-07-18 Eastman Kodak Company Edge enhancement of gray level images
US20060238723A1 (en) * 2005-04-22 2006-10-26 El-Ghoroury Hussein S Low profile, large screen display using a rear projection array system
US20060250415A1 (en) * 2005-05-09 2006-11-09 Microsoft Corporation Anti-aliasing content using opacity blending
US20070081205A1 (en) * 2000-08-01 2007-04-12 Hwai-Tzuu Tai Image recording apparatus and method providing personalized color enhancement
US20070206008A1 (en) * 2000-02-25 2007-09-06 The Research Foundation Of The State University Of New York Apparatus and Method for Real-Time Volume Processing and Universal Three-Dimensional Rendering
US20070291189A1 (en) * 2006-06-16 2007-12-20 Michael Harville Blend maps for rendering an image frame

Patent Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5706025A (en) * 1989-05-22 1998-01-06 Tektronix, Inc. Smooth vertical motion via color palette manipulation
US5437946A (en) * 1994-03-03 1995-08-01 Nikon Precision Inc. Multiple reticle stitching for scanning exposure system
US5963247A (en) * 1994-05-31 1999-10-05 Banitt; Shmuel Visual display systems and a system for producing recordings for visualization thereon and methods therefor
US5703604A (en) * 1995-05-22 1997-12-30 Dodeca Llc Immersive dodecaherdral video viewing system
US6456339B1 (en) * 1998-07-31 2002-09-24 Massachusetts Institute Of Technology Super-resolution display
US6229550B1 (en) * 1998-09-04 2001-05-08 Sportvision, Inc. Blending a graphic
US6545685B1 (en) * 1999-01-14 2003-04-08 Silicon Graphics, Inc. Method and system for efficient edge blending in high fidelity multichannel computer graphics displays
US6292713B1 (en) * 1999-05-20 2001-09-18 Compaq Computer Corporation Robotic telepresence system
US20020063726A1 (en) * 1999-05-20 2002-05-30 Jouppi Norman P. System and method for displaying images using anamorphic video
US6570623B1 (en) * 1999-05-21 2003-05-27 Princeton University Optical blending for multi-projector display wall systems
US7015954B1 (en) * 1999-08-09 2006-03-21 Fuji Xerox Co., Ltd. Automatic video system using multiple cameras
US20040159636A1 (en) * 1999-09-09 2004-08-19 Torbjorn Sandstrom Data path for high performance pattern generator
US20070206008A1 (en) * 2000-02-25 2007-09-06 The Research Foundation Of The State University Of New York Apparatus and Method for Real-Time Volume Processing and Universal Three-Dimensional Rendering
US20030206179A1 (en) * 2000-03-17 2003-11-06 Deering Michael F. Compensating for the chromatic distortion of displayed images
US20020008675A1 (en) * 2000-06-14 2002-01-24 Theodore Mayer Method and apparatus for seamless integration of images using a transmissive/reflective mirror
US7079287B1 (en) * 2000-08-01 2006-07-18 Eastman Kodak Company Edge enhancement of gray level images
US20070081205A1 (en) * 2000-08-01 2007-04-12 Hwai-Tzuu Tai Image recording apparatus and method providing personalized color enhancement
US6804406B1 (en) * 2000-08-30 2004-10-12 Honeywell International Inc. Electronic calibration for seamless tiled display using optical function generator
US6568816B2 (en) * 2000-10-04 2003-05-27 Panoram Technologies, Inc. Projection system and method for using a single light source to generate multiple images to be edge blended for arrayed or tiled display
US20020180727A1 (en) * 2000-11-22 2002-12-05 Guckenberger Ronald James Shadow buffer control module method and software construct for adjusting per pixel raster images attributes to screen space and projector features for digital warp, intensity transforms, color matching, soft-edge blending, and filtering for multiple projectors and laser projectors
US20030208345A1 (en) * 2002-05-02 2003-11-06 O'neill Julia Catherine Color matching and simulation of multicolor surfaces
US20040251574A1 (en) * 2003-06-13 2004-12-16 Collins David C. Methods to produce an object through solid freeform frabrication
US20060007239A1 (en) * 2004-07-06 2006-01-12 Harrison Charles F Color correction system
US20060238723A1 (en) * 2005-04-22 2006-10-26 El-Ghoroury Hussein S Low profile, large screen display using a rear projection array system
US20060250415A1 (en) * 2005-05-09 2006-11-09 Microsoft Corporation Anti-aliasing content using opacity blending
US20070291189A1 (en) * 2006-06-16 2007-12-20 Michael Harville Blend maps for rendering an image frame

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110216284A1 (en) * 2010-03-02 2011-09-08 Canon Kabushiki Kaisha Automatic mode switching between single and multiple projectors
US8439504B2 (en) 2010-03-02 2013-05-14 Canon Kabushiki Kaisha Automatic mode switching between single and multiple projectors
US20150269913A1 (en) * 2014-03-19 2015-09-24 Canon Kabushiki Kaisha Display apparatus
US9830842B2 (en) * 2014-03-19 2017-11-28 Canon Kabushiki Kaisha Display apparatus for performing control to change a position on an image where an additional image is superimposed
JP2022058677A (en) * 2014-11-28 2022-04-12 株式会社半導体エネルギー研究所 Display device and image processing method for display device
US20160343116A1 (en) * 2015-05-22 2016-11-24 Samsung Electronics Co., Ltd. Electronic device and screen display method thereof
JP2017083710A (en) * 2015-10-29 2017-05-18 キヤノン株式会社 Image processing device, image processing method, and program

Also Published As

Publication number Publication date
WO2007001298A1 (en) 2007-01-04

Similar Documents

Publication Publication Date Title
US20090135200A1 (en) Selective Edge Blending Based on Displayed Content
US7936361B2 (en) System and method for masking and overlaying images in multiple projector system
US8325198B2 (en) Color gamut mapping and brightness enhancement for mobile displays
US7907792B2 (en) Blend maps for rendering an image frame
US7854518B2 (en) Mesh for rendering an image frame
US9098922B2 (en) Adaptive image blending operations
US20070291184A1 (en) System and method for displaying images
US20070291047A1 (en) System and method for generating scale maps
US20070291185A1 (en) System and method for projecting multiple image streams
US8625154B2 (en) Apparatus and method for reproducing optimized preference color using candidate images and natural languages
JP2006108873A (en) Dynamic image processor and method
JPH0296485A (en) Picture generator
US20070103483A1 (en) Adaptive alpha blending
JP5089783B2 (en) Image processing apparatus and control method thereof
US7474438B2 (en) Wide gamut mapping method and apparatus
US20080246883A1 (en) Image processing program, image processing method, and image processor
JP4930781B2 (en) Image correction circuit, image correction method, and image display apparatus
JP2006033672A (en) Curved surface multi-screen projection method, and its device
US8077187B2 (en) Image display using a computer system, including, but not limited to, display of a reference image for comparison with a current image in image editing
JP6837860B2 (en) Image display control device, image display control method, and image display control program
JP2004147143A (en) Multi-image projection method using a plurality of projectors, projector device therefor, program, and recording medium
US6647151B1 (en) Coalescence of device independent bitmaps for artifact avoidance
JP5839808B2 (en) Information processing apparatus, information processing method, and program
JP3558073B2 (en) Image processing apparatus, image processing method, medium recording image processing control program
JPH05249951A (en) Image information presenting device

Legal Events

Date Code Title Description
AS Assignment

Owner name: THOMSON LICENSING, FRANCE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SCHULTZ, MARK ALAN;REEL/FRAME:020315/0890

Effective date: 20051007

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION