US20020094132A1 - Method, apparatus and computer program product for generating perspective corrected data from warped information - Google Patents

Method, apparatus and computer program product for generating perspective corrected data from warped information Download PDF

Info

Publication number
US20020094132A1
US20020094132A1 US10/056,476 US5647602A US2002094132A1 US 20020094132 A1 US20020094132 A1 US 20020094132A1 US 5647602 A US5647602 A US 5647602A US 2002094132 A1 US2002094132 A1 US 2002094132A1
Authority
US
United States
Prior art keywords
region
data space
points
computer
mapped
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/056,476
Inventor
Robert Hoffman
John Furlan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chartoleaux KG LLC
Original Assignee
Be Here Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Be Here Corp filed Critical Be Here Corp
Priority to US10/056,476 priority Critical patent/US20020094132A1/en
Publication of US20020094132A1 publication Critical patent/US20020094132A1/en
Assigned to BE HERE CORPORATION reassignment BE HERE CORPORATION RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: WASSERSTEIN ADELSON VENTURES, L.P.
Assigned to B.H. IMAGE CO. LLC reassignment B.H. IMAGE CO. LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BE HERE CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation

Definitions

  • This invention relates to the field computer technology for mapping one data space to another data space.
  • Computers are often used to map data that exists in a source data space to a destination data space. This type of mapping is often used in “virtual reality” and “telepresence” applications.
  • the data in the source data space can represent a warped image that is obtained by a distorting lens such as a fisheye lens or a catadioptric lens.
  • the data in the destination data space can be presented by a presentation device such as a video screen, computer monitor or printer. The problem is how to rapidly generate the data for the destination data space from the source data space.
  • One approach is to backward map the coordinates of a point in the destination data space to coordinates in the source data space and to obtain the value for the point in the destination data space from the mapped point in the source data space.
  • Precisely mapping each point is expensive in either memory or computation, or both.
  • Another approach is to precisely map a grid of points from the destination data space to the source data space. These grid points bound regions (patches) that contain pixels that have similar mapping as the grid points that bound the region. Thus, the precisely mapped grid points are used to determine coefficients for a mapping that can be applied to each point in the region.
  • Each of the grid points in the destination data space has a corresponding grid point in the source data space. Thus, the destination grid point and the corresponding source grid point are referred to as a “point pair.”
  • (x s , y s ,) is the is the resulting coordinate in the source data space
  • (x d , y d ) is the coordinates of the pixel in the destination data space
  • a, b, c, d, e, f , t x and t y are the perspective transform coefficients.
  • This calculation includes at least six multiply operations, two division operations, and six add operations. The multiply and division operations are computationally expensive. Although this equation does not generate mapping artifacts between the regions, the additional computational overhead is often prohibitive.
  • Another approach is to generate a look-up table that provides the x s and y s coordinates when given the x d and y d coordinates.
  • these look-up tables become unwieldy even with modern computers.
  • Another approach is to approximately map the region using a less computationally expensive formula.
  • an affine transformation can be used to perform this mapping.
  • three of the four sets of destination and source point pairs are used to compute the affine coefficients for the transformation.
  • coordinates that specify a pixel that in the destination data space can be used to find the corresponding pixel in the source data space.
  • (x s , y s ,) is the resulting coordinate in the source data space
  • (x d , y d ) is the coordinates of the pixel in the destination data space
  • a, b, c, d, e and f are the affine coefficients for the grid region bounded by the precisely mapped grid points.
  • This calculation includes four multiply operations and four add operations for each pixel in the patch and so is still computationally expensive.
  • An additional problem with this approach is that an affine transformation often generates a very poor approximation to the perspective transformation.
  • the affine transformation only uses three of the four point pairs that bound the region.
  • the affine transformation can generate mapping artifacts (such as discontinuities) along edges of the quadralateral defining the region in the source space.
  • FIG. 1 A illustrates a quadralateral patch in source space, indicated by general reference character 100 , used to show mapping artifacts generated by the affine transformation.
  • the quadralateral patch in source space 100 is bounded by grid points (such as a point A 101 , a point B 103 , a point C 105 , and a point D 107 ).
  • Applying an affine transformation to this region would generate the patch bounded by the point A 101 , the point B 103 , the point C 105 and a point D′ 109 instead of approximating the original patch.
  • FIG. 1B illustrates a presentation of a correctly mapped image, indicated by general reference character 150 , that represents a magnified image presented on a presentation device.
  • This image does not have any mapping artifacts. It can be generated by precisely mapping each point in the destination data space to the source data space. It can also be generated by precisely mapping grid points, and using the perspective transformation previously discussed to map the points in each region defined by the grid. Compare FIG. 1B with FIG. 1C.
  • FIG. 1C illustrates a presentation of an incorrectly mapped image, indicated by general reference character 160 , that shows mapping artifacts 161 that can result from the use of the affine transformation.
  • mapping artifacts include discontinuities in lines.
  • Other mapping artifacts include (without limitation) texture discontinuities and color discontinuities.
  • the invention provides a fast and accurate means of mapping one data space into another by precisely mapping grid points between the data spaces and then by performing a bilateral-bilinear interpolation to map the points bounded by the precisely mapped grid points.
  • One aspect of the invention is a computer-controlled method that includes the step of determining a region in a destination data space.
  • the region is bounded by a plurality of grid points. It defines a first plurality of data points in the destination data space.
  • the method precisely maps the plurality of grid points in the destination data space to a plurality of mapped grid points in a source data space.
  • the source data space contains, or is associated with, a second plurality of data points.
  • the plurality of mapped grid points define a plurality of boundary lines that represent the boundary of the region as mapped into the source data space.
  • the method also applies a bilateral-bilinear interpolation algorithm to approximately map the first plurality of data points to the second plurality of data points.
  • Another aspect of the invention is an apparatus that includes a central processing unit (CPU) and a memory coupled to the CPU.
  • the apparatus also includes a region determination mechanism that is configured to determine a region in a destination data space.
  • the region is bounded by a plurality of grid points.
  • the region defines a first plurality of data points within the destination data space.
  • the apparatus also includes a precise mapping mechanism that is configured to precisely map the plurality of grid points determined by the region determination mechanism to a plurality of mapped grid points in a source data space.
  • the source data space contains (or associates) a second plurality of data points.
  • the plurality of mapped grid points define a plurality of boundary lines that represent the boundary of the region as mapped into the source data space.
  • the apparatus also includes a bilateral-bilinear interpolation mechanism that is configured to approximately map the first plurality of data points in the region to the second plurality of data points using the plurality of mapped grid points.
  • Yet another aspect of the invention is a computer program product that includes a computer usable storage medium having computer readable code embodied therein for causing a computer to map a destination data space to a source data space.
  • the computer readable code When executed on a computer, the computer readable code causes the computer to effect a precise mapping mechanism, a region determination mechanism, and a bilateral-bilinear interpolation mechanism.
  • Each of these mechanisms having the same functions as the corresponding mechanisms for the previously described apparatus.
  • Still another aspect of the invention is a computer program product embodied in a carrier wave transmitting computer readable code therein for causing a computer to map a destination data space to a source data space.
  • the computer readable code When executed on a computer, the computer readable code causes the computer to effect a precise mapping mechanism, a region determination mechanism, and a bilateral-bilinear interpolation mechanism.
  • Each of these mechanisms having the same functions as the corresponding mechanisms for the previously described apparatus.
  • FIG. 1A illustrates a mapping artifact resulting from an affine transformation
  • FIG. 1B illustrates a presentation of an image without mapping artifacts
  • FIG. 1C illustrates a presentation of an image with mapping artifacts
  • FIG. 2 illustrates a computer system capable of using the invention in accordance with a preferred embodiment
  • FIG. 3A illustrates a gridded destination data space in two-dimensions in accordance with a preferred embodiment
  • FIG. 3B illustrates a gridded source data space with a mapped destination data space in two-dimensions in accordance with a preferred embodiment
  • FIG. 3C illustrates a gridded destination data space with a mapped destination data space in three-dimensions in accordance with a preferred embodiment
  • FIG. 3D illustrates a gridded source data space with a mapped destination data space in three-dimensions in accordance with a preferred embodiment
  • FIG. 4A illustrates a gridded patch in two-dimensions in accordance with a preferred embodiment
  • FIG. 4B illustrates the gridded patch of FIG. 4A as mapped into the source data space in accordance with a preferred embodiment
  • FIG. 5 illustrates an overview of the process used to backward map pixels in a destination data space to a source data space in accordance with a preferred embodiment
  • FIG. 6 illustrates a bilateral-bilinear interpolation algorithm that backward maps pixels in a destination region to a source data space in accordance with a preferred embodiment.
  • Procedure A procedure is a self-consistent sequence of computerized steps that lead to a desired result. These steps are defined by one or more computer instructions. These steps are performed by a computer executing the instructions that define the steps.
  • the term “procedure” can refer to a sequence of instructions, a sequence of instructions organized within a programmed-procedure or programmed-function, or a sequence of instructions organized within programmed-processes executing in one or more computers.
  • FIG. 2 illustrates a computer, indicated by general reference character 200 , that incorporates the invention.
  • the computer 200 includes a processor 201 having a central processor unit (CPU) 203 , a memory section 205 , and an input/output (I/O ) section 207 .
  • the I/O section 207 is connected to a presentation device 211 , a disk storage unit 213 and a CD-ROM drive unit 215 .
  • the CD-ROM drive unit 215 can read a CD-ROM medium 217 that typically contains a program and data 219 .
  • the CD-ROM drive unit 215 (along with the CD-ROM medium 217 ) and the disk storage unit 213 comprise a filestorage mechanism (a filesystem).
  • Some embodiments of the invention include a network interface 221 that connects the computer 200 to a network 223 .
  • An application program 225 executes from the memory section 205 .
  • the application program 225 can be loaded into the memory section 205 over the network 223 or from the filesystem.
  • the application program 225 includes computer code that causes the computer to perform the inventive steps.
  • the CD-ROM drive unit 215 (along with the CD-ROM medium 217 ) are illustrative of mechanisms that can be used to read computer code from a removable media.
  • the CD-ROM drive unit 215 (along with the CD-ROM medium 217 ) are illustrative of mechanisms that can be used to read computer code from a removable media.
  • FIG. 3A through FIG. 3D illustrate some of the possible data spaces that can be mapped by this aspect of the invention.
  • FIG. 3A illustrates a gridded destination data space 300 showing a first grid point 301 , a second grid point 303 , a third grid point 305 , and a fourth grid point 307 . These points are bounding points for the destination data space.
  • Each of the intersections in the destination data space (for example a fifth grid point 309 ) is precisely mapped to a source data space.
  • the grid points bound regions that contain data points that will be approximately mapped to the source data space.
  • the third grid point 305 and the fifth grid point 309 are two of the four grid points that bound a region 311 that contains points having mappings that will be interpolated.
  • a bilateral-bilinear interpolation algorithm performs this approximate mapping.
  • the bilateral-bilinear interpolation algorithm is subsequently described with respect to FIG. 5 and FIG. 6 as applied to patches in a two-dimensional data space.
  • FIG. 3B illustrates a gridded source data space 350 indicating how the destination data space is mapped to the source data space.
  • the resolution of the gridded destination data space 300 and the gridded source data space 350 need not be the same.
  • the gridded source data space 350 can contain (or reference) warped image data that represents a true image that has been warped by a lens.
  • a physical lens need not be used to generate the warped image as ray-tracing techniques through a virtual lens can also be used to generate the warped image.
  • a virtual lens can be used to generate images in a virtual-space. Once the image is generated, the invention can be used to present the image.
  • One aspect of the invention backward maps the destination data space to the source data space using a mapping that generates a perspective corrected image in the destination data space.
  • One step in this mapping process precisely maps the first grid point 301 , the second grid point 303 , the third grid point 305 , the fourth grid point 307 , the fifth grid point 309 , and other grid points to the gridded source data space 350 .
  • These grid points map to a mapped first grid point 301 ′, a mapped second grid point 303 ′, a mapped third grid point 305 ′, a mapped fourth grid point 307 ′, a mapped fifth grid point 309 ′ and other grid points respectively in the source data space.
  • the region 311 is mapped to a mapped region 311 ′.
  • FIG. 3C illustrates a 3-D gridded destination data space, indicated by general reference character 360 , that has a first plane 361 (bounded by the first grid point 301 , the second grid point 303 , the third grid point 305 and the fourth grid point 307 ) and a second plane 363 (sp) (bounded by a sp-first grid point 365 , a sp-second grid point 367 , a sp-third grid point 369 and another point that cannot be seen in FIG. 3C).
  • first plane 361 (bounded by the first grid point 301 , the second grid point 303 , the third grid point 305 and the fourth grid point 307 )
  • a second plane 363 (sp) (bounded by a sp-first grid point 365 , a sp-second grid point 367 , a sp-third grid point 369 and another point that cannot be seen in FIG. 3C).
  • FIG. 3D illustrates a 3-D gridded source data space, indicated by general reference character 370 , that indicates how the 3-D gridded destination data space 360 is mapped to the 3-D gridded source data space 370 .
  • a mapped first plane 361 ′ is bounded by the mapped first grid point 301 ′, the mapped second grid point 303 ′, the mapped third grid point 305 ′, and the mapped fourth grid point 307 ′.
  • a mapped second plane 363 ′ (msp) is bounded by a msp-second grid point 367 ′, and a msp-third grid point 369 ′, and two other points that cannot be seen in FIG. 3D.
  • FIG. 3C and FIG. 3D show how grid points can be imposed on three-dimensional spaces. Once the grid points are precisely mapped, the points contained in the region (the volume) between and including the first plane 361 and the second plane 363 can be interpolated by extending the subsequently described techniques. Similar techniques can be applied to n-dimensional spaces.
  • the bilateral-bilinear interpolation algorithm is applicable to n-dimensional spaces
  • subsequent discussion of the algorithm is directed to two-dimensional spaces containing image data.
  • Each region is a two-dimensional patch containing points that represent pixels.
  • One skilled in the art will understand how to modify the described algorithm to be applicable to higher dimensional spaces, for non-image data, and to a source data space that references the data.
  • the invention can be used (without limitation) to map a viewport onto spherical, cylindrical, and panoramic spaces.
  • FIG. 4A illustrates a patch in destination data space, indicated by general reference character 400 , bounded by a first grid point 401 , a second grid point 403 , a third grid point 405 and a fourth grid point 407 .
  • the destination patch 400 contains a number of pixels (in the illustration, 36 pixels) of which a pixel 409 is but one.
  • the bilateral-bilinear interpolation algorithm efficiently generates data values for the pixels contained in the destination patch 400 .
  • the 36 pixels are arranged in six scan lines. Each scan line is six pixels long.
  • the destination patch 400 need not be square and may include more or fewer pixels than the 36 used in the illustration.
  • the grid points are mapped to the source data space as is shown with respect to FIG. 4B.
  • FIG. 4B illustrates a mapped patch in source data space, indicated by general reference character 420 , indicating some of the parameters used by the bilateral-bilinear interpolation algorithm.
  • the mapped patch 420 is bounded by the mapped first grid point 401 ′, the mapped second grid point 403 ′, the mapped third grid point 405 ′, and the mapped fourth grid point 407 ′ each of which have been precisely mapped to the source data space from the corresponding points in the destination data space.
  • the data that is used to generate the value for the pixel 409 in the destination data space is located at a mapped pixel contribution area 409 ′.
  • the mapped pixel contribution area 409 ′ contains pixels of a warped image at a resolution possibly different from the pixel resolution in the destination data space. Techniques known in the art are used to determine the value of the pixel 409 based on the information within the mapped pixel contribution area 409 ′.
  • the mapped grid points define lines that bound the mapped patch 420 .
  • the mapped first grid point 401 ′ and the mapped third grid point 405 ′ define a second boundary line 421 ;
  • the mapped second grid point 403 ′ and the mapped fourth grid point 407 ′ define a third boundary line 423 ;
  • the mapped first grid point 401 ′ and the mapped second grid point 403 ′ define a first boundary line 425 and a mapped third grid point 405 ′ and a mapped fourth grid point 407 ′ define a final boundary line 427 .
  • a different geometry can be used other than the one described.
  • a first slope 428 represents the slope of the first boundary line 425 .
  • a second slope 429 represents the slope of the second boundary line 421 and in the two-dimensional case includes delta-x and delta-y components.
  • a third slope 431 represents the slope of the third boundary line 423 .
  • a final slope 435 represents the slope of the final boundary line 427 .
  • the bilateral-bilinear interpolation algorithm operates by determining the second slope 429 and the third slope 431 for the boundary lines.
  • the second slope 429 and the third slope 431 need not be the same.
  • Each of these slopes is used to determine a respective delta-x and delta-y value that is dependent on the number of scan lines in the destination patch 400 (N yd ).
  • N yd the number of scan lines in the destination patch 400
  • each pixel in the first scan line in the destination patch 400 is iterated.
  • a delta-x and delta-y representing the first slope 428 is determined responsive to the number of pixels in the scan line contained by the destination patch 400 (N xd ), and the coordinates of the starting pixel and the ending pixel.
  • the mapped pixel contribution area 409 ′ in the source data space is evaluated to determine a value for the destination pixel.
  • the corresponding position in the source data space is advanced by the delta-x and delta-y corresponding to the first slope 428 .
  • subsequent scan lines in the destination patch 400 are processed.
  • the starting coordinates for a subsequent scan line in the mapped patch 420 is advanced by the delta-x and delta-y corresponding to the second slope 429 and the ending position of the subsequent scan line in the mapped patch 420 is advanced by the delta-x and delta-y corresponding to the third slope 431 .
  • a subsequent slope 437 can be (and usually is) different from the first slope 428 , the final slope 435 and any other subsequent slope.
  • Each subsequent scan line in the destination patch 400 is iterated (such that the last subsequent scan line is the final boundary line 427 ).
  • the bilateral-bilinear interpolation algorithm assures that adjacent patches correctly align.
  • the bilateral-bilinear interpolation algorithm does not generate mapping artifacts as shown in FIG. 1A and FIG. 1C.
  • FIG. 5 illustrates a mapping process, indicated by general reference character 500 used to backward map data points in a destination data space to data points in a source data space.
  • the mapping process 500 initiates at a ‘start’ terminal 501 and continues to an ‘initialization’ procedure 503 .
  • the ‘initialization’ procedure 503 performs initialization steps for the mapping process 500 . These steps can include steps for allocating memory for the source data space, allocating memory for the destination data space, determining the resolution of the presentation device (if any) used to present the destination data, and the spacing of grid points in the destination data space.
  • the mapping process 500 continues to a ‘load source data’ procedure 505 that inputs the source data into the source data space.
  • the source data can be read (without limitation) from a file, a scanner device, a video device, from a network, a medical diagnostic tool or other similar device.
  • the source data can represent a portion of a video data stream (the video data stream can be compressed; in addition the video stream can be live video, stored video or computer generated video).
  • the ‘load source data’ procedure 505 need not complete before the mapping process 500 continues to a ‘determine grid points’ procedure 507 .
  • the ‘determine grid points’ procedure 507 uses the resolution and the size of the destination data space and possibly other parameters to determine the size of the region.
  • the region can be n-dimensional.
  • the region defines the data points that will be interpolated instead of being precisely mapped.
  • the bilateral-bilinear interpolation algorithm can be applied to n-dimensional spaces. When the region is two-dimensional, the region is referred to as a patch.
  • a ‘precisely map grid points’ procedure 508 precisely maps the grid points that bound the selected region in the destination data space to points in the source data space.
  • the ‘precisely map grid points’ procedure 508 uses well known transformations that can include floating point multiplication and division operations to precisely locate points in the source data space that correspond to the grid points in the destination data space.
  • mapping process 500 continues to an ‘iterate region’ procedure 509 that iterates each region in the destination data space that is to be interpolated.
  • a ‘get grid point coordinates in source data space’ procedure 511 obtains the grid points that bound the iterated region.
  • a ‘map points in region’ procedure 513 applies a bilateral-bilinear interpolation algorithm to approximately map the points in the region to a portion of the data in the source data space. The bilateral-bilinear interpolation algorithm is subsequently described with respect to FIG. 6.
  • the mapping process 500 repeats to the ‘iterate region’ procedure 509 until all the regions in the destination data space are iterated.
  • the resulting data in the destination data space can then be presented by a ‘present destination space data’ procedure 514 .
  • This presentation can be accomplished (without limitation) by visually presenting the information by using a presentation device such as a printer or monitor, by providing a printout of the data, or by subsequent processing of the data using other mechanisms.
  • the mapping process 500 completes through an ‘end’ terminal 515 .
  • precisely mapped grid points define lines in the source data space that can serve as boundary lines for the mapped region in the source data space.
  • FIG. 6 illustrates a bilateral-bilinear interpolation algorithm process, indicated by general reference character 600 that is invoked from the ‘map points in region’ procedure 513 of FIG. 5.
  • a preferred embodiment is directed towards mapping data between two two-dimensional data spaces.
  • This embodiment can be used to generate a perspective corrected image from a warped image that was generated from a true image projected through a lens (such as a fisheye lens or a catadioptric lens).
  • a physical lens need not be used to generate the warped image as ray-tracing techniques through a virtual lens can also be used to generate the warped image.
  • the bilateral-bilinear interpolation algorithm process 600 initiates at a ‘start’ terminal 601 and continues to an ‘initialize’ procedure 603 .
  • the ‘initialize’ procedure 603 determines the slopes for the boundary lines in the source data space that define the limits of the scan lines in the patch. The slope is defined by a delta-x and delta-y that depend on the number of scan lines in the patch.
  • the ‘initialize’ procedure 603 also defines the starting and ending coordinates in the source data space for the first scan line that is to be interpolated.
  • the ‘initialize’ procedure 603 can include steps similar to:
  • startx x0; // set starting coordinates
  • the bilateral-bilinear interpolation algorithm process 600 continues to an ‘iterate scan line in patch’ procedure 605 that iterates each scan line in the patch in the destination data space.
  • the bilateral-bilinear interpolation algorithm process 600 completes through an ‘end’ terminal 607 .
  • the number of iterations to iterate each scan line in the patch is the value of nyd.
  • An ‘initialize working variables’ procedure 609 initializes the variables used for the iteration of the pixels in the iterated scan line. These initializations include determining the slope for the iterated scan line based on the coordinates of the start point and the end point of the scan line in the source data space. The start point of the scan line substantially lies on the boundary line defined by P 0 and P 2 . The end point of the scan line substantially lies on the line defined by P 1 and P 3 . Thus, these lines bound each scan line. The slope of the scan line is determined using the start point, the end point, and the number of pixels in the scan line in the patch.
  • the ‘initialize working variables’ procedure 609 can include steps similar to:
  • An ‘iterate each pixel in scan line’ procedure 611 iterates each pixel in the destination scan line. To iterate each pixel in the scan line requires N xd iterations.
  • the bilateral-bilinear interpolation algorithm process 600 continues to an ‘advance to next scan line in patch’ procedure 613 .
  • the ‘advance to next scan line in patch’ procedure 613 advances the startx, starty, endx and endy values by dxl, dyl, dxr, and dyr respectively.
  • each pixel in scan line’ procedure 611 determines a subsequent starting point and a subsequent ending point that bound a subsequent line that has a subsequent line slope.
  • the subsequent line depends on the slope of the boundary lines.
  • the ‘advance to next scan line in patch’ procedure 613 can include steps similar to:
  • each scan line thus depends on the slope of the patch's bounding lines.
  • the end position of each scan line in one region substantially matches the start position of each scan line for an adjacent patch. The result is that there are no interpolation artifacts between adjacent patches (or regions).
  • the bilateral-bilinear interpolation algorithm process 600 continues to the ‘iterate scan line in patch’ procedure 605 to iterate the next scan line in the patch or to exit if all scan lines have been iterated.
  • a ‘set pixel’ procedure 615 obtains the data value for the pixel in the destination data space from the data area specified by sx and sy in the source data space.
  • One skilled in the art will understand how to obtain the value of a destination pixel from the source data space given coordinates in the source data space. Such a one will also understand how to combine data values in the source data space to generate the data value in the destination data space.
  • the bilateral-bilinear interpolation algorithm process 600 continues to an ‘advance to next pixel in scan line’ procedure 617 .
  • the ‘advance to next pixel in scan line’ procedure 617 advances sx and sy by dx and dy respectively.
  • the bilateral-bilinear interpolation algorithm process 600 continues to the ‘iterate each pixel in scan line’ procedure 611 until each pixel in the scan line has been iterated.
  • the invention provides a high performance mapping capability between data spaces.
  • the invention provides a superior real-time presentation of a visual image when the source data space contains warped image data.

Abstract

Apparatus, methods, and computer program products that provide fast and accurate means of mapping one data space into another by precisely mapping grid points between the data spaces and then by performing a bilateral-bilinear interpolation to map the points bounded by the precisely mapped grid points. The precisely mapped grid points define boundary lines that bound a data region in a source space. Each scan line mapped to the source space is dependent on the slopes of the bounding lines of the data region.

Description

    Background of the Invention
  • 1. Field of the Invention [0001]
  • This invention relates to the field computer technology for mapping one data space to another data space. [0002]
  • 2 . Background [0003]
  • Computers are often used to map data that exists in a source data space to a destination data space. This type of mapping is often used in “virtual reality” and “telepresence” applications. The data in the source data space can represent a warped image that is obtained by a distorting lens such as a fisheye lens or a catadioptric lens. The data in the destination data space can be presented by a presentation device such as a video screen, computer monitor or printer. The problem is how to rapidly generate the data for the destination data space from the source data space. [0004]
  • One approach is to backward map the coordinates of a point in the destination data space to coordinates in the source data space and to obtain the value for the point in the destination data space from the mapped point in the source data space. Precisely mapping each point (for example, by using floating point calculations to perform the mapping) is expensive in either memory or computation, or both. [0005]
  • Another approach is to precisely map a grid of points from the destination data space to the source data space. These grid points bound regions (patches) that contain pixels that have similar mapping as the grid points that bound the region. Thus, the precisely mapped grid points are used to determine coefficients for a mapping that can be applied to each point in the region. Each of the grid points in the destination data space has a corresponding grid point in the source data space. Thus, the destination grid point and the corresponding source grid point are referred to as a “point pair.”[0006]
  • By using all four of the destination and source point pairs that bound a region, a perspective transformation can be computed and used to find the corresponding pixel in the source data space. Thus, [0007] x s = ax d + by d + t x ex d + fy d + 1 y s = cx d + dy d + t y ex d + fy d + 1
    Figure US20020094132A1-20020718-M00001
  • Can be used to perform the mapping where (x[0008] s, ys,) is the is the resulting coordinate in the source data space, (xd, yd) is the coordinates of the pixel in the destination data space, and a, b, c, d, e, f , tx and ty are the perspective transform coefficients. This calculation includes at least six multiply operations, two division operations, and six add operations. The multiply and division operations are computationally expensive. Although this equation does not generate mapping artifacts between the regions, the additional computational overhead is often prohibitive.
  • Another approach is to generate a look-up table that provides the x[0009] s and ys coordinates when given the xd and yd coordinates. With high resolution images and high-resolution presentation devices, these look-up tables become unwieldy even with modern computers.
  • Another approach is to approximately map the region using a less computationally expensive formula. For example, an affine transformation can be used to perform this mapping. In this circumstance, three of the four sets of destination and source point pairs are used to compute the affine coefficients for the transformation. Then coordinates that specify a pixel that in the destination data space can be used to find the corresponding pixel in the source data space. Thus, [0010]
  • x s =ax d +by d +c
  • y s =dx d +ey d +f
  • Where (x[0011] s, ys,) is the resulting coordinate in the source data space, (xd, yd) is the coordinates of the pixel in the destination data space, and a, b, c, d, e and f are the affine coefficients for the grid region bounded by the precisely mapped grid points. This calculation includes four multiply operations and four add operations for each pixel in the patch and so is still computationally expensive. An additional problem with this approach is that an affine transformation often generates a very poor approximation to the perspective transformation. The affine transformation only uses three of the four point pairs that bound the region. Thus, the affine transformation can generate mapping artifacts (such as discontinuities) along edges of the quadralateral defining the region in the source space.
  • FIG. 1 A illustrates a quadralateral patch in source space, indicated by [0012] general reference character 100, used to show mapping artifacts generated by the affine transformation. The quadralateral patch in source space 100 is bounded by grid points (such as a point A 101, a point B 103, a point C 105, and a point D 107 ). Applying an affine transformation to this region (using the points 101, 103, and 105 and the corresponding points in the destination space) would generate the patch bounded by the point A 101, the point B 103, the point C 105 and a point D′ 109 instead of approximating the original patch.
  • FIG. 1B illustrates a presentation of a correctly mapped image, indicated by [0013] general reference character 150, that represents a magnified image presented on a presentation device. This image does not have any mapping artifacts. It can be generated by precisely mapping each point in the destination data space to the source data space. It can also be generated by precisely mapping grid points, and using the perspective transformation previously discussed to map the points in each region defined by the grid. Compare FIG. 1B with FIG. 1C.
  • FIG. 1C illustrates a presentation of an incorrectly mapped image, indicated by [0014] general reference character 160, that shows mapping artifacts 161 that can result from the use of the affine transformation. As can be seen, these mapping artifacts include discontinuities in lines. Other mapping artifacts include (without limitation) texture discontinuities and color discontinuities.
  • It would be advantageous to use a fast mapping algorithm that also provides a good approximation for a precise perspective-correct transformation that maintains continuity across patch boundaries without the computational or memory overheads associated with the prior art. Devices and computers that use these methods will operate more efficiently than those that use prior art methods will. [0015]
  • SUMMARY OF THE INVENTION
  • The invention provides a fast and accurate means of mapping one data space into another by precisely mapping grid points between the data spaces and then by performing a bilateral-bilinear interpolation to map the points bounded by the precisely mapped grid points. [0016]
  • One aspect of the invention is a computer-controlled method that includes the step of determining a region in a destination data space. The region is bounded by a plurality of grid points. It defines a first plurality of data points in the destination data space. The method precisely maps the plurality of grid points in the destination data space to a plurality of mapped grid points in a source data space. The source data space contains, or is associated with, a second plurality of data points. The plurality of mapped grid points define a plurality of boundary lines that represent the boundary of the region as mapped into the source data space. The method also applies a bilateral-bilinear interpolation algorithm to approximately map the first plurality of data points to the second plurality of data points. [0017]
  • Another aspect of the invention is an apparatus that includes a central processing unit (CPU) and a memory coupled to the CPU. The apparatus also includes a region determination mechanism that is configured to determine a region in a destination data space. The region is bounded by a plurality of grid points. The region defines a first plurality of data points within the destination data space. The apparatus also includes a precise mapping mechanism that is configured to precisely map the plurality of grid points determined by the region determination mechanism to a plurality of mapped grid points in a source data space. The source data space contains (or associates) a second plurality of data points. The plurality of mapped grid points define a plurality of boundary lines that represent the boundary of the region as mapped into the source data space. The apparatus also includes a bilateral-bilinear interpolation mechanism that is configured to approximately map the first plurality of data points in the region to the second plurality of data points using the plurality of mapped grid points. [0018]
  • Yet another aspect of the invention, is a computer program product that includes a computer usable storage medium having computer readable code embodied therein for causing a computer to map a destination data space to a source data space. When executed on a computer, the computer readable code causes the computer to effect a precise mapping mechanism, a region determination mechanism, and a bilateral-bilinear interpolation mechanism. Each of these mechanisms having the same functions as the corresponding mechanisms for the previously described apparatus. [0019]
  • Still another aspect of the invention is a computer program product embodied in a carrier wave transmitting computer readable code therein for causing a computer to map a destination data space to a source data space. When executed on a computer, the computer readable code causes the computer to effect a precise mapping mechanism, a region determination mechanism, and a bilateral-bilinear interpolation mechanism. Each of these mechanisms having the same functions as the corresponding mechanisms for the previously described apparatus.[0020]
  • DESCRIPTION OF THE DRAWINGS
  • FIG. 1A illustrates a mapping artifact resulting from an affine transformation; [0021]
  • FIG. 1B illustrates a presentation of an image without mapping artifacts; [0022]
  • FIG. 1C illustrates a presentation of an image with mapping artifacts; [0023]
  • FIG. 2 illustrates a computer system capable of using the invention in accordance with a preferred embodiment; [0024]
  • FIG. 3A illustrates a gridded destination data space in two-dimensions in accordance with a preferred embodiment; [0025]
  • FIG. 3B illustrates a gridded source data space with a mapped destination data space in two-dimensions in accordance with a preferred embodiment; [0026]
  • FIG. 3C illustrates a gridded destination data space with a mapped destination data space in three-dimensions in accordance with a preferred embodiment; [0027]
  • FIG. 3D illustrates a gridded source data space with a mapped destination data space in three-dimensions in accordance with a preferred embodiment; [0028]
  • FIG. 4A illustrates a gridded patch in two-dimensions in accordance with a preferred embodiment; [0029]
  • FIG. 4B illustrates the gridded patch of FIG. 4A as mapped into the source data space in accordance with a preferred embodiment; [0030]
  • FIG. 5 illustrates an overview of the process used to backward map pixels in a destination data space to a source data space in accordance with a preferred embodiment; and [0031]
  • FIG. 6 illustrates a bilateral-bilinear interpolation algorithm that backward maps pixels in a destination region to a source data space in accordance with a preferred embodiment.[0032]
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Notations and Nomenclature [0033]
  • The following ‘notations and nomenclature’ are provided to assist in the understanding of the present invention and the preferred embodiments thereof. [0034]
  • Procedure—A procedure is a self-consistent sequence of computerized steps that lead to a desired result. These steps are defined by one or more computer instructions. These steps are performed by a computer executing the instructions that define the steps. Thus, the term “procedure” can refer to a sequence of instructions, a sequence of instructions organized within a programmed-procedure or programmed-function, or a sequence of instructions organized within programmed-processes executing in one or more computers. [0035]
  • Operating Environment [0036]
  • FIG. 2 illustrates a computer, indicated by [0037] general reference character 200, that incorporates the invention. The computer 200 includes a processor 201 having a central processor unit (CPU) 203, a memory section 205, and an input/output (I/O ) section 207. The I/O section 207 is connected to a presentation device 211, a disk storage unit 213 and a CD-ROM drive unit 215. The CD-ROM drive unit 215 can read a CD-ROM medium 217 that typically contains a program and data 219. The CD-ROM drive unit 215 (along with the CD-ROM medium 217 ) and the disk storage unit 213 comprise a filestorage mechanism (a filesystem). Some embodiments of the invention include a network interface 221 that connects the computer 200 to a network 223. An application program 225 executes from the memory section 205. The application program 225 can be loaded into the memory section 205 over the network 223 or from the filesystem. In one embodiment of the invention, the application program 225 includes computer code that causes the computer to perform the inventive steps. The CD-ROM drive unit 215 (along with the CD-ROM medium 217 ) are illustrative of mechanisms that can be used to read computer code from a removable media. One skilled in the art will understand that not all of the displayed features of the computer 200 need to be present for the invention.
  • Data Space [0038]
  • One aspect of the invention maps points between two data spaces. FIG. 3A through FIG. 3D illustrate some of the possible data spaces that can be mapped by this aspect of the invention. [0039]
  • FIG. 3A illustrates a gridded [0040] destination data space 300 showing a first grid point 301, a second grid point 303, a third grid point 305, and a fourth grid point 307. These points are bounding points for the destination data space. Each of the intersections in the destination data space (for example a fifth grid point 309 ) is precisely mapped to a source data space. The grid points bound regions that contain data points that will be approximately mapped to the source data space. For example, the third grid point 305 and the fifth grid point 309 are two of the four grid points that bound a region 311 that contains points having mappings that will be interpolated. A bilateral-bilinear interpolation algorithm performs this approximate mapping. The bilateral-bilinear interpolation algorithm is subsequently described with respect to FIG. 5 and FIG. 6 as applied to patches in a two-dimensional data space.
  • FIG. 3B illustrates a gridded [0041] source data space 350 indicating how the destination data space is mapped to the source data space. The resolution of the gridded destination data space 300 and the gridded source data space 350 need not be the same. The gridded source data space 350 can contain (or reference) warped image data that represents a true image that has been warped by a lens. One skilled in the art will understand that a physical lens need not be used to generate the warped image as ray-tracing techniques through a virtual lens can also be used to generate the warped image. A virtual lens can be used to generate images in a virtual-space. Once the image is generated, the invention can be used to present the image.
  • One aspect of the invention backward maps the destination data space to the source data space using a mapping that generates a perspective corrected image in the destination data space. One step in this mapping process precisely maps the [0042] first grid point 301, the second grid point 303, the third grid point 305, the fourth grid point 307, the fifth grid point 309, and other grid points to the gridded source data space 350. These grid points map to a mapped first grid point 301′, a mapped second grid point 303′, a mapped third grid point 305′, a mapped fourth grid point 307′, a mapped fifth grid point 309′ and other grid points respectively in the source data space. Thus, the region 311 is mapped to a mapped region 311′.
  • Notice that the gridded [0043] destination data space 300 when mapped into the gridded source data space 350 need not result in a parallelogram—the slopes of each of the lines defined by the mapped grid points can be different.
  • FIG. 3C illustrates a 3-D gridded destination data space, indicated by [0044] general reference character 360, that has a first plane 361 (bounded by the first grid point 301, the second grid point 303, the third grid point 305 and the fourth grid point 307 ) and a second plane 363 (sp) (bounded by a sp-first grid point 365, a sp-second grid point 367, a sp-third grid point 369 and another point that cannot be seen in FIG. 3C).
  • FIG. 3D illustrates a 3-D gridded source data space, indicated by general reference character [0045] 370, that indicates how the 3-D gridded destination data space 360 is mapped to the 3-D gridded source data space 370. A mapped first plane 361′ is bounded by the mapped first grid point 301′, the mapped second grid point 303′, the mapped third grid point 305′, and the mapped fourth grid point 307′. A mapped second plane 363′ (msp) is bounded by a msp-second grid point 367′, and a msp-third grid point 369′, and two other points that cannot be seen in FIG. 3D.
  • FIG. 3C and FIG. 3D show how grid points can be imposed on three-dimensional spaces. Once the grid points are precisely mapped, the points contained in the region (the volume) between and including the [0046] first plane 361 and the second plane 363 can be interpolated by extending the subsequently described techniques. Similar techniques can be applied to n-dimensional spaces.
  • Although the bilateral-bilinear interpolation algorithm is applicable to n-dimensional spaces, subsequent discussion of the algorithm is directed to two-dimensional spaces containing image data. Each region is a two-dimensional patch containing points that represent pixels. One skilled in the art will understand how to modify the described algorithm to be applicable to higher dimensional spaces, for non-image data, and to a source data space that references the data. Such a one will also understand that the invention can be used (without limitation) to map a viewport onto spherical, cylindrical, and panoramic spaces. [0047]
  • FIG. 4A illustrates a patch in destination data space, indicated by [0048] general reference character 400, bounded by a first grid point 401, a second grid point 403, a third grid point 405 and a fourth grid point 407. The destination patch 400 contains a number of pixels (in the illustration, 36 pixels) of which a pixel 409 is but one. The bilateral-bilinear interpolation algorithm efficiently generates data values for the pixels contained in the destination patch 400. In this particular illustration, the 36 pixels are arranged in six scan lines. Each scan line is six pixels long. One skilled in the art will understand that the destination patch 400 need not be square and may include more or fewer pixels than the 36 used in the illustration. The grid points are mapped to the source data space as is shown with respect to FIG. 4B.
  • FIG. 4B illustrates a mapped patch in source data space, indicated by [0049] general reference character 420, indicating some of the parameters used by the bilateral-bilinear interpolation algorithm. The mapped patch 420 is bounded by the mapped first grid point 401′, the mapped second grid point 403′, the mapped third grid point 405′, and the mapped fourth grid point 407′ each of which have been precisely mapped to the source data space from the corresponding points in the destination data space. The data that is used to generate the value for the pixel 409 in the destination data space is located at a mapped pixel contribution area 409′. The mapped pixel contribution area 409′ contains pixels of a warped image at a resolution possibly different from the pixel resolution in the destination data space. Techniques known in the art are used to determine the value of the pixel 409 based on the information within the mapped pixel contribution area 409′.
  • The mapped grid points define lines that bound the mapped [0050] patch 420. Thus, the mapped first grid point 401′ and the mapped third grid point 405′ define a second boundary line 421; the mapped second grid point 403′ and the mapped fourth grid point 407′ define a third boundary line 423; the mapped first grid point 401′ and the mapped second grid point 403′ define a first boundary line 425 and a mapped third grid point 405′ and a mapped fourth grid point 407′ define a final boundary line 427. One skilled in the art will understand that a different geometry can be used other than the one described.
  • A [0051] first slope 428 represents the slope of the first boundary line 425. A second slope 429 represents the slope of the second boundary line 421 and in the two-dimensional case includes delta-x and delta-y components. A third slope 431 represents the slope of the third boundary line 423. A final slope 435 represents the slope of the final boundary line 427.
  • The bilateral-bilinear interpolation algorithm operates by determining the [0052] second slope 429 and the third slope 431 for the boundary lines. The second slope 429 and the third slope 431 need not be the same. Each of these slopes is used to determine a respective delta-x and delta-y value that is dependent on the number of scan lines in the destination patch 400 (Nyd). Next, each pixel in the first scan line in the destination patch 400 is iterated. To do this, a delta-x and delta-y representing the first slope 428 is determined responsive to the number of pixels in the scan line contained by the destination patch 400 (Nxd), and the coordinates of the starting pixel and the ending pixel. As each destination pixel is iterated the mapped pixel contribution area 409′ in the source data space is evaluated to determine a value for the destination pixel. As the destination pixel is advanced, the corresponding position in the source data space is advanced by the delta-x and delta-y corresponding to the first slope 428. Once the first scan line has been processed subsequent scan lines in the destination patch 400 are processed. The starting coordinates for a subsequent scan line in the mapped patch 420 is advanced by the delta-x and delta-y corresponding to the second slope 429 and the ending position of the subsequent scan line in the mapped patch 420 is advanced by the delta-x and delta-y corresponding to the third slope 431. Thus, the subsequent slope for each mapped scan line changes dependent on the slope of the second boundary line 421 and the third boundary line 423. Thus for example, a subsequent slope 437 can be (and usually is) different from the first slope 428, the final slope 435 and any other subsequent slope.
  • Each subsequent scan line in the [0053] destination patch 400 is iterated (such that the last subsequent scan line is the final boundary line 427 ).
  • One skilled in the art will understand that the bilateral-bilinear interpolation algorithm, previously summarized and subsequently described in detail, assures that adjacent patches correctly align. Thus, the bilateral-bilinear interpolation algorithm does not generate mapping artifacts as shown in FIG. 1A and FIG. 1C. [0054]
  • Data Space Mapping [0055]
  • FIG. 5 illustrates a mapping process, indicated by [0056] general reference character 500 used to backward map data points in a destination data space to data points in a source data space. The mapping process 500 initiates at a ‘start’ terminal 501 and continues to an ‘initialization’ procedure 503. The ‘initialization’ procedure 503 performs initialization steps for the mapping process 500. These steps can include steps for allocating memory for the source data space, allocating memory for the destination data space, determining the resolution of the presentation device (if any) used to present the destination data, and the spacing of grid points in the destination data space. Next, the mapping process 500 continues to a ‘load source data’ procedure 505 that inputs the source data into the source data space. The source data can be read (without limitation) from a file, a scanner device, a video device, from a network, a medical diagnostic tool or other similar device. In addition, the source data can represent a portion of a video data stream (the video data stream can be compressed; in addition the video stream can be live video, stored video or computer generated video). The ‘load source data’ procedure 505 need not complete before the mapping process 500 continues to a ‘determine grid points’ procedure 507. The ‘determine grid points’ procedure 507 uses the resolution and the size of the destination data space and possibly other parameters to determine the size of the region. Depending on the configuration of the source data space and the destination data space, the region can be n-dimensional. The region defines the data points that will be interpolated instead of being precisely mapped. The bilateral-bilinear interpolation algorithm can be applied to n-dimensional spaces. When the region is two-dimensional, the region is referred to as a patch.
  • A ‘precisely map grid points’ [0057] procedure 508 precisely maps the grid points that bound the selected region in the destination data space to points in the source data space. The ‘precisely map grid points’ procedure 508 uses well known transformations that can include floating point multiplication and division operations to precisely locate points in the source data space that correspond to the grid points in the destination data space.
  • Once the grid points that bound the region have been precisely mapped, the [0058] mapping process 500 continues to an ‘iterate region’ procedure 509 that iterates each region in the destination data space that is to be interpolated. A ‘get grid point coordinates in source data space’ procedure 511 obtains the grid points that bound the iterated region. Then a ‘map points in region’ procedure 513 applies a bilateral-bilinear interpolation algorithm to approximately map the points in the region to a portion of the data in the source data space. The bilateral-bilinear interpolation algorithm is subsequently described with respect to FIG. 6.
  • The [0059] mapping process 500 repeats to the ‘iterate region’ procedure 509 until all the regions in the destination data space are iterated. The resulting data in the destination data space can then be presented by a ‘present destination space data’ procedure 514. This presentation can be accomplished (without limitation) by visually presenting the information by using a presentation device such as a printer or monitor, by providing a printout of the data, or by subsequent processing of the data using other mechanisms. The mapping process 500 completes through an ‘end’ terminal 515.
  • One skilled in the art will understand that the precisely mapped grid points define lines in the source data space that can serve as boundary lines for the mapped region in the source data space. [0060]
  • FIG. 6 illustrates a bilateral-bilinear interpolation algorithm process, indicated by [0061] general reference character 600 that is invoked from the ‘map points in region’ procedure 513 of FIG. 5. A preferred embodiment is directed towards mapping data between two two-dimensional data spaces. This embodiment can be used to generate a perspective corrected image from a warped image that was generated from a true image projected through a lens (such as a fisheye lens or a catadioptric lens). One skilled in the art will understand that a physical lens need not be used to generate the warped image as ray-tracing techniques through a virtual lens can also be used to generate the warped image.
  • The bilateral-bilinear [0062] interpolation algorithm process 600 initiates at a ‘start’ terminal 601 and continues to an ‘initialize’ procedure 603. The ‘initialize’ procedure 603 determines the slopes for the boundary lines in the source data space that define the limits of the scan lines in the patch. The slope is defined by a delta-x and delta-y that depend on the number of scan lines in the patch. The ‘initialize’ procedure 603 also defines the starting and ending coordinates in the source data space for the first scan line that is to be interpolated. For a patch bounded by points P0(x0,y0), P1(x1,y1), P2(x2,y2), and P3(x3,y3) (these points corresponding to the mapped first grid point 401′, the mapped second grid point 403′, the mapped third grid point 405′, and the mapped fourth grid point 407′ of FIG. 4B) in the source data space and where the patch in the destination data space includes Nyd scan lines, each scan line containing Nxd pixels, the ‘initialize’ procedure 603 can include steps similar to:
  • dxl=(x2−x0)/nyd; // determine slope of left line [0063]
  • dyl=(y2−y0)/nyd; [0064]
  • dxr=(x3−x1)/nyd; // determine slope of right line [0065]
  • dyr=(y3−y1)/nyd; [0066]
  • startx=x0; // set starting coordinates [0067]
  • starty=y0; [0068]
  • endx=x1; // set ending coordinates [0069]
  • endy=y1; [0070]
  • Next, the bilateral-bilinear [0071] interpolation algorithm process 600 continues to an ‘iterate scan line in patch’ procedure 605 that iterates each scan line in the patch in the destination data space. When all the scan lines in the patch have been iterated, the bilateral-bilinear interpolation algorithm process 600 completes through an ‘end’ terminal 607. The number of iterations to iterate each scan line in the patch is the value of nyd.
  • An ‘initialize working variables’ [0072] procedure 609 initializes the variables used for the iteration of the pixels in the iterated scan line. These initializations include determining the slope for the iterated scan line based on the coordinates of the start point and the end point of the scan line in the source data space. The start point of the scan line substantially lies on the boundary line defined by P0 and P2. The end point of the scan line substantially lies on the line defined by P1 and P3. Thus, these lines bound each scan line. The slope of the scan line is determined using the start point, the end point, and the number of pixels in the scan line in the patch. The ‘initialize working variables’ procedure 609 can include steps similar to:
  • dx=(endx−startx)/nxd; // determine scan line slope [0073]
  • dy=(endy−starty)/nxd; [0074]
  • sx=startx; [0075]
  • sy=starty; [0076]
  • An ‘iterate each pixel in scan line’ [0077] procedure 611 iterates each pixel in the destination scan line. To iterate each pixel in the scan line requires Nxd iterations. When all the pixels in the line have been iterated, the bilateral-bilinear interpolation algorithm process 600 continues to an ‘advance to next scan line in patch’ procedure 613. The ‘advance to next scan line in patch’ procedure 613 advances the startx, starty, endx and endy values by dxl, dyl, dxr, and dyr respectively. Thus, the ‘iterate each pixel in scan line’ procedure 611 determines a subsequent starting point and a subsequent ending point that bound a subsequent line that has a subsequent line slope. Thus, the subsequent line depends on the slope of the boundary lines. The ‘advance to next scan line in patch’ procedure 613 can include steps similar to:
  • startx+=dxl; // determine new scan line start [0078]
  • starty+=dyl; // coordinates [0079]
  • endx+=dxr; // determine new scan line end [0080]
  • endy+=dyr; // coordinates [0081]
  • One skilled in the art will understand that the interpolation of each scan line thus depends on the slope of the patch's bounding lines. Thus, the end position of each scan line in one region substantially matches the start position of each scan line for an adjacent patch. The result is that there are no interpolation artifacts between adjacent patches (or regions). [0082]
  • Once the ‘advance to next scan line in patch’ [0083] procedure 613 completes, the bilateral-bilinear interpolation algorithm process 600 continues to the ‘iterate scan line in patch’ procedure 605 to iterate the next scan line in the patch or to exit if all scan lines have been iterated.
  • A ‘set pixel’ [0084] procedure 615 obtains the data value for the pixel in the destination data space from the data area specified by sx and sy in the source data space. One skilled in the art will understand how to obtain the value of a destination pixel from the source data space given coordinates in the source data space. Such a one will also understand how to combine data values in the source data space to generate the data value in the destination data space.
  • Once the ‘set pixel’ [0085] procedure 615 completes, the bilateral-bilinear interpolation algorithm process 600 continues to an ‘advance to next pixel in scan line’ procedure 617. The ‘advance to next pixel in scan line’ procedure 617 advances sx and sy by dx and dy respectively. Next, the bilateral-bilinear interpolation algorithm process 600 continues to the ‘iterate each pixel in scan line’ procedure 611 until each pixel in the scan line has been iterated.
  • One skilled in the art will understand that the invention improves the mapping between two data spaces while still maintaining high performance. [0086]
  • From the foregoing, it will be appreciated that the invention has (without limitation) the following advantages: [0087]
  • 1) The invention dramatically reduces mapping artifacts when mapping from one data space to another. [0088]
  • 2) The invention provides a high performance mapping capability between data spaces. [0089]
  • 3) The invention provides a superior real-time presentation of a visual image when the source data space contains warped image data. [0090]
  • Although the present invention has been described in terms of the presently preferred embodiments, one skilled in the art will understand that various modifications and alterations may be made without departing from the scope of the invention. Accordingly, the scope of the invention is not to be limited to the particular invention embodiments discussed herein. [0091]

Claims (16)

What is claimed is:
1. A computer controlled method including steps of:
determining a region in a destination data space, said region bounded by a plurality of grid points and said region defining a first plurality of data points in said destination data space;
precisely mapping said plurality of grid points to a plurality of mapped grid points in a source data space associating a second plurality of data points, wherein said plurality of mapped grid points define a plurality of boundary lines that represent the boundary of said region as mapped into said source data space; and
applying a bilateral-bilinear interpolation algorithm to map said first plurality of data points to said second plurality of data points.
2. The computer controlled method of claim 1 further including presenting said first plurality of data points using a presentation device.
3. The computer controlled method of claim 1 wherein said second plurality of data points represents a warped image and said first plurality of data points represents a perspective corrected image.
4. The computer controlled method of claim 3 wherein said warped image represents a true image warped by a lens, said perspective corrected image substantially representing said true image.
5. The computer controlled method of claim 4 wherein said lens is a catadioptric lens.
6. An apparatus having a central processing unit (CPU) and a memory coupled to said CPU, said apparatus including:
a region determination mechanism configured to determine a region in a destination data space, said region bounded by a plurality of grid points and said region defining a first plurality of data points in said destination data space;
a precise mapping mechanism configured to precisely map said plurality of grid points determined by the region determination mechanism to a plurality of mapped grid points in a source data space associating a second plurality of data points, wherein said plurality of mapped grid points define a plurality of boundary lines that represent the boundary of said region as mapped into said source data space; and
a bilateral-bilinear interpolation mechanism configured to map said first plurality of data points in said region to said second plurality of data points using said plurality of mapped grid points.
7. The apparatus of claim 6 further including a presentation device configured to present said first plurality of data points.
8. The apparatus of claim 6 wherein said second plurality of data points represents a warped image and said first plurality of data points represents a perspective corrected image.
9. The apparatus of claim 8 wherein said warped image represents a true image warped by a lens, said perspective corrected image substantially representing said true image.
10. The apparatus of claim 9 wherein said lens is a catadioptric lens.
11. A computer program product including:
a computer usable storage medium having computer readable code embodied therein for causing a computer to map a destination data space to a source data space, said computer readable code including:
computer readable program code configured to cause said computer to effect a region determination mechanism configured to determine a region in said destination data space, said region bounded by a plurality of grid points and said region defining a first plurality of data points in said destination data space;
computer readable program code configured to cause said computer to effect a precise mapping mechanism configured to precisely map said plurality of grid points determined by the region determination mechanism to a plurality of mapped grid points in said source data space associating a second plurality of data points, wherein said plurality of mapped grid points define a plurality of boundary lines that represent the boundary of said region as mapped into said source data space; and
computer readable program code configured to cause said computer to effect a bilateral-bilinear interpolation mechanism configured to map said first plurality of data points in said region to said second plurality of data points using said plurality of mapped grid points.
12. The computer program product of claim 11 further including computer readable program code configured to cause said computer to drive a presentation device configured to present said first plurality of data points.
13. The computer program product of claim 11 wherein said second plurality of data points represents a warped image and said first plurality of data points represents a perspective corrected image.
14. The computer program product of claim 13 wherein said warped image represents a true image warped by a lens, said perspective corrected image substantially representing said true image.
15. The computer program product of claim 14 wherein said lens is a catadioptric lens.
16. A computer program product including:
a computer data signal embodied in a carrier wave having computer readable code embodied therein for causing a computer to map a destination data space to a source data space, said computer readable code including:
computer readable program code configured to cause said computer to effect a region determination mechanism configured to determine a region in said destination data space, said region bounded by a plurality of grid points and said region defining a first plurality of data points in said destination data space;
computer readable program code configured to cause said computer to effect a precise mapping mechanism configured to precisely map said plurality of grid points determined by the region determination mechanism to a plurality of mapped grid points in said source data space associating a second plurality of data points, wherein said plurality of mapped grid points define a plurality of boundary lines that represent the boundary of said region as mapped into said source data space; and
computer readable program code configured to cause said computer to effect a bilateral-bilinear interpolation mechanism configured to map said first plurality of data points in said region to said second plurality of data points using said plurality of mapped grid points.
US10/056,476 1998-11-25 2002-01-24 Method, apparatus and computer program product for generating perspective corrected data from warped information Abandoned US20020094132A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/056,476 US20020094132A1 (en) 1998-11-25 2002-01-24 Method, apparatus and computer program product for generating perspective corrected data from warped information

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/200,172 US6369818B1 (en) 1998-11-25 1998-11-25 Method, apparatus and computer program product for generating perspective corrected data from warped information
US10/056,476 US20020094132A1 (en) 1998-11-25 2002-01-24 Method, apparatus and computer program product for generating perspective corrected data from warped information

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/200,172 Continuation US6369818B1 (en) 1998-11-25 1998-11-25 Method, apparatus and computer program product for generating perspective corrected data from warped information

Publications (1)

Publication Number Publication Date
US20020094132A1 true US20020094132A1 (en) 2002-07-18

Family

ID=22740633

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/200,172 Expired - Lifetime US6369818B1 (en) 1998-11-25 1998-11-25 Method, apparatus and computer program product for generating perspective corrected data from warped information
US10/056,476 Abandoned US20020094132A1 (en) 1998-11-25 2002-01-24 Method, apparatus and computer program product for generating perspective corrected data from warped information

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/200,172 Expired - Lifetime US6369818B1 (en) 1998-11-25 1998-11-25 Method, apparatus and computer program product for generating perspective corrected data from warped information

Country Status (5)

Country Link
US (2) US6369818B1 (en)
EP (1) EP1004988B1 (en)
JP (1) JP2000182038A (en)
AT (1) ATE350726T1 (en)
DE (1) DE69934661D1 (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020196327A1 (en) * 2001-06-14 2002-12-26 Yong Rui Automated video production system and method using expert video production rules for online publishing of lectures
US20030234866A1 (en) * 2002-06-21 2003-12-25 Ross Cutler System and method for camera color calibration and image stitching
US20040001137A1 (en) * 2002-06-27 2004-01-01 Ross Cutler Integrated design for omni-directional camera and microphone array
US20040267521A1 (en) * 2003-06-25 2004-12-30 Ross Cutler System and method for audio/video speaker detection
US20040263611A1 (en) * 2003-06-26 2004-12-30 Ross Cutler Omni-directional camera design for video conferencing
US20040263636A1 (en) * 2003-06-26 2004-12-30 Microsoft Corporation System and method for distributed meetings
US20040263646A1 (en) * 2003-06-24 2004-12-30 Microsoft Corporation Whiteboard view camera
US20050046703A1 (en) * 2002-06-21 2005-03-03 Cutler Ross G. Color calibration in photographic devices
US20050117015A1 (en) * 2003-06-26 2005-06-02 Microsoft Corp. Foveated panoramic camera system
US20050117034A1 (en) * 2002-06-21 2005-06-02 Microsoft Corp. Temperature compensation in multi-camera photographic devices
US20050151837A1 (en) * 2002-06-21 2005-07-14 Microsoft Corp. Minimizing dead zones in panoramic images
US20050206659A1 (en) * 2002-06-28 2005-09-22 Microsoft Corporation User interface for a system and method for head size equalization in 360 degree panoramic images
US20050243166A1 (en) * 2004-04-30 2005-11-03 Microsoft Corporation System and process for adding high frame-rate current speaker data to a low frame-rate video
US20050243168A1 (en) * 2004-04-30 2005-11-03 Microsoft Corporation System and process for adding high frame-rate current speaker data to a low frame-rate video using audio watermarking techniques
US20050243167A1 (en) * 2004-04-30 2005-11-03 Microsoft Corporation System and process for adding high frame-rate current speaker data to a low frame-rate video using delta frames
US20050280700A1 (en) * 2001-06-14 2005-12-22 Microsoft Corporation Automated online broadcasting system and method using an omni-directional camera system for viewing meetings over a computer network
US20050285943A1 (en) * 2002-06-21 2005-12-29 Cutler Ross G Automatic face extraction for use in recorded meetings timelines
US20060023106A1 (en) * 2004-07-28 2006-02-02 Microsoft Corporation Multi-view integrated camera system
US20060023074A1 (en) * 2004-07-28 2006-02-02 Microsoft Corporation Omni-directional camera with calibration and up look angle improvements
US7184609B2 (en) 2002-06-28 2007-02-27 Microsoft Corp. System and method for head size equalization in 360 degree panoramic images
US20070058879A1 (en) * 2005-09-15 2007-03-15 Microsoft Corporation Automatic detection of panoramic camera position and orientation table parameters
US7260257B2 (en) 2002-06-19 2007-08-21 Microsoft Corp. System and method for whiteboard and audio capture
US20070300165A1 (en) * 2006-06-26 2007-12-27 Microsoft Corporation, Corporation In The State Of Washington User interface for sub-conferencing
US20070299710A1 (en) * 2006-06-26 2007-12-27 Microsoft Corporation Full collaboration breakout rooms for conferencing
US20070299912A1 (en) * 2006-06-26 2007-12-27 Microsoft Corporation, Corporation In The State Of Washington Panoramic video in a live meeting client
US20080008458A1 (en) * 2006-06-26 2008-01-10 Microsoft Corporation Interactive Recording and Playback for Network Conferencing
US7525928B2 (en) 2003-06-16 2009-04-28 Microsoft Corporation System and process for discovery of network-connected devices at remote sites using audio-based discovery techniques
CN103164861A (en) * 2013-03-21 2013-06-19 北京大学 Image structuring expression method based on subdivision codes
US20160124287A1 (en) * 2014-11-05 2016-05-05 Morpho Method for calibrating a sighting system
CN106227789A (en) * 2016-07-18 2016-12-14 华南理工大学 The area grid division methods of geography information attribute in studying overhead transmission line region
US10951859B2 (en) 2018-05-30 2021-03-16 Microsoft Technology Licensing, Llc Videoconferencing device and method

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002096096A1 (en) * 2001-05-16 2002-11-28 Zaxel Systems, Inc. 3d instant replay system and method
US7139440B2 (en) * 2001-08-25 2006-11-21 Eyesee360, Inc. Method and apparatus for encoding photographic images
US7123777B2 (en) * 2001-09-27 2006-10-17 Eyesee360, Inc. System and method for panoramic imaging
US7058239B2 (en) * 2001-10-29 2006-06-06 Eyesee360, Inc. System and method for panoramic imaging
US8599266B2 (en) * 2002-07-01 2013-12-03 The Regents Of The University Of California Digital processing of video images
US8307273B2 (en) 2002-12-30 2012-11-06 The Board Of Trustees Of The Leland Stanford Junior University Methods and apparatus for interactive network sharing of digital video content
US7082572B2 (en) 2002-12-30 2006-07-25 The Board Of Trustees Of The Leland Stanford Junior University Methods and apparatus for interactive map-based analysis of digital video content
US7823058B2 (en) 2002-12-30 2010-10-26 The Board Of Trustees Of The Leland Stanford Junior University Methods and apparatus for interactive point-of-view authoring of digital video content
US20040254424A1 (en) * 2003-04-15 2004-12-16 Interscience, Inc. Integrated panoramic and forward view endoscope
US7443807B2 (en) * 2003-06-16 2008-10-28 Microsoft Corporation System and process for discovery of network-connected devices
US7359575B2 (en) * 2004-12-07 2008-04-15 Silicon Optix Inc. Dynamic warp map generation system and method
US7880737B2 (en) * 2005-03-22 2011-02-01 Vijayvardhan Elchuri Graphical method and system for making drawings directly in three-dimensions on a computer monitor or other display device
TWI303782B (en) * 2006-03-10 2008-12-01 Sony Taiwan Ltd An optimized video stitching mehtod for asic implementation
JP4947351B2 (en) * 2006-07-28 2012-06-06 富士ゼロックス株式会社 Image processing apparatus and program
US8224122B2 (en) * 2006-12-15 2012-07-17 Microsoft Corporation Dynamic viewing of wide angle images
US8543788B2 (en) * 2007-06-06 2013-09-24 Aptina Imaging Corporation Conformal rolling buffer apparatus, systems, and methods
WO2010084460A1 (en) * 2009-01-20 2010-07-29 Nxp B.V. Image processing using a bilateral grid
JP5376313B2 (en) * 2009-09-03 2013-12-25 株式会社リコー Image processing apparatus and image pickup apparatus
US9488469B1 (en) 2013-04-22 2016-11-08 Cognex Corporation System and method for high-accuracy measurement of object surface displacement using a laser displacement sensor
CN108510549B (en) * 2018-03-27 2022-01-04 京东方科技集团股份有限公司 Distortion parameter measuring method, device and system of virtual reality equipment
US11734789B2 (en) * 2020-06-02 2023-08-22 Immersive Tech, Inc. Systems and methods for image distortion correction

Family Cites Families (149)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2304434A (en) 1928-09-03 1942-12-08 Ibm Projecting device
US2146662A (en) 1936-09-05 1939-02-07 Lieuwe E W Van Albada Sighting instrument
US2244235A (en) 1938-09-03 1941-06-03 Ibm Cycloramic optical system
US2628529A (en) 1948-09-25 1953-02-17 Lawrence E Braymer Reflecting telescope with auxiliary optical system
US2654286A (en) 1950-07-14 1953-10-06 Jorge M Cesar Optical viewing device for night driving
FR1234341A (en) 1958-07-02 1960-10-17 Additional lens for taking and projecting photographic views of moving and still subjects
US3205777A (en) 1961-11-08 1965-09-14 Brenner Arthur Telescopic mounting for convex mirrors
US3229576A (en) 1962-11-21 1966-01-18 Donald W Rees Hyperbolic ellipsoidal real time display panoramic viewing installation for vehicles
US3203328A (en) 1963-02-21 1965-08-31 Marquardt Corp Full circumferential viewing system
US3692934A (en) 1971-02-11 1972-09-19 Us Navy Roll and pitch simulator utilizing 360{20 {0 display
US3723805A (en) 1971-05-12 1973-03-27 Us Navy Distortion correction system
US3785715A (en) 1972-05-17 1974-01-15 Singer Co Panoramic infinity image display
US3832046A (en) 1972-11-13 1974-08-27 Singer Co Panoramic projector and camera
US3846809A (en) 1973-10-18 1974-11-05 G Troje Reflectors and mounts for panoramic optical systems
CH589309A5 (en) 1974-03-11 1977-06-30 Infra Vision Ag
US3872238A (en) 1974-03-11 1975-03-18 Us Navy 360 Degree panoramic television system
US3998532A (en) 1974-04-08 1976-12-21 The United States Of America As Represented By The Secretary Of The Navy Wide angle single channel projection apparatus
US4012126A (en) 1974-04-08 1977-03-15 The United States Of America As Represented By The Secretary Of The Navy Optical system for 360° annular image transfer
NL7406227A (en) 1974-05-09 1975-11-11 Stichting Internationaal Insti DEVICE IN A VESSEL FOR RECORDING DATA OF AN OBJECT LOCATED OUTSIDE.
US3934259A (en) 1974-12-09 1976-01-20 The United States Of America As Represented By The Secretary Of The Navy All-sky camera apparatus for time-resolved lightning photography
US4058831A (en) 1976-09-08 1977-11-15 Lectrolarm Custom Systems, Inc. Panoramic camera scanning system
US4078860A (en) 1976-10-27 1978-03-14 Globus Ronald P Cycloramic image projection system
GB1553525A (en) 1976-10-30 1979-09-26 Luknar A Security system
US4157218A (en) 1977-04-14 1979-06-05 The Perkin-Elmer Corporation Wide angle scan camera
US4241985A (en) 1978-11-27 1980-12-30 Globus Richard D Panoramic camera
USD263716S (en) 1979-02-06 1982-04-06 Globuscope, Inc. Panoramic camera
US4326775A (en) 1979-02-07 1982-04-27 King Don G Method for operating a panoramic optical system
GB2315944B (en) 1979-05-16 1998-06-24 British Aerospace Improvements relating to surveillance apparatus
GB2084833A (en) 1980-04-11 1982-04-15 Ampex System for spatially transforming images
US4395093A (en) 1981-05-21 1983-07-26 The United States Of America As Represented By The Secretary Of The Navy Lens system for panoramic imagery
US4429957A (en) 1981-07-30 1984-02-07 King-Bell Optics, Inc. Panoramic zoom lens assembly
US4463380A (en) 1981-09-25 1984-07-31 Vought Corporation Image processing system
US4835532A (en) 1982-07-30 1989-05-30 Honeywell Inc. Nonaliasing real-time spatial transform image processing system
US4484801A (en) 1982-09-20 1984-11-27 The United States Of America As Represented By The Secretary Of The Navy Panoramic lens
JPS59115677A (en) 1982-12-22 1984-07-04 Hitachi Ltd Picture processor
US4602857A (en) 1982-12-23 1986-07-29 James H. Carmel Panoramic motion picture camera and method
US4761641A (en) 1983-01-21 1988-08-02 Vidcom Rentservice B.V. Information display system
HU192125B (en) 1983-02-08 1987-05-28 Budapesti Mueszaki Egyetem Block of forming image for centre theory projection adn reproduction of spaces
US4518898A (en) 1983-02-22 1985-05-21 Image Graphics, Incorporated Method and apparatus for correcting image distortions
US4656506A (en) 1983-02-25 1987-04-07 Ritchey Kurtis J Spherical projection system
IT1195600B (en) 1983-10-26 1988-10-19 Ivo Rosset DEVICE FOR MAKING PANORAMIC PHOTOGRAPHS WITH NORMAL USE CAMERA
JPS60186967A (en) 1984-03-05 1985-09-24 Fanuc Ltd Image display method
US4578682A (en) 1984-03-20 1986-03-25 Raydx Satellite Systems, Ltd. Antenna dish
US4736436A (en) 1984-04-13 1988-04-05 Fujitsu Limited Information extraction by mapping
US4561733A (en) 1984-04-17 1985-12-31 Recon/Optical, Inc. Panoramic unity vision system
DE3422752A1 (en) 1984-06-19 1985-12-19 Krauss-Maffei AG, 8000 München ELEVATIBLE OBSERVATION AND TARGET SYSTEM FOR COMBAT VEHICLES
US4670648A (en) 1985-03-06 1987-06-02 University Of Cincinnati Omnidirectional vision system for controllng mobile machines
JPH0681275B2 (en) 1985-04-03 1994-10-12 ソニー株式会社 Image converter
GB2177278A (en) 1985-07-05 1987-01-14 Hunger Ibak H Gmbh & Co Kg Variable sight line television camera
GB2177871B (en) 1985-07-09 1989-02-08 Sony Corp Methods of and circuits for video signal processing
GB2185360B (en) 1986-01-11 1989-10-25 Pilkington Perkin Elmer Ltd Display system
GB2188205B (en) 1986-03-20 1990-01-04 Rank Xerox Ltd Imaging apparatus
US5038225A (en) 1986-04-04 1991-08-06 Canon Kabushiki Kaisha Image reading apparatus with black-level and/or white level correction
JP2515101B2 (en) 1986-06-27 1996-07-10 ヤマハ株式会社 Video and audio space recording / playback method
GB2194656B (en) 1986-09-03 1991-10-09 Ibm Method and system for solid modelling
US4807158A (en) 1986-09-30 1989-02-21 Daleco/Ivex Partners, Ltd. Method and apparatus for sampling images to simulate movement within a multidimensional space
US4728839A (en) 1987-02-24 1988-03-01 Remote Technology Corporation Motorized pan/tilt head for remote control
US4797942A (en) 1987-03-02 1989-01-10 General Electric Pyramid processor for building large-area, high-resolution image by parts
DE3712453A1 (en) 1987-04-11 1988-10-20 Wolf Gmbh Richard WIDE-ANGLE LENS FOR ENDOSCOPES
USD312263S (en) 1987-08-03 1990-11-20 Charles Jeffrey R Wide angle reflector attachment for a camera or similar article
JPS6446875A (en) 1987-08-17 1989-02-21 Toshiba Corp Object discriminating device
JPS6437174U (en) 1987-08-28 1989-03-06
FR2620544B1 (en) 1987-09-16 1994-02-11 Commissariat A Energie Atomique INTERPOLATION PROCESS
JPH01101061A (en) 1987-10-14 1989-04-19 Canon Inc Picture reader
US4945367A (en) 1988-03-02 1990-07-31 Blackshear David M Surveillance camera system
US4918473A (en) 1988-03-02 1990-04-17 Diamond Electronics, Inc. Surveillance camera system
EP0342419B1 (en) 1988-05-19 1992-10-28 Siemens Aktiengesellschaft Method for the observation of a scene and apparatus therefor
JP3138264B2 (en) 1988-06-21 2001-02-26 ソニー株式会社 Image processing method and apparatus
US5083389A (en) 1988-07-15 1992-01-28 Arthur Alperin Panoramic display device and method of making the same
US4864335A (en) 1988-09-12 1989-09-05 Corrales Richard C Panoramic camera
JPH0286266A (en) 1988-09-21 1990-03-27 Fuji Xerox Co Ltd Picture reader
US5157491A (en) 1988-10-17 1992-10-20 Kassatly L Samuel A Method and apparatus for video broadcasting and teleconferencing
US4899293A (en) 1988-10-24 1990-02-06 Honeywell Inc. Method of storage and retrieval of digital map data based upon a tessellated geoid system
GB8829135D0 (en) 1988-12-14 1989-01-25 Smith Graham T Panoramic interactive system
US5153716A (en) 1988-12-14 1992-10-06 Horizonscan Inc. Panoramic interactive system
US5040055A (en) 1988-12-14 1991-08-13 Horizonscan Inc. Panoramic interactive system
US4943821A (en) 1989-01-23 1990-07-24 Janet Louise Gelphman Topological panorama camera
US4991020A (en) 1989-02-17 1991-02-05 Hughes Aircraft Company Imaging system for providing separate simultaneous real time images from a singel image sensor
US4943851A (en) 1989-03-07 1990-07-24 Gold Stake 360 degree viewing system having a liquid crystal display screen encircling a rotatable projection screen
US4901140A (en) 1989-03-07 1990-02-13 Gold Stake Solid state 360 degree viewing system having a liquid crystal display (LCD) screen that encircles the rotating real image in space and functions as a multi-color filter system
US5067019A (en) 1989-03-31 1991-11-19 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Programmable remapper for image processing
NL8900867A (en) 1989-04-07 1990-11-01 Theo Jogchum Poelstra A SYSTEM OF "IMAGETRY" FOR THE OBTAINMENT OF DIGITAL, 3D TOPOGRAPHIC INFORMATION.
JPH0378373A (en) 1989-08-22 1991-04-03 Fuji Photo Optical Co Ltd Television camera operating device
US5175808A (en) 1989-09-12 1992-12-29 Pixar Method and apparatus for non-affine image warping
US5023725A (en) 1989-10-23 1991-06-11 Mccutchen David Method and apparatus for dodecahedral imaging system
US5115266A (en) 1989-11-08 1992-05-19 Troje Gerald J Optical system for recording or projecting a panoramic image
FR2655503B1 (en) 1989-12-01 1992-02-21 Thomson Csf OPTOELECTRONIC SYSTEM FOR AIDING ATTACK AND NAVIGATION MISSIONS.
JPH0771290B2 (en) 1989-12-27 1995-07-31 富士写真光機株式会社 Signal processing circuit
US5224208A (en) * 1990-03-16 1993-06-29 Hewlett-Packard Company Gradient calculation for texture mapping
US5130794A (en) 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
NL9000766A (en) 1990-04-02 1991-11-01 Koninkl Philips Electronics Nv DEVICE FOR GEOMETRIC CORRECTION OF A DISTRIBUTED IMAGE.
FR2662831B1 (en) 1990-05-29 1992-08-07 Cit Alcatel METHOD FOR MANAGING A DATABASE NETWORK.
JP3021556B2 (en) 1990-06-20 2000-03-15 ソニー株式会社 Video information processing apparatus and method
US5259584A (en) 1990-07-05 1993-11-09 Wainwright Andrew G Camera mount for taking panoramic pictures having an electronic protractor
FR2665600A1 (en) 1990-08-03 1992-02-07 Thomson Csf METHOD OF DETECTION FOR PANORAMIC CAMERA, CAMERA FOR ITS IMPLEMENTATION, AND SLEEPING SYSTEM EQUIPPED WITH SUCH A CAMERA
US5021813A (en) 1990-08-29 1991-06-04 Corrales Richard C Manually operated handle for panoramic camera
US5315331A (en) 1990-11-09 1994-05-24 Nikon Corporation Optical apparatus capable of performing a panoramic photographing
US5097325A (en) 1990-12-17 1992-03-17 Eol3 Company, Inc. Circular scanning system for an integrated camera and panoramic catadioptric display
US5187571A (en) 1991-02-01 1993-02-16 Bell Communications Research, Inc. Television system for displaying multiple views of a remote location
US5200818A (en) 1991-03-22 1993-04-06 Inbal Neta Video imaging system with interactive windowing capability
US5173948A (en) 1991-03-29 1992-12-22 The Grass Valley Group, Inc. Video image mapping system
JP3047927B2 (en) 1991-04-09 2000-06-05 三菱電機株式会社 Video signal clamp circuit
US5185667A (en) 1991-05-13 1993-02-09 Telerobotics International, Inc. Omniview motionless camera orientation system
US5313306A (en) 1991-05-13 1994-05-17 Telerobotics International, Inc. Omniview motionless camera endoscopy system
US5384588A (en) 1991-05-13 1995-01-24 Telerobotics International, Inc. System for omindirectional image viewing at a remote location without the transmission of control signals to select viewing parameters
US5903319A (en) 1991-05-13 1999-05-11 Interactive Pictures Corporation Method for eliminating temporal and spacial distortion from interlaced video signals
US5990941A (en) 1991-05-13 1999-11-23 Interactive Pictures Corporation Method and apparatus for the interactive display of any portion of a spherical image
US5764276A (en) 1991-05-13 1998-06-09 Interactive Pictures Corporation Method and apparatus for providing perceived video viewing experiences using still images
US6002430A (en) 1994-01-31 1999-12-14 Interactive Pictures Corporation Method and apparatus for simultaneous capture of a spherical image
US5359363A (en) 1991-05-13 1994-10-25 Telerobotics International, Inc. Omniview motionless camera surveillance system
JP2719056B2 (en) 1991-08-20 1998-02-25 富士通株式会社 3D object drawing device
JP3085481B2 (en) 1991-09-28 2000-09-11 株式会社ニコン Catadioptric reduction projection optical system, and exposure apparatus having the optical system
US5311572A (en) 1991-10-03 1994-05-10 At&T Bell Laboratories Cooperative databases call processing system
US5280540A (en) 1991-10-09 1994-01-18 Bell Communications Research, Inc. Video teleconferencing system employing aspect ratio transformation
JP3302715B2 (en) 1992-04-20 2002-07-15 キヤノン株式会社 Video camera equipment
AU3930793A (en) 1992-05-08 1993-12-13 Apple Computer, Inc. Textured sphere and spherical environment map rendering using texture map double indirection
DE4226286A1 (en) 1992-08-08 1994-02-10 Kamerawerke Noble Gmbh Panorama camera with a lens drum
US5490239A (en) 1992-10-01 1996-02-06 University Corporation For Atmospheric Research Virtual reality imaging system
US5396583A (en) 1992-10-13 1995-03-07 Apple Computer, Inc. Cylindrical to planar image mapping using scanline coherence
US5530650A (en) 1992-10-28 1996-06-25 Mcdonnell Douglas Corp. Computer imaging system and method for remote in-flight aircraft refueling
EP0623268A1 (en) 1992-11-24 1994-11-09 Geeris Holding Nederland B.V. A method and device for producing panoramic images, and a method and device for consulting panoramic images
US5854713A (en) 1992-11-30 1998-12-29 Mitsubishi Denki Kabushiki Kaisha Reflection type angle of view transforming optical apparatus
US5444476A (en) 1992-12-11 1995-08-22 The Regents Of The University Of Michigan System and method for teleinteraction
US5495576A (en) 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5473474A (en) 1993-07-16 1995-12-05 National Research Council Of Canada Panoramic lens
US5432871A (en) 1993-08-04 1995-07-11 Universal Systems & Technology, Inc. Systems and methods for interactive image data acquisition and compression
US5550646A (en) 1993-09-13 1996-08-27 Lucent Technologies Inc. Image communication system and method
CA2129942C (en) 1993-09-30 1998-08-25 Steven Todd Kaish Telecommunication network with integrated network-wide automatic call distribution
US5796426A (en) 1994-05-27 1998-08-18 Warp, Ltd. Wide-angle image dewarping method and apparatus
US5508734A (en) 1994-07-27 1996-04-16 International Business Machines Corporation Method and apparatus for hemispheric imaging which emphasizes peripheral content
US5610391A (en) 1994-08-25 1997-03-11 Owens-Brockway Glass Container Inc. Optical inspection of container finish dimensional parameters
US5649032A (en) 1994-11-14 1997-07-15 David Sarnoff Research Center, Inc. System for automatically aligning images to form a mosaic image
US5612533A (en) 1994-12-27 1997-03-18 Siemens Corporate Research, Inc. Low-profile horizon-sampling light sensor
US5920337A (en) 1994-12-27 1999-07-06 Siemens Corporate Research, Inc. Omnidirectional visual image detector and processor
US5714997A (en) 1995-01-06 1998-02-03 Anderson; David P. Virtual reality television system
US5606365A (en) 1995-03-28 1997-02-25 Eastman Kodak Company Interactive camera for network processing of captured images
US5850352A (en) 1995-03-31 1998-12-15 The Regents Of The University Of California Immersive video, including video hypermosaicing to generate from multiple video views of a scene a three-dimensional video mosaic from which diverse virtual video scene images are synthesized, including panoramic, scene interactive and stereoscopic images
US5729471A (en) 1995-03-31 1998-03-17 The Regents Of The University Of California Machine dynamic selection of one video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene
CA2146406A1 (en) 1995-04-05 1996-10-06 Ian Powell Panoramic fish-eye imaging system
US5682511A (en) 1995-05-05 1997-10-28 Microsoft Corporation Graphical viewer interface for an interactive network system
US5627675A (en) 1995-05-13 1997-05-06 Boeing North American Inc. Optics assembly for observing a panoramic scene
US5539483A (en) 1995-06-30 1996-07-23 At&T Corp. Panoramic projection apparatus
US5841589A (en) 1995-09-26 1998-11-24 Boeing North American, Inc. Panoramic optics assembly having an initial flat reflective element
US5633810A (en) 1995-12-14 1997-05-27 Sun Microsystems, Inc. Method and apparatus for distributing network bandwidth on a media server
US5601353A (en) 1995-12-20 1997-02-11 Interval Research Corporation Panoramic display with stationary display device and rotating support structure
US5748194A (en) 1996-05-08 1998-05-05 Live Picture, Inc. Rendering perspective views of a scene using a scanline-coherent look-up table
US5760826A (en) 1996-05-10 1998-06-02 The Trustees Of Columbia University Omnidirectional imaging apparatus
US6043837A (en) 1997-05-08 2000-03-28 Be Here Corporation Method and apparatus for electronically distributing images from a panoptic camera system
US6034716A (en) 1997-12-18 2000-03-07 Whiting; Joshua B. Panoramic digital camera system

Cited By (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7515172B2 (en) 2001-06-14 2009-04-07 Microsoft Corporation Automated online broadcasting system and method using an omni-directional camera system for viewing meetings over a computer network
US7349005B2 (en) 2001-06-14 2008-03-25 Microsoft Corporation Automated video production system and method using expert video production rules for online publishing of lectures
US7580054B2 (en) 2001-06-14 2009-08-25 Microsoft Corporation Automated online broadcasting system and method using an omni-directional camera system for viewing meetings over a computer network
US20020196327A1 (en) * 2001-06-14 2002-12-26 Yong Rui Automated video production system and method using expert video production rules for online publishing of lectures
US20050285933A1 (en) * 2001-06-14 2005-12-29 Microsoft Corporation Automated online broadcasting system and method using an omni-directional camera system for viewing meetings over a computer network
US20050280700A1 (en) * 2001-06-14 2005-12-22 Microsoft Corporation Automated online broadcasting system and method using an omni-directional camera system for viewing meetings over a computer network
US7260257B2 (en) 2002-06-19 2007-08-21 Microsoft Corp. System and method for whiteboard and audio capture
US20050117034A1 (en) * 2002-06-21 2005-06-02 Microsoft Corp. Temperature compensation in multi-camera photographic devices
US20050285943A1 (en) * 2002-06-21 2005-12-29 Cutler Ross G Automatic face extraction for use in recorded meetings timelines
US20050046703A1 (en) * 2002-06-21 2005-03-03 Cutler Ross G. Color calibration in photographic devices
US20050151837A1 (en) * 2002-06-21 2005-07-14 Microsoft Corp. Minimizing dead zones in panoramic images
US7782357B2 (en) 2002-06-21 2010-08-24 Microsoft Corporation Minimizing dead zones in panoramic images
US20030234866A1 (en) * 2002-06-21 2003-12-25 Ross Cutler System and method for camera color calibration and image stitching
US7936374B2 (en) 2002-06-21 2011-05-03 Microsoft Corporation System and method for camera calibration and images stitching
US7598975B2 (en) 2002-06-21 2009-10-06 Microsoft Corporation Automatic face extraction for use in recorded meetings timelines
US7259784B2 (en) 2002-06-21 2007-08-21 Microsoft Corporation System and method for camera color calibration and image stitching
US7602412B2 (en) 2002-06-21 2009-10-13 Microsoft Corporation Temperature compensation in multi-camera photographic devices
US20040001137A1 (en) * 2002-06-27 2004-01-01 Ross Cutler Integrated design for omni-directional camera and microphone array
US7852369B2 (en) 2002-06-27 2010-12-14 Microsoft Corp. Integrated design for omni-directional camera and microphone array
US7184609B2 (en) 2002-06-28 2007-02-27 Microsoft Corp. System and method for head size equalization in 360 degree panoramic images
US20050206659A1 (en) * 2002-06-28 2005-09-22 Microsoft Corporation User interface for a system and method for head size equalization in 360 degree panoramic images
US7149367B2 (en) 2002-06-28 2006-12-12 Microsoft Corp. User interface for a system and method for head size equalization in 360 degree panoramic images
US7525928B2 (en) 2003-06-16 2009-04-28 Microsoft Corporation System and process for discovery of network-connected devices at remote sites using audio-based discovery techniques
US7397504B2 (en) 2003-06-24 2008-07-08 Microsoft Corp. Whiteboard view camera
US20040263646A1 (en) * 2003-06-24 2004-12-30 Microsoft Corporation Whiteboard view camera
US20040267521A1 (en) * 2003-06-25 2004-12-30 Ross Cutler System and method for audio/video speaker detection
US7343289B2 (en) 2003-06-25 2008-03-11 Microsoft Corp. System and method for audio/video speaker detection
US20040263611A1 (en) * 2003-06-26 2004-12-30 Ross Cutler Omni-directional camera design for video conferencing
US20040263636A1 (en) * 2003-06-26 2004-12-30 Microsoft Corporation System and method for distributed meetings
US7428000B2 (en) 2003-06-26 2008-09-23 Microsoft Corp. System and method for distributed meetings
US20050117015A1 (en) * 2003-06-26 2005-06-02 Microsoft Corp. Foveated panoramic camera system
US20050243167A1 (en) * 2004-04-30 2005-11-03 Microsoft Corporation System and process for adding high frame-rate current speaker data to a low frame-rate video using delta frames
US7355622B2 (en) 2004-04-30 2008-04-08 Microsoft Corporation System and process for adding high frame-rate current speaker data to a low frame-rate video using delta frames
US7355623B2 (en) 2004-04-30 2008-04-08 Microsoft Corporation System and process for adding high frame-rate current speaker data to a low frame-rate video using audio watermarking techniques
US7362350B2 (en) 2004-04-30 2008-04-22 Microsoft Corporation System and process for adding high frame-rate current speaker data to a low frame-rate video
US20050243166A1 (en) * 2004-04-30 2005-11-03 Microsoft Corporation System and process for adding high frame-rate current speaker data to a low frame-rate video
US20050243168A1 (en) * 2004-04-30 2005-11-03 Microsoft Corporation System and process for adding high frame-rate current speaker data to a low frame-rate video using audio watermarking techniques
US7593042B2 (en) 2004-07-28 2009-09-22 Microsoft Corporation Maintenance of panoramic camera orientation
US20060023106A1 (en) * 2004-07-28 2006-02-02 Microsoft Corporation Multi-view integrated camera system
US20060023074A1 (en) * 2004-07-28 2006-02-02 Microsoft Corporation Omni-directional camera with calibration and up look angle improvements
US20060023075A1 (en) * 2004-07-28 2006-02-02 Microsoft Corp. Maintenance of panoramic camera orientation
US7593057B2 (en) 2004-07-28 2009-09-22 Microsoft Corp. Multi-view integrated camera system with housing
US7495694B2 (en) 2004-07-28 2009-02-24 Microsoft Corp. Omni-directional camera with calibration and up look angle improvements
US7630571B2 (en) 2005-09-15 2009-12-08 Microsoft Corporation Automatic detection of panoramic camera position and orientation table parameters
US20070058879A1 (en) * 2005-09-15 2007-03-15 Microsoft Corporation Automatic detection of panoramic camera position and orientation table parameters
US7653705B2 (en) 2006-06-26 2010-01-26 Microsoft Corp. Interactive recording and playback for network conferencing
US20070300165A1 (en) * 2006-06-26 2007-12-27 Microsoft Corporation, Corporation In The State Of Washington User interface for sub-conferencing
US20070299710A1 (en) * 2006-06-26 2007-12-27 Microsoft Corporation Full collaboration breakout rooms for conferencing
US20070299912A1 (en) * 2006-06-26 2007-12-27 Microsoft Corporation, Corporation In The State Of Washington Panoramic video in a live meeting client
US20080008458A1 (en) * 2006-06-26 2008-01-10 Microsoft Corporation Interactive Recording and Playback for Network Conferencing
US8572183B2 (en) 2006-06-26 2013-10-29 Microsoft Corp. Panoramic video in a live meeting client
CN103164861A (en) * 2013-03-21 2013-06-19 北京大学 Image structuring expression method based on subdivision codes
US20160124287A1 (en) * 2014-11-05 2016-05-05 Morpho Method for calibrating a sighting system
US10018893B2 (en) * 2014-11-05 2018-07-10 Morpho Method for calibrating a sighting system
CN106227789A (en) * 2016-07-18 2016-12-14 华南理工大学 The area grid division methods of geography information attribute in studying overhead transmission line region
US10951859B2 (en) 2018-05-30 2021-03-16 Microsoft Technology Licensing, Llc Videoconferencing device and method

Also Published As

Publication number Publication date
JP2000182038A (en) 2000-06-30
DE69934661D1 (en) 2007-02-15
ATE350726T1 (en) 2007-01-15
EP1004988A2 (en) 2000-05-31
EP1004988B1 (en) 2007-01-03
US6369818B1 (en) 2002-04-09
EP1004988A3 (en) 2002-03-20

Similar Documents

Publication Publication Date Title
US6369818B1 (en) Method, apparatus and computer program product for generating perspective corrected data from warped information
US6215915B1 (en) Image processing methods and apparatus for separable, general affine transformation of an image
US4827413A (en) Modified back-to-front three dimensional reconstruction algorithm
JP4845147B2 (en) Perspective editing tool for 2D images
US6268846B1 (en) 3D graphics based on images and morphing
US6782130B2 (en) Rendering of photorealistic computer graphics images
EP0360155B1 (en) Image transformation method and device
US6292192B1 (en) System and method for the direct rendering of curve bounded objects
JP3466661B2 (en) Image processing apparatus and method
JP3675488B2 (en) Circuit for determining non-homogeneous secondary perspective texture mapping coordinates using linear interpolation
JP2793466B2 (en) Variable image enlargement method and apparatus
US20050237336A1 (en) Method and system for multi-object volumetric data visualization
US20040150638A1 (en) Image processing apparatus, image processing method, and image processing program
US7825928B2 (en) Image processing device and image processing method for rendering three-dimensional objects
US5491769A (en) Method and apparatus for variable minification of an image
EP0574245A2 (en) Method and apparatus for variable expansion and variable shrinkage of an image
US5821942A (en) Ray tracing through an ordered array
JP6443574B1 (en) Ray casting program, search control data, search control data generation method, and ray casting apparatus
JP7131080B2 (en) volume rendering device
US6380936B1 (en) System and method for inferring projective mappings
JP2878614B2 (en) Image synthesis method and apparatus
JP2519779B2 (en) 3D image display device
US6510442B1 (en) System and method for improved digital differential analyzer
US6400369B1 (en) Information processing method and apparatus for generating texture-data for computer graphics
KR0140283B1 (en) Image Rotation Method of Image Editing Device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: BE HERE CORPORATION, CALIFORNIA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:WASSERSTEIN ADELSON VENTURES, L.P.;REEL/FRAME:020125/0676

Effective date: 20071116

AS Assignment

Owner name: B.H. IMAGE CO. LLC, DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BE HERE CORPORATION;REEL/FRAME:021965/0030

Effective date: 20071117