WO2005076628A2 - Procede et systeme de determination du deplacement d'un pixel, et support d'enregistrement pour la mise en oeuvre du procede - Google Patents
Procede et systeme de determination du deplacement d'un pixel, et support d'enregistrement pour la mise en oeuvre du procede Download PDFInfo
- Publication number
- WO2005076628A2 WO2005076628A2 PCT/FR2004/003417 FR2004003417W WO2005076628A2 WO 2005076628 A2 WO2005076628 A2 WO 2005076628A2 FR 2004003417 W FR2004003417 W FR 2004003417W WO 2005076628 A2 WO2005076628 A2 WO 2005076628A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- scene
- point
- pixel
- color
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
Definitions
- the invention relates to a method and a system for determining the displacement of a pixel between two images as well as to an image synthesis method and an information recording medium for implementing the method. More specifically, the invention relates to a method for determining the displacement of a pixel between a first and a second image, the first image being synthesized from a first scene containing a given object and a point of view, and the second image being synthesized from a second scene containing the same object, this second scene being obtained from the first scene by moving the object in the first scene and / or by moving the point of view of the first scene.
- this information is used to determine a vector of displacement of this point between the first and the second images, that is to say the amplitude and the direction of the displacement or of the translation of this point between these two images.
- the set of displacement vectors of all the points of an object or an image is called displacement vector fields.
- These displacement fields or vectors are used in particular in methods for constructing intermediate images using temporal interpolation processes.
- a synthesis module only outputs images such as plane images of a scene defined in three dimensions.
- a calculation module dedicated to this task must be used in addition to the synthesis module.
- this calculation module works directly on the data defining the three-dimensional scene, that is to say on the same data as that processed by the synthesis module. For example, the calculation module establishes the final position of a pixel from information on the movement of objects in a scene between times when a first and a second image are taken as well as from information on the displacement of the point of view between these same moments. These calculations are long and complicated so that the calculation module is itself complex and slow.
- the invention aims to remedy this drawback by proposing a method for determining the displacement of a pixel between a first and a second image making it possible to simplify the calculation module.
- the subject of the invention is therefore a method as described above, characterized: - in that it comprises, before the synthesis of the second image, a step of defining the color of at least one point of the object in the second scene as a function of the position of this point in the first scene so that the color of the pixel of the second image corresponding to this point indicates the position of this point in the first image, this definition step comprising: a projection operation on the object of the second scene of a third image consisting of a set of pixels, in which the color of each pixel indicates its position in this third image, this third image being projected from a projector whose relative position with respect to the object which it lights up in the second scene is chosen to be the same as the relative position of the point of view with respect to this object in the first scene e, - an operation to modify the surface aspects of the illuminated object so that the surface of this object diffuses the lighting from the projector to the point of view, and - an operation to remove the light sources parasites likely to modify the color diffused by one or
- the determination of the initial position and of the final position of a point of an object is simplified because it suffices to note the position and the color of a pixel in the second image to know the position of the point of the corresponding object respectively in the second image and in the first image.
- the calculation module therefore no longer has to perform complicated calculations to determine this information. It will also be noted that the calculation module no longer works on the input data of the synthesis module, that is to say the data defining the three-dimensional scene, but only on an image, that is to say two-dimensional data.
- the subject of the invention is also a method of synthesizing images, each image being formed of a set of pixels, this method comprising: - a first step synthesis of a first image from a first scene, the first image representing, moreover, an object in the first scene taken from a given point of view, and - a second step synthesis of a second image from a second scene, this second scene being obtained from of the first scene by moving the object in the first scene and / or by moving the point of view of the first scene, - a step of determining the displacement of at least one pixel between the first and second images, this step being carried out by putting implement a method for determining the displacement of a pixel according to the invention, and - a step of construction by temporal interpolation of at least one intermediate image between the first and second images synthesized using the information on the displacement or each pixel previously determined.
- the invention also relates to an information recording medium, characterized in that it includes instructions for the implementation of the determination or synthesis method according to the invention when these instructions are executed by a computer. electronic.
- the invention also relates to a system for determining the displacement of a pixel between a first and a second synthetic image, said images being made up of a set of pixels, the first image being synthesized from a first scene.
- this system comprising: - an image synthesis module capable of generating images from a three-dimensional scene, and - a control module capable of activating a first time the synthesis module to generate the first image from the first scene, and to activate a second time the synthesis module to generate the second image from the second scene, characterized: - in that the control module is capable of automatically defining the color of at least one point of the object in the second scene as a function of the position of this point in the first scene so that the color of the pixel of the second image corresponding to this point indicates the position of this pixel in the first image, the control module being able to do this for this: a projection operation on the object of the second scene of a third image consisting of a set of pixels, in which the color of each pixel
- FIG. 1 represents an electronic system for synthesizing images designated by the general reference 2.
- This system 2 comprises an electronic computer 4, such as a central unit of a conventional computer, associated with a display screen 6 and a memory 8.
- the system 2 also includes a man / machine interface 10 allowing a user to enter commands for moving objects.
- this interface 10 is here formed of an alphanumeric keyboard
- the computer 4 includes a module 20 for synthesizing digital images and a module 22 for controlling this module 20.
- the module 22 is capable of defining a digital model of a three-dimensional scene such as a scene 28 shown in perspective on the FIG. 2 at a time T.
- the three-dimensional scene used to illustrate the present description comprises only two objects, that is to say here a cone 30 and a cylinder 32.
- the geometric shape of each object is defined by an assembly of several facets contiguous to each other.
- these facets are represented by triangles 34.
- Each facet of each object has a surface appearance. This surface aspect is adjustable.
- a scene also includes one or more projectors intended to illuminate the objects of the scene.
- projectors 36 and 38 are shown in FIG. 2.
- the directivity of the projector that is to say, for example, the facets of each object which will be illuminated by this projector.
- a scene comprises at least one point of view represented here by a camera 40. The position, the orientation as well as the field of vision of this camera 40 are adjustable.
- FIG. 3 represents a scene 42 corresponding to scene 28 at an instant T + 1 after the cone 30 has been moved along the axis Z while the cylinder 32 has remained stationary.
- This cone displacement command 30 has, for example, been generated by the user using the control lever 12.
- This displacement is denoted D and represented by an arrow D in FIG. 3.
- the synthesis module 20 is capable of generating a two-dimensional image of the scene defined by the control module 22.
- the module 20 is able to generate the image of the three-dimensional scene that the camera 40 films.
- the module 20 To this end, a known process is carried out using Open GL technology. Information on this technology can be obtained online from the following address: http: //developer.apple.com/documentation/Graphicslmagi ng / Conceptual / OpenGL / chap2 / chapter_2_section_3.html
- Figures 4 and 5 represent images 44 and 46 synthesized by module 20 respectively for scenes 28 and 42. In images 44 and 46 the axes Y and Z correspond to those of the orthonormal reference frame of scenes 28 and 42.
- the images synthesized by the module 20 are suitable for being displayed on the screen 6. For example, here these images are images of 256 pixels by 256 pixels.
- FIGS. 4 and 5 is shown an orthonormal reference frame ⁇ , ⁇ graduated in number of pixels and whose origin corresponds to the pixel in the lower left corner of the image.
- This orthonormal reference frame ⁇ , ⁇ makes it possible to define the position of each pixel of the image by a pair of coordinates (i, j) where i and j correspond to the coordinates of the pixel respectively on the axes ⁇ and ⁇ .
- the module 20 also includes a sub-module 48 for smoothing colors. Indeed, when the surface of a point of an object corresponding to a pixel has two different colors separated by a border, it is necessary to adopt a rule which allows the module 20 to assign a single color to this pixel.
- the sub-module 48 acts so as to smooth this difference in colors and therefore chooses an intermediate color between the colors present on either side of the border. More precisely, the sub-module 48 determines this intermediate color to be assigned to the pixel by linear interpolation.
- the computer 4 also includes a module 50 for calculating a field of displacement vectors as well as a module 52 for constructing intermediate images.
- the module 50 is intended to calculate the vector of displacement of the pixels between two successive synthesized images. For example, this module 50 is able to calculate the displacement vector of all the pixels of the image 44 between the instants T and T + 1.
- the module 52 is capable of calculating intermediate images representing the scene filmed by the camera 40 at an intermediate instant between the instants T and T + 1.
- the module 52 implements a known temporal interpolation process, such as that described in patent EP 0 294 282 B1.
- the computer 4 is able to control the display on the screen 6 of an ordered temporal sequence of images or video sequence formed by the images synthesized by the module 20 between which are inserted intermediate images constructed by the module 52 .
- the computer 4 is a conventional programmable electronic computer and the various modules 20, 22, 50 and 52 are, for example, software modules.
- the instructions corresponding to these software modules are, for example, recorded in memory 8. These instructions are adapted to the execution of the method of FIG. 6.
- the operation of the system 2 will now be described with reference to FIG. 6 and in the particular case of scenes 28 and 42.
- the module 20 synthesizes during a step 70, the image 44 from scene 28 defined by the control module 22. Then, the module 20 synthesizes, during a step 72, the image 46 from the scene 42 defined by the control module 22.
- the system 2 then proceeds to a step 74 for generating a third image, for example, identical to the second image 46 at l except that each pixel of an object has a color as a function of the place occupied at time T by the point of the object corresponding to this pixel.
- the module 22 automatically defines, during an operation 76, a third scene 80 (FIG. 7). This scene 80 is geometrically identical to scene 42 with the exception of the position of the projectors.
- this scene 80 is constructed from scene 42 to preserve the position of the objects 30 and 32 and of the camera 40 identically.
- the module 22 removes during of a sub-operation 86, all the projectors or light sources of the scene 42. Then, during a sub-operation 88, the module 22 assigns to all the objects of the scene 42 the same surface appearance.
- the surface aspect is here chosen so that the surface of all objects is perfectly diffusing, that is to say that the surface does not change the color of the light it returns and that it distributes throughout the space with the same value. Thus, when a point of an object is illuminated with a red beam, this point diffuses throughout the space a light of exactly the same red.
- the module 22 creates, during a sub-operation 90, a projector for each part or static object in the scene and a projector for each movable object in the scene.
- Static objects or parts are parts or objects which have not moved between time T and time T + 1, for example here cylinder 32.
- Movable objects are, on the contrary, objects which have moved between time T and l 'instant T + 1.
- a single projector will be assigned to a group of several mobile objects if the relative position of these objects with respect to each other remains unchanged between time T and time T + 1.
- a projector 82 is created to illuminate the cylinder 32 while a projector 84 is created to illuminate the cone 30.
- the module 22 regulates, during sub-operation 90, the directivity of the projectors 82 and 84 so that that these only illuminate the facets of the object with which they are associated which are visible on the image 44.
- the module 22 determines the position of each projector with respect to the object it illuminates. To this end, the position of each projector with respect to the object it illuminates is chosen so that this position is identical to that occupied by the camera 40 with respect to this same object. in scene 28. Consequently, since here neither the position of the cylinder 32 nor the position of the camera 40 have been modified between the instants T and T + 1, the projector 82 which illuminates the cylinder 32 is placed at the same position as the camera 40.
- the position of the projector 84 is offset by a distance D along the Z axis from the position of the camera 40.
- This distance D is identical to the amplitude of the displacement D of the cone 30.
- the projector 84 remains, in a way, stationary relative to the cone 30 despite the displacement of the latter between instants T and T + 1.
- the choice consisting in placing the projector with respect to the object which it illuminates in the same position as that occupied by the camera 40 at the previous instant, that is to say the instant T makes it possible to maximize the illuminated surface of the object usable for determining, for example, fields of displacement vectors.
- the field of vision of each projector is adjusted to correspond to that of the camera 40 at time T, then, during a sub-operation 96, each projector is also oriented as the camera 40 at time T.
- the module 22 defines the image projected by each projector on the object it illuminates.
- each pixel is linked by a one-to-one function to the position of this pixel in this image so that the color of the pixel indicates its position or, in other words, identifies the row and the column. of the pixel in the image.
- the number of possible colors is greater than the number of pixels of the synthesized images and preferably at least 2, 3 or 4 times greater than the number of pixels of the synthesized images.
- each color is represented by a pair of values (k, I) where k represents, for example, the quantity of red and I the quantity, for example, of green. In the example described here, it is assumed that there are three times more possible colors than pixels.
- the synthesized images are images of 256/256 pixels, the maximum quantity of red or green will be represented by the value 768 while the minimum quantity will be represented by the value 1, the values k and I being able to take all the integer values between 1 and 768.
- all the projectors of scene 80 project the same image, that is to say an image 104 (FIG. 8).
- FIG. 8 includes a reference mark ⁇ , ⁇ identical to that of FIGS. 4 and 5 so as to identify the position of the pixels of the image 104.
- FIG. 8 includes a reference mark ⁇ , ⁇ identical to that of FIGS. 4 and 5 so as to identify the position of the pixels of the image 104.
- only a few pixels of this image 104 are represented by squares 106
- the colors associated with each of these pixels are represented by a pair of values (k, I) indicated in parentheses in this figure.
- the colors associated with the pixels of image 104 are organized so as to create a color gradient along the axis ⁇ and along the axis ⁇ .
- the pixel located on the origin of the axes ⁇ and ⁇ is associated with the color (1, 1).
- the values of the parameter k assigned to each pixel of the image 104 form a geometric progression of reason 3 going from left to right along the axis ⁇ .
- values of the parameter I of each pixel form a geometric progression of reason 3 along the axis ⁇ and away from the pixel having the color (1,1).
- the pixel located at the upper right corner that is to say the one placed at the intersection of the 256 th row and the 256 th column of pixels, is associated with the color (768,768). Consequently, in image 104, knowing the color of a pixel, it is possible to find its coordinates i, j in the reference frame ⁇ , ⁇ by a simple rule of three. It is therefore understood that in scene 80, because of the choice of the position of the projectors 82 and 84 with respect to the objects 32 and 30 and the color coding in the projected image 104, each point of the object lit is assigned a color depending on the position it occupied in image 44.
- module 20 proceeds to a step 116 of synthesizing an image 118 (FIG. 9) filmed at starting from the camera 40 of the scene 80.
- the image 118 is identical to the image 46 with the exception that the color of the different pixels of an object is a function of the position of the corresponding point of the object in image 44. In particular, the position of objects 30 and 32 is identical to the position of these same objects in image 46.
- the sub-module 48 performs an operation 119 color smoothing thus, the pixels of image 118 can be assigned an intermediate color between those pixels 106 of image 104. This is made possible by the fact that the number of possible colors is greater than the number of pixels in the image. From then on, it is possible to measure displacements with an accuracy lower than the pixel as will be understood on reading the following description.
- the 50 calculates during a step 120, the field of displacement vectors. For this purpose, it notes, during an operation 122, the position and the color of the pixels of the image 118. For example, it notes that the position of a pixel 120 (FIG. 9) is (225; 128) and that its color is (599; 384). Then, during an operation 124, it deduces the position of this pixel in the image 44 from the detected color. For example, for pixel 120 there deduces that the color (599; 384) corresponds to the position (199.6666; 128) in the coordinate system ⁇ , ⁇ . Finally, during an operation 126 it determines the displacement vector of each pixel by subtracting from the raised position, the initial position deduced during operation 124.
- the displacement vector is equal to (25.3333; 0).
- the operations 122 to 126 are repeated for all the pixels of each object which are illuminated by one of the projectors of the scene 80.
- the module 52 constructs, during a step 130, intermediate images representing the objects 30 and 32 at intermediate times between the times T and T + 1.
- the module 52 implements a temporal interpolation process and uses the displacement vector field calculated during step 120.
- the module 52 builds at least one intermediate image and preferably more than two, five or ten intermediate images between times T and T + 1.
- Step 130 of construction of an intermediate image by temporal interpolation is faster than the synthesis of an image by the module 20. Consequently, the above method has the advantage of being faster than the methods of image synthesis known.
- the known image synthesis methods synthesize using the module 20 each of the images of a temporal sequence of images. Consequently, for a time sequence of seven images, the module 20 will be executed seven times. On the contrary, in the method of FIG.
- the module 20 is activated three times respectively to generate images 44, 46 and 118 while the five intermediate images of the time sequence are constructed by temporal interpolation. So it was measured that the total time required to build this sequence temporal of seven images using the method of Figure 6 is much less than the time required to construct a temporal sequence of seven images using known image synthesis methods.
- the method has been described in the particular case where only an image 118 is constructed to determine the field of displacement vectors.
- the method of FIG. 6 is modified to construct an image in which the colors of the pixels indicate the position of the points of the objects corresponding to the instant T and another image in which the colors of the pixels indicate the position of the points of the objects at time T + 1.
- step 74 is performed by taking once as the starting image the image taken at an instant T and as the arriving image the image taken at an instant T + 1 and a second time by taking as the starting image the image taken at time T + 1 and as the arriving image the image taken at time T.
- step 120 is executed for these two images so as to obtain two displacement vector fields and step 130 is preceded by a step of calculating the weighted average of these two displacement vector fields.
- the computation of the vector field of displacement once in one direction, that is to say from the instant T towards the instant T + 1, and once in the other direction, that is to say from time T + 1 to time T improves the accuracy of the calculation of the displacement vector field.
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/584,645 US7379565B2 (en) | 2004-01-06 | 2004-12-30 | Method and system for determining displacement of a pixel, and recording medium therefor |
DE602004009219T DE602004009219T2 (de) | 2004-01-06 | 2004-12-30 | Verfahren und system zur bestimmung der verschiebung eines pixels und aufzeichnungsmedium dafür |
EP04816493A EP1702472B1 (fr) | 2004-01-06 | 2004-12-30 | Procede et systeme de determination du deplacement d un pixe l, et support d enregistrement pour la mise en oeuvre du pro cede |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR0400063A FR2864878B1 (fr) | 2004-01-06 | 2004-01-06 | Procede et systeme de determination du deplacement d'un pixel, et support d'enregistrement pour la mise en oeuvre du procede |
FR0400063 | 2004-01-06 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2005076628A2 true WO2005076628A2 (fr) | 2005-08-18 |
WO2005076628A3 WO2005076628A3 (fr) | 2005-11-17 |
Family
ID=34673849
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FR2004/003417 WO2005076628A2 (fr) | 2004-01-06 | 2004-12-30 | Procede et systeme de determination du deplacement d'un pixel, et support d'enregistrement pour la mise en oeuvre du procede |
Country Status (5)
Country | Link |
---|---|
US (1) | US7379565B2 (fr) |
EP (1) | EP1702472B1 (fr) |
DE (1) | DE602004009219T2 (fr) |
FR (1) | FR2864878B1 (fr) |
WO (1) | WO2005076628A2 (fr) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8068665B2 (en) * | 2005-05-10 | 2011-11-29 | Kabushiki Kaisha Toshiba | 3D-image processing apparatus, 3D-image processing method, storage medium, and program |
US20070058846A1 (en) * | 2005-05-10 | 2007-03-15 | Satoru Ohishi | Three-dimensional image processing apparatus, three-dimensional image processing method and control program used in three-dimensional image processing apparatus |
JP5075757B2 (ja) * | 2008-08-05 | 2012-11-21 | オリンパス株式会社 | 画像処理装置、画像処理プログラム、画像処理方法、および電子機器 |
KR101627185B1 (ko) * | 2009-04-24 | 2016-06-03 | 삼성전자 주식회사 | 영상촬영장치의 제어방법 |
JP6143469B2 (ja) * | 2013-01-17 | 2017-06-07 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
FR3032104B1 (fr) * | 2015-02-02 | 2017-02-10 | Institut De Rech Sur Les Cancers De Lappareil Digestif Ircad | Procede de determination du decalage entre axes median et optique d'un endoscope |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6141041A (en) * | 1998-06-22 | 2000-10-31 | Lucent Technologies Inc. | Method and apparatus for determination and visualization of player field coverage in a sporting event |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5214751A (en) * | 1987-06-04 | 1993-05-25 | Thomson Grand Public | Method for the temporal interpolation of images and device for implementing this method |
FR2645383B1 (fr) * | 1989-03-31 | 1997-06-27 | Thomson Consumer Electronics | Procede et dispositif d'interpolation temporelle d'images, a compensation de mouvement corrigee |
US5613048A (en) * | 1993-08-03 | 1997-03-18 | Apple Computer, Inc. | Three-dimensional image synthesis using view interpolation |
FR2724033B1 (fr) * | 1994-08-30 | 1997-01-03 | Thomson Broadband Systems | Procede de generation d'image de synthese |
JP3907891B2 (ja) * | 1999-11-11 | 2007-04-18 | 富士フイルム株式会社 | 画像撮像装置及び画像処理装置 |
-
2004
- 2004-01-06 FR FR0400063A patent/FR2864878B1/fr not_active Expired - Fee Related
- 2004-12-30 US US10/584,645 patent/US7379565B2/en active Active
- 2004-12-30 EP EP04816493A patent/EP1702472B1/fr not_active Expired - Fee Related
- 2004-12-30 WO PCT/FR2004/003417 patent/WO2005076628A2/fr active IP Right Grant
- 2004-12-30 DE DE602004009219T patent/DE602004009219T2/de active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6141041A (en) * | 1998-06-22 | 2000-10-31 | Lucent Technologies Inc. | Method and apparatus for determination and visualization of player field coverage in a sporting event |
Also Published As
Publication number | Publication date |
---|---|
EP1702472B1 (fr) | 2007-09-26 |
FR2864878B1 (fr) | 2006-04-14 |
EP1702472A2 (fr) | 2006-09-20 |
FR2864878A1 (fr) | 2005-07-08 |
DE602004009219D1 (de) | 2007-11-08 |
US20070076919A1 (en) | 2007-04-05 |
US7379565B2 (en) | 2008-05-27 |
WO2005076628A3 (fr) | 2005-11-17 |
DE602004009219T2 (de) | 2008-06-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP0511101B1 (fr) | Procédé de modélisation d'un système de prise de vues et procédé et système de réalisation de combinaisons d'images réelles et d'images de synthèse | |
EP2828834B1 (fr) | Modèle et procédé de production de modèles 3d photo-réalistes | |
EP2385405B1 (fr) | Dispositif de projection panoramique, et procédé mis en oeuvre dans ce dispositif | |
FR2984057A1 (fr) | Systeme de tournage de film video | |
EP0661672A1 (fr) | Procédé et dispositif de traitement d'image pour construire à partir d'une image source une image cible avec changement de perspective | |
FR2695230A1 (fr) | Procédé de création d'images animées. | |
FR2821167A1 (fr) | Dispositif support d'appareil photographique | |
EP3072109A1 (fr) | Procede d'estimation de la vitesse de deplacement d'une camera | |
EP1702472B1 (fr) | Procede et systeme de determination du deplacement d un pixe l, et support d enregistrement pour la mise en oeuvre du pro cede | |
EP2297705B1 (fr) | Procede de composition temps reel d'une video | |
FR2821156A1 (fr) | Procede et dispositif pour l'obtention d'une image panoramique numerique a teinte constante | |
EP3210378B1 (fr) | Méthode pour collecter des données d'images destinées à produire une vidéo immersive et méthode de visualisation d'un espace sur base de ces données d'images | |
FR3007175A1 (fr) | Systemes de reperage de la position de la camera de tournage pour le tournage de films video | |
EP2987319A1 (fr) | Procede de generation d'un flux video de sortie a partir d'un flux video large champ | |
EP2994813B1 (fr) | Procede de commande d'une interface graphique pour afficher des images d'un objet tridimensionnel | |
JP2005092549A (ja) | 三次元画像処理方法および装置 | |
FR3048786A1 (fr) | Reglage dynamique de la nettete d'au moins une image projetee sur un objet | |
JP2003168129A (ja) | 三次元画像処理方法、三次元画像処理プログラム、三次元画像処理装置および三次元画像処理システム | |
FR3048541B1 (fr) | Modification virtuelle de la denture d'une personne | |
FR3013492A1 (fr) | Procede utilisant des donnees de geometrie 3d pour une presentation et une commande d'image de realite virtuelle dans un espace 3d | |
Hudon | Active illumination for high speed image acquisition and recovery of shape and albedo | |
CH704910A1 (fr) | Procédé et équipement pour insérer dans un flux vidéo des images de synthèse de meubles ou d'autres éléments d'habitat. | |
CH712678B1 (fr) | Procédé d'affichage de données cartographiques. | |
FR2931611A1 (fr) | Procede de modelisation 3d de scenes reelles et dynamiques | |
FR3043295A1 (fr) | Dispositif de realite augmentee spatiale pour un environnement de bureau |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004816493 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007076919 Country of ref document: US Ref document number: 10584645 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2004816493 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 10584645 Country of ref document: US |
|
WWG | Wipo information: grant in national office |
Ref document number: 2004816493 Country of ref document: EP |