WO2013184340A1 - Coordinate measurement machines with removable accessories - Google Patents
Coordinate measurement machines with removable accessories Download PDFInfo
- Publication number
- WO2013184340A1 WO2013184340A1 PCT/US2013/041826 US2013041826W WO2013184340A1 WO 2013184340 A1 WO2013184340 A1 WO 2013184340A1 US 2013041826 W US2013041826 W US 2013041826W WO 2013184340 A1 WO2013184340 A1 WO 2013184340A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pattern
- structured light
- aacmm
- dimensional
- coded
- Prior art date
Links
- 238000005259 measurement Methods 0.000 title claims description 52
- 239000000523 sample Substances 0.000 claims abstract description 103
- 238000000034 method Methods 0.000 claims description 64
- 239000003086 colorant Substances 0.000 claims description 7
- 230000004044 response Effects 0.000 claims description 7
- 230000008859 change Effects 0.000 claims description 4
- 238000012545 processing Methods 0.000 description 43
- 230000008901 benefit Effects 0.000 description 19
- 230000006870 function Effects 0.000 description 14
- 230000033001 locomotion Effects 0.000 description 13
- 230000006854 communication Effects 0.000 description 10
- 238000004891 communication Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 230000000737 periodic effect Effects 0.000 description 5
- 230000010363 phase shift Effects 0.000 description 5
- 230000007175 bidirectional communication Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 239000002131 composite material Substances 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 239000000853 adhesive Substances 0.000 description 2
- 230000001070 adhesive effect Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 241001270131 Agaricus moelleri Species 0.000 description 1
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 240000007108 Fuchsia magellanica Species 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 229910052799 carbon Inorganic materials 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000003750 conditioning effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013481 data capture Methods 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000004870 electrical engineering Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000003754 machining Methods 0.000 description 1
- 230000006386 memory function Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000002991 molded plastic Substances 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000007514 turning Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
- 238000003466 welding Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
- G01B11/005—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
- G01B11/005—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines
- G01B11/007—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines feeler heads therefor
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2509—Color coding
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2513—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B5/00—Measuring arrangements characterised by the use of mechanical techniques
- G01B5/004—Measuring arrangements characterised by the use of mechanical techniques for measuring coordinates of points
- G01B5/008—Measuring arrangements characterised by the use of mechanical techniques for measuring coordinates of points using coordinate measuring machines
Definitions
- the present disclosure relates to a coordinate measuring machine, and more particularly to a portable articulated arm coordinate measuring machine having a connector on a probe end of the coordinate measuring machine that allows accessory devices which use structured light for non-contact three dimensional measurement to be removably connected to the coordinate measuring machine.
- Portable articulated arm coordinate measuring machines have found widespread use in the manufacturing or production of parts where there is a need to rapidly and accurately verify the dimensions of the part during various stages of the manufacturing or production (e.g., machining) of the part.
- Portable AACMMs represent a vast improvement over known stationary or fixed, cost-intensive and relatively difficult to use measurement installations, particularly in the amount of time it takes to perform dimensional measurements of relatively complex parts.
- a user of a portable AACMM simply guides a probe along the surface of the part or object to be measured. The measurement data are then recorded and provided to the user.
- the data are provided to the user in visual form, for example, three-dimensional (3-D) form on a computer screen.
- FIG. 1 An example of a prior art portable articulated arm CMM is disclosed in commonly assigned U.S. Patent No. 5,402,582 ('582).
- the '582 patent discloses a 3-D measuring system comprised of a manually- operated articulated arm CMM having a support base on one end and a measurement probe at the other end.
- Commonly assigned U.S. Patent No. 5,611,147 ( ⁇ 47) discloses a similar articulated arm CMM.
- the articulated arm CMM includes a number of features including an additional rotational axis at the probe end, thereby providing for an arm with either a two-two-two or a two-two-three axis configuration (the latter case being a seven axis arm).
- Three-dimensional surfaces may be measured using non-contact techniques as well.
- One type of non-contact device sometimes referred to as a laser line probe or laser line scanner, emits a laser light either on a spot, or along a line.
- An imaging device such as a charge-coupled device (CCD) for example, is positioned adjacent the laser.
- the laser is arranged to emit a line of light which is reflected off of the surface.
- the surface of the object being measured causes a diffuse reflection which is captured by the imaging device.
- the image of the reflected line on the sensor will change as the distance between the sensor and the surface changes.
- triangulation methods may be used to measure three-dimensional coordinates of points on the surface.
- One issue that arises with laser line probes is that the density of measured points may vary depending on the speed at which the laser line probe is moved across the surface of the object. The faster the laser line probe is moved, the greater the distance between the points and a lower point density.
- the point spacing is typically uniform in each of the two dimensions, thereby generally providing uniform measurement of workpiece surface points.
- a portable articulated arm coordinate measuring machine for measuring three-dimensional coordinates of an object in space.
- the AACMM includes a base.
- a manually positionable arm portion is provided having an opposed first end and second end, the arm portion being rotationally coupled to the base, the arm portion including a plurality of connected arm segments, each arm segment including at least one position transducer for producing a position signal.
- An electronic circuit is provided that receives the position signal from the at least one position transducer in each arm segment.
- a probe end is coupled to the first end.
- a noncontact three-dimensional measuring device is coupled to the probe end, the noncontact three-dimensional measuring device having a projector and an image sensor, the projector having a source plane, the projector configured to emit a structured light onto the object, the structured light located on the source plane and including at least three non-collinear pattern elements, the image sensor arranged to receive the structured light reflected from the object.
- a processor is electrically coupled to the electronic circuit, the processor configured to determine the three-dimensional coordinates of a point on the object in response to receiving the position signals from the position transducers and in response to receiving the structured light by the image sensor.
- the method includes providing a manually positionable arm portion having an opposed first end and second end, the arm portion including a plurality of connected arm segments, each arm segment including at least one position transducer for producing a position signal.
- a probe end is provided for measuring the object, the probe end being coupled to the first end.
- An electronic circuit receives the position signals from the transducers.
- a three-dimensional noncontact measurement device is provided having a controller, the three-dimensional noncontact measurement device having a sensor and a projector, the projector configured to emit a structured light onto the object, the projector having a source plane, the structured light located on the source plane and including at least three non-collinear pattern elements.
- a structured light is projected from the three- dimensional measurement device onto the object.
- FIG. 1 including FIGs. 1A and IB, are perspective views of a portable articulated arm coordinate measuring machine (AACMM) having embodiments of various aspects of the present invention therewithin;
- AACMM portable articulated arm coordinate measuring machine
- FIG. 2 is a block diagram of electronics utilized as part of the AACMM of FIG. 1 in accordance with an embodiment
- FIG. 3 is a block diagram describing detailed features of the electronic data processing system of FIG. 2 in accordance with an embodiment
- FIG. 4 is an isometric view of the probe end of the AACMM of FIG. 1;
- FIG. 5 is a side view of the probe end of FIG. 4 with the handle being coupled thereto;
- FIG. 6 is a side view of the probe end of FIG. 4 with the handle attached;
- FIG. 7 is an enlarged partial side view of the interface portion of the probe end of FIG. 6;
- FIG. 8 is another enlarged partial side view of the interface portion of the probe end of FIG. 5;
- FIG. 9 is an isometric view partially in section of the handle of FIG. 4;
- FIG. 10 is an isometric view of the probe end of the AACMM of FIG. 1 with a structured light device having a single camera attached;
- FIG. 11 is an isometric view partially in section of the device of FIG. 10;
- FIG. 12 is an isometric view of the probe end of the AACMM of FIG. 1 with another structured light device having dual cameras attached;
- FIG. 13A and FIG. 13B are schematic views illustrating the operation of the device of FIG. 10 when attached to the probe end of the AACMM of FIG. 1;
- FIGS. 14 - 17 are sequential projections having an uncoded binary pattern that may be emitted by the structured light device of FIG. 10 or FIG. 12, in accordance with an embodiment of the present invention
- FIGS. 18-19 are spatially varying color coded patterns that may be emitted by the structured light device of FIG. 10 or FIG. 12, in accordance with an embodiment of the invention.
- FIGS. 20-23 are strip index coded patterns that may be emitted by the structured light device of FIG. 10 or FIG. 12, in accordance with an embodiment of the invention.
- FIGS. 24-31 are two-dimensional grid patterns that may be emitted by the structured light device of FIG. 10 or FIG. 12, in accordance with an embodiment of the invention.
- FIG. 32 is a schematic illustration of a photometric technique for acquiring patterns of structured light under a plurality of lighting conditions; and, [0027] FIG. 33 is an illustration of a structured light scanner device independently operable from an AACMM in accordance with another embodiment of the invention.
- AACMM Portable articulated arm coordinate measuring machines
- Embodiments of the present invention provide advantages in allowing an operator to easily and quickly couple accessory devices to a probe end of the AACMM that use structured light to provide for the non-contact measuring of a three-dimensional object.
- Embodiments of the present invention provide further advantages in providing for communicating data representing a point cloud measured by the structured light device within the AACMM.
- Embodiments of the present invention provide advantages in greater uniformity in the distribution of measured points that may provide enhanced accuracy.
- Embodiments of the present invention provide still further advantages in providing power and data communications to a removable accessory without having external connections or wiring.
- structured light refers to a two-dimensional pattern of light projected onto a continuous and enclosed area of an object that conveys information which may be used to determine coordinates of points on the object.
- a structured light pattern will contain at least three non-collinear pattern elements disposed within the contiguous and enclosed area. Each of the three non-collinear pattern elements conveys information which may be used to determine the point coordinates.
- a coded light pattern is one in which the three dimensional coordinates of an illuminated surface of the object may be ascertained by the acquisition of a single image.
- the projecting device may be moving relative to the object.
- a coded light pattern there will be no significant temporal relationship between the projected pattern and the acquired image.
- a coded light pattern will contain a set of elements (e.g. geometric shapes) arranged so that at least three of the elements are non-collinear.
- the set of elements may be arranged into collections of lines.
- an uncoded structured light pattern as used herein is a pattern that does not allow measurement through a single pattern when the projector is moving relative to the object.
- An example of an uncoded light pattern is one which requires a series of sequential patterns and thus the acquisition of a series of sequential images. Due to the temporal nature of the projection pattern and acquisition of the image, there should be no relative movement between the projector and the object.
- structured light is different from light projected by a laser line probe or laser line scanner type device that generates a line of light.
- laser line probes used with articulated arms today have irregularities or other aspects that may be regarded as features within the generated lines, these features are disposed in a collinear arrangement. Consequently such features within a single generated line are not considered to make the projected light into structured light.
- FIGs. 1A and IB illustrate, in perspective, an AACMM 100 according to various embodiments of the present invention, an articulated arm being one type of coordinate measuring machine.
- the exemplary AACMM 100 may comprise a six or seven axis articulated measurement device having a probe end 401 that includes a measurement probe housing 102 coupled to an arm portion 104 of the AACMM 100 at one end.
- the arm portion 104 comprises a first arm segment 106 coupled to a second arm segment 108 by a first grouping of bearing cartridges 110 (e.g., two bearing cartridges).
- a second grouping of bearing cartridges 112 couples the second arm segment 108 to the measurement probe housing 102.
- a third grouping of bearing cartridges 114 couples the first arm segment 106 to a base 116 located at the other end of the arm portion 104 of the AACMM 100.
- Each grouping of bearing cartridges 110, 112, 114 provides for multiple axes of articulated movement.
- the probe end 401 may include a measurement probe housing 102 that comprises the shaft of the seventh axis portion of the AACMM 100 (e.g., a cartridge containing an encoder system that determines movement of the measurement device, for example a probe 118, in the seventh axis of the AACMM 100). In this embodiment, the probe end 401 may rotate about an axis extending through the center of measurement probe housing 102.
- the base 116 is typically affixed to a work surface.
- Each bearing cartridge within each bearing cartridge grouping 110, 112, 114 typically contains an encoder system (e.g., an optical angular encoder system).
- the encoder system i.e., transducer
- the arm segments 106, 108 may be made from a suitably rigid material such as but not limited to a carbon composite material for example.
- a portable AACMM 100 with six or seven axes of articulated movement provides advantages in allowing the operator to position the probe 118 in a desired location within a 360° area about the base 116 while providing an arm portion 104 that may be easily handled by the operator.
- an arm portion 104 having two arm segments 106, 108 is for exemplary purposes, and the claimed invention should not be so limited.
- An AACMM 100 may have any number of arm segments coupled together by bearing cartridges (and, thus, more or less than six or seven axes of articulated movement or degrees of freedom).
- the probe 118 is detachably mounted to the measurement probe housing 102, which is connected to bearing cartridge grouping 112.
- a handle 126 is removable with respect to the measurement probe housing 102 by way of, for example, a quick-connect interface.
- the handle 126 may be replaced with another device configured to emit a structured light to provide non-contact measurement of three-dimensional objects, thereby providing advantages in allowing the operator to make both contact and non-contact measurements with the same AACMM 100.
- the probe housing 102 houses a removable probe 118, which is a contacting measurement device and may have different tips 118 that physically contact the object to be measured, including, but not limited to: ball, touch-sensitive, curved and extension type probes.
- the measurement is performed, for example, by a non- contacting device such as a coded structured light scanner device.
- the handle 126 is replaced with the coded structured light scanner device using the quick-connect interface.
- Other types of measurement devices may replace the removable handle 126 to provide additional functionality. Examples of such measurement devices include, but are not limited to, one or more illumination lights, a temperature sensor, a thermal scanner, a bar code scanner, a projector, a paint sprayer, a camera, or the like, for example.
- the AACMM 100 includes the removable handle 126 that provides advantages in allowing accessories or functionality to be changed without removing the measurement probe housing 102 from the bearing cartridge grouping 112.
- the removable handle 126 may also include an electrical connector that allows electrical power and data to be exchanged with the handle 126 and the corresponding electronics located in the probe end 401.
- each grouping of bearing cartridges 110, 112, 114 allows the arm portion 104 of the AACMM 100 to move about multiple axes of rotation.
- each bearing cartridge grouping 110, 112, 114 includes corresponding encoder systems, such as optical angular encoders for example, that are each arranged coaxially with the corresponding axis of rotation of, e.g., the arm segments 106, 108.
- the optical encoder system detects rotational (swivel) or transverse (hinge) movement of, e.g., each one of the arm segments 106, 108 about the corresponding axis and transmits a signal to an electronic data processing system within the AACMM 100 as described in more detail herein below.
- Each individual raw encoder count is sent separately to the electronic data processing system as a signal where it is further processed into measurement data.
- No position calculator separate from the AACMM 100 itself e.g., a serial box
- U.S. Patent No. 5,402,582 '582
- the base 116 may include an attachment device or mounting device 120.
- the mounting device 120 allows the AACMM 100 to be removably mounted to a desired location, such as an inspection table, a machining center, a wall or the floor for example.
- the base 116 includes a handle portion 122 that provides a convenient location for the operator to hold the base 116 as the AACMM 100 is being moved.
- the base 116 further includes a movable cover portion 124 that folds down to reveal a user interface, such as a display screen.
- the base 116 of the portable AACMM 100 contains or houses an electronic circuit having an electronic data processing system that includes two primary components: a base processing system that processes the data from the various encoder systems within the AACMM 100 as well as data representing other arm parameters to support three-dimensional (3-D) positional calculations; and a user interface processing system that includes an on-board operating system, a touch screen display, and resident application software that allows for relatively complete metrology functions to be implemented within the AACMM 100 without the need for connection to an external computer.
- a base processing system that processes the data from the various encoder systems within the AACMM 100 as well as data representing other arm parameters to support three-dimensional (3-D) positional calculations
- a user interface processing system that includes an on-board operating system, a touch screen display, and resident application software that allows for relatively complete metrology functions to be implemented within the AACMM 100 without the need for connection to an external computer.
- the electronic data processing system in the base 116 may communicate with the encoder systems, sensors, and other peripheral hardware located away from the base 116 (e.g., a structured light device that can be mounted to the removable handle 126 on the AACMM 100).
- the electronics that support these peripheral hardware devices or features may be located in each of the bearing cartridge groupings 110, 112, 114 located within the portable AACMM 100.
- FIG. 2 is a block diagram of electronics utilized in an AACMM 100 in accordance with an embodiment.
- the embodiment shown in FIG. 2A includes an electronic data processing system 210 including a base processor board 204 for implementing the base processing system, a user interface board 202, a base power board 206 for providing power, a Bluetooth module 232, and a base tilt board 208.
- the user interface board 202 includes a computer processor for executing application software to perform user interface, display, and other functions described herein.
- each encoder system generates encoder data and includes: an encoder arm bus interface 214, an encoder digital signal processor (DSP) 216, an encoder read head interface 234, and a temperature sensor 212.
- DSP digital signal processor
- Other devices such as strain sensors, may be attached to the arm bus 218.
- probe end electronics 230 that are in
- the probe end electronics 230 include a probe end DSP 228, a temperature sensor 212, a handle/device interface bus 240 that connects with the handle 126 or the coded structured light scanner device 242 via the quick-connect interface in an embodiment, and a probe interface 226.
- the quick-connect interface allows access by the handle 126 to the data bus, control lines, and power bus used by the coded structured light scanner device 242 and other accessories.
- the probe end electronics 230 are located in the measurement probe housing 102 on the AACMM 100.
- the handle 126 may be removed from the quick-connect interface and measurement may be performed by the structured light device 242 communicating with the probe end electronics 230 of the AACMM 100 via the interface bus 240.
- the electronic data processing system 210 is located in the base 116 of the AACMM 100, the probe end electronics 230 are located in the measurement probe housing 102 of the AACMM 100, and the encoder systems are located in the bearing cartridge groupings 110, 112, 114.
- the probe interface 226 may connect with the probe end DSP 228 by any suitable communications protocol, including commercially-available products from Maxim Integrated Products, Inc. that embody the 1-wire® communications protocol 236.
- FIG. 3 is a block diagram describing detailed features of the electronic data processing system 210 of the AACMM 100 in accordance with an embodiment.
- the electronic data processing system 210 is located in the base 116 of the AACMM 100 and includes the base processor board 204, the user interface board 202, a base power board 206, a Bluetooth module 232, and a base tilt module 208.
- the base processor board 204 includes the various functional blocks illustrated therein.
- a base processor function 302 is utilized to support the collection of measurement data from the AACMM 100 and receives raw arm data (e.g., encoder system data) via the arm bus 218 and a bus control module function 308.
- the memory function 304 stores programs and static arm configuration data.
- the base processor board 204 also includes an external hardware option port function 310 for communicating with any external hardware devices or accessories such as a coded structured light scanner device 242.
- a real time clock (RTC) and log 306, a battery pack interface (IF) 316, and a diagnostic port 318 are also included in the functionality in an embodiment of the base processor board 204 depicted in FIG. 3A.
- the base processor board 204 also manages all the wired and wireless data communication with external (host computer) and internal (display processor 202) devices.
- the base processor board 204 has the capability of communicating with an Ethernet network via an Ethernet function 320 (e.g., using a clock synchronization standard such as Institute of Electrical and Electronics Engineers (IEEE) 1588), with a wireless local area network (WLAN) via a LAN function 322, and with Bluetooth module 232 via a parallel to serial communications (PSC) function 314.
- the base processor board 204 also includes a connection to a universal serial bus (USB) device 312.
- the base processor board 204 transmits and collects raw measurement data (e.g., encoder system counts, temperature readings) for processing into measurement data without the need for any preprocessing, such as disclosed in the serial box of the
- the base processor 204 sends the processed data to the display processor 328 on the user interface board 202 via an RS485 interface (IF) 326. In an embodiment, the base processor 204 also sends the raw measurement data to an external computer.
- IF RS485 interface
- the angle and positional data received by the base processor is utilized by applications executing on the display processor 328 to provide an autonomous metrology system within the AACMM 100.
- Applications may be executed on the display processor 328 to support functions such as, but not limited to: measurement of features, guidance and training graphics, remote diagnostics, temperature corrections, control of various operational features, connection to various networks, and display of measured objects.
- the user interface board 202 includes several interface options including a secure digital (SD) card interface 330, a memory 332, a USB Host interface 334, a diagnostic port 336, a camera port 340, an audio/video interface 342, a dial-up/ cell modem 344 and a global positioning system (GPS) port 346.
- SD secure digital
- the electronic data processing system 210 shown in FIG. 3A also includes a base power board 206 with an environmental recorder 362 for recording environmental data.
- the base power board 206 also provides power to the electronic data processing system 210 using an AC/DC converter 358 and a battery charger control 360.
- the base power board 206 communicates with the base processor board 204 using inter-integrated circuit (I2C) serial single ended bus 354 as well as via a DMA serial peripheral interface (DSPI) 357.
- I2C inter-integrated circuit
- DSPI DMA serial peripheral interface
- the base power board 206 is connected to a tilt sensor and radio frequency identification (RFID) module 208 via an input/output (I/O) expansion function 364 implemented in the base power board 206.
- RFID radio frequency identification
- all or a subset of the components may be physically located in different locations and/or functions combined in different manners than that shown in FIG. 3 A and FIG. 3B.
- the base processor board 204 and the user interface board 202 are combined into one physical board.
- the device 400 includes an enclosure 402 that includes a handle portion 404 that is sized and shaped to be held in an operator's hand, such as in a pistol grip for example.
- the enclosure 402 is a thin wall structure having a cavity 406 (FIG. 9).
- the cavity 406 is sized and configured to receive a controller 408.
- the controller 408 may be a digital circuit, having a microprocessor for example, or an analog circuit.
- the controller 408 is in asynchronous bidirectional communication with the electronic data processing system 210 (FIGs. 2 and 3).
- the communication connection between the controller 408 and the electronic data processing system 210 may be wired (e.g. via controller 420) or may be a direct or indirect wireless connection (e.g. Bluetooth or IEEE 802.11) or a combination of wired and wireless connections.
- the enclosure 402 is formed in two halves 410, 412, such as from an injection molded plastic material for example.
- the halves 410, 412 may be secured together by fasteners, such as screws 414 for example.
- the enclosure halves 410, 412 may be secured together by adhesives or ultrasonic welding for example.
- the handle portion 404 also includes buttons or actuators 416, 418 that may be manually activated by the operator.
- the actuators 416, 418 are coupled to the controller 408 that transmits a signal to a controller 420 within the probe housing 102.
- the actuators 416, 418 perform the functions of actuators 422, 424 located on the probe housing 102 opposite the device 400.
- the device 400 may have additional switches, buttons or other actuators that may also be used to control the device 400, the AACMM 100 or vice versa.
- the device 400 may include indicators, such as light emitting diodes (LEDs), sound generators, meters, displays or gauges for example.
- LEDs light emitting diodes
- the device 400 may include a digital voice recorder that allows for synchronization of verbal comments with a measured point.
- the device 400 includes a microphone that allows the operator to transmit voice activated commands to the electronic data processing system 210.
- the handle portion 404 may be configured to be used with either operator hand or for a particular hand (e.g. left handed or right handed).
- the handle portion 404 may also be configured to facilitate operators with disabilities (e.g. operators with missing finders or operators with prosthetic arms).
- the handle portion 404 may be removed and the probe housing 102 used by itself when clearance space is limited.
- the probe end 401 may also comprise the shaft of the seventh axis of AACMM 100. In this embodiment the device 400 may be arranged to rotate about the AACMM seventh axis.
- the probe end 401 includes a mechanical and electrical interface 426 having a first connector 429 (Fig. 8) on the device 400 that cooperates with a second connector 428 on the probe housing 102.
- the connectors 428, 429 may include electrical and mechanical features that allow for coupling of the device 400 to the probe housing 102.
- the interface 426 includes a first surface 430 having a mechanical coupler 432 and an electrical connector 434 thereon.
- the enclosure 402 also includes a second surface 436 positioned adjacent to and offset from the first surface 430.
- the second surface 436 is a planar surface offset a distance of approximately 0.5 inches from the first surface 430. This offset provides a clearance for the operator's fingers when tightening or loosening a fastener such as collar 438.
- the interface 426 provides for a relatively quick and secure electronic connection between the device 400 and the probe housing 102 without the need to align connector pins, and without the need for separate cables or connectors.
- the electrical connector 434 extends from the first surface 430 and includes one or more connector pins 440 that are electrically coupled in asynchronous bidirectional communication with the electronic data processing system 210 (FIGs. 2 and 3), such as via one or more arm buses 218 for example.
- the bidirectional communication connection may be wired (e.g. via arm bus 218), wireless (e.g. Bluetooth or IEEE 802.11), or a combination of wired and wireless connections.
- the electrical connector 434 is electrically coupled to the controller 420.
- the controller 420 may be in asynchronous bidirectional communication with the electronic data processing system 210 such as via one or more arm buses 218 for example.
- the electrical connector 434 is positioned to provide a relatively quick and secure electronic connection with electrical connector 442 on probe housing 102.
- the electrical connectors 434, 442 connect with each other when the device 400 is attached to the probe housing 102.
- the electrical connectors 434, 442 may each comprise a metal encased connector housing that provides shielding from electromagnetic interference as well as protecting the connector pins and assisting with pin alignment during the process of attaching the device 400 to the probe housing 102.
- the mechanical coupler 432 provides relatively rigid mechanical coupling between the device 400 and the probe housing 102 to support relatively precise applications in which the location of the device 400 on the end of the arm portion 104 of the AACMM 100 preferably does not shift or move. Any such movement may typically cause an undesirable degradation in the accuracy of the measurement result.
- the mechanical coupler 432 includes a first projection 444 positioned on one end 448 (the leading edge or "front" of the device 400).
- the first projection 444 may include a keyed, notched or ramped interface that forms a lip 446 that extends from the first projection 444.
- the lip 446 is sized to be received in a slot 450 defined by a projection 452 extending from the probe housing 102 (FIG. 8).
- the first projection 444 and the slot 450 along with the collar 438 form a coupler arrangement such that when the lip 446 is positioned within the slot 450, the slot 450 may be used to restrict both the longitudinal and lateral movement of the device 400 when attached to the probe housing 102.
- the rotation of the collar 438 may be used to secure the lip 446 within the slot 450.
- the mechanical coupler 432 may include a second projection 454.
- the second projection 454 may have a keyed, notched-lip or ramped interface surface 456 (FIG. 5).
- the second projection 454 is positioned to engage a fastener associated with the probe housing 102, such as collar 438 for example.
- the mechanical coupler 432 includes a raised surface projecting from surface 430 that adjacent to or disposed about the electrical connector 434 which provides a pivot point for the interface 426 (FIGs. 7 and 8). This serves as the third of three points of mechanical contact between the device 400 and the probe housing 102 when the device 400 is attached thereto.
- the probe housing 102 includes a collar 438 arranged co-axially on one end.
- the collar 438 includes a threaded portion that is movable between a first position (FIG. 5) and a second position (FIG. 7).
- the collar 438 may be used to secure or remove the device 400 without the need for external tools.
- Rotation of the collar 438 moves the collar 438 along a relatively coarse, square-threaded cylinder 474.
- the use of such relatively large size, square-thread and contoured surfaces allows for significant clamping force with minimal rotational torque.
- the coarse pitch of the threads of the cylinder 474 further allows the collar 438 to be tightened or loosened with minimal rotation.
- the lip 446 is inserted into the slot 450 and the device is pivoted to rotate the second projection 454 toward surface 458 as indicated by arrow 464 (FIG. 5).
- the collar 438 is rotated causing the collar 438 to move or translate in the direction indicated by arrow 462 into engagement with surface 456.
- the movement of the collar 438 against the angled surface 456 drives the mechanical coupler 432 against the raised surface 460. This assists in overcoming potential issues with distortion of the interface or foreign objects on the surface of the interface that could interfere with the rigid seating of the device 400 to the probe housing 102.
- FIG. 5 includes arrows 462, 464 to show the direction of movement of the device 400 and the collar 438.
- the offset distance of the surface 436 of device 400 provides a gap 472 between the collar 438 and the surface 436 (FIG. 6).
- the gap 472 allows the operator to obtain a firmer grip on the collar 438 while reducing the risk of pinching fingers as the collar 438 is rotated.
- the probe housing 102 is of sufficient stiffness to reduce or prevent the distortion when the collar 438 is tightened.
- Embodiments of the interface 426 allow for the proper alignment of the mechanical coupler 432 and electrical connector 434 and also protects the electronics interface from applied stresses that may otherwise arise due to the clamping action of the collar 438, the lip 446 and the surface 456. This provides advantages in reducing or eliminating stress damage to circuit board 476 mounted electrical connectors 434, 442 that may have soldered terminals. Also, embodiments provide advantages over known approaches in that no tools are required for a user to connect or disconnect the device 400 from the probe housing 102. This allows the operator to manually connect and disconnect the device 400 from the probe housing 102 with relative ease.
- a relatively large number of functions may be shared between the AACMM 100 and the device 400.
- switches, buttons or other actuators located on the AACMM 100 may be used to control the device 400 or vice versa.
- commands and data may be transmitted from electronic data processing system 210 to the device 400.
- the device 400 is a video camera that transmits data of a recorded image to be stored in memory on the base processor 204 or displayed on the display 328.
- the device 400 is an image projector that receives data from the electronic data processing system 210.
- temperature sensors located in either the AACMM 100 or the device 400 may be shared by the other.
- embodiments of the present invention provide advantages in providing a flexible interface that allows a wide variety of accessory devices 400 to be quickly, easily and reliably coupled to the AACMM 100. Further, the capability of sharing functions between the AACMM 100 and the device 400 may allow a reduction in size, power consumption and complexity of the AACMM 100 by eliminating duplicity.
- the controller 408 may alter the operation or functionality of the probe end 401 of the AACMM 100.
- the controller 408 may alter indicator lights on the probe housing 102 to either emit a different color light, a different intensity of light, or turn on/off at different times when the device 400 is attached versus when the probe housing 102 is used by itself.
- the device 400 includes a range finding sensor (not shown) that measures the distance to an object. In this
- the controller 408 may change indicator lights on the probe housing 102 in order to provide an indication to the operator how far away the object is from the probe tip 118.
- the controller 408 may change the color of the indicator lights based on the quality of the image acquired by the coded structured light scanner device. This provides advantages in simplifying the requirements of controller 420 and allows for upgraded or increased functionality through the addition of accessory devices.
- inventions of the present invention provide advantages to projector, camera, signal processing, control and indicator interfaces for a non- contact three-dimensional measurement device 500.
- the device 500 includes a pair of optical devices, such as a light projector 508 and a camera 510, for example, that project a structured light pattern and receive a two-dimensional pattern that was reflected from an object 501.
- the device 500 uses triangulation-based methods based on the known emitted pattern and the acquired image to determine a point cloud representing the X, Y, Z coordinate data for the object 501 for each pixel of the received image.
- the structured light pattern is coded so that a single image is sufficient to determine the three-dimensional coordinates of object points. Such a coded structured light pattern may also be said to measure three-dimensional coordinates in a single shot.
- the projector 508 uses a visible light source that illuminates a pattern generator.
- the visible light source may be a laser, a
- the pattern generator is a chrome-on-glass slide having a structured light pattern etched thereon.
- the slide may have a single pattern or multiple patterns that move in and out of position as needed.
- the slide may be manually or automatically installed in the operating position.
- the source pattern may be light reflected off or transmitted by a digital micro-mirror device (DMD) such as a digital light projector (DLP) manufactured by Texas Instruments Corporation, a liquid crystal device (LCD), a liquid crystal on silicon (LCOS) device, or a similar device used in transmission mode rather than reflection mode.
- DMD digital micro-mirror device
- DLP digital light projector
- LCD liquid crystal device
- LCOS liquid crystal on silicon
- the projector 508 may further include a lens system 515 that alters the outgoing light to have the desired focal characteristics.
- the device 500 further includes an enclosure 502 with a handle portion 504.
- the device 500 may further include an interface 426 on one end that mechanically and electrically couples the device 500 to the probe housing 102 as described herein above.
- the device 500 may be integrated into the probe housing 102.
- the interface 426 provides advantages in allowing the device 500 to be coupled and removed from the AACMM 100 quickly and easily without requiring additional tools.
- the camera 510 includes a photosensitive sensor which generates a digital image/representation of the area within the sensor's field of view.
- the sensor may be charged-coupled device (CCD) type sensor or a complementary metal-oxide-semiconductor (CMOS) type sensor for example having an array of pixels.
- CMOS complementary metal-oxide-semiconductor
- the camera 510 may further include other components, such as but not limited to lens 503 and other optical devices for example.
- the projector 508 and the camera 510 are arranged at an angle such that the sensor may receive light reflected from the surface of the object 501.
- the projector 508 and camera 510 are positioned such that the device 500 may be operated with the probe tip 118 in place.
- the device 500 is substantially fixed relative to the probe tip 118 and forces on the handle portion 504 may not influence the alignment of the device 500 relative to the probe tip 118.
- the device 500 may have an additional actuator (not shown) that allows the operator to switch between acquiring data from the device 500 and the probe tip 118.
- the projector 508 and camera 510 are electrically coupled to a controller 512 disposed within the enclosure 502.
- the controller 512 may include one or more
- the controller 512 may be arranged within the handle portion 504.
- the controller 512 is electrically coupled to the arm buses 218 via electrical connector 434.
- the device 500 may further include actuators 514, 516 which may be manually activated by the operator to initiate operation and data capture by the device 500.
- the image processing to determine the X, Y, Z coordinate data of the point cloud representing object 501 is performed by the controller 512 and the coordinate data is transmitted to the electronic data processing system 210 via bus 240.
- images are transmitted to the electronic data processing system 210 and the calculation of the coordinates is performed by the electronic data processing system 210.
- the controller 512 is configured to communicate with the electronic data processing system 210 to receive structured light pattern images from the electronic data processing system 210.
- the pattern emitted onto the object may be changed by the electronic data processing system 210 either automatically or in response to an input from the operator. This may provide advantages in obtaining higher accuracy measurements with less processing time by allowing the use of patterns that are simpler to decode when the conditions warrant, and use the more complex patterns where it is desired to achieve the desired level of accuracy or resolution.
- the device 520 (FIG. 12) includes a pair of cameras 510.
- the cameras 510 are arranged on an angle relative to the projector 508 to receive reflected light from the object 501.
- the use of multiple cameras 510 may provide advantages in some applications by providing redundant images to increase the accuracy of the measurement.
- the redundant images may allow for sequential patterns to be quickly acquired by the device 500 by increasing the acquisition speed of images by alternately operating the cameras 510.
- the device 500 first emits a structured light pattern 522 with projector 508 onto surface 524 of the object 501.
- the structured light pattern 522 may include the patterns disclosed in the journal article "DLP-Based Structured Light 3D Imaging Technologies and Applications” by Jason Geng published in the Proceedings of SPIE, Vol. 7932.
- the structured light pattern 522 may further include, but is not limited to one of the patterns shown in FIGS 14 - 32.
- the light 509 from projector 508 is reflected from the surface 524 and the reflected light 511 is received by the camera 510.
- variations in the surface 524 create distortions in the structured pattern when the image of the pattern is captured by the camera 510.
- the pattern is formed by structured light
- the controller 512 or the electronic data processing system 210 determine a one to one correspondence between the pixels in the emitted pattern, such as pixel 513 for example, and the pixels in the imaged pattern, such as pixel 515 for example. This enables triangulation principals to be used to determine the coordinates of each pixel in the imaged pattern.
- the collection of three-dimensional coordinates of the surface 524 is sometimes referred to as a point cloud. By moving the device 500 over the surface 524, a point cloud may be created of the entire object 501.
- the coupling of the device 500 to the probe end provides advantages in that the position and orientation of the device 500 is known by the electronic data processing system 210, so that the location of the object 501 relative to the AACMM 100 may also be ascertained.
- the angle of each projected ray of light 509 intersecting the object 522 in a point 527 is known to correspond to a projection angle phi ( ⁇ ), so that ⁇ information is encoded into the emitted pattern.
- the system is configured to enable the ⁇ value corresponding to each pixel in the imaged pattern to be ascertained.
- an angle omega ( ⁇ ) for each pixel in the camera is known, as is the baseline distance "D" between the projector 508 and the camera. Therefore, the distance "Z" from the camera 510 to the location that the pixel has imaged using the equation:
- three-dimensional coordinates may be calculated for each pixel in the acquired image.
- uncoded structured light there are two categories of structured light, namely coded and uncoded structured light.
- a common form of uncoded structured light such as that shown in FIGS. 14-17 and 28-30, relies on a striped pattern varying in a periodic manner along one dimension. These types of patterns are usually applied in a sequence to provide an approximate distance to the object.
- Some uncoded pattern embodiments, such as the sinusoidal patterns for example, may provide relatively highly accurate measurements.
- a coded pattern such as that shown in FIGS. 18-27 may be preferable.
- a coded pattern allows the image to be analyzed using a single acquired image.
- Some coded patterns may be placed in a particular orientation on the projector pattern (for example, perpendicular to epipolar lines on the projector plane), thereby simplifying analysis of the three-dimensional surface coordinates based on a single image.
- Epipolar lines are mathematical lines formed by the intersection of epipolar planes and the source plane 517 or the image plane 521 (the plane of the camera sensor) in FIG. 13B.
- An epipolar plane may be any plane that passes through the projector perspective center 519 and the camera perspective center.
- the epipolar lines on the source plane 517 and the image plane 521 may be parallel in some cases, but in general are not parallel.
- An aspect of epipolar lines is that a given epipolar line on the projector plane 517 has a corresponding epipolar line on the image plane 521. Therefore, any particular pattern known on an epipolar line in the projector plane 517 may be immediately observed and evaluated in the image plane 521. For example, if a coded pattern is placed along an epipolar line in the projector plane 517, the spacing between the coded elements in the image plane 521 may be determined using the values read out of the pixels of the camera sensor 510. This
- coded patterns may be used to determine the three-dimensional coordinates of a point 527 on the object 501. It is further possible to tilt coded patterns at a known angle with respect to an epipolar line and efficiently extract object surface coordinates. Examples of coded patterns are shown in FIGS. 20 - 29. .
- the sinusoidal period represents a plurality of pattern elements. Since there is a multiplicity of periodic patterns in two-dimensions, the pattern elements are non-collinear. In some cases, a striped pattern having stripes of varying width may represent a coded pattern.
- gray code as used in the field of three-dimensional metrology based on structured light is somewhat different than the term as used in the field of electrical engineering, where the term Gray code commonly means the sequential changing of a single bit at a time.
- Gray code commonly means the sequential changing of a single bit at a time.
- the present application follows the use of the term gray code as is customary for the field of three-dimensional metrology where the gray code typically represents a sequence of binary black and white values.
- FIG. 14A shows an example of a binary pattern that includes a plurality of sequential images 530, 532, 534, each having a different stripped pattern thereon.
- the stripes alternate between bright (illuminated) and dark (non-illuminated) striped regions.
- white and black are used to mean illuminated and non-illuminated, respectively.
- the composite pattern 536 For each point on the object 501 (represented by a camera pixel in the image) the composite pattern 536 has a unique binary value obtained through the sequential projection of patterns 530, 532, 534, 535, 537, which correspond to a relatively small range of possible projection angles ⁇ .
- Eq. (1) may be used to find the distance Z from the camera to the object point.
- a two-dimensional angle is known for each camera pixel. The two-dimensional angle corresponds generally to the one-dimensional angle Omega, which is used in the calculation of the distance Z according to Eq. (1).
- a line drawn from each camera pixel through the camera perspective center and intersecting the object in a point defines a two-dimensional angle in space.
- the two pixel angles provide three-dimensional coordinates corresponding to a point on the object surface.
- the term grey-scale usually refers to an amount of irradiance at a point on the object from white (maximum light), to various levels of gray (less light), to black (minimum light). This same nomenclature is used even if the light being projected has a color such as red, and the grayscale values correspond to levels of red illumination.
- the pattern (FIG. 15) has a plurality of images 538, 540, 542 with stripes having varying light power levels, such as black, grey and white for example, used to produce an emitted patternon the object 501.
- the grey scale values may be used to determine the possible projection angles ⁇ to within a relatively small range of possible values. As discussed hereinabove, Eq. (1) may then be used to determine the distance Z.
- the distance Z to an object point may be found by measuring a phase shift observed in a plurality of images.
- the gray-scale intensities 546, 548, 550 of a projector pattern 552 vary in a sinusoidal manner, but with the phase shifted between projected patterns.
- the sinusoid gray-scale intensity 546 (representing optical power per unit area) may have a phase of zero degrees at a particular point.
- the sinusoid intensity 548 has a phase of 120 degrees at the same point.
- the sinusoid intensity 550 may have a phase of 240 degrees at the same point.
- phase shift method is used to determine a phase of the projected light at each camera pixel, which eliminates the need to consider information from adjacent pixels as in the coded-pattern single shot case.
- Many methods may be used to determine the phase of a camera pixel.
- One method involves performing a multiply and accumulate procedure and then taking an arctangent of a quotient. This method is well known to those of ordinary skill in the art and is not discussed further.
- the background light cancels out in the calculation of phase.
- the value Z calculated for a give pixel is usually more accurate than the value Z calculated using a coded-pattern single shot method.
- all of the calculated phases vary from 0 to 360 degrees.
- these calculated phases may be adequate if "thickness" of the object under test does not vary by too much because the angle for each projected stripe is known ahead of time.
- an ambiguity may arise between in the phase calculated for a particular pixel since that pixel may have been obtained from first projected ray of light striking the object at a first position or a second projected ray of light striking the object at a second position.
- the phases may not be properly decoded and the desired one to one correspondence not achieved.
- FIG. 17A shows a sequence 1 - 4 of projected gray-code intensities 554 according to a method by which the ambiguity may be eliminated in the distance Z based on a calculated phase.
- a collection of gray code patterns are projected sequentially onto the object.
- the sequential pattern 1 has dark (black) on the left half of the pattern (elements 0 - 15) and bright (white) on the right half of the pattern (elements 16 - 31).
- the sequential pattern 2 has a dark band toward the center (elements 8 - 23) and bright bands toward the edges (elements 2 - 7, 24 - 31).
- the sequential pattern 3 has two separated bright bands near the center (elements 4 - 11, 20 - 27) and three bright bands (elements 0 - 3, 12 - 19, 28 - 31).
- the sequential pattern 4 has four separated dark bands (elements 2 - 5, 10 - 13, 18 - 21, 26 - 29) and five separated bright bands (elements 0 - 1, 6 - 9, 14 - 17, 22 - 25, 30 - 31). For any given pixel in the camera, this sequence of patterns enables the "object thickness region" of the object to be improved by a factor of 16 compared to an initial object thickness region corresponding to all the elements 0 to 31.
- a phase shift method similar to the method of FIG. 16, is performed.
- a pattern 556A four sinusoidal periods are projected onto an object.
- One way to reduce or eliminate the ambiguity is to project one or more additional sinusoidal patterns 556B, 556C, each pattern having a different fringe period (pitch). So, for example, in FIG. 17B, a second sinusoidal pattern 555 having three fringe periods rather than four fringe periods is projected onto an object.
- the difference in the phases for the two patterns 555, 556 may be used to help eliminate an ambiguity in the distance Z to the target.
- Another method for eliminating ambiguity is to use a different type of method, such as the gray code method of FIG. 17A for example, to eliminate the ambiguity in the distances Z calculated using the sinusoidal phase shift method.
- patterns 558, 566 have a distribution of colors that may in some cases enable measurement of the object to be based on a single (coded) image.
- the pattern 558 uses lines having a continuously spatially varying wavelength of light to create a pattern where the color changes continuously from blue to green to yellow to red to fuchsia for example. Thus for each particular spectral wavelength, a one-to-one correspondence may be made between the emitted image and the imaged pattern.
- the three-dimensional coordinates of the object 501 may be determined from a single imaged pattern.
- the stripes of the pattern 558 are oriented perpendicular to the epipolar lines on the projector plane. Since the epipolar lines on the projector plane map into epipolar lines on the camera image plane, it is possible to obtain an association between projector points and camera points by moving along the direction of epipolar lines in the camera image plane and noting the color of the line in each case. It should be appreciated that each pixel in the camera image plane corresponds to a two- dimensional angle. The color enables determination of the one-to-one correspondence between particular projection angles and particular camera angles.
- FIG. 19 Another embodiment using color patterns is shown in FIG. 19.
- a plurality of colored patterns having varying intensities 560, 562, 564 are combined to create a color pattern 566.
- the plurality of colored patterns intensities 560, 562, 564 are primary colors, such that pattern 560 varies the intensity of the color red, pattern 562 varies the intensity of the color green and pattern 564 varies the intensity of the color blue.
- the resulting emitted image has a known relationship that may be decoded in the imaged pattern.
- the three-dimensional coordinates of the object 501 may be determined.
- the pattern of FIG. 19 projects three complete cycles of nearly identical colors.
- triangulation may be used to determine the three-dimensional object coordinates at each pixel position using only a single camera image.
- the method of FIG. 18 may be considered to be a coded, single-shot method.
- FIG. 19 there is a chance of ambiguity in the distance Z to an object point. For example, if the camera sees a color purple, the projector may have projected any of three different angles. Based on the triangulation geometry, three different distances Z are possible. If the thickness of the object is known ahead of time to be within a relatively small range of values, then it may be possible to eliminate two of the values, thereby obtaining three-dimensional coordinates in a single shot.
- the spatial period of the colored pattern may be changed, and then used to illuminate the object a second time.
- this method of projected structured light is considered to be a sequential method rather than a coded, single- shot method.
- coded structured light patterns for a single image acquisition are shown based on a stripe indexing technique.
- patterns having color stripes 568, 570 are emitted by the projector 508.
- This technique utilizes a characteristic of image sensors wherein the sensor has three independent color channels, such as red, green, blue or cyan, yellow, magenta for example. The combinations of the values generated by these sensor channels may produce a large number of colored patterns. As with the embodiment of FIG. 19, the ratio of the color distribution is known, therefore the relationship between the emitted pattern and the imaged pattern may be determined and the three-dimensional coordinates calculated. Still other types of colored patterns may be used, such as a pattern based on the De Bruijn sequence. The stripe indexing techniques and the De Bruijn sequence are well known to those of ordinary skill in the art and so are not discussed further.
- the pattern 572 provides groups of stripes having multiple intensity (gray- scale) levels and different widths. As a result, a particular group of stripes within the overall image has a unique gray-scale pattern. Due to the uniqueness of the groups, a one-to-one correspondence may be determined between the emitted pattern and the imaged pattern to calculate the coordinates of the object 501.
- the pattern 574 provides a series of stripes having a segmented pattern. Since each line has unique segment design, the correspondence may be determined between the emitted pattern and the imaged pattern to calculate the coordinates of the object 501.
- additional advantages may be gained by orienting the projected lines 572, 574 perpendicular to epipolar lines so that in the camera plane since this simplifies determination of a second dimension in finding the one-to-one correspondence between camera and projector patterns.
- coded structured light patterns are shown that use a two-dimensional spatial grid pattern technique. These types of patterns are arranged such that a sub window, such as window 576 on pattern 578 for example, is unique relative to other sub windows within the pattern.
- a pseudo random binary array pattern 578 is used.
- the pattern 578 uses a grid with elements, such as circles 579 for example, that form the coded pattern. It should be appreciated that elements having other geometric shapes may also be used, such as but not limited to squares, rectangles, and triangles for example.
- a pattern 580 is shown of a multivalued pseudo random array wherein each of the numerical values has an assigned shape 582.
- These shapes 582 form a unique sub- window 584 that allows for correspondence between the emitted pattern and the imaged pattern to calculate the coordinates of the object 501.
- the grid 586 is color coded with stripes perpendicular to the projector plane.
- the pattern of FIG. 26 will not necessarily provide a pattern that can be decoded in a single shot, but the color information may help to simplify the analysis.
- an array 588 of colored shapes such as squares or circles, for example, are used to form the pattern.
- an exemplary sinusoidal pattern 720 is shown.
- the lines 734 are perpendicular to epipolar lines on the projector plane.
- the sinusoidal pattern 720 is made up of thirty lines 722 which are repeated once to give a total number of lines 722 of sixty.
- Each line 722 has a sinusoidal feature 723 that is approximately 180 degrees out of phase with the line above and the line below. This is to allow the lines 722 to be as close as possible and also allows a greater depth of field because the lines can blur on the projected surface or acquired image and still be recognized.
- Each single line 722 can be uniquely decoded using just the phase of that line where the line length must be at least one wavelength of the sinusoid.
- the pattern 720 Since the pattern 720 is repeated, it would generally cause ambiguities in the line identification. However this is problem is resolved in this system through the geometry of the camera's field of view and depth of field. For a single view of the camera, i.e. a row of pixels, within the depth of field in which the lines can be optically resolved, no two lines with the same phase can be imaged. For example, the first row of pixels on the camera can only receive reflected light from lines 1-30 of the pattern. Whereas further down the camera sensor, another row will only receive reflected light from lines 2-31 of the pattern, and so on. In figure 28B an enlarged portion of the pattern 720 is shown of three lines where the phase between consecutive lines 722 is approximately 180 degrees. It also shows how the phase of each single line is enough to uniquely decode the lines.
- FIG. 29A - 29B another pattern 730 is shown having square pattern elements.
- the lines 732 are perpendicular to epipolar lines on the projector plane.
- the square pattern 730 contains twenty seven lines 732 before the pattern 730 is repeated and has a total number of lines of 59.
- the code elements 734 of pattern 730 are distinguished by the phase of the square wave from left to right in Figure 29B.
- the pattern 730 is encoded such that a group of sequential lines 732 are distinguished by the relative phases of its members.
- sequential lines are found by scanning vertically for the lines.
- scanning vertically means scanning along epipolar lines in the camera image plane. Sequential lines within a camera vertical pixel column are paired together and their relative phases are determined.
- Figure 29B shows an enlarged view of four lines 732 of the square pattern. This embodiment shows that the phase of a single line 732 alone is not able to uniquely decode a line because the first and third lines have the same absolute phase.
- each sinusoidal period represents a plurality of pattern elements. Since there is a multiplicity of periodic patterns in two dimensions, the pattern elements are non-collinear. In contrast, for the case of the laser line scanner that emits a line of light, all of the pattern elements lie on a straight line.
- the line has width and the tail of the line cross section may have less optical power than the peak of the signal, these aspects of the line are not evaluated separately in finding surface coordinates of an object and therefore do not represent separate pattern elements.
- the line may contain multiple pattern elements, these pattern elements are collinear.
- FIGS. 30-31 the various pattern techniques may be combined as shown in FIGS. 30-31 to form either a binary (FIG. 30) checkerboard uncoded pattern 590 or a colored (FIG. 31) checkerboard uncoded pattern 592.
- a photometric stereo technique may be used where a plurality of images 594 are taken on the object 501 where the light source 596 is moved to a plurality of locations.
- FIG. 33 another embodiment is shown of a system 700 for acquiring three-dimensional coordinates of an object 702.
- the device 704 is independently operable when detached from the AACMM 100.
- the device 704 includes a controller 706 and an optional display 708.
- the display 708 may be integrated in the housing of the device 704 or may be a separate component that is coupled to the device 704 when it is used independently from the AACMM 100. In embodiments where the display 708 is separable from the device 704, the display 708 may include a controller (not shown) that provides additional functionality to facilitate to independent operation of the device 704. In one embodiment, the controller 706 is disposed within the separable display.
- the controller 706 includes a communications circuit configured to wirelessly transmit data, such as images or coordinate data via a communications link 712 to the AACMM 100, to a separate computing device 710 or a combination of both.
- the computing device 710 may be, but is not limited to a computer, a laptop, a tablet computer, a personal digital assistant (PDA), or a cell phone for example.
- the display 708 may allow the operator see the acquired images, or the point cloud of acquired coordinates of the object 702.
- the controller 706 decodes the patterns in the acquired image to determine the three-dimensional coordinates of the object.
- the images are acquired by the device 704 and transmitted to either the AACMM 100, the computing device 710 or a combination of both.
- the device 704 may further include a location device assembly 714.
- the location device assembly may include one or more of inertial navigation sensors, such as a Global Positioning System (GPS) sensor, a gyroscopic sensor, an accelerometer sensor. Such sensors may be electrically coupled to the controller 706. Gyroscopic and accelerometer sensors may be single-axis or multiple-axis devices.
- the location device assembly 714 is configured to allow the controller 706 to measure or maintain the orientation of the device 704 when detached from the AACMM 100.
- a gyroscope within the location device assembly 714 may be a MEMS gyroscopic device, a solid-state ring-laser device, a fiber optic device gyroscope, or other type.
- a method is used to combine images obtained from multiple scans.
- the images are each obtained by using coded patterns so that only a single image is needed to obtain three-dimensional coordinates associated with a particular position and orientation of the device 704.
- One way to combine multiple images captured by the device 704 is to provide at least some overlap between adjacent images so that point cloud features may be matched. This matching function may be assisted by the inertial navigation devices described above.
- the reference markers are small markers having an adhesive or sticky backing, for example, circular markers that are placed on an object or objects being measured. Even a relatively small number of such markers can be useful in registering multiple images, especially if the object being measured has a relatively small number of features to use for registration.
- the reference markers may be projected as spots of light onto the object or objects under inspection. For example, a small portable projector capable of emitting a plurality of small dots may be placed in front of the object or objects to be measured. An advantage of projected dots over sticky dots is that the dots do not have to be attached and later removed.
- the device projects the structured light over a contiguous and enclosed area 716 and can acquire an image over the area 716 at a range of 100 mm to 300 mm with an accuracy of 35 microns.
- the perpendicular area 716 of projection is approximately 150 to 200 mm .
- the camera or cameras 510 may be a digital camera having a 1.2-5.0 megapixel CMOS or CCD sensor.
- the first step in decoding an image of the pattern is to extract the centers of gravity (cog) 724 (FIG. 28C) of the projected pattern 720 features in the Y direction. This is carried out by calculating a moving average of the pixel grayscale values and moving downwards in the Y direction processing a single column at a time. When a pixel value in an image falls above the moving average value then a starting point for a feature is found. After a starting point is found the width of the feature continues to increase until a pixel value falls below the moving average value.
- cog centers of gravity
- a weighted average is then calculated using the pixel values and their Y positions between the start and end points to give the cog 724 of the pattern feature 723 in the image.
- the distances between the start and end points are also recorded for later use.
- the resulting cogs 724 are used next to find the pattern lines 722. This is done by moving in a left to right direction (when viewed from the direction shown in the FIGS.) starting with the first column of the image. For each cog 724 in this column the neighboring column to the immediate right is searched for a cog 724 that is within a particular distance. If two matching cogs 724 are found then a potential line has been determined.
- FIG. 28C also shows the detected lines where they are all longer than a single wavelength of the pattern. In one embodiment there is no or a small delta between neighboring column's cogs.
- the next step in the decoding process is to extract the projected pattern features along the lines in the X direction in the form of block centers.
- Each pattern contains both wide blocks and narrow blocks.
- This process proceeds in a similar fashion to extracting the features in the Y direction, however the moving average is also calculated using the widths found in the first stage and the direction of movement is along the line.
- the features are extracted in the area where widths are above the moving average value but in this process, features are also extracted in the areas where the widths are below the moving average.
- the widths and X positions are used to calculate a weighted average to find the center of the block 726 in the X direction.
- the Y positions of the cogs 724 between moving average crossings are also used to calculate a center for the block 726 in the Y direction. This is carried out by taking the average of the Y coordinates of the cogs.
- the start and end points of each line are also modified based on the features extracted in this step to ensure that both points are where the crossing of the moving average occurs. In one embodiment, only complete blocks are used in later processing steps.
- the lines and blocks are then processed further to ensure that the distance between the block centers 726 on each line are within a predetermined tolerance. This is accomplished by taking the delta between the X center positions between two neighboring blocks on a line and checking that the delta is below the tolerance. If the delta is above the tolerance then the line is broken up into smaller lines. If the break is required between the last two blocks on a line then the last block is removed and no additional line is created. If the break is required between the first and second or second and third blocks on a line then the blocks to the left of the break are also discarded and no additional line is created. For situations where the break occurs in any other place along the line the line is broken into two and a new line is created and the appropriate blocks are transferred to it. After this stage of processing the two patterns require different steps to finish decoding.
- the sinusoidal pattern 720 may now be decoded with one additional step of processing using the block centers on the lines.
- the modulus of each block X center and the wavelength of the pattern 720 on a line 722 are calculated and the average of these values gives the phase of the line 722.
- the phase of the line 722 may then be used to decode the line in the pattern 720 which in turn allows for the determination of an X, Y, Z coordinate position for all cogs 724 on that line 722.
- first lines 732 be connected vertically before any decoding can take place. This allows a group of lines to be identified and not just a single line like the sinusoidal pattern. Connections 736 are found between lines 732 by using the blocks 734 and the cogs contained in the block calculated in the first stage of processing. The first cog in each block on a line 732 is tested to see if there is another cog directly below it in the same column. If there is no cog below then there is no connection with another line at this point so processing continues. If there is a cog below then the Y distance between the two cogs is determined and compared to a desired maximum spacing between lines. If the distance is less than this value the two lines are considered connected at that point and the connection 736 is stored and processing continues onto the next block. In one embodiment, a line connection 736 is unique such that no two lines will have more than one connection 736 between them.
- the next step of processing for the square pattern 730 is phase calculation between connected lines.
- Each pair of lines 732 is first processed to determine the length of overlap between them. In one embodiment there is at least one wavelength of overlap between the pair of lines to allow the calculation of the relative phase. If the lines have the desired overlap, then the cog at center of the area of overlap is found.
- the blocks 738 that contain the center cog and the cog directly below are determined and the relative phase between the block X centers is calculated for that line connection. This process is repeated for all connections between lines. In one embodiment, the process is repeated in only the downwards direction in the Y axis. This is because the code is based on connections below lines and not the other way round or both.
- Figure 29C shows the blocks 738 that could be used for calculating the relative phase for this set of lines. The relative phases in embodiment of FIG. 29C are 3, 1 and 2 and these phases would be used in the final stage to decode the top line.
- the next step in decoding the square pattern 730 is performing a look up using the relative phases calculated in the previous step.
- Each line 732 is processed by tracking down the line connections 736 until a connection depth of four is reached. This depth is used because this is the number of phases to decode the line.
- a hash is determined using the relative phase between the lines 732. When the required connection depth is reached the hash is used to look up the line code. If the hash returns a valid code then this is recorded and stored in a voting system. Every line 732 is processed in this way and all connections that are of the desired depth are used to generate a vote if they are a valid phase combination.
- the final step is then to find out which code received the most votes on each line 732 and assigned the code of the line 732 to this value. If there is not a unique code that received the most votes then the line is not assigned a code.
- the lines 732 are identified once a code has been assigned and the X, Y, Z coordinate position for all cogs on that line 732 may now be found.
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015516035A JP5816773B2 (en) | 2012-06-07 | 2013-05-20 | Coordinate measuring machine with removable accessories |
CN201380029985.4A CN104380033A (en) | 2012-06-07 | 2013-05-20 | Coordinate measurement machines with removable accessories |
DE112013002824.7T DE112013002824T5 (en) | 2012-06-07 | 2013-05-20 | Coordinate measuring machines with removable accessories |
GB1422105.5A GB2517621A (en) | 2012-06-07 | 2013-05-20 | Coordinate measurement machines with removable accessories |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/491,176 | 2012-06-07 | ||
US13/491,176 US8832954B2 (en) | 2010-01-20 | 2012-06-07 | Coordinate measurement machines with removable accessories |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013184340A1 true WO2013184340A1 (en) | 2013-12-12 |
Family
ID=48537024
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2013/041826 WO2013184340A1 (en) | 2012-06-07 | 2013-05-20 | Coordinate measurement machines with removable accessories |
Country Status (5)
Country | Link |
---|---|
JP (1) | JP5816773B2 (en) |
CN (1) | CN104380033A (en) |
DE (1) | DE112013002824T5 (en) |
GB (1) | GB2517621A (en) |
WO (1) | WO2013184340A1 (en) |
Cited By (57)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016010640A1 (en) * | 2014-07-16 | 2016-01-21 | Faro Technologies, Inc. | Measurement device for machining center |
WO2016020826A1 (en) * | 2014-08-07 | 2016-02-11 | Ingenera Sa | Method and relevant device for measuring distance with auto-calibration and temperature compensation |
WO2016039955A1 (en) * | 2014-09-10 | 2016-03-17 | Faro Technologies, Inc. | A portable device for optically measuring three- dimensional coordinates |
WO2016044658A1 (en) * | 2014-09-19 | 2016-03-24 | Hexagon Metrology, Inc. | Multi-mode portable coordinate measuring machine |
DE102015205187A1 (en) * | 2015-03-23 | 2016-09-29 | Siemens Aktiengesellschaft | Method and device for the projection of line pattern sequences |
US9602811B2 (en) | 2014-09-10 | 2017-03-21 | Faro Technologies, Inc. | Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device |
US9618620B2 (en) | 2012-10-05 | 2017-04-11 | Faro Technologies, Inc. | Using depth-camera images to speed registration of three-dimensional scans |
EP3171129A1 (en) * | 2015-11-19 | 2017-05-24 | Hand Held Products, Inc. | High resolution dot pattern |
US9671221B2 (en) | 2014-09-10 | 2017-06-06 | Faro Technologies, Inc. | Portable device for optically measuring three-dimensional coordinates |
US9684078B2 (en) | 2010-05-10 | 2017-06-20 | Faro Technologies, Inc. | Method for optically scanning and measuring an environment |
US9693040B2 (en) | 2014-09-10 | 2017-06-27 | Faro Technologies, Inc. | Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device |
JP2017126192A (en) * | 2016-01-14 | 2017-07-20 | セイコーエプソン株式会社 | Image recognition device, image recognition method and image recognition unit |
US9752864B2 (en) | 2014-10-21 | 2017-09-05 | Hand Held Products, Inc. | Handheld dimensioning system with feedback |
US9762793B2 (en) | 2014-10-21 | 2017-09-12 | Hand Held Products, Inc. | System and method for dimensioning |
US9769463B2 (en) | 2014-09-10 | 2017-09-19 | Faro Technologies, Inc. | Device and method for optically scanning and measuring an environment and a method of control |
US9769454B2 (en) | 2014-06-20 | 2017-09-19 | Stmicroelectronics S.R.L. | Method for generating a depth map, related system and computer program product |
US9779546B2 (en) | 2012-05-04 | 2017-10-03 | Intermec Ip Corp. | Volume dimensioning systems and methods |
US9779276B2 (en) | 2014-10-10 | 2017-10-03 | Hand Held Products, Inc. | Depth sensor based auto-focus system for an indicia scanner |
US9786101B2 (en) | 2015-05-19 | 2017-10-10 | Hand Held Products, Inc. | Evaluating image values |
US9784566B2 (en) | 2013-03-13 | 2017-10-10 | Intermec Ip Corp. | Systems and methods for enhancing dimensioning |
US9823059B2 (en) | 2014-08-06 | 2017-11-21 | Hand Held Products, Inc. | Dimensioning system with guided alignment |
US9835486B2 (en) | 2015-07-07 | 2017-12-05 | Hand Held Products, Inc. | Mobile dimensioner apparatus for use in commerce |
US9841311B2 (en) | 2012-10-16 | 2017-12-12 | Hand Held Products, Inc. | Dimensioning system |
US9857167B2 (en) | 2015-06-23 | 2018-01-02 | Hand Held Products, Inc. | Dual-projector three-dimensional scanner |
US9879976B2 (en) | 2010-01-20 | 2018-01-30 | Faro Technologies, Inc. | Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features |
US9897434B2 (en) | 2014-10-21 | 2018-02-20 | Hand Held Products, Inc. | Handheld dimensioning system with measurement-conformance feedback |
US9940721B2 (en) | 2016-06-10 | 2018-04-10 | Hand Held Products, Inc. | Scene change detection in a dimensioner |
US9939259B2 (en) | 2012-10-04 | 2018-04-10 | Hand Held Products, Inc. | Measuring object dimensions using mobile computer |
US10007858B2 (en) | 2012-05-15 | 2018-06-26 | Honeywell International Inc. | Terminals and methods for dimensioning objects |
US10025314B2 (en) | 2016-01-27 | 2018-07-17 | Hand Held Products, Inc. | Vehicle positioning and object avoidance |
US10060729B2 (en) | 2014-10-21 | 2018-08-28 | Hand Held Products, Inc. | Handheld dimensioner with data-quality indication |
US10060722B2 (en) | 2010-01-20 | 2018-08-28 | Faro Technologies, Inc. | Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations |
US10066982B2 (en) | 2015-06-16 | 2018-09-04 | Hand Held Products, Inc. | Calibrating a volume dimensioner |
US10070116B2 (en) | 2014-09-10 | 2018-09-04 | Faro Technologies, Inc. | Device and method for optically scanning and measuring an environment |
US10067231B2 (en) | 2012-10-05 | 2018-09-04 | Faro Technologies, Inc. | Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner |
US10134120B2 (en) | 2014-10-10 | 2018-11-20 | Hand Held Products, Inc. | Image-stitching for dimensioning |
US10140724B2 (en) | 2009-01-12 | 2018-11-27 | Intermec Ip Corporation | Semi-automatic dimensioning with imager on a portable device |
US10163216B2 (en) | 2016-06-15 | 2018-12-25 | Hand Held Products, Inc. | Automatic mode switching in a volume dimensioner |
US10175037B2 (en) | 2015-12-27 | 2019-01-08 | Faro Technologies, Inc. | 3-D measuring device with battery pack |
US10203402B2 (en) | 2013-06-07 | 2019-02-12 | Hand Held Products, Inc. | Method of error correction for 3D imaging device |
US10247547B2 (en) | 2015-06-23 | 2019-04-02 | Hand Held Products, Inc. | Optical pattern projector |
US10249030B2 (en) | 2015-10-30 | 2019-04-02 | Hand Held Products, Inc. | Image transformation for indicia reading |
US10321127B2 (en) | 2012-08-20 | 2019-06-11 | Intermec Ip Corp. | Volume dimensioning system calibration systems and methods |
US10339352B2 (en) | 2016-06-03 | 2019-07-02 | Hand Held Products, Inc. | Wearable metrological apparatus |
US10393506B2 (en) | 2015-07-15 | 2019-08-27 | Hand Held Products, Inc. | Method for a mobile dimensioning device to use a dynamic accuracy compatible with NIST standard |
US10584962B2 (en) | 2018-05-01 | 2020-03-10 | Hand Held Products, Inc | System and method for validating physical-item security |
US10656617B2 (en) | 2014-07-16 | 2020-05-19 | Faro Technologies, Inc. | Measurement device for machining center |
US10775165B2 (en) | 2014-10-10 | 2020-09-15 | Hand Held Products, Inc. | Methods for improving the accuracy of dimensioning-system measurements |
US10909708B2 (en) | 2016-12-09 | 2021-02-02 | Hand Held Products, Inc. | Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements |
US11029762B2 (en) | 2015-07-16 | 2021-06-08 | Hand Held Products, Inc. | Adjusting dimensioning results using augmented reality |
US11047672B2 (en) | 2017-03-28 | 2021-06-29 | Hand Held Products, Inc. | System for optically dimensioning |
WO2022016873A1 (en) * | 2020-07-23 | 2022-01-27 | Zhejiang Hanchine Ai Tech. Co., Ltd. | Multi-line laser three-dimensional imaging method and system based on random lattice |
US11350077B2 (en) | 2018-07-03 | 2022-05-31 | Faro Technologies, Inc. | Handheld three dimensional scanner with an autoaperture |
WO2022207201A1 (en) * | 2021-03-29 | 2022-10-06 | Sony Semiconductor Solutions Corporation | Depth sensor device and method for operating a depth sensor device |
US11561088B2 (en) | 2018-06-07 | 2023-01-24 | Pibond Oy | Modeling the topography of a three-dimensional surface |
EP3315902B1 (en) * | 2016-10-27 | 2023-09-06 | Pepperl+Fuchs SE | Measuring device and method for triangulation measurement |
US11763473B2 (en) | 2020-07-23 | 2023-09-19 | Zhejiang Hanchine Ai Tech. Co., Ltd. | Multi-line laser three-dimensional imaging method and system based on random lattice |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6465682B2 (en) * | 2014-03-20 | 2019-02-06 | キヤノン株式会社 | Information processing apparatus, information processing method, and program |
JP6512912B2 (en) * | 2015-04-10 | 2019-05-15 | キヤノン株式会社 | Measuring device for measuring the shape of the object to be measured |
CN106712160B (en) * | 2015-07-30 | 2019-05-21 | 安徽啄木鸟无人机科技有限公司 | A kind of charging method of unmanned plane quick charging system |
WO2017095259A1 (en) * | 2015-12-04 | 2017-06-08 | Андрей Владимирович КЛИМОВ | Method for monitoring linear dimensions of three-dimensional entities |
KR102482062B1 (en) * | 2016-02-05 | 2022-12-28 | 주식회사바텍 | Dental three-dimensional scanner using color pattern |
CN106091930B (en) * | 2016-08-16 | 2019-01-11 | 郑州辰维科技股份有限公司 | A kind of real-time online measuring method based on double camera measuring system and structured light sensor |
WO2018049286A1 (en) * | 2016-09-09 | 2018-03-15 | Quality Vision International, Inc. | Articulated head with multiple sensors for measuring machine |
DE102016220127B4 (en) * | 2016-10-14 | 2020-09-03 | Carl Zeiss Industrielle Messtechnik Gmbh | Method for operating a coordinate measuring machine |
WO2019177066A1 (en) * | 2018-03-16 | 2019-09-19 | 日本電気株式会社 | Three-dimensional shape measuring device, three-dimensional shape measuring method, program, and recording medium |
FR3083605B1 (en) * | 2018-07-06 | 2020-09-18 | Hexagon Metrology Sas | MEASURING ARM WITH MULTIFUNCTIONAL END |
FR3083602B1 (en) * | 2018-07-06 | 2020-09-18 | Hexagon Metrology Sas | MEASURING ARM WITH MULTIFUNCTIONAL END |
WO2020136885A1 (en) * | 2018-12-28 | 2020-07-02 | ヤマハ発動機株式会社 | Three-dimensional measurement device and workpiece processing device |
US20220252392A1 (en) * | 2019-07-02 | 2022-08-11 | Nikon Corporation | Metrology for additive manufacturing |
CN113188450B (en) * | 2021-04-23 | 2023-03-14 | 封泽希 | Scene depth detection method and system based on structured light |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5402582A (en) | 1993-02-23 | 1995-04-04 | Faro Technologies Inc. | Three dimensional coordinate measuring apparatus |
US5611147A (en) | 1993-02-23 | 1997-03-18 | Faro Technologies, Inc. | Three dimensional coordinate measuring apparatus |
US20060017720A1 (en) * | 2004-07-15 | 2006-01-26 | Li You F | System and method for 3D measurement and surface reconstruction |
US20110164114A1 (en) * | 2010-01-06 | 2011-07-07 | Canon Kabushiki Kaisha | Three-dimensional measurement apparatus and control method therefor |
EP2400261A1 (en) * | 2010-06-21 | 2011-12-28 | Leica Geosystems AG | Optical measurement method and system for determining 3D coordination in a measuring object surface |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB0309662D0 (en) * | 2003-04-28 | 2003-06-04 | Crampton Stephen | Robot CMM arm |
GB2490631B (en) * | 2010-01-20 | 2016-11-02 | Faro Tech Inc | Portable articulated arm coordinate measuring machine with multi-bus arm technology |
-
2013
- 2013-05-20 GB GB1422105.5A patent/GB2517621A/en not_active Withdrawn
- 2013-05-20 WO PCT/US2013/041826 patent/WO2013184340A1/en active Application Filing
- 2013-05-20 CN CN201380029985.4A patent/CN104380033A/en active Pending
- 2013-05-20 JP JP2015516035A patent/JP5816773B2/en not_active Expired - Fee Related
- 2013-05-20 DE DE112013002824.7T patent/DE112013002824T5/en not_active Ceased
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5402582A (en) | 1993-02-23 | 1995-04-04 | Faro Technologies Inc. | Three dimensional coordinate measuring apparatus |
US5611147A (en) | 1993-02-23 | 1997-03-18 | Faro Technologies, Inc. | Three dimensional coordinate measuring apparatus |
US20060017720A1 (en) * | 2004-07-15 | 2006-01-26 | Li You F | System and method for 3D measurement and surface reconstruction |
US20110164114A1 (en) * | 2010-01-06 | 2011-07-07 | Canon Kabushiki Kaisha | Three-dimensional measurement apparatus and control method therefor |
EP2400261A1 (en) * | 2010-06-21 | 2011-12-28 | Leica Geosystems AG | Optical measurement method and system for determining 3D coordination in a measuring object surface |
Non-Patent Citations (2)
Title |
---|
GENG JASON ED - DOUGLASS MICHAEL R ET AL: "DLP-based structured light 3D imaging technologies and applications", EMERGING DIGITAL MICROMIRROR DEVICE BASED SYSTEMS AND APPLICATIONS III, SPIE, 1000 20TH ST. BELLINGHAM WA 98225-6705 USA, vol. 7932, no. 1, 10 February 2011 (2011-02-10), pages 1 - 15, XP060006632, DOI: 10.1117/12.873125 * |
JASON GENG: "DLP-Based Structured Light 3D Imaging Technologies and Applications", PROCEEDINGS OF SPIE, vol. 7932 |
Cited By (98)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10845184B2 (en) | 2009-01-12 | 2020-11-24 | Intermec Ip Corporation | Semi-automatic dimensioning with imager on a portable device |
US10140724B2 (en) | 2009-01-12 | 2018-11-27 | Intermec Ip Corporation | Semi-automatic dimensioning with imager on a portable device |
US9879976B2 (en) | 2010-01-20 | 2018-01-30 | Faro Technologies, Inc. | Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features |
US10060722B2 (en) | 2010-01-20 | 2018-08-28 | Faro Technologies, Inc. | Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations |
US10281259B2 (en) | 2010-01-20 | 2019-05-07 | Faro Technologies, Inc. | Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features |
US9684078B2 (en) | 2010-05-10 | 2017-06-20 | Faro Technologies, Inc. | Method for optically scanning and measuring an environment |
US9779546B2 (en) | 2012-05-04 | 2017-10-03 | Intermec Ip Corp. | Volume dimensioning systems and methods |
US10467806B2 (en) | 2012-05-04 | 2019-11-05 | Intermec Ip Corp. | Volume dimensioning systems and methods |
US10007858B2 (en) | 2012-05-15 | 2018-06-26 | Honeywell International Inc. | Terminals and methods for dimensioning objects |
US10635922B2 (en) | 2012-05-15 | 2020-04-28 | Hand Held Products, Inc. | Terminals and methods for dimensioning objects |
US10805603B2 (en) | 2012-08-20 | 2020-10-13 | Intermec Ip Corp. | Volume dimensioning system calibration systems and methods |
US10321127B2 (en) | 2012-08-20 | 2019-06-11 | Intermec Ip Corp. | Volume dimensioning system calibration systems and methods |
US9939259B2 (en) | 2012-10-04 | 2018-04-10 | Hand Held Products, Inc. | Measuring object dimensions using mobile computer |
US10067231B2 (en) | 2012-10-05 | 2018-09-04 | Faro Technologies, Inc. | Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner |
US9746559B2 (en) | 2012-10-05 | 2017-08-29 | Faro Technologies, Inc. | Using two-dimensional camera images to speed registration of three-dimensional scans |
US10203413B2 (en) | 2012-10-05 | 2019-02-12 | Faro Technologies, Inc. | Using a two-dimensional scanner to speed registration of three-dimensional scan data |
US9739886B2 (en) | 2012-10-05 | 2017-08-22 | Faro Technologies, Inc. | Using a two-dimensional scanner to speed registration of three-dimensional scan data |
US11815600B2 (en) | 2012-10-05 | 2023-11-14 | Faro Technologies, Inc. | Using a two-dimensional scanner to speed registration of three-dimensional scan data |
US11112501B2 (en) | 2012-10-05 | 2021-09-07 | Faro Technologies, Inc. | Using a two-dimensional scanner to speed registration of three-dimensional scan data |
US9618620B2 (en) | 2012-10-05 | 2017-04-11 | Faro Technologies, Inc. | Using depth-camera images to speed registration of three-dimensional scans |
US9841311B2 (en) | 2012-10-16 | 2017-12-12 | Hand Held Products, Inc. | Dimensioning system |
US10908013B2 (en) | 2012-10-16 | 2021-02-02 | Hand Held Products, Inc. | Dimensioning system |
US9784566B2 (en) | 2013-03-13 | 2017-10-10 | Intermec Ip Corp. | Systems and methods for enhancing dimensioning |
US10203402B2 (en) | 2013-06-07 | 2019-02-12 | Hand Held Products, Inc. | Method of error correction for 3D imaging device |
US10228452B2 (en) | 2013-06-07 | 2019-03-12 | Hand Held Products, Inc. | Method of error correction for 3D imaging device |
US9769454B2 (en) | 2014-06-20 | 2017-09-19 | Stmicroelectronics S.R.L. | Method for generating a depth map, related system and computer program product |
US10656617B2 (en) | 2014-07-16 | 2020-05-19 | Faro Technologies, Inc. | Measurement device for machining center |
WO2016010640A1 (en) * | 2014-07-16 | 2016-01-21 | Faro Technologies, Inc. | Measurement device for machining center |
US9823059B2 (en) | 2014-08-06 | 2017-11-21 | Hand Held Products, Inc. | Dimensioning system with guided alignment |
US9976848B2 (en) | 2014-08-06 | 2018-05-22 | Hand Held Products, Inc. | Dimensioning system with guided alignment |
US10240914B2 (en) | 2014-08-06 | 2019-03-26 | Hand Held Products, Inc. | Dimensioning system with guided alignment |
WO2016020826A1 (en) * | 2014-08-07 | 2016-02-11 | Ingenera Sa | Method and relevant device for measuring distance with auto-calibration and temperature compensation |
US10088296B2 (en) | 2014-09-10 | 2018-10-02 | Faro Technologies, Inc. | Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device |
GB2545603B (en) * | 2014-09-10 | 2020-04-15 | Faro Tech Inc | A portable device for optically measuring three-dimensional coordinates |
WO2016039955A1 (en) * | 2014-09-10 | 2016-03-17 | Faro Technologies, Inc. | A portable device for optically measuring three- dimensional coordinates |
US9915521B2 (en) | 2014-09-10 | 2018-03-13 | Faro Technologies, Inc. | Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device |
US9879975B2 (en) | 2014-09-10 | 2018-01-30 | Faro Technologies, Inc. | Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device |
US9602811B2 (en) | 2014-09-10 | 2017-03-21 | Faro Technologies, Inc. | Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device |
US9693040B2 (en) | 2014-09-10 | 2017-06-27 | Faro Technologies, Inc. | Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device |
US9769463B2 (en) | 2014-09-10 | 2017-09-19 | Faro Technologies, Inc. | Device and method for optically scanning and measuring an environment and a method of control |
US9671221B2 (en) | 2014-09-10 | 2017-06-06 | Faro Technologies, Inc. | Portable device for optically measuring three-dimensional coordinates |
US10401143B2 (en) | 2014-09-10 | 2019-09-03 | Faro Technologies, Inc. | Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device |
US10070116B2 (en) | 2014-09-10 | 2018-09-04 | Faro Technologies, Inc. | Device and method for optically scanning and measuring an environment |
US10499040B2 (en) | 2014-09-10 | 2019-12-03 | Faro Technologies, Inc. | Device and method for optically scanning and measuring an environment and a method of control |
GB2545603A (en) * | 2014-09-10 | 2017-06-21 | Faro Tech Inc | A portable device for optically measuring three-dimensional coordinates |
CN107076551A (en) * | 2014-09-19 | 2017-08-18 | 海克斯康测量技术有限公司 | Multi-mode portable coordinate measuring machine |
US11215442B2 (en) | 2014-09-19 | 2022-01-04 | Hexagon Metrology, Inc. | Multi-mode portable coordinate measuring machine |
US10663284B2 (en) | 2014-09-19 | 2020-05-26 | Hexagon Metrology, Inc. | Multi-mode portable coordinate measuring machine |
US10036627B2 (en) | 2014-09-19 | 2018-07-31 | Hexagon Metrology, Inc. | Multi-mode portable coordinate measuring machine |
WO2016044658A1 (en) * | 2014-09-19 | 2016-03-24 | Hexagon Metrology, Inc. | Multi-mode portable coordinate measuring machine |
US10402956B2 (en) | 2014-10-10 | 2019-09-03 | Hand Held Products, Inc. | Image-stitching for dimensioning |
US10775165B2 (en) | 2014-10-10 | 2020-09-15 | Hand Held Products, Inc. | Methods for improving the accuracy of dimensioning-system measurements |
US10121039B2 (en) | 2014-10-10 | 2018-11-06 | Hand Held Products, Inc. | Depth sensor based auto-focus system for an indicia scanner |
US10810715B2 (en) | 2014-10-10 | 2020-10-20 | Hand Held Products, Inc | System and method for picking validation |
US10134120B2 (en) | 2014-10-10 | 2018-11-20 | Hand Held Products, Inc. | Image-stitching for dimensioning |
US9779276B2 (en) | 2014-10-10 | 2017-10-03 | Hand Held Products, Inc. | Depth sensor based auto-focus system for an indicia scanner |
US10859375B2 (en) | 2014-10-10 | 2020-12-08 | Hand Held Products, Inc. | Methods for improving the accuracy of dimensioning-system measurements |
US9762793B2 (en) | 2014-10-21 | 2017-09-12 | Hand Held Products, Inc. | System and method for dimensioning |
US10060729B2 (en) | 2014-10-21 | 2018-08-28 | Hand Held Products, Inc. | Handheld dimensioner with data-quality indication |
US9752864B2 (en) | 2014-10-21 | 2017-09-05 | Hand Held Products, Inc. | Handheld dimensioning system with feedback |
US9897434B2 (en) | 2014-10-21 | 2018-02-20 | Hand Held Products, Inc. | Handheld dimensioning system with measurement-conformance feedback |
US10393508B2 (en) | 2014-10-21 | 2019-08-27 | Hand Held Products, Inc. | Handheld dimensioning system with measurement-conformance feedback |
US9826220B2 (en) | 2014-10-21 | 2017-11-21 | Hand Held Products, Inc. | Dimensioning system with feedback |
US10218964B2 (en) | 2014-10-21 | 2019-02-26 | Hand Held Products, Inc. | Dimensioning system with feedback |
DE102015205187A1 (en) * | 2015-03-23 | 2016-09-29 | Siemens Aktiengesellschaft | Method and device for the projection of line pattern sequences |
US11403887B2 (en) | 2015-05-19 | 2022-08-02 | Hand Held Products, Inc. | Evaluating image values |
US10593130B2 (en) | 2015-05-19 | 2020-03-17 | Hand Held Products, Inc. | Evaluating image values |
US9786101B2 (en) | 2015-05-19 | 2017-10-10 | Hand Held Products, Inc. | Evaluating image values |
US11906280B2 (en) | 2015-05-19 | 2024-02-20 | Hand Held Products, Inc. | Evaluating image values |
US10066982B2 (en) | 2015-06-16 | 2018-09-04 | Hand Held Products, Inc. | Calibrating a volume dimensioner |
US9857167B2 (en) | 2015-06-23 | 2018-01-02 | Hand Held Products, Inc. | Dual-projector three-dimensional scanner |
US10247547B2 (en) | 2015-06-23 | 2019-04-02 | Hand Held Products, Inc. | Optical pattern projector |
US9835486B2 (en) | 2015-07-07 | 2017-12-05 | Hand Held Products, Inc. | Mobile dimensioner apparatus for use in commerce |
US10612958B2 (en) | 2015-07-07 | 2020-04-07 | Hand Held Products, Inc. | Mobile dimensioner apparatus to mitigate unfair charging practices in commerce |
US10393506B2 (en) | 2015-07-15 | 2019-08-27 | Hand Held Products, Inc. | Method for a mobile dimensioning device to use a dynamic accuracy compatible with NIST standard |
US11353319B2 (en) | 2015-07-15 | 2022-06-07 | Hand Held Products, Inc. | Method for a mobile dimensioning device to use a dynamic accuracy compatible with NIST standard |
US11029762B2 (en) | 2015-07-16 | 2021-06-08 | Hand Held Products, Inc. | Adjusting dimensioning results using augmented reality |
US10249030B2 (en) | 2015-10-30 | 2019-04-02 | Hand Held Products, Inc. | Image transformation for indicia reading |
EP3171129A1 (en) * | 2015-11-19 | 2017-05-24 | Hand Held Products, Inc. | High resolution dot pattern |
US10225544B2 (en) | 2015-11-19 | 2019-03-05 | Hand Held Products, Inc. | High resolution dot pattern |
US10175037B2 (en) | 2015-12-27 | 2019-01-08 | Faro Technologies, Inc. | 3-D measuring device with battery pack |
JP2017126192A (en) * | 2016-01-14 | 2017-07-20 | セイコーエプソン株式会社 | Image recognition device, image recognition method and image recognition unit |
US10747227B2 (en) | 2016-01-27 | 2020-08-18 | Hand Held Products, Inc. | Vehicle positioning and object avoidance |
US10025314B2 (en) | 2016-01-27 | 2018-07-17 | Hand Held Products, Inc. | Vehicle positioning and object avoidance |
US10872214B2 (en) | 2016-06-03 | 2020-12-22 | Hand Held Products, Inc. | Wearable metrological apparatus |
US10339352B2 (en) | 2016-06-03 | 2019-07-02 | Hand Held Products, Inc. | Wearable metrological apparatus |
US9940721B2 (en) | 2016-06-10 | 2018-04-10 | Hand Held Products, Inc. | Scene change detection in a dimensioner |
US10417769B2 (en) | 2016-06-15 | 2019-09-17 | Hand Held Products, Inc. | Automatic mode switching in a volume dimensioner |
US10163216B2 (en) | 2016-06-15 | 2018-12-25 | Hand Held Products, Inc. | Automatic mode switching in a volume dimensioner |
EP3315902B1 (en) * | 2016-10-27 | 2023-09-06 | Pepperl+Fuchs SE | Measuring device and method for triangulation measurement |
US10909708B2 (en) | 2016-12-09 | 2021-02-02 | Hand Held Products, Inc. | Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements |
US11047672B2 (en) | 2017-03-28 | 2021-06-29 | Hand Held Products, Inc. | System for optically dimensioning |
US10584962B2 (en) | 2018-05-01 | 2020-03-10 | Hand Held Products, Inc | System and method for validating physical-item security |
US11561088B2 (en) | 2018-06-07 | 2023-01-24 | Pibond Oy | Modeling the topography of a three-dimensional surface |
US11350077B2 (en) | 2018-07-03 | 2022-05-31 | Faro Technologies, Inc. | Handheld three dimensional scanner with an autoaperture |
WO2022016873A1 (en) * | 2020-07-23 | 2022-01-27 | Zhejiang Hanchine Ai Tech. Co., Ltd. | Multi-line laser three-dimensional imaging method and system based on random lattice |
US11763473B2 (en) | 2020-07-23 | 2023-09-19 | Zhejiang Hanchine Ai Tech. Co., Ltd. | Multi-line laser three-dimensional imaging method and system based on random lattice |
WO2022207201A1 (en) * | 2021-03-29 | 2022-10-06 | Sony Semiconductor Solutions Corporation | Depth sensor device and method for operating a depth sensor device |
Also Published As
Publication number | Publication date |
---|---|
DE112013002824T5 (en) | 2015-04-02 |
JP5816773B2 (en) | 2015-11-18 |
CN104380033A (en) | 2015-02-25 |
JP2015524916A (en) | 2015-08-27 |
GB2517621A (en) | 2015-02-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10060722B2 (en) | Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations | |
US10281259B2 (en) | Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features | |
US8832954B2 (en) | Coordinate measurement machines with removable accessories | |
US9628775B2 (en) | Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations | |
JP5816773B2 (en) | Coordinate measuring machine with removable accessories | |
US11262194B2 (en) | Triangulation scanner with blue-light projector | |
US9228816B2 (en) | Method of determining a common coordinate system for an articulated arm coordinate measurement machine and a scanner | |
US9500469B2 (en) | Laser line probe having improved high dynamic range | |
US8875409B2 (en) | Coordinate measurement machines with removable accessories | |
US9909856B2 (en) | Dynamic range of a line scanner having a photosensitive array that provides variable exposure | |
US20130286196A1 (en) | Laser line probe that produces a line of light having a substantially even intensity distribution | |
US20140002608A1 (en) | Line scanner using a low coherence light source | |
WO2013188026A1 (en) | Coordinate measurement machines with removable accessories | |
EP3385661B1 (en) | Articulated arm coordinate measurement machine that uses a 2d camera to determine 3d coordinates of smoothly continuous edge features | |
WO2016044014A1 (en) | Articulated arm coordinate measurement machine having a 2d camera and method of obtaining 3d representations | |
WO2014109810A1 (en) | Laser line probe that produces a line of light having a substantially even intensity distribution |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13725881 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015516035 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120130028247 Country of ref document: DE Ref document number: 112013002824 Country of ref document: DE |
|
ENP | Entry into the national phase |
Ref document number: 1422105 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20130520 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1422105.5 Country of ref document: GB |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13725881 Country of ref document: EP Kind code of ref document: A1 |