US20030218069A1 - Indicia sensor system for optical reader - Google Patents

Indicia sensor system for optical reader Download PDF

Info

Publication number
US20030218069A1
US20030218069A1 US10/441,473 US44147303A US2003218069A1 US 20030218069 A1 US20030218069 A1 US 20030218069A1 US 44147303 A US44147303 A US 44147303A US 2003218069 A1 US2003218069 A1 US 2003218069A1
Authority
US
United States
Prior art keywords
threshold
imaging data
reading device
optical reading
edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/441,473
Inventor
Timothy Meier
Robert Hussey
Charles Barber
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hand Held Products Inc
Original Assignee
Welch Allyn Data Collection Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Welch Allyn Data Collection Inc filed Critical Welch Allyn Data Collection Inc
Priority to US10/441,473 priority Critical patent/US20030218069A1/en
Publication of US20030218069A1 publication Critical patent/US20030218069A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10821Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
    • G06K7/10881Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices constructional details of hand-held scanners
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10821Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
    • G06K7/10881Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices constructional details of hand-held scanners
    • G06K7/109Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices constructional details of hand-held scanners adaptations to make the hand-held scanner useable as a fixed scanner
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K2207/00Other aspects
    • G06K2207/1018Source control

Definitions

  • a continuous scan operating configuration requires repetitive illumination flashing of an LED array in the case of an image sensor based optical reader and repetitive laser scanning in the case of a laser scan engine based optical reader.
  • Repetitive flashing illumination or laser scanning requires a high level of energy consumption and can result in premature component degradation.
  • the repetitive illumination or laser scanning has been observed to be highly distracting to users of such optical readers configured to continuously scan image data.
  • U.S. Pat. No. 5,550,366 describes a system for automatically activating image scanning in a portable bar code reader when the presence of a bar code in a target area is detected.
  • the detection of a bar code in the target area is carried out on a period basis and requires for the detection activation of a high radiance source of illumination. Accordingly, the system is not responsive in real time to an object being moved into the field of view of the reader, and the high radiance illumination required for operation of the system remains a source of distraction.
  • the invention is a method for operating an optical reader so that control of the reader depends on image information being generated by the optical imaging element, such as an image sensor, of the reader.
  • the reader control unit analyzes image information being generated by the imaging element and changes the mode of operation of the reader if the image information indicates that machine readable indicia, such as bar code symbols or a text character, is in the field of view of the reader.
  • analysis of image data includes the step of detecting for edges, or edge transitions in the image information. If the control unit determines that the image information includes more than a predetermined number of edge transitions, then the control unit imparts appropriate control over various reader elements to change the mode of operation to the reader.
  • control unit may change the mode of operation of the reader from a first mode, wherein the reader does not operate to decode or otherwise recognize machine readable indicia to a second mode, wherein the reader actively attempts to decode or otherwise recognize machine readable indicia.
  • the second mode may be characterized, for example, by an increased illumination of the field of view, and/or by the activation or enhancement of decoding algorithms being operated to process captured image data and/or by the activation or enhancement of optical character recognition (OCR) algorithms being operated to process captured image data.
  • OCR optical character recognition
  • the method may be utilized with any type of optical reader, including a basic hand held bar code reader, a multi functional data collection unit having a keyboard, display, and imaging element, a scan stand optical reader, or a fixed mount optical reader mounted to generate image information corresponding to articles which are manually or automatically moved across a point of transaction.
  • a basic hand held bar code reader including a keyboard, display, and imaging element, a scan stand optical reader, or a fixed mount optical reader mounted to generate image information corresponding to articles which are manually or automatically moved across a point of transaction.
  • FIGS. 2 A- 2 H show perspective views of exemplary optical readers in which the invention may be incorporated;
  • FIG. 4 is a pixel diagram corresponding to a single bar symbol illustrating one possible variation of the edge detection method of the invention
  • FIG. 1 A block diagram illustrating one type of optical reading device in which the invention may be incorporated is described with reference to FIG. 1.
  • Optical reader 10 includes an illumination assembly 20 for illuminating a target object T, such as a 1D or 2D bar code symbol, and an imaging assembly 30 for receiving an image of object T and generating an electrical output signal indicative of the data optically encoded therein.
  • Illumination assembly 20 may, for example, include an illumination source assembly 22 , such as one or more LEDs, together with an illuminating optics assembly 24 , such as one or more reflectors, for directing light from light source 22 in the direction of target object T.
  • Illumination assembly 20 may be eliminated if ambient light levels are certain to be high enough to allow high quality images of object T to be taken.
  • processor 42 is preferably a general purpose, off-the-shelf VLSI integrated circuit microprocessor which has overall control of the circuitry of FIG. 2, but which devotes most of its time to decoding image data stored in RAM 46 in accordance with program data stored in EROM 47 .
  • Processor 44 is preferably a special purpose VLSI integrated circuit, such as a programmable logic or gate array, which is programmed to devote its time to functions other than decoding image data, and thereby relieve processor 42 from the burden of performing these functions.
  • processors 42 and 44 The actual division of labor between processors 42 and 44 will naturally depend on the type of off-the-shelf microprocessors that are available, the type of image sensor which is used, the rate at which image data is output by imaging assembly 30 , etc. There is nothing in principle, however, that requires that any particular division of labor be made between processors 42 and 44 , or even that such a division be made at all. This is because special purpose processor 44 may be eliminated entirely if general purpose processor 42 is fast enough and powerful enough to perform all of the functions contemplated by the present invention. It will, therefore, be understood that neither the number of processors used, nor the division of labor therebetween, is of any fundamental significance for purposes of the present invention.
  • Processor 44 may, for example, control the illumination of LEDs 22 , the timing of image sensor 32 and an analog-to-digital (A/D) converter 36 , the transmission and reception of data to and from a processor external to reader 10 , through an RS- 232 , a network, or a serial bus such as USB, (or other) compatible I/O interface 37 and the outputting of user perceptible data via an output device 38 , such as a beeper, a good read LED and/or a display monitor which may be provided by a liquid crystal display such as display 82 .
  • A/D analog-to-digital
  • processor 42 may be convenient to configure processor 42 to control output, display and I/O functions. Control of output, display and I/O functions may also be shared between processors 42 and 44 , as suggested by bus driver I/O and output/display devices 37 ′′ and 38 ′ or may be duplicated, as suggested by microprocessor serial I/O ports 42 A and 42 B and I/O and display devices 37 ′′ and 38 ′. As explained earlier, the specifics of this division of labor is of no significance to the present invention.
  • FIGS. 2A through 2H show examples of types of housings in which the present invention maybe incorporated.
  • FIGS. 2A and 2B show a ID optical reader 10 - 1
  • FIGS. 2 C- 2 H show 2 D optical readers 10 - 2 , 10 - 3 , 10 - 4 .
  • Housing 12 of each of the optical readers 10 - 1 through 10 - 4 is adapted to be graspable by a human hand and has incorporated therein at least one trigger switch 74 for activating image capture and decoding and/or image capture and character recognition operations.
  • Readers 10 - 1 , 10 - 2 , 10 - 3 include hard-wired communication links 78 for communication with external devices such as other data collection devices or a host processor, while reader 10 - 4 includes an antenna 80 for providing wireless communication with an external device such as another data collection device or a host processor.
  • reader 10 - 3 and 10 - 4 each include a display 82 for displaying information to a user and a keyboard 78 for enabling a user to input commands and data into the reader.
  • control unit 40 captures image data by repeatedly reading the output from A/D converter 36 and writing these data into RAM 46 .
  • pixel data corresponding to an entire field of view of the imaging assembly is typically referred to as a “frame” of image data, or a bit map.
  • Control unit 40 may detect for the presence of machine readable indicia in captured image data by detecting for edge transitions or edges in the image data.
  • An edge of an image is an area of contrast between a darker indicia and lighter indicia.
  • a plain uniformly reflecting substrate can be expected to have substantially no human recognizable edge transitions.
  • a substrate having a bar code symbol formed thereon can be expected to have several edge transitions because each interface between a space and a dark area of a symbol constitutes an edge. Substrates having machine readable text characters formed thereon can also be expected to have several edge transitions.
  • control unit 40 determines that a frame of image data captured by an imaging system is likely to contain machine recognizable indicia if the scene contains more than a predetermined number of edge transitions.
  • the preferred number of predetermined edge transitions that is selected to indicate the likely presence of machine readable indicia in a captured frame of image data may vary within a wide range (from about 3 to 50 or more) depending on such factors as the characteristics of machine readable indicia which are to be subject to image capture, and on characteristics of the image capturing process.
  • the selection of a relatively small number of edge transitions (such as between about 5 and 15) as the predetermined threshold number of edges indicating the likely presence of machine readable indicia is useful in the case a reading device according to the invention is configured to detect for the presence of bar code symbols having 50 or more edge transitions formed on a substrate that is moving relative to a reading device during image capture.
  • Selecting a number of edge transitions substantially less than the actual number of edge transitions expected to be found in a still captured image aids in the detection of machine readable indicia in captured images that are blurred as a result of a substrate and/or reader being moved during the image capture process.
  • control unit 40 determines that a captured scene likely contains machine recognizable indicia if the number of edge transitions represented in captured frames of image data changes by more than a predetermined amount over the course of one or more consecutively captured frames. Such an implementation is useful, for example, where control unit 40 is employed to capture images from scenes having backgrounds known to have a high number of edges (wood grain surfaces, for example).
  • Threshold 122 also illustrates other potential problems which may arise with use of a constant valued threshold.
  • a constant threshold for determining edge transitions can yield inconsistent edge detections of image data corresponding to similar indicia. It is seen that although region 110 and region 112 both correspond to a white substrate, they are illuminated slightly non-uniformly and therefore application of constant threshold 122 would result in edges being detected for in region 110 and not being detected in region 112 .
  • FIG. 5 illustrates thresholding edge detection method utilizing two “adaptive” thresholds, an adaptive maximum threshold 130 and an adaptive minimum threshold 132 .
  • the threshold at any one pixel may be a function of the values of pixels in proximity with that pixel.
  • the adaptive threshold method increases the processing load and computation time
  • use of adaptive threshold or thresholds enables an edge detection method, according to the invention, to accurately record edge transitions in the case there is a non-uniform illumination of a scene or in the case that dark regions of machine readable indicia are formed on a substrate non-uniformly.
  • FIG. 1 the threshold at any one pixel may be a function of the values of pixels in proximity with that pixel.
  • the maximum threshold 130 is established at a predetermined percent (such as an 80 percent level) of a tracking line, which tracks local maximum points of the row of pixel data 138 with a provision that results in local maximum points being bypassed if a local maximum point is significantly lower than neighboring local maximum points.
  • the minimum threshold 130 is established at a predetermined percent value (such as 20 percent above) of a minimum tracking line, which is established by tracking local minimum points of the row of pixel data 138 with the provision that results in local minimum points that are significantly higher than neighboring local minimum points being bypassed. Edge transitions in the example of FIG. 5 are recorded when the pixel data 138 falls below the minimum threshold 132 , e.g., point 134 , and when the pixel data rises above maximum threshold 130 , e.g. point 136 .
  • FIG. 4 shows pixel data corresponding to a single bar in a scene
  • FIG. 5 shows pixel data corresponding to a machine readable symbol
  • FIG. 6 shows pixel data corresponding to a white sheet of paper.
  • the edge detection method utilizing threshold 120 will record two edges, which according to the invention is not normally a sufficient number of edges to constitute the detection of a machine readable indicia.
  • the reader is programmed so that a substantial number of edges (normally at least three) are required to constitute the detection of a machine readable indicia.
  • FIG. 5 corresponding to a machine readable indicia then application of an edge detection method results in sixteen (16) edges being detected. This is normally sufficient to constitute the detection of a machine readable indicia.
  • edges depends only on whether there is detectable fluctuation of pixel image data and not on the magnitude of the fluctuation. Accordingly, as the edge detection method has been described thus far, application of the method may result in edges being detected from image data corresponding to a substantially uniform gray scale scene. It can be observed from the example of FIG. 6 illustrating a row 140 of pixel data corresponding to a substantially uniform white substrate that use a single constant threshold for detecting edges or use of maximum and minimum thresholds as described in connection with the example of FIG. 5 would result in several edge transitions being detected in the pixel data.
  • a pixel variance measurement step may be executed.
  • the pixel value may be analyzed to determine a measurement in pixel variance, such as the difference between the maximum and minimum pixel value, or the difference between the average local minimum value and the average local maximum value. If the pixel variance measurement value does not exceed a predetermined value, then it is determined that the scene is one of a substantially uniform reflectance, and either processing ends or edges that are detected are ignored.
  • control unit 40 normally analyzes a line of image data at a given time.
  • a 1D image sensor there is often only one pixel row from which a line of image data can be determined.
  • a 2D image sensor comprises several rows of pixels from which numerous lines of image data can be selected.
  • control unit 40 applies several lines of image data determined from an initial bit map to an edge detection method in succession.
  • control unit 40 applies lines of image data corresponding to a vertical, horizontal, diagonal left, and diagonal right pixel rows to an edge detection method in succession.
  • this change in the reader operating mode will comprise a change in the operation of the reader from a first mode, wherein the reader does not have the capability to recognize machine readable indicia to a second mode wherein the reader actively attempts to recognize machine readable indicia.
  • the second mode of operation may be characterized, for example, by the activation of illumination array 22 , and/or by activation of a bar code decoding algorithm and/or an optical character recognition algorithm.
  • control unit 40 may process a frame of image data to locate image data pertaining to a symbology, determine the identity of that symbology, and apply further decoding steps to the image data corresponding to that symbology to determine the message encoded by the symbol.
  • the second mode of operation may also be characterized by the activation of an optical character recognition (OCR) algorithm.
  • OCR optical character recognition
  • control unit 40 processes image data from a frame to determine the identity of any text characters or other machine readable image data which may be represented in the frame. If in the second mode of operation, the reader is programmed either to decode a symbol or to perform OCR on other machine readable indicia, then control unit 40 should be configured to execute certain processing steps to determine whether the detected machine readable image data corresponds to a symbol or whether the image data corresponds to non-symbol indicia. Since such processing steps, and specific steps of various decoding and optical character recognition algorithms are well known, they will not be discussed further herein.

Abstract

In the present invention, the control unit of an optical reader analyzes image data being generated by the imaging element of the reader and changes the mode of operation of the reader if the image data indicates that machine readable indicia, such as a bar code symbol or a text character, is likely in the field of view of the reader. Normally, analysis of image data includes the step of detecting for edge transitions in the image information. If the control unit determines that the image data includes more than a predetermined number of edge transitions, then the control unit imparts appropriate control over various reader elements to change the mode of operation of the reader. Normally, the control unit changes the mode of operation of the reader from a first mode, wherein the reader does not operate to decode or recognize image data to a second mode, wherein the reader operates to decode and/or recognize image data.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This is a continuation of U.S. patent application Ser. No. 09/432,282 filed on Nov. 2, 1999 the content of which is relied upon and incorporated herein by reference in its entirety, and the benefit of priority under 35 U.S.C. §120 is hereby claimed.[0001]
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0002]
  • The present invention relates to optical readers in general and particularly to an optical reader configured to change operating modes depending on characteristics of images in the reader's field of view. [0003]
  • 2. Background of the Prior Art [0004]
  • Prior art optical readers are sometimes configured to operate in a “continuous scan mode” so that bar code symbols and other indicia presented to the reader are automatically decoded or otherwise recognized without manually activating a control element such as a trigger to commence indicia recognizing activities. [0005]
  • A continuous scan operating configuration requires repetitive illumination flashing of an LED array in the case of an image sensor based optical reader and repetitive laser scanning in the case of a laser scan engine based optical reader. Repetitive flashing illumination or laser scanning requires a high level of energy consumption and can result in premature component degradation. Furthermore, the repetitive illumination or laser scanning has been observed to be highly distracting to users of such optical readers configured to continuously scan image data. [0006]
  • U.S. Pat. No. 5,550,366 describes a system for automatically activating image scanning in a portable bar code reader when the presence of a bar code in a target area is detected. However, the detection of a bar code in the target area is carried out on a period basis and requires for the detection activation of a high radiance source of illumination. Accordingly, the system is not responsive in real time to an object being moved into the field of view of the reader, and the high radiance illumination required for operation of the system remains a source of distraction. [0007]
  • There is a need for an optical reader which is configured to automatically and in real time decode or otherwise recognize machine readable indicia that is presented to the reader without manual activation of a control element to commence recognition operations. [0008]
  • SUMMARY OF THE INVENTION
  • According to its major aspects and broadly stated, the invention is a method for operating an optical reader so that control of the reader depends on image information being generated by the optical imaging element, such as an image sensor, of the reader. [0009]
  • In one embodiment, the reader control unit analyzes image information being generated by the imaging element and changes the mode of operation of the reader if the image information indicates that machine readable indicia, such as bar code symbols or a text character, is in the field of view of the reader. Normally, analysis of image data includes the step of detecting for edges, or edge transitions in the image information. If the control unit determines that the image information includes more than a predetermined number of edge transitions, then the control unit imparts appropriate control over various reader elements to change the mode of operation to the reader. [0010]
  • When the control unit determines that machine readable indicia is in the field of view of the imaging element then the control unit may change the mode of operation of the reader from a first mode, wherein the reader does not operate to decode or otherwise recognize machine readable indicia to a second mode, wherein the reader actively attempts to decode or otherwise recognize machine readable indicia. The second mode may be characterized, for example, by an increased illumination of the field of view, and/or by the activation or enhancement of decoding algorithms being operated to process captured image data and/or by the activation or enhancement of optical character recognition (OCR) algorithms being operated to process captured image data. [0011]
  • The method may be utilized with any type of optical reader, including a basic hand held bar code reader, a multi functional data collection unit having a keyboard, display, and imaging element, a scan stand optical reader, or a fixed mount optical reader mounted to generate image information corresponding to articles which are manually or automatically moved across a point of transaction. [0012]
  • The method may be utilized with an optical reader to supplement or replace the function normally provided by a trigger switch. In most hand held optical readers a trigger switch is manually depressed to commence decoding or recognition operations of the reader. An optical reader programmed in accordance with the invention may commence decoding and/or recognition operations automatically upon the detection of machine readable indicia in the field of view of the reader without a trigger being depressed. [0013]
  • These and other details, advantages and benefits of the present invention will become apparent from the detailed description of the preferred embodiment herein below.[0014]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The preferred embodiment of the invention will now be described, by way of example only, with reference to the accompanying Figures wherein like members bear like reference numerals and wherein: [0015]
  • FIG. 1 is a block electrical diagram of an exemplary optical reading device in which the invention may be incorporated; [0016]
  • FIGS. [0017] 2A-2H show perspective views of exemplary optical readers in which the invention may be incorporated;
  • FIG. 2I shows an example optical reader of the type in which the invention may be incorporated stationed in a scan stand; [0018]
  • FIG. 3 is a flow diagram illustrating operations which may be performed by an optical reading device during execution of the invention; [0019]
  • FIG. 4 is a pixel diagram corresponding to a single bar symbol illustrating one possible variation of the edge detection method of the invention; [0020]
  • FIG. 5 is a pixel diagram corresponding to a multiple bar symbol illustrating a second possible variation of an edge detection method of the invention; [0021]
  • FIG. 6 is a pixel diagram corresponding to a substantially uniform white sheet of paper illustrating a third possible variation of an edge detection method of the invention.[0022]
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • A block diagram illustrating one type of optical reading device in which the invention may be incorporated is described with reference to FIG. 1. [0023]
  • [0024] Optical reader 10 includes an illumination assembly 20 for illuminating a target object T, such as a 1D or 2D bar code symbol, and an imaging assembly 30 for receiving an image of object T and generating an electrical output signal indicative of the data optically encoded therein. Illumination assembly 20 may, for example, include an illumination source assembly 22, such as one or more LEDs, together with an illuminating optics assembly 24, such as one or more reflectors, for directing light from light source 22 in the direction of target object T. Illumination assembly 20 may be eliminated if ambient light levels are certain to be high enough to allow high quality images of object T to be taken. Imaging assembly 30 may include an image sensor 32, such as a 1D or 2D CCD, CMOS, NMOS, PMOS, CID or CMD solid state image sensor, together with an imaging optics assembly 34 for receiving and focusing an image of object T onto image sensor 32. The array-based imaging assembly shown in FIG. 1 may be replaced by a laser scanning based imaging assembly comprising a laser source, a scanning mechanism, emit and receive optics, a photodetector and accompanying signal processing circuitry.
  • [0025] Optical reader 10 of FIG. 1 also includes programmable control unit 40 which preferably comprises an integrated circuit microprocessor 42 and an application specific integrated circuit or ASIC 44. Processor 42 and ASIC 44 are both programmable control devices which are able to receive, output and process data in accordance with a stored program stored in memory unit 45 which may comprise such memory elements as a read/write random access memory or RAM 46 and an erasable read only memory or EROM 47. RAM 46 typically includes at least one volatile memory device but may include one or more long term non-volatile memory devices. Processor 42 and ASIC 44 are also both connected to a common bus 48 through which program data and working data, including address data, may be received and transmitted in either direction to any circuitry that is also connected thereto. Processor 42 and ASIC 44 differ from one another, however, in how they are made and how they are used.
  • More particularly, [0026] processor 42 is preferably a general purpose, off-the-shelf VLSI integrated circuit microprocessor which has overall control of the circuitry of FIG. 2, but which devotes most of its time to decoding image data stored in RAM 46 in accordance with program data stored in EROM 47. Processor 44, on the other hand, is preferably a special purpose VLSI integrated circuit, such as a programmable logic or gate array, which is programmed to devote its time to functions other than decoding image data, and thereby relieve processor 42 from the burden of performing these functions.
  • The actual division of labor between [0027] processors 42 and 44 will naturally depend on the type of off-the-shelf microprocessors that are available, the type of image sensor which is used, the rate at which image data is output by imaging assembly 30, etc. There is nothing in principle, however, that requires that any particular division of labor be made between processors 42 and 44, or even that such a division be made at all. This is because special purpose processor 44 may be eliminated entirely if general purpose processor 42 is fast enough and powerful enough to perform all of the functions contemplated by the present invention. It will, therefore, be understood that neither the number of processors used, nor the division of labor therebetween, is of any fundamental significance for purposes of the present invention.
  • With processor architectures of the type shown in FIG. 1, a typical division of labor between [0028] processors 42 and 44 will be as follows. Processor 42 is preferably devoted primarily to the tasks of decoding image data, once such data has been stored in RAM 46, handling menuing options and reprogramming functions, processing commands and data received from control/data input unit 39 which may comprise such elements as trigger 74 and keyboard 78 and providing overall system level coordination. Processor 44 is preferably devoted primarily to controlling the image acquisition process, the A/D conversion process and the storage of image data, including the ability to access memories 46 and 47 via a DMA channel. Processor 44 may also perform many timing and communication operations. Processor 44 may, for example, control the illumination of LEDs 22, the timing of image sensor 32 and an analog-to-digital (A/D) converter 36, the transmission and reception of data to and from a processor external to reader 10, through an RS-232, a network, or a serial bus such as USB, (or other) compatible I/O interface 37 and the outputting of user perceptible data via an output device 38, such as a beeper, a good read LED and/or a display monitor which may be provided by a liquid crystal display such as display 82. In the alternative, given that off-the-shelf microprocessors having built-in serial interfaces and display controllers are now available, it may be convenient to configure processor 42 to control output, display and I/O functions. Control of output, display and I/O functions may also be shared between processors 42 and 44, as suggested by bus driver I/O and output/display devices 37″ and 38′ or may be duplicated, as suggested by microprocessor serial I/ O ports 42A and 42B and I/O and display devices 37″ and 38′. As explained earlier, the specifics of this division of labor is of no significance to the present invention.
  • FIGS. 2A through 2H show examples of types of housings in which the present invention maybe incorporated. FIGS. 2A and 2B show a ID optical reader [0029] 10-1, while FIGS. 2C-2H show 2D optical readers 10-2, 10-3, 10-4. Housing 12 of each of the optical readers 10-1 through 10-4 is adapted to be graspable by a human hand and has incorporated therein at least one trigger switch 74 for activating image capture and decoding and/or image capture and character recognition operations. Readers 10-1, 10-2, 10-3 include hard-wired communication links 78 for communication with external devices such as other data collection devices or a host processor, while reader 10-4 includes an antenna 80 for providing wireless communication with an external device such as another data collection device or a host processor.
  • In addition to the above elements, reader [0030] 10-3 and 10-4 each include a display 82 for displaying information to a user and a keyboard 78 for enabling a user to input commands and data into the reader.
  • Any one of the readers described with reference to FIGS. 2A through 2H may be mounted in a stationary position as is illustrated in FIG. 2I showing a generic [0031] optical reader 10 docked in a scan stand 90. Scan stand 90 adapts portable optical reader 10 for presentation mode scanning. In a presentation mode, reader 10 is held in a stationary position and an indicia bearing article is moved across the field of view of reader 10.
  • As will become clear from the ensuing description, the invention need not be incorporated in a portable optical reader. The invention may also be incorporated, for example, in association with a control unit for controlling a non-portable fixed mount imaging assembly that captures image data representing image information formed on articles transported by an assembly line, or manually transported across a checkout counter at a retail point of sale location. [0032]
  • Now referring to particular aspects of the invention, a high level flow diagram illustrating operation of an optical reader configured to operate in accordance with the invention is shown in FIG. 3. At [0033] block 102 control unit 40 analyzes image information that is generated by the reader's imaging assembly. At block 104 control unit 40 determines if a machine readable indicia is likely represented in the image information, and if the unit 40 at block 104 determines that a machine recognized indicia is likely contained in the image information then the unit at block 106 changes the mode of operation of the reading device.
  • Preferred implementations of each of these steps will now be described in detail. While the analysis of image information (block [0034] 102) could be carried out by processing of analog image signals produced by sensor array 32, or by a photo detector in the case the imaging assembly is laser based, the analysis of image information is preferably carried out by processing of pixel image data captured by control unit 40. In the control system of FIG. 2, control unit 40 captures image data by repeatedly reading the output from A/D converter 36 and writing these data into RAM 46. In the case of a 2D imaging assembly, pixel data corresponding to an entire field of view of the imaging assembly is typically referred to as a “frame” of image data, or a bit map.
  • [0035] Control unit 40 may detect for the presence of machine readable indicia in captured image data by detecting for edge transitions or edges in the image data. An edge of an image is an area of contrast between a darker indicia and lighter indicia. A plain uniformly reflecting substrate can be expected to have substantially no human recognizable edge transitions. A substrate having a bar code symbol formed thereon, however, can be expected to have several edge transitions because each interface between a space and a dark area of a symbol constitutes an edge. Substrates having machine readable text characters formed thereon can also be expected to have several edge transitions. In one implementation of the invention, control unit 40 determines that a frame of image data captured by an imaging system is likely to contain machine recognizable indicia if the scene contains more than a predetermined number of edge transitions.
  • The preferred number of predetermined edge transitions that is selected to indicate the likely presence of machine readable indicia in a captured frame of image data may vary within a wide range (from about 3 to 50 or more) depending on such factors as the characteristics of machine readable indicia which are to be subject to image capture, and on characteristics of the image capturing process. The selection of a relatively small number of edge transitions (such as between about 5 and 15) as the predetermined threshold number of edges indicating the likely presence of machine readable indicia is useful in the case a reading device according to the invention is configured to detect for the presence of bar code symbols having 50 or more edge transitions formed on a substrate that is moving relative to a reading device during image capture. Selecting a number of edge transitions substantially less than the actual number of edge transitions expected to be found in a still captured image aids in the detection of machine readable indicia in captured images that are blurred as a result of a substrate and/or reader being moved during the image capture process. [0036]
  • In an alternative implementation of the invention, [0037] control unit 40 determines that a captured scene likely contains machine recognizable indicia if the number of edge transitions represented in captured frames of image data changes by more than a predetermined amount over the course of one or more consecutively captured frames. Such an implementation is useful, for example, where control unit 40 is employed to capture images from scenes having backgrounds known to have a high number of edges (wood grain surfaces, for example). In one specific example of this type of implementation, control unit 40 can be configured to determine that a first frame is not likely to contain machine recognizable indicia if the frame has edge transitions numbering within an “equilibrium” range number of edge transitions and to determine that a next frame is likely to contain machine recognizable indicia if the next frame contains a number of edge transitions that differs from that of the previous frame by a predetermined amount.
  • [0038] Control unit 40 may detect edge transitions in captured image data in a variety of ways. In one method for edge detection, control unit 40 analyzes a line of image data to determine if a captured image includes an edge or edges. Control unit 40 may detect edges in a captured image by establishing at least one threshold in a line of image data and recording an edge detection each time the line image data crosses the threshold. In one embodiment of the invention, the at least one threshold may be established based on the average pixel value or on a function of the average pixel value of the line of image data.
  • If [0039] control unit 40 captures image data from a 1×N 1D image sensor then the line of image or pixel data analyzed by control unit 40 comprises image data generated from the row of pixels of the linear pixel array. If control unit 40 captures image data from a 1D laser scanning assembly then the line of image data analyzed by control unit 40 comprises image data corresponding to a line sweep of a laser scanner. If control unit 40 captures image data from a 2D image sensor then the line of image data analyzed by control unit 40 may comprise any line grouping of pixels from the initial captured bit map. The line of pixel values analyzed by control unit 40 may comprise, for example, pixel data one or more pixels wide corresponding to a vertical, horizontal, diagonal linear row of pixels from the sensor array. The line of pixels need not be linear, however. For example, the line of pixels analyzed by control unit 40 may comprise an arcuate or jagged grouping of pixels from a captured bit map. Furthermore, control unit 40 need not analyze every pixel from a selected line. For example it may be beneficial to ignore pixels (such as every other pixel) in, a given line in the interest of increasing processing speed. The line of pixel data analyzed by control unit 40 normally comprises pixel data captured in an initial bit map. It will be understood, however, that a pixel value of a line of image data in accordance with the invention may not be an actual pixel value from an initial bit map, but a representative pixel value determined, for example, based on the values of a grouping of positionally related pixels of an initial bit map.
  • FIG. 4 shows [0040] pixel data 108 that corresponds to a scene having a single bar symbol. The regions of higher pixel intensity 110 and 112 correspond to space while the region of lower pixel intensity 114 corresponds to the bar. It is seen by threshold 116 that a single threshold may successfully aid in the detection of edge transitions of a scene. If edges are detected for based on pixel data crossings of threshold 116, then threshold 116 will result in the recording of two edge transitions 118 and 120, which is the correct number for a scene having a single bar. However, it is seen by threshold 122 that if threshold 122 is in a range of pixel values about which pixel values may fluctuate due to noise, that detecting for edges using a single threshold may yield erroneous detections of edge transitions.
  • [0041] Threshold 122 also illustrates other potential problems which may arise with use of a constant valued threshold. In the case that a scene is illuminated non-uniformly, or if indicia is formed on substrate non-uniformly, use of a constant threshold for determining edge transitions can yield inconsistent edge detections of image data corresponding to similar indicia. It is seen that although region 110 and region 112 both correspond to a white substrate, they are illuminated slightly non-uniformly and therefore application of constant threshold 122 would result in edges being detected for in region 110 and not being detected in region 112.
  • Accordingly, in view of the potential problems involved with the use of a single, constant threshold, it may be beneficial to detect for edges in row of pixel image data utilizing a plurality of “adaptive” thresholds. FIG. 5 illustrates thresholding edge detection method utilizing two “adaptive” thresholds, an adaptive [0042] maximum threshold 130 and an adaptive minimum threshold 132.
  • In an adaptive threshold, the threshold at any one pixel may be a function of the values of pixels in proximity with that pixel. Although the adaptive threshold method increases the processing load and computation time, use of adaptive threshold or thresholds enables an edge detection method, according to the invention, to accurately record edge transitions in the case there is a non-uniform illumination of a scene or in the case that dark regions of machine readable indicia are formed on a substrate non-uniformly. In the specific example of FIG. 5, the [0043] maximum threshold 130 is established at a predetermined percent (such as an 80 percent level) of a tracking line, which tracks local maximum points of the row of pixel data 138 with a provision that results in local maximum points being bypassed if a local maximum point is significantly lower than neighboring local maximum points. The minimum threshold 130, meanwhile, is established at a predetermined percent value (such as 20 percent above) of a minimum tracking line, which is established by tracking local minimum points of the row of pixel data 138 with the provision that results in local minimum points that are significantly higher than neighboring local minimum points being bypassed. Edge transitions in the example of FIG. 5 are recorded when the pixel data 138 falls below the minimum threshold 132, e.g., point 134, and when the pixel data rises above maximum threshold 130, e.g. point 136.
  • FIG. 4 shows pixel data corresponding to a single bar in a scene, FIG. 5 shows pixel data corresponding to a machine readable symbol, while FIG. 6 shows pixel data corresponding to a white sheet of paper. In the example of FIG. 4 corresponding to a single bar, the edge detection [0044] method utilizing threshold 120 will record two edges, which according to the invention is not normally a sufficient number of edges to constitute the detection of a machine readable indicia. Normally, the reader is programmed so that a substantial number of edges (normally at least three) are required to constitute the detection of a machine readable indicia. In the example of FIG. 5 corresponding to a machine readable indicia then application of an edge detection method results in sixteen (16) edges being detected. This is normally sufficient to constitute the detection of a machine readable indicia.
  • In the edge detection methods described thus far with reference to FIGS. 4 and 5, the detection of edges depends only on whether there is detectable fluctuation of pixel image data and not on the magnitude of the fluctuation. Accordingly, as the edge detection method has been described thus far, application of the method may result in edges being detected from image data corresponding to a substantially uniform gray scale scene. It can be observed from the example of FIG. 6 illustrating a [0045] row 140 of pixel data corresponding to a substantially uniform white substrate that use a single constant threshold for detecting edges or use of maximum and minimum thresholds as described in connection with the example of FIG. 5 would result in several edge transitions being detected in the pixel data.
  • To the end that application of the method does not result in edges being detected on a substrate having substantially uniform gray scale images therein, a pixel variance measurement step may be executed. In a pixel variance measurement step, the pixel value may be analyzed to determine a measurement in pixel variance, such as the difference between the maximum and minimum pixel value, or the difference between the average local minimum value and the average local maximum value. If the pixel variance measurement value does not exceed a predetermined value, then it is determined that the scene is one of a substantially uniform reflectance, and either processing ends or edges that are detected are ignored. [0046]
  • In the example illustrated in FIG. 6, the step of pixel variance measurement is substituted for by a specialized minimum threshold establishing step. Specifically, in the example shown in FIG. 6, [0047] minimum threshold 142 is established according to a rule which precludes minimum threshold 142 from being established within a predetermined value from the value of maximum threshold 144 at any given pixel location. Using this threshold establishing step, it is seen that the pixel value 140 never falls below minimum threshold 142, and that therefore no edges are detected. This is the correct result for a substantially uniform gray scale image in which substantially no edges are humanly recognizable.
  • It has been mentioned that [0048] control unit 40, according to the method of the invention, normally analyzes a line of image data at a given time. In the case of a 1D image sensor, there is often only one pixel row from which a line of image data can be determined. However, a 2D image sensor comprises several rows of pixels from which numerous lines of image data can be selected. In a preferred implementation of the invention in a reader having a 2D image sensor control unit 40 applies several lines of image data determined from an initial bit map to an edge detection method in succession. In one embodiment of the invention, control unit 40 applies lines of image data corresponding to a vertical, horizontal, diagonal left, and diagonal right pixel rows to an edge detection method in succession. In another embodiment, control unit 40 applies parallel lines of image data determined from the array to an edge detection method in succession. If application of an edge detection method to any one of the lines of image data yields the detection of an image having a plurality of edge transitions before all lines of image data determined from a given frame are applied to the method, then program control may immediately jump to step 106 to change the mode of operation of the reader without analyzing the remaining lines of image data.
  • Referring again to the flow diagram of FIG. 3, it is seen that [0049] control unit 40 proceeds to block 106 to change the mode of operation of the reader when the control unit determines that a machine readable indicia is in the field of view.
  • Normally, this change in the reader operating mode will comprise a change in the operation of the reader from a first mode, wherein the reader does not have the capability to recognize machine readable indicia to a second mode wherein the reader actively attempts to recognize machine readable indicia. The second mode of operation may be characterized, for example, by the activation of [0050] illumination array 22, and/or by activation of a bar code decoding algorithm and/or an optical character recognition algorithm. When a bar code decoding algorithm is activated, control unit 40 may process a frame of image data to locate image data pertaining to a symbology, determine the identity of that symbology, and apply further decoding steps to the image data corresponding to that symbology to determine the message encoded by the symbol. The second mode of operation may also be characterized by the activation of an optical character recognition (OCR) algorithm. When an optical character recognition algorithm is activated, control unit 40 processes image data from a frame to determine the identity of any text characters or other machine readable image data which may be represented in the frame. If in the second mode of operation, the reader is programmed either to decode a symbol or to perform OCR on other machine readable indicia, then control unit 40 should be configured to execute certain processing steps to determine whether the detected machine readable image data corresponds to a symbol or whether the image data corresponds to non-symbol indicia. Since such processing steps, and specific steps of various decoding and optical character recognition algorithms are well known, they will not be discussed further herein.
  • While this invention has been described in detail with reference to a preferred embodiment, it should be appreciated that the present invention is not limited to that precise embodiment. Rather, in view of the present disclosure which describes the best mode for practicing the invention, many modifications and variations would present themselves to those skilled in the art without departing from the scope and spirit of this invention, as defined in the following claims.[0051]

Claims (43)

What is claimed is:
1. A method for operating an optical reading device, the optical reading device including an imaging assembly and a control system, the method including the steps of:
capturing a frame of imaging data using the imaging assembly;
applying at least one adaptive threshold to the imaging data to obtain edge transition data;
analyzing the edge transition data to detect the presence of machine readable indicia; and
changing a mode of operation of the device from a first mode of operation to at least one second mode of operation, if the step of analyzing indicates that the imaging data represents machine readable indicia.
2. The method of claim 1, wherein the step of applying further comprises:
obtaining a line of the imaging data;
applying the at least one adaptive threshold for the line of imaging data; and
recording an edge transition when said image information crosses the at least one adaptive threshold.
3. The method of claim 2, wherein the at least one adaptive threshold includes two adaptive thresholds.
4. The method of claim 3, wherein the two adaptive thresholds includes a maximum adaptive threshold and a minimum adaptive threshold.
5. The method of claim 4, wherein the edge transition data indicates an edge transition when the line of imaging data crosses both the maximum adaptive threshold and the minimum adaptive threshold.
6. The method of claim 4, wherein the maximum adaptive threshold is approximately 80% of the line.
7. The method of claim 4, wherein the minimum adaptive threshold is approximately 20% of the line.
8. The method of claim 1, wherein the at least one adaptive threshold is applied on a pixel-by-pixel basis, such that the adaptive threshold for a pixel is a function of pixel values proximate the pixel.
9. The method of claim 1, wherein the step of applying further comprises the step of performing a pixel variance measurement.
10. The method of claim 9, wherein the pixel variance measurement further comprises:
calculating a difference between a maximum pixel value and a minimum pixel value; and
comparing the difference to a predetermined value to determine whether the imaging data represents an edge transition.
11. The method of claim 10, wherein an edge transition is detected if the difference is greater than the predetermined value, and an edge transition is not detected if the difference is less that or equal to the predetermined value.
12. The method of claim 9, wherein the pixel variance measurement further comprises:
calculating a difference between an average local maximum pixel value and an average local minimum pixel value; and
comparing the difference to a predetermined value to determine whether the imaging data represents an edge transition.
13. The method of claim 12, wherein an edge transition is detected if the difference is greater than the predetermined value, and an edge transition is not detected if the difference is less that or equal to the predetermined value.
14. The method of claim 12, wherein the average local maximum pixel value and the average local minimum pixel value are pixel values taken from a line of imaging data.
15. The method of claim 9, wherein the imaging data represents a gray scale image if the difference is less that or equal to the predetermined value.
16. The method of claim 1, further comprising the step of recording detected edge transitions when a pixel value crosses the at least one adaptive threshold.
17. The method of claim 16, wherein the step of applying includes the step of setting a maximum threshold and a minimum threshold, and wherein the recording step includes the step of counting an edge transition if said image information rises above said maximum threshold or falls below said minimum threshold.
18. The method of claim 17, wherein the maximum threshold is separated from the minimum threshold by a predetermined value.
19. The method of claim 1, wherein the optical reading device is not configured to decode imaging data in the first mode, but is configured to read optical indicia in the at least one second mode.
20. The method of claim 19, wherein the optical indicia includes bar code indicia.
21. The method of claim 20, wherein the bar code indicia includes a linear bar code and/or a two-dimensional bar code.
22. The method of claim 19, wherein the optical indicia is read using optical character recognition.
23. The method of claim 1, wherein the optical reading device is configured to provide increased illumination in the at least one second mode.
24. The method of claim 1, wherein the step of changing step further comprises:
determining a number of edge transitions in the edge transition; and
changing the mode if the number of edge transitions exceeds a predetermined amount.
25. The method of claim 1, wherein the step of changing step further comprises:
determining a number of edge transitions in the edge transition; and
changing the mode if the number of edge transitions changes by a predetermined amount.
26. The method of claim 1, wherein step of analyzing further comprises:
sequentially analyzing a plurality of lines of imaging data; and
Terminating the step of analyzing if one of the plurality of lines indicates the presence of edge transitions in excess of a predetermined amount.
27. The method of claim 1, wherein the step of analyzing includes the step of sequentially analyzing vertical, horizontal, and diagonal rows of image information.
28. The method of claim 1, wherein the step of analyzing includes the step of sequentially analyzing parallel lines of image information.
29. An optical imaging device for reading machine readable indicia disposed on an object, the device comprising:
an imaging assembly configured to capture a frame of imaging data corresponding to an image of the object; and
a control system coupled to the imaging assembly, the processor being configured to,
apply at least one adaptive threshold to the imaging data to thereby detect edge transitions in the imaging data,
analyze the edge transitions to determine whether the imaging data includes machine readable indicia, and
decode the machine readable indicia if the step of analyzing indicates the presence of machine readable indicia.
30. The optical imaging device of claim 29, wherein the control system is further configured to change a mode of operation of the optical reader if the step of analyzing indicates the presence of machine readable indicia.
31. The optical reading device of claim 29, wherein the control system is further configured to:
analyze at least one line of image information from the captured frame; and
change a mode of operation of the optical reading device if the number of edge transitions exceeds a predetermined amount.
32. The optical reading device of claim 31, wherein the control system is configured to sequentially analyze a plurality of the lines of image information in the captured frame
33. The optical reading device of claim 29, wherein the control system sequentially analyzes vertical, horizontal, and diagonal rows of image information.
34. The optical reading device of claim 29, wherein the control system is further configured to:
analyze at least one line of imaging data; and
record the edge transitions of the at least one line of imaging data.
35. The optical reading device of claim 34, wherein the at least one adaptive threshold includes a maximum threshold and a minimum threshold, the control system being configured to count an edge transition if a portion of the at least one line exceeds the maximum threshold or falls below the minimum threshold.
36. The optical reading device of claim 35, wherein a value of the maximum threshold and a value of the minimum threshold differ by a predetermined amount.
37. The optical reading device of claim 29, wherein the control system is further configured to:
analyze at least one line of image information from the captured frame; and
change the mode of operation of the optical reading device if the number of the edge transitions changes by a predetermined amount.
38. The optical reading device of claim 36, wherein the control system is further configured to sequentially analyze a plurality of lines of the captured frame.
39. The optical reading device of claim 36, wherein the control system is further configured to terminate detection of the edge transitions when the number of the edge transitions changes by a predetermined amount.
40. The optical reading device of claim 29, wherein the control system is configured to sequentially analyze vertical, horizontal, and diagonal rows of image information.
41. The optical reading device of claim 29, wherein the control system is further configured to analyze at least one line of imaging data and record the edge transitions based on crossings of the line of imaging data of the at least one adaptive threshold.
42. The optical reading device of claim 41, wherein the at least one adaptive threshold includes a maximum threshold and a minimum threshold, and the control system is configured to record the an edge transition if a portion of the line of imaging data exceeds the maximum threshold or falls below the minimum threshold.
43. The optical reading device of claim 42, wherein the maximum threshold and the minimum threshold are separated by a predetermined range of values such that edge transitions are not recorded from imaging data corresponding to a substantially uniform gray scale image.
US10/441,473 1999-11-02 2003-05-20 Indicia sensor system for optical reader Abandoned US20030218069A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/441,473 US20030218069A1 (en) 1999-11-02 2003-05-20 Indicia sensor system for optical reader

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/432,282 US6585159B1 (en) 1999-11-02 1999-11-02 Indicia sensor system for optical reader
US10/441,473 US20030218069A1 (en) 1999-11-02 2003-05-20 Indicia sensor system for optical reader

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/432,282 Continuation US6585159B1 (en) 1999-11-02 1999-11-02 Indicia sensor system for optical reader

Publications (1)

Publication Number Publication Date
US20030218069A1 true US20030218069A1 (en) 2003-11-27

Family

ID=23715498

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/432,282 Expired - Lifetime US6585159B1 (en) 1999-11-02 1999-11-02 Indicia sensor system for optical reader
US10/441,473 Abandoned US20030218069A1 (en) 1999-11-02 2003-05-20 Indicia sensor system for optical reader

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/432,282 Expired - Lifetime US6585159B1 (en) 1999-11-02 1999-11-02 Indicia sensor system for optical reader

Country Status (1)

Country Link
US (2) US6585159B1 (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040069855A1 (en) * 2002-10-15 2004-04-15 Mehul Patel Imaging bar code reader with moving beam simulation
WO2005124657A1 (en) 2004-06-18 2005-12-29 Valtion Teknillinen Tutkimuskeskus Method for detecting a code with the aid of a mobile station
US20080029602A1 (en) * 2006-08-03 2008-02-07 Nokia Corporation Method, Apparatus, and Computer Program Product for Providing a Camera Barcode Reader
US20080142602A1 (en) * 2000-11-24 2008-06-19 Knowles C Harry Laser illumination beam generation system employing despeckling of the laser beam using high-frequency modulation of the laser diode current and optical multiplexing of the component laser beams
US20080300062A1 (en) * 2004-06-04 2008-12-04 Mattel, Inc. Electronic Device for Enhancing an Interactive Experience with a Tangible Medium of Expression
US20090132386A1 (en) * 2006-03-23 2009-05-21 Jari Natunen Integrated system, device and use thereof
US20090127326A1 (en) * 2007-11-20 2009-05-21 Datalogic Scanning, Inc. Enhanced virtual scan line processing
US20090272812A1 (en) * 2007-03-16 2009-11-05 Marty William A Systems, devices, and methods for reading machine-readable characters and human-readable characters
US7651028B2 (en) 2000-11-24 2010-01-26 Metrologic Instruments, Inc. Intelligent system for automatically recognizing objects at a point of sale (POS) station by omni-directional imaging of the objects using a complex of coplanar illumination and imaging subsystems
US7654461B2 (en) 2003-11-13 2010-02-02 Metrologic Instruments, Inc, Automatically-triggered digital video imaging based code symbol reading system employing illumination and imaging subsystems controlled in response to real-time image quality analysis
US7681799B2 (en) 2003-11-13 2010-03-23 Metrologic Instruments, Inc. Method of reading code symbols using a digital image capturing and processing system employing a micro-computing platform with an event-driven multi-tier software architecture
US20100074920A1 (en) * 2006-10-26 2010-03-25 Glykos Finland Oy Peptide vaccine for influenza virus
US7708205B2 (en) 2003-11-13 2010-05-04 Metrologic Instruments, Inc. Digital image capture and processing system employing multi-layer software-based system architecture permitting modification and/or extension of system features and functions by way of third party code plug-ins
US20100177363A1 (en) * 2009-01-09 2010-07-15 Datalogic Scanning, Inc. Prioritized virtual scan line processing
US7841533B2 (en) 2003-11-13 2010-11-30 Metrologic Instruments, Inc. Method of capturing and processing digital images of an object within the field of view (FOV) of a hand-supportable digitial image capture and processing system
US20110026762A1 (en) * 2009-07-31 2011-02-03 Seiko Epson Corporation Marker processing method, marker processing device, marker, object having a marker, and marker processing program
US8002173B2 (en) 2006-07-11 2011-08-23 Intermec Ip Corp. Automatic data collection device, method and article
US8120461B2 (en) 2006-04-03 2012-02-21 Intermec Ip Corp. Automatic data collection device, method and article
US8199689B2 (en) 2005-09-21 2012-06-12 Intermec Ip Corp. Stochastic communication protocol method and system for radio frequency identification (RFID) tags based on coalition formation, such as for tag-to-tag communication
USD723563S1 (en) * 2012-12-21 2015-03-03 Datalogic Ip Tech S.R.L. Reader of coded information
US9710878B2 (en) * 2014-12-17 2017-07-18 Microsoft Technoloy Licensing, LLC Low power DMA labeling
US10181175B2 (en) 2014-12-17 2019-01-15 Microsoft Technology Licensing, Llc Low power DMA snoop and skip
DE102018209032A1 (en) * 2018-06-07 2019-12-12 Continental Reifen Deutschland Gmbh Apparatus and method for automatically determining the position of the bar code of a tire or green tire and a corresponding computer program, computer program product and a computer-readable data carrier
US10949634B2 (en) 2005-06-03 2021-03-16 Hand Held Products, Inc. Apparatus having hybrid monochrome and color image sensor array
US11317050B2 (en) 2005-03-11 2022-04-26 Hand Held Products, Inc. Image reader comprising CMOS based image sensor array

Families Citing this family (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6631842B1 (en) 2000-06-07 2003-10-14 Metrologic Instruments, Inc. Method of and system for producing images of objects using planar laser illumination beams and image detection arrays
US7387253B1 (en) 1996-09-03 2008-06-17 Hand Held Products, Inc. Optical reader system comprising local host processor and optical reader
US7304670B1 (en) 1997-03-28 2007-12-04 Hand Held Products, Inc. Method and apparatus for compensating for fixed pattern noise in an imaging system
US6777891B2 (en) * 1997-08-26 2004-08-17 Color Kinetics, Incorporated Methods and apparatus for controlling devices in a networked lighting system
US7584893B2 (en) 1998-03-24 2009-09-08 Metrologic Instruments, Inc. Tunnel-type digital imaging system for use within retail shopping environments such as supermarkets
US6585159B1 (en) * 1999-11-02 2003-07-01 Welch Allyn Data Collection, Inc. Indicia sensor system for optical reader
JP4110258B2 (en) * 2000-07-07 2008-07-02 日立オムロンターミナルソリューションズ株式会社 Form handling device
US7954719B2 (en) 2000-11-24 2011-06-07 Metrologic Instruments, Inc. Tunnel-type digital imaging-based self-checkout system for use in retail point-of-sale environments
US7268924B2 (en) 2001-01-22 2007-09-11 Hand Held Products, Inc. Optical reader having reduced parameter determination delay
EP2249284B1 (en) 2001-01-22 2014-03-05 Hand Held Products, Inc. Optical reader having partial frame operating mode
US7331523B2 (en) 2001-07-13 2008-02-19 Hand Held Products, Inc. Adaptive optical image reader
US7494064B2 (en) * 2001-12-28 2009-02-24 Symbol Technologies, Inc. ASIC for supporting multiple functions of a portable data collection device
US7748620B2 (en) 2002-01-11 2010-07-06 Hand Held Products, Inc. Transaction terminal including imaging module
US20030222147A1 (en) 2002-06-04 2003-12-04 Hand Held Products, Inc. Optical reader having a plurality of imaging modules
US8596542B2 (en) 2002-06-04 2013-12-03 Hand Held Products, Inc. Apparatus operative for capture of image data
US7086596B2 (en) 2003-01-09 2006-08-08 Hand Held Products, Inc. Decoder board for an optical reader utilizing a plurality of imaging formats
US7219843B2 (en) * 2002-06-04 2007-05-22 Hand Held Products, Inc. Optical reader having a plurality of imaging modules
CA2453766A1 (en) * 2003-01-02 2004-07-02 Societe Des Loteries Video Du Quebec, Inc. Bar code reader stand
US7637430B2 (en) * 2003-05-12 2009-12-29 Hand Held Products, Inc. Picture taking optical reader
CA2438951A1 (en) * 2003-08-29 2005-02-28 Bob Richards Feeder system and method
US20050150959A1 (en) * 2004-01-09 2005-07-14 John Izzo Optical reader
EP2420954B8 (en) * 2004-12-01 2017-04-12 Datalogic USA, Inc. Data reader with automatic exposure adjustment and methods of operating a data reader
US7237721B2 (en) * 2005-05-24 2007-07-03 Nokia Corporation Image processing for pattern detection
US7686216B2 (en) 2006-06-13 2010-03-30 Hand Held Products, Inc. Method and apparatus for uniquely associating a bar code reading terminal to a cash register in a retail store network
KR100818988B1 (en) * 2006-09-05 2008-04-04 삼성전자주식회사 Method and apparatus for processing image signal
US7775431B2 (en) * 2007-01-17 2010-08-17 Metrologic Instruments, Inc. Method of and apparatus for shipping, tracking and delivering a shipment of packages employing the capture of shipping document images and recognition-processing thereof initiated from the point of shipment pickup and completed while the shipment is being transported to its first scanning point to facilitate early customs clearance processing and shorten the delivery time of packages to point of destination
US7852519B2 (en) 2007-02-05 2010-12-14 Hand Held Products, Inc. Dual-tasking decoder for improved symbol reading
US8496177B2 (en) 2007-06-28 2013-07-30 Hand Held Products, Inc. Bar code reading terminal with video capturing mode
US8570393B2 (en) 2007-11-30 2013-10-29 Cognex Corporation System and method for processing image data relative to a focus of attention within the overall image
US8596541B2 (en) * 2008-02-22 2013-12-03 Qualcomm Incorporated Image capture device with integrated barcode scanning
US8366004B2 (en) * 2008-02-22 2013-02-05 Qualcomm Incorporated Barcode detection based on morphological operations
US8628015B2 (en) 2008-10-31 2014-01-14 Hand Held Products, Inc. Indicia reading terminal including frame quality evaluation processing
US9189670B2 (en) * 2009-02-11 2015-11-17 Cognex Corporation System and method for capturing and detecting symbology features and parameters
US8587595B2 (en) 2009-10-01 2013-11-19 Hand Held Products, Inc. Low power multi-core decoder system and method
US8561903B2 (en) 2011-01-31 2013-10-22 Hand Held Products, Inc. System operative to adaptively select an image sensor for decodable indicia reading
US8608071B2 (en) 2011-10-17 2013-12-17 Honeywell Scanning And Mobility Optical indicia reading terminal with two image sensors
JP6108213B2 (en) * 2013-01-31 2017-04-05 ブラザー工業株式会社 Reader
USD719574S1 (en) * 2014-01-09 2014-12-16 Datalogic Ip Tech S.R.L. Portable terminal

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5268580A (en) * 1992-09-02 1993-12-07 Ncr Corporation Bar code enhancement system and method for vision scanners
US5550366A (en) * 1994-06-20 1996-08-27 Roustaei; Alexander Optical scanner with automatic activation
US5748804A (en) * 1992-05-14 1998-05-05 United Parcel Service Of America, Inc. Method and apparatus for processing images with symbols with dense edges
US5902987A (en) * 1997-02-20 1999-05-11 Intermec Ip Corporation Apparatus and method of rapidly locating edges of machine-readable symbols or other linear images
US6064763A (en) * 1996-07-26 2000-05-16 Intermec Ip Corporation Time-efficient method of analyzing imaged input data to locate two-dimensional machine-readable symbols or other linear images therein
US6236735B1 (en) * 1995-04-10 2001-05-22 United Parcel Service Of America, Inc. Two camera system for locating and storing indicia on conveyed items
US6298175B1 (en) * 1997-10-17 2001-10-02 Welch Allyn Data Collection, Inc. Object sensor system comprising controlled light source
US6366696B1 (en) * 1996-12-20 2002-04-02 Ncr Corporation Visual bar code recognition method
US6585159B1 (en) * 1999-11-02 2003-07-01 Welch Allyn Data Collection, Inc. Indicia sensor system for optical reader

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5288976A (en) * 1991-07-15 1994-02-22 Nynex Corporation Bar code use in information, transactional and other system and service applications
US5818023A (en) * 1996-03-05 1998-10-06 Metanetics Corporation Portable ID card verification apparatus
US6073849A (en) * 1996-11-01 2000-06-13 Psc Scanning, Inc. Electronic edge detection system using a second derivative signal processor
US6005683A (en) * 1997-12-05 1999-12-21 Hewlett-Packard Company Document edge detection by linear image sensor
US6176429B1 (en) * 1998-07-17 2001-01-23 Psc Scanning, Inc. Optical reader with selectable processing characteristics for reading data in multiple formats
US6206287B1 (en) * 1998-08-21 2001-03-27 Eastman Kodak Company Film optical image bar code reader

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5748804A (en) * 1992-05-14 1998-05-05 United Parcel Service Of America, Inc. Method and apparatus for processing images with symbols with dense edges
US5268580A (en) * 1992-09-02 1993-12-07 Ncr Corporation Bar code enhancement system and method for vision scanners
US5550366A (en) * 1994-06-20 1996-08-27 Roustaei; Alexander Optical scanner with automatic activation
US6236735B1 (en) * 1995-04-10 2001-05-22 United Parcel Service Of America, Inc. Two camera system for locating and storing indicia on conveyed items
US6064763A (en) * 1996-07-26 2000-05-16 Intermec Ip Corporation Time-efficient method of analyzing imaged input data to locate two-dimensional machine-readable symbols or other linear images therein
US6366696B1 (en) * 1996-12-20 2002-04-02 Ncr Corporation Visual bar code recognition method
US5902987A (en) * 1997-02-20 1999-05-11 Intermec Ip Corporation Apparatus and method of rapidly locating edges of machine-readable symbols or other linear images
US6298175B1 (en) * 1997-10-17 2001-10-02 Welch Allyn Data Collection, Inc. Object sensor system comprising controlled light source
US6585159B1 (en) * 1999-11-02 2003-07-01 Welch Allyn Data Collection, Inc. Indicia sensor system for optical reader

Cited By (98)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7806335B2 (en) 2000-11-24 2010-10-05 Metrologic Instruments, Inc. Digital image capturing and processing system for automatically recognizing objects in a POS environment
US7731091B2 (en) 2000-11-24 2010-06-08 Metrologic Instruments, Inc. Digital image capturing and processing system employing automatic object detection and spectral-mixing based illumination techniques
US7762465B2 (en) 2000-11-24 2010-07-27 Metrologic Instruments, Inc. Device for optically multiplexing a laser beam
US8042740B2 (en) 2000-11-24 2011-10-25 Metrologic Instruments, Inc. Method of reading bar code symbols on objects at a point-of-sale station by passing said objects through a complex of stationary coplanar illumination and imaging planes projected into a 3D imaging volume
US7651028B2 (en) 2000-11-24 2010-01-26 Metrologic Instruments, Inc. Intelligent system for automatically recognizing objects at a point of sale (POS) station by omni-directional imaging of the objects using a complex of coplanar illumination and imaging subsystems
US20080142602A1 (en) * 2000-11-24 2008-06-19 Knowles C Harry Laser illumination beam generation system employing despeckling of the laser beam using high-frequency modulation of the laser diode current and optical multiplexing of the component laser beams
US7905413B2 (en) 2000-11-24 2011-03-15 Metrologic Instruments, Inc. Digital image capturing and processing system employing a plurality of coplanar illumination and imaging subsystems for digitally imaging objects in a 3D imaging volume, and a globally-deployed object motion detection subsystem for automatically detecting and analyzing the motion of objects passing through said 3-D imaging volume
US7878407B2 (en) 2000-11-24 2011-02-01 Metrologic Instruments, Inc. POS-based digital image capturing and processing system employing automatic object motion detection and spectral-mixing based illumination techniques
US7819326B2 (en) 2000-11-24 2010-10-26 Metrologic Instruments, Inc. Network of digital image capturing systems installed at retail POS-based stations and serviced by a remote image processing server in communication therewith
US7806336B2 (en) 2000-11-24 2010-10-05 Metrologic Instruments, Inc. Laser beam generation system employing a laser diode and high-frequency modulation circuitry mounted on a flexible circuit
US8172141B2 (en) 2000-11-24 2012-05-08 Metrologic Instruments, Inc. Laser beam despeckling devices
US7770796B2 (en) 2000-11-24 2010-08-10 Metrologic Instruments, Inc. Device for producing a laser beam of reduced coherency using high-frequency modulation of the laser diode current and optical multiplexing of the output laser beam
US7815113B2 (en) 2000-11-24 2010-10-19 Metrologic Instruments, Inc. Method of and system for returning a consumer product in a retail environment so as to prevent or reduce employee theft, as well as provide greater accountability for returned merchandise in retail store environments
US7658330B2 (en) 2000-11-24 2010-02-09 Metrologic Instruments, Inc. Automatic POS-based digital image capturing and processing system employing object motion controlled area-type illumination and imaging operations
US7661597B2 (en) 2000-11-24 2010-02-16 Metrologic Instruments, Inc. Coplanar laser illumination and imaging subsystem employing spectral-mixing and despeckling of laser illumination
US7661595B2 (en) 2000-11-24 2010-02-16 Metrologic Instruments, Inc. Digital image capturing and processing system employing a plurality of area-type illuminating and imaging stations projecting a plurality of coextensive area-type illumination and imaging zones into a 3D imaging volume, and controlling operations therewithin using
US7665665B2 (en) 2000-11-24 2010-02-23 Metrologic Instruments, Inc. Digital illumination and imaging subsystem employing despeckling mechanism employing high-frequency modulation of laser diode drive current and optical beam multiplexing techniques
US7673802B2 (en) 2000-11-24 2010-03-09 Metrologic Instruments, Inc. Automatic POS-based digital image capturing and processing system employing a plurality of area-type illumination and imaging zones intersecting within the 3D imaging volume of the system
US7793841B2 (en) 2000-11-24 2010-09-14 Metrologic Instruments, Inc. Laser illumination beam generation system employing despeckling of the laser beam using high-frequency modulation of the laser diode current and optical multiplexing of the component laser beams
US7784695B2 (en) 2000-11-24 2010-08-31 Metrologic Instruments, Inc. Planar laser illumination module (PLIM) employing high-frequency modulation (HFM) of the laser drive currents and optical multplexing of the output laser beams
US7775436B2 (en) 2000-11-24 2010-08-17 Metrologic Instruments, Inc. Method of driving a plurality of visible and invisible LEDs so as to produce an illumination beam having a dynamically managed ratio of visible to invisible (IR) spectral energy/power during object illumination and imaging operations
US6866198B2 (en) * 2002-10-15 2005-03-15 Symbol Technologies, Inc. Imaging bar code reader with moving beam simulation
US20040069855A1 (en) * 2002-10-15 2004-04-15 Mehul Patel Imaging bar code reader with moving beam simulation
US7922089B2 (en) 2003-11-13 2011-04-12 Metrologic Instruments, Inc. Hand-supportable digital image capture and processing system employing automatic object presence detection to control automatic generation of a linear targeting illumination beam within the field of view (FOV), and manual trigger switching to initiate illumination
US8047438B2 (en) 2003-11-13 2011-11-01 Metrologic Instruments, Inc. Digital image capture and processing system employing an image formation and detection subsystem having an area-type image detection array supporting periodic occurrance of snap-shot type image acquisition cycles at a high-repetition rate during object illumination
US7735737B2 (en) 2003-11-13 2010-06-15 Metrologic Instruments, Inc. Automatically-triggered digital video-imaging based code symbol reading system supporting ambient illumination mode automatically selected by adaptive control process
US7712666B2 (en) 2003-11-13 2010-05-11 Metrologic Instruments, Inc. Automatically-triggered digital video-imaging based code symbol reading system supporting dynamically controlled object illumination and digital video-imaging operations
US7770798B2 (en) 2003-11-13 2010-08-10 Metrologic Instruments, Inc. Automatically-triggered digital video-imaging based code symbol reading system for use in a point-of-sale (POS) environment
US7708205B2 (en) 2003-11-13 2010-05-04 Metrologic Instruments, Inc. Digital image capture and processing system employing multi-layer software-based system architecture permitting modification and/or extension of system features and functions by way of third party code plug-ins
US9785811B2 (en) 2003-11-13 2017-10-10 Metrologic Instruments, Inc. Image capture and processing system supporting a multi-tier modular software architecture
US7789309B2 (en) 2003-11-13 2010-09-07 Metrologic Instruments, Inc. Automatic digital video-imaging based code symbol reading system employing illumination and imaging subsystems controlled within a control loop maintained as long as a code symbol has not been successfully read and the object is detected in the field of view of the system
US7681799B2 (en) 2003-11-13 2010-03-23 Metrologic Instruments, Inc. Method of reading code symbols using a digital image capturing and processing system employing a micro-computing platform with an event-driven multi-tier software architecture
US7654461B2 (en) 2003-11-13 2010-02-02 Metrologic Instruments, Inc, Automatically-triggered digital video imaging based code symbol reading system employing illumination and imaging subsystems controlled in response to real-time image quality analysis
US9355288B2 (en) 2003-11-13 2016-05-31 Metrologic Instruments, Inc. Image capture and processing system supporting a multi-tier modular software architecture
US9104930B2 (en) 2003-11-13 2015-08-11 Metrologic Instruments, Inc. Code symbol reading system
US7815121B2 (en) 2003-11-13 2010-10-19 Metrologic Instruments, Inc. Method of modifying and/or extending the standard features and functions of a digital image capture and processing system
US8844822B2 (en) 2003-11-13 2014-09-30 Metrologic Instruments, Inc. Image capture and processing system supporting a multi-tier modular software architecture
US7841533B2 (en) 2003-11-13 2010-11-30 Metrologic Instruments, Inc. Method of capturing and processing digital images of an object within the field of view (FOV) of a hand-supportable digitial image capture and processing system
US7845559B2 (en) 2003-11-13 2010-12-07 Metrologic Instruments, Inc. Hand-supportable digital image capture and processing system employing visible targeting illumination beam projected from an array of visible light sources on the rear surface of a printed circuit (PC) board having a light transmission aperture, and reflected off multiple folding mirrors and projected through the light transmission aperture into a central portion of the field of view of said system
US7845561B2 (en) 2003-11-13 2010-12-07 Metrologic Instruments, Inc. Digital image capture and processing system supporting a periodic snapshot mode of operation wherein during each image acquisition cycle, the rows of image detection elements in the image detection array are exposed simultaneously to illumination
US7845563B2 (en) 2003-11-13 2010-12-07 Metrologic Instruments, Inc. Digital image capture and processing system employing an illumination subassembly mounted about a light transmission aperture, and a field of view folding mirror disposed beneath the light transmission aperture
US7854384B2 (en) 2003-11-13 2010-12-21 Metrologic Instruments, Inc. Digital image capture and processing engine employing optical waveguide technology for collecting and guiding LED-based illumination during object illumination and image capture modes of operation
US8479992B2 (en) 2003-11-13 2013-07-09 Metrologic Instruments, Inc. Optical code symbol reading system employing an acoustic-waveguide structure for coupling sonic energy, produced from an electro-transducer, to sound wave ports formed in the system housing
US7861936B2 (en) 2003-11-13 2011-01-04 Metrologic Instruments, Inc. digital image capturing and processing system allowing third-parties to extend the features and functions of said system, and modify the standard behavior thereof without permanently modifying the standard features and functions thereof
US8366005B2 (en) 2003-11-13 2013-02-05 Metrologic Instruments, Inc. Hand-supportable digital image capture and processing system supporting a multi-tier modular software architecture
US8317105B2 (en) 2003-11-13 2012-11-27 Metrologic Instruments, Inc. Optical scanning system having an extended programming mode and method of unlocking restricted extended classes of features and functionalities embodied therewithin
US7900839B2 (en) 2003-11-13 2011-03-08 Metrologic Instruments, Inc. Hand-supportable digital image capture and processing system having a printed circuit board with a light transmission aperture, through which the field of view (FOV) of the image detection array and visible targeting illumination beam are projected using a FOV-folding mirror
US8157174B2 (en) 2003-11-13 2012-04-17 Metrologic Instruments, Inc. Digital image capture and processing system employing an image formation and detection system having an area-type image detection array supporting single snap-shot and periodic snap-shot modes of image acquisition during object illumination and imaging operations
US8157175B2 (en) 2003-11-13 2012-04-17 Metrologic Instruments, Inc. Digital image capture and processing system supporting a presentation mode of system operation which employs a combination of video and snapshot modes of image detection array operation during a single cycle of system operation
US8132731B2 (en) 2003-11-13 2012-03-13 Metrologic Instruments, Inc. Digital image capture and processing system having a printed circuit (PC) board with a light transmission aperture, wherein an image detection array is mounted on the rear side of said PC board, and a linear array of light emitting diodes (LEDS) is mounted on the front surface of said PC board, and aligned with an illumination-focusing lens structure integrated within said imaging window
US7950583B2 (en) 2003-11-13 2011-05-31 Metrologic Instruments, Inc Automatic digital video imaging based code symbol reading system employing an automatic object motion controlled illumination subsystem
US7967209B2 (en) 2003-11-13 2011-06-28 Metrologic Instruments, Inc. Method of blocking a portion of illumination rays generated by a countertop-supported digital imaging system, and preventing illumination rays from striking the eyes of the system operator or nearby consumers during operation of said countertop-supported digital image capture and processing system installed at a retail point of sale (POS) station
US7980471B2 (en) 2003-11-13 2011-07-19 Metrologic Instruments, Inc. Method of unlocking restricted extended classes of features and functionalities embodied within a digital image capture and processing system by reading feature/functionality-unlocking type code symbols
US7988053B2 (en) 2003-11-13 2011-08-02 Metrologic Instruments, Inc. Digital image capture and processing system employing an image formation and detection subsystem having image formation optics providing a field of view (FOV) on an area-type image detection array, and a multi-mode illumination subsystem having near and far field LED-based illumination arrays for illuminating near and far field portions of said FOV
US8100331B2 (en) 2003-11-13 2012-01-24 Metrologic Instruments, Inc. Digital image capture and processing system having a printed circuit (PC) board with light transmission aperture, wherein first and second field of view (FOV) folding mirrors project the FOV of a digital image detection array on the rear surface of said PC board, through said light transmission aperture
US7997489B2 (en) 2003-11-13 2011-08-16 Metrologic Instruments, Inc. Countertop-based digital image capture and processing system having an illumination subsystem employing a single array of LEDs disposed behind an illumination focusing lens structure integrated within the imaging window, for generating a field of visible illumination highly confined below the field
US8087588B2 (en) 2003-11-13 2012-01-03 Metrologic Instruments, Inc. Digital image capture and processing system having a single printed circuit (PC) board with a light transmission aperture, wherein a first linear array of visible light emitting diodes (LEDs) are mounted on the rear side of the PC board for producing a linear targeting illumination beam, and wherein a second linear array of visible LEDs are mounted on the front side of said PC board for producing a field of visible illumination within the field of view (FOV) of the system
US8052057B2 (en) 2003-11-13 2011-11-08 Metrologic Instruments, Inc. Method of programming the system configuration parameters of a digital image capture and processing system during the implementation of its communication interface with a host system without reading programming-type bar code symbols
US8011585B2 (en) 2003-11-13 2011-09-06 Metrologic Instruments, Inc. Digital image capture and processing system employing a linear LED-based illumination array mounted behind an illumination-focusing lens component integrated within the imaging window of the system
US8038538B2 (en) 2004-06-04 2011-10-18 Mattel, Inc. Electronic device for enhancing an interactive experience with a tangible medium of expression
US20080300062A1 (en) * 2004-06-04 2008-12-04 Mattel, Inc. Electronic Device for Enhancing an Interactive Experience with a Tangible Medium of Expression
US20070183652A1 (en) * 2004-06-18 2007-08-09 Valtion Teknillinen Tutkimuskeskus Method for detecting a code with the aid of a mobile station
WO2005124657A1 (en) 2004-06-18 2005-12-29 Valtion Teknillinen Tutkimuskeskus Method for detecting a code with the aid of a mobile station
US11863897B2 (en) 2005-03-11 2024-01-02 Hand Held Products, Inc. Image reader comprising CMOS based image sensor array
US11323649B2 (en) 2005-03-11 2022-05-03 Hand Held Products, Inc. Image reader comprising CMOS based image sensor array
US11317050B2 (en) 2005-03-11 2022-04-26 Hand Held Products, Inc. Image reader comprising CMOS based image sensor array
US11323650B2 (en) 2005-03-11 2022-05-03 Hand Held Products, Inc. Image reader comprising CMOS based image sensor array
US11625550B2 (en) 2005-06-03 2023-04-11 Hand Held Products, Inc. Apparatus having hybrid monochrome and color image sensor array
US10949634B2 (en) 2005-06-03 2021-03-16 Hand Held Products, Inc. Apparatus having hybrid monochrome and color image sensor array
US11238251B2 (en) 2005-06-03 2022-02-01 Hand Held Products, Inc. Apparatus having hybrid monochrome and color image sensor array
US11238252B2 (en) 2005-06-03 2022-02-01 Hand Held Products, Inc. Apparatus having hybrid monochrome and color image sensor array
US11604933B2 (en) 2005-06-03 2023-03-14 Hand Held Products, Inc. Apparatus having hybrid monochrome and color image sensor array
US8488510B2 (en) 2005-09-21 2013-07-16 Intermec Ip Corp. Stochastic communication protocol method and system for radio frequency identification (RFID) tags based on coalition formation, such as for tag-to-tag communication
US8199689B2 (en) 2005-09-21 2012-06-12 Intermec Ip Corp. Stochastic communication protocol method and system for radio frequency identification (RFID) tags based on coalition formation, such as for tag-to-tag communication
US8010413B2 (en) 2006-03-23 2011-08-30 Jari Natunen Method, system, and medium for calculating an emissions allowance
US20090132386A1 (en) * 2006-03-23 2009-05-21 Jari Natunen Integrated system, device and use thereof
US8120461B2 (en) 2006-04-03 2012-02-21 Intermec Ip Corp. Automatic data collection device, method and article
US8002173B2 (en) 2006-07-11 2011-08-23 Intermec Ip Corp. Automatic data collection device, method and article
US20080029602A1 (en) * 2006-08-03 2008-02-07 Nokia Corporation Method, Apparatus, and Computer Program Product for Providing a Camera Barcode Reader
US7946491B2 (en) * 2006-08-03 2011-05-24 Nokia Corporation Method, apparatus, and computer program product for providing a camera barcode reader
US20090292615A1 (en) * 2006-09-28 2009-11-26 Jari Natunen Integrated system, device and use thereof
US20100074920A1 (en) * 2006-10-26 2010-03-25 Glykos Finland Oy Peptide vaccine for influenza virus
US20110191867A1 (en) * 2006-10-26 2011-08-04 Glykos Finland Oy Peptide vaccine for influenza virus
US7857220B2 (en) * 2007-03-16 2010-12-28 Intermac Ip Corp. Systems, devices, and methods for reading machine-readable characters and human-readable characters
US20090272812A1 (en) * 2007-03-16 2009-11-05 Marty William A Systems, devices, and methods for reading machine-readable characters and human-readable characters
US20090127326A1 (en) * 2007-11-20 2009-05-21 Datalogic Scanning, Inc. Enhanced virtual scan line processing
US8172145B2 (en) 2007-11-20 2012-05-08 Datalogic ADC, Inc. Enhanced virtual scan line processing
US8459556B2 (en) 2009-01-09 2013-06-11 Datalogic ADC, Inc. Prioritized virtual scan line processing
US20100177363A1 (en) * 2009-01-09 2010-07-15 Datalogic Scanning, Inc. Prioritized virtual scan line processing
US20130313330A1 (en) * 2009-07-31 2013-11-28 Seiko Epson Corporation Marker processing method, marker processing device, marker, object having a marker, and marker processing program
US8971569B2 (en) * 2009-07-31 2015-03-03 Seiko Epson Corporation Marker processing method, marker processing device, marker, object having a marker, and marker processing program
US20140231521A1 (en) * 2009-07-31 2014-08-21 Seiko Epson Corporation Marker processing method, marker processing device, marker, object having a marker, and marker processing program
US8526668B2 (en) * 2009-07-31 2013-09-03 Seiko Epson Corporation Marker processing method, marker processing device, marker, object having a marker, and marker processing program
US20110026762A1 (en) * 2009-07-31 2011-02-03 Seiko Epson Corporation Marker processing method, marker processing device, marker, object having a marker, and marker processing program
USD723563S1 (en) * 2012-12-21 2015-03-03 Datalogic Ip Tech S.R.L. Reader of coded information
US10181175B2 (en) 2014-12-17 2019-01-15 Microsoft Technology Licensing, Llc Low power DMA snoop and skip
US9710878B2 (en) * 2014-12-17 2017-07-18 Microsoft Technoloy Licensing, LLC Low power DMA labeling
DE102018209032A1 (en) * 2018-06-07 2019-12-12 Continental Reifen Deutschland Gmbh Apparatus and method for automatically determining the position of the bar code of a tire or green tire and a corresponding computer program, computer program product and a computer-readable data carrier

Also Published As

Publication number Publication date
US6585159B1 (en) 2003-07-01

Similar Documents

Publication Publication Date Title
US6585159B1 (en) Indicia sensor system for optical reader
US6575367B1 (en) Image data binarization methods enabling optical reader to read fine print indicia
US6298175B1 (en) Object sensor system comprising controlled light source
US6637658B2 (en) Optical reader having partial frame operating mode
US5949052A (en) Object sensor system for stationary position optical reader
US6264105B1 (en) Bar code reader configured to read fine print barcode symbols
US9582696B2 (en) Imaging apparatus having imaging assembly
US8702000B2 (en) Reading apparatus having partial frame operating mode
US7434733B2 (en) Optical reader having partial frame operating mode
US6123261A (en) Optical scanner and image reader for reading images and decoding optical information including one and two dimensional symbologies at variable depth of field
EP2485177B1 (en) Auto-exposure method using continuous video frames under controlled illumination
EP1714232B1 (en) Adaptive optical image reader
US20060213997A1 (en) Method and apparatus for a cursor control device barcode reader
JP2000501209A (en) Sub-pixel data form reader
EP1916557B1 (en) Optical scanner and image reader for reading images and decoding optical information including one and two dimensional symbologies at variable depth of field
CA2577235A1 (en) Optical scanner and image reader for reading images and decoding optical information including one and two dimensional symbologies at variable depth of field

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION