US20030025926A1 - Image processing apparatus and image processing method - Google Patents

Image processing apparatus and image processing method Download PDF

Info

Publication number
US20030025926A1
US20030025926A1 US09/921,703 US92170301A US2003025926A1 US 20030025926 A1 US20030025926 A1 US 20030025926A1 US 92170301 A US92170301 A US 92170301A US 2003025926 A1 US2003025926 A1 US 2003025926A1
Authority
US
United States
Prior art keywords
image
data
discrimination
discrimination data
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/921,703
Inventor
Takahiro Fuchigami
Sunao Tabata
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Corp
Toshiba TEC Corp
Original Assignee
Toshiba TEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba TEC Corp filed Critical Toshiba TEC Corp
Assigned to TOSHIBA TEC KABUSHIKI KAISHA reassignment TOSHIBA TEC KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FUCHIGAMI, TAKAHIRO, TABATA, SUNAO
Priority to JP2001290212A priority Critical patent/JP2003051939A/en
Publication of US20030025926A1 publication Critical patent/US20030025926A1/en
Assigned to TOSHIBA TEC KABUSHIKI KAISHA, KABUSHIKI KAISHA TOSHIBA reassignment TOSHIBA TEC KABUSHIKI KAISHA ASSIGNMENT (ONE-HALF INTEREST) Assignors: TOSHIBA TEC KABUSHIKI KAISHA
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K15/00Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers
    • G06K15/02Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers using printers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • G02B6/10Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type
    • G02B6/12Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type of the integrated circuit kind
    • G02B6/12007Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type of the integrated circuit kind forming wavelength selective elements, e.g. multiplexer, demultiplexer
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • G02B6/10Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type
    • G02B6/12Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type of the integrated circuit kind
    • G02B6/122Basic optical elements, e.g. light-guiding paths
    • G02B6/125Bends, branchings or intersections
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K15/00Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers
    • G06K15/02Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers using printers
    • G06K15/18Conditioning data for presenting it to the physical printing elements
    • G06K15/1848Generation of the printable image
    • G06K15/1849Generation of the printable image using an intermediate representation, e.g. a list of graphical primitives
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K15/00Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers
    • G06K15/02Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers using printers
    • G06K15/18Conditioning data for presenting it to the physical printing elements
    • G06K15/1848Generation of the printable image
    • G06K15/1852Generation of the printable image involving combining data of different types
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/58Edge or detail enhancement; Noise or error suppression, e.g. colour misregistration correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/6072Colour correction or control adapting to different types of images, e.g. characters, graphs, black and white image portions
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/42Wire connectors; Manufacturing methods related thereto
    • H01L2224/47Structure, shape, material or disposition of the wire connectors after the connecting process
    • H01L2224/48Structure, shape, material or disposition of the wire connectors after the connecting process of an individual wire connector
    • H01L2224/4805Shape
    • H01L2224/4809Loop shape
    • H01L2224/48091Arched

Definitions

  • the present invention relates to an image processing apparatus and an image processing method, and more particularly to an image processing apparatus and an image processing method for outputting page description information with high quality, which has been formed by an image forming apparatus such as a personal computer.
  • page description information such as DTP data
  • an image output apparatus such as a printer
  • the data to be output is sent to an image output apparatus such as a printer or an MFP via a printer controller that receives the page description information and develops it to image data comprising pixel arrays of four colors, Cyan, Magenta, Yellow and Black, which represent ink amounts.
  • the printer controller not only performs development to image data but also produces discrimination data representative of attributes of respective pixels of the image data.
  • Jpn. Pat. Appln. KOKAI Publication No. 9-282472 discloses a technique wherein characters or given discrimination signals representing other attributes, as well as image data, are produced and transmitted, and the image data is subjected to an image process corresponding to the discrimination signals in an image output apparatus.
  • image data includes character information
  • an image process for example, for preventing degradation in quality of characters is performed and the data is output from the image output apparatus.
  • Jpn. Pat. Appln. KOKAI Publication No. 2000-270213 discloses a technique wherein generated discrimination data is converted to data representing correspondency with image data, thereby reducing the memory capacity needed for storing the discrimination data.
  • image development means i.e. printer controller
  • printer controller simultaneously produces image data and discrimination data on the basis of page description information, and the image data is output from an image forming apparatus capable of switching image processes according to the discrimination data.
  • an ordinary printer controller is unable to generate desired discrimination data, and thus the printer controller is limited to a specific type.
  • image data matching with characteristics of an output apparatus is not necessarily produced.
  • image data is produced in ordinary cases that the black character portion is written in black alone and there is no information on the color of the background.
  • this image data is output as such from a printer, if an error has occurred in print position between black ink and color ink, a colorless portion forms around the character and the image quality deteriorates.
  • the object of the present invention is to provide an image processing apparatus and an image processing method capable of performing a high-image-quality image process matching with output characteristics of a printer, even in a case where an ordinary printer controller is used.
  • the present invention provides an image processing apparatus comprising: image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data and the first discrimination data generated by the image development means; image data generating means for generating second image data by correcting the first image data generated by the image development means on the basis of the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
  • the invention provides an image processing apparatus comprising: image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data generated by the image development means; image data generating means for generating second image data by correcting the first image data generated by the image development means on the basis of the second discrimination data generated by the discrimination data generating means and the first discrimination data generated by the image development means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the first discrimination data generated by the image development means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
  • the invention provides an image processing apparatus comprising: image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data generated by the image development means or using the first image data and the first discrimination data; image processing means for subjecting the first image data generated by the image development means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means and the first discrimination data generated by the image development means; and image output means for outputting image data processed by the image processing means.
  • the invention provides an image processing apparatus comprising: input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the type of attributes set by the setting means and the first image data and the first discrimination data input by the input means; image data generating means for generating second image data by correcting the first image data input by the input means on the basis of the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
  • the invention provides an image processing apparatus comprising: input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data input by the input means; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; image data generating means for generating second image data by correcting the first image data input by the input means on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
  • the invention provides an image processing apparatus comprising: input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data input by the input means; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; image processing means for subjecting the first image data input by the input means to a predetermined process on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
  • the invention provides an image processing method for image-processing information described in a page description language, and outputting an image, comprising: generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of the information described in the page description language; generating second discrimination data different from the first discrimination data, using the generated first image data and first discrimination data; generating second image data by correcting the generated first image data on the basis of the generated second discrimination data; subjecting the generated second image data to a predetermined process on the basis of the generated second discrimination data; and outputting processed image data.
  • FIG. 1 is a block diagram showing the structure of an image processing apparatus according to a first embodiment of the present invention
  • FIG. 2 shows an example of the structure of image development means
  • FIG. 3 shows an example of the structure of discrimination data generating means
  • FIG. 4 shows an example of the structure of an edge detection section in the discrimination data generating means
  • FIG. 5 shows an example of the structure of a color detection section in the discrimination data generating means
  • FIG. 6 shows an example of the structure of a synthetic determination section in the discrimination data generating means
  • FIG. 7 shows an example of conversion by a converter
  • FIG. 8 shows an example of the structure of image data generating means
  • FIG. 9A shows an example of first image data
  • FIG. 9B shows an example of second image data in a case where an output value of the first image data has been replaced
  • FIG. 10 is a view for describing a smoothing process
  • FIG. 11 shows an example of the structure of image processing means
  • FIG. 12 shows an example of a correction table
  • FIG. 13 shows an example of the correction table
  • FIG. 14 shows an example of the correction table
  • FIG. 15 shows an example of the correction table
  • FIG. 16 shows an example of the correction table
  • FIG. 17 shows an example of the correction table
  • FIG. 18 shows an example of the correction table
  • FIG. 19 shows an example of the correction table
  • FIG. 20 is a b lock diagram showing the structure of an image processing apparatus according to a second embodiment
  • FIG. 21 is a block diagram showing the structure of an image processing apparatus according to a third embodiment
  • FIG. 22 is a block diagram showing the structure of an image processing apparatus according to a fourth embodiment
  • FIG. 23 is a block diagram showing the structure of an image processing apparatus according to a fifth embodiment.
  • FIG. 24 is a block diagram showing the structure of an image processing apparatus according to a sixth embodiment.
  • FIG. 1 is a block diagram showing the structure of an image processing apparatus 1 according to a first embodiment of the present invention.
  • This image processing apparatus 1 is called a printer in usual cases.
  • the apparatus receives document data, etc. produced by a personal computer via a network, etc., generates image data comprising toner amount information, and transfers toner on paper, thus performing image formation.
  • the image processing apparatus 1 comprises image development means (controller unit) 11 , discrimination data generating means 12 , image data generating means 13 , image processing means 14 , and image output means (printer) 15 .
  • the image development means 11 receives DTP (Desk Top Publishing) data formed on a personal computer or document data of a word processor, etc. as page information described in a page description language (PDL).
  • DTP Desk Top Publishing
  • the image development means 11 develops the received data to first image data as bit map data and to first discrimination data representative of attributes of each pixel.
  • the page information contains characters as font data, figures as line description data or painted-out region data, and others as ordinary raster image data.
  • the page information is output as a print image, it is necessary to develop all data as the same bit map data.
  • the image processing apparatus may be constructed such that the image development means 11 is provided as an external element as a printer controller.
  • the discrimination data generating means 12 generates second discrimination data for each pixel, which is necessary for controlling the image processing means 14 , on the basis of the first image data and the first discrimination data.
  • the second discrimination data differs from the first discrimination data and corresponds to an image area discrimination signal that is commonly used in a copying machine, etc.
  • the second discrimination data can be generated from the scanner input image.
  • the image data generating means 13 corrects the first image data on the basis of the second discrimination data generated by the discrimination data generating means 12 , and thus generates second image data.
  • the correction of the image data in this context is effected by an over-print process, which is performed based on the fact that a white blank portion forms between a black line and a C, M, or Y color component background due to a print position error at the time of printing out, a trapping process, a character smoothing process, etc.
  • the image processing means 14 performs a process for emphasizing an image (in particular, a character) at the time of printing out.
  • General methods of the process are filtering, gamma correction, etc.
  • a filter coefficient or a gamma correction table is switched in accordance with the second discrimination data.
  • the image output means 15 uses output image data (corresponding to the ink amount of each color in the case of a printer) generated by the image processing means 14 , and transfers ink on a printing medium (paper, etc.).
  • FIG. 2 shows an example of the structure of the image development means 11 .
  • the image development means 11 comprises a CPU 21 , a RAM 22 and a page memory 23 .
  • the page information received by the image development means 11 is converted to first image data and first discrimination data by the CPU 21 , which is then developed in the page memory 23 and transmitted pixel by pixel.
  • FIG. 3 shows an example of the structure of the discrimination data generating means 12 .
  • the discrimination data generating means 12 comprises line buffers 31 a and 31 b , an edge detection section 32 , a color detection section 33 and a synthetic determination section 34 .
  • the first image data transmitted from the image development means 11 is input to the line buffer 31 a of the discrimination data generating means 12 .
  • the first image data is accumulated in the line buffer 31 a by several lines, thereby forming block data.
  • the first image data output from the line buffer 31 a is sent to the edge detection section 32 , and it is determined for each color component whether a center pixel (“pixel of interest”) of the block corresponds to an edge portion.
  • the first image data output from the line buffer 31 a is sent to the color detection section 33 , and it is determined based on the chroma whether the pixel of interest has an achromatic color or a chromatic color.
  • the first discrimination data transmitted from the image development means 11 is input to the line buffer 31 b of the discrimination data generating means 12 .
  • the line buffer 31 b is used for establishing synchronism with the first image data.
  • the synthetic determination section 34 outputs second discrimination data by performing synthetic determination on the basis of the edge detection result from the edge detection section 32 , the determination result from the color detection section 33 , and the first discrimination data synchronized by the line buffer 31 b.
  • FIG. 4 shows an example of the structure of the edge detection section 32 in the discrimination data generating means 12 .
  • the edge detection section 32 comprises Multipliers 41 a and 41 b , adders 42 a and 42 b , positive number generators 43 a and 43 b , an adder 44 and a comparator 45 .
  • the edge detection section 32 is provided for each of the color component image signals C, M, Y and K of the first image data input from the line buffer 31 a , and the edge detection is performed in parallel.
  • the multiplier 41 a multiplies a 3 ⁇ 3 matrix of the first image data with coefficients (edge detection operators) shown in FIG. 4 by symbol A.
  • the adder 42 a adds calculated values of the multiplier 41 a .
  • the positive number generator 43 a produces an absolute value of the value calculated by the adder 42 a.
  • the multiplier 41 b multiplies a 3 ⁇ 3 matrix of the first image data with coefficients (edge detection operators) shown in FIG. 4 by symbol B.
  • the adder 42 b adds calculated values of the multiplier 41 b .
  • the positive number generator 43 b produces an absolute value of the value calculated by the adder 42 b.
  • the adder 44 adds the two absolute values obtained by the absolute value generators 43 a and 43 b .
  • the comparator 45 compares the added value with a predetermined value, thereby determining the presence/absence of the edge.
  • the comparison result of the comparator 45 is output to the synthetic determination section 34 as an edge determination result EC, EM, EY, EK, in association with a color component image signal C, M, Y, K in the first image data input from the line buffer 31 a.
  • FIG. 5 shows an example of the structure of the color detection section 33 in the discrimination data generating means 12 .
  • the color detection section 33 comprises subtracters 51 a , 51 b and 51 c , positive number generators 52 a , 52 b and 52 c , a maximum value selector 53 , a comparator 54 , digitizers 55 a , 55 b , 55 c and 55 d , selectors 56 a , 56 b , 56 c and 56 d , AND gates 57 a , 57 b and 57 c , and a NOT gate 58 .
  • the subtracter 51 a calculates a difference in density between color components (C, Y) of image signals of the first image data input from the line buffer 31 a , and outputs the difference to the positive number generator 52 a .
  • the positive number generator 52 a produces an absolute value of the input density difference between the color components (C, Y), and outputs the absolute value to the maximum value selector 53 .
  • the subtracter 51 b calculates a difference in density between color components (C, M) of image signals of the first image data input from the line buffer 31 a , and outputs the difference to the positive number generator 52 b .
  • the positive number generator 52 b produces an absolute value of the input density difference between the color components (C, M), and outputs the absolute value to the maximum value selector 53 .
  • the subtracter 51 c calculates a difference in density between color components (M, Y) of image signals of the first image data input from the line buffer 31 a , and outputs the difference to the positive number generator 52 c .
  • the positive number generator 52 c produces an absolute value of the input density difference between the color components (M, Y), and outputs the absolute value to the maximum value selector 53 .
  • the maximum value selector 53 selects a maximum of the values input from the positive number generators 52 a , 52 b and 53 c , and outputs the maximum value to the comparator 54 .
  • the comparator 54 compares the input maximum value and a predetermined value, and determines whether the color is achromatic or chromatic.
  • the digitizer 55 a digitizes the density of the color component image signal C of the first image data input from the line buffer 31 a .
  • the digitizer 55 b digitizes the density of the color component image signal M of the first image data input from the line buffer 31 a .
  • the digitizer 55 c digitizes the density of the color component image signal Y of the first image data input from the line buffer 31 a .
  • the digitizer 55 d digitizes the density of the color component image signal K of the first image data input from the line buffer 31 a.
  • the digitized result represents which color component is effective in the synthetic determination section.
  • the digitized result of the image signal K i.e. the output of the digitizer 55 d
  • the digitized result of the image signal K is “1”
  • a black over-print process for incorporating a background density in the image data of the color component C, M, Y
  • An AND value between the digitized result of the image signal C, M, Y and an inverted value of the digitized result of the image signal K is obtained.
  • the digitized result of the digitizer 55 a and an inverted value of the digitized result of the digitizer 55 d are input to the AND gate 57 a to produce an AND value.
  • the digitized result of the digitizer 55 b and an inverted value of the digitized result of the digitizer 55 d are input to the AND gate 57 b to produce an AND value.
  • the digitized result of the digitizer 55 c and an inverted value of the digitized result of the digitizer 55 d are input to the AND gate 57 c to produce an AND value.
  • the selector 56 a receives the comparison result of the comparator 54 and the AND value of the AND gate 57 a , selects one of them, and outputs a select result SC.
  • the selector 56 b receives the comparison result of the comparator 54 and the AND value of the AND gate 57 b , selects one of them, and outputs a select result SM.
  • the selector 56 c receives the comparison result of the comparator 54 and the AND value of the AND gate 57 c , selects one of them, and outputs a select result SY.
  • the selector 56 d receives the comparison result of the comparator 54 , which has been inverted by the NOT gate 58 , and the digitized result of the digitizer 55 d , selects one of them, and outputs a select result SK.
  • FIG. 6 shows an example of the structure of the synthetic determination section 34 in the discrimination data generating means 12 .
  • the synthetic determination section 34 comprises converters 61 a , 61 b , 61 c and 61 d , and AND gates 62 a , 62 b , 62 c and 62 d.
  • Signals EC, EM, EY and EK input from the edge detection sections 32 associated with the image signals C, M, Y and K represent the edge detection results of C, M, Y and K.
  • Signals SC, SM, SY and SK input from the color detector 33 represent the color detection results of C, M, Y and K.
  • the converter 61 a receives the edge detection result EC from the edge detector 32 and the first discrimination data from the line buffer 31 b , and outputs desired converted discrimination data based on them.
  • the converter 61 b receives the edge detection result EM from the edge detector 32 and the first discrimination data from the line buffer 31 b , and outputs desired converted discrimination data based on them.
  • the converter 61 c receives the edge detection result EY from the edge detector 32 and the first discrimination data from the line buffer 31 b , and outputs desired converted discrimination data based on them.
  • the converter 61 d receives the edge detection result EK from the edge detector 32 and the first discrimination data from the line buffer 31 b , and outputs desired converted discrimination data based on them.
  • FIG. 7 shows an example of conversion by the converters 61 a , 61 b , 61 c and 61 d .
  • the first discrimination data is classified such that a character described as font data with a predetermined size or less is “TEXT”, an object described as line description data or painted-out data and a character other than “TEXT” are “GRAPHIC”, and an object other than “TEXT” and “GRAPHIC” is “IMAGE”.
  • the second discrimination data is output as “NEW-TEXT” (conversion result).
  • the first discrimination data is “IMAGE” and the edge detection result is “NON-EDGE”
  • the second discrimination data is output as “NEW-GRAPHIC” (conversion result).
  • the desired discrimination data (second discrimination data) output from the converter 61 a is input to the AND gate 62 a .
  • the AND gate 62 a produces second discrimination data DC as an AND value between the desired discrimination data input from the converter 61 a and the color detection result SC input from the color detection section 33 .
  • the desired discrimination data (second discrimination data) output from the converter 61 b is input to the AND gate 62 b .
  • the AND gate 62 b produces second discrimination data DM as an AND value between the desired discrimination data input from the converter 61 b and the color detection result SM input from the color detection section 33 .
  • the desired discrimination data (second discrimination data) output from the converter 61 c is input to the AND gate 62 c .
  • the AND gate 62 c produces second discrimination data DY as an AND value between the desired discrimination data input from the converter 61 c and the color detection result SY input from the color detection section 33 .
  • the desired discrimination data (second discrimination data) output from the converter 61 d is input to the AND gate 62 d .
  • the AND gate 62 d produces second discrimination data DK as an AND value between the desired discrimination data input from the converter 61 d and the color detection result SK input from the color detection section 33 .
  • FIG. 8 shows an example of the structure of the image data generating means 13 .
  • the image data generating means 13 comprises line buffers 71 a and 71 b , a background density averaging section 72 , a character density averaging section 73 , and a selector 74 .
  • the line buffer 71 a accumulates n-lines of the first image data output from the image development means 11 .
  • the line buffer 71 b accumulates n-lines of the second discrimination data output from the discrimination data generating means 12 .
  • the background density averaging section 72 calculates the average density of each of the color components C, M and Y as regards a pixel within n ⁇ n pixels around a pixel of interest, with respect to which the second discrimination data DK on the color component K is not “NEW-TEXT”.
  • the character density averaging section 73 calculates the average density of each of the color components C, M, Y and K within an area of m ⁇ m pixels (m ⁇ n) around the pixel of interest.
  • the selector 74 outputs second image data by properly replacing the pixel values in accordance with the second discrimination data on the pixel of interest.
  • the data C, M, Y of the pixel of interest shown in FIG. 9A is changed to the output value of the background density averaging section 72 as shown in FIG. 9B (over-print process or trapping process).
  • the first image data C, M, Y, K shown in FIG. 9A is replaced with the output value of the second image data C, M, Y, K shown in FIG. 9B.
  • FIG. 11 shows an example of the structure of the image processing means 14 .
  • the image processing means 14 comprises line buffers 101 a and 101 b , a filter section 102 , a gamma correction section 103 , and a screen processing section 104 .
  • the line buffer 101 a accumulates several lines of the second image data generated by the image data generating means 13 for the purpose of filter processing.
  • the line buffer 101 b outputs the second discrimination data on the pixel of interest (center pixel of an image matrix) in synchronism with the second image data.
  • the filter section 102 multiplies each pixel of the image matrix buffered by the line buffer 101 b with a predetermined coefficient, thus calculating the sum.
  • the filter section 102 changes the coefficient for multiplication in accordance with the second discrimination data output synchronously from the line buffer 101 b.
  • the gamma correction section 103 corrects each pixel of the second image data for each color component, using correction tables as shown in FIGS. 12 to 19 .
  • the gamma correction section 103 switches the correction table in accordance with the second discrimination data output synchronously from the line buffer 101 b.
  • a correction table shown in FIG. 12 relates to correction of color component C in a case where the second discrimination data is “NEW-TEXT”.
  • a correction table shown in FIG. 13 relates to correction of color component C in a case where the second discrimination data is not “NEW-TEXT”.
  • a correction table shown in FIG. 14 relates to correction of color component M in a case where the second discrimination data is “NEW-TEXT”.
  • a correction table shown in FIG. 15 relates to correction of color component M in a case where the second discrimination data is not “NEW-TEXT”.
  • a correction table shown in FIG. 16 relates to correction of color component Y in a case where the second discrimination data is “NEW-TEXT”.
  • a correction table shown in FIG. 17 relates to correction of color component Y in a case where the second discrimination data is not “NEW-TEXT”.
  • a correction table shown in FIG. 18 relates to correction of color component K in a case where the second discrimination data is “NEW-TEXT”.
  • a correction table shown in FIG. 19 relates to correction of color component K in a case where the second discrimination data is not “NEW-TEXT”.
  • the screen processing section 104 processes each pixel of the corrected second image data input from the gamma correction section 103 in accordance with the second discrimination data input synchronously from the line buffer 101 b , thereby outputting image data of each color component matching with the image output means 15 in the rear stage.
  • the processing is, for example, an error spreading process for converting image data of 8 bits per pixel (256 tone levels) to image data of 1 bit (2 tone levels).
  • the image output means 15 transfers the output image data from the screen processing section 104 onto printing medium (paper or the like).
  • the first discrimination data is generated from the image development means and the second discrimination data is generated from the discrimination data generating means, for example, in the following manner.
  • the image development means generates first discrimination data that discriminates whether each pixel is associated with a character or a line figure, and the discrimination data generating means generates second discrimination data that does not discriminate whether each pixel is associated with a character or a line figure, using the first discrimination data generated by the image development means.
  • the character is an object disposed in the first image data as font data.
  • the line figure is an object described by a straight line and a curve.
  • the image development means generates first discrimination data that does not discriminate whether each pixel is associated with a line figure or a plane figure, and the discrimination data generating means generates second discrimination data that discriminates whether each pixel is associated with a line figure or a plane figure, using the first discrimination data generated by the image development means.
  • the plane figure is an object, the entirety or each component of which is painted out with uniform density.
  • the image development means generates first discrimination data that does not discriminate whether each pixel is associated with a contour portion or an inside portion of a plane figure, and the discrimination data generating means generates second discrimination data that discriminates whether each pixel is associated with a contour portion or an inside portion of a plane figure, using the first discrimination data generated by the image development means.
  • the image development means generates first discrimination data that discriminates whether each pixel is associated with a plane figure or a tone image, and the discrimination data generating means generates second discrimination data that does not discriminate whether each pixel is associated with a plane figure or a tone image, using the first discrimination data generated by the image development means.
  • the image development means generates first discrimination data that discriminates that each pixel is associated with a tone image, and the discrimination data generating means generates second discrimination data that discriminates the magnitude of density variation in each pixel, using the first discrimination data generated by the image development means.
  • the first embodiment comprises the discrimination data generating means for generating the second discrimination data on the basis of the first image data and the first discrimination data generated from the page information described in the page description language, and the image data generating means for correcting the first image data on the basis of the second discrimination data and generating the second image data, thereby performing an image quality enhancing process matching with the output characteristics of the printer.
  • FIG. 20 shows the structure of an image processing apparatus 2 according to a second embodiment.
  • a discrimination data generating means 122 generates second discrimination data, without using first discrimination data generated by image development means 121 . Thereby, the independency of the first discrimination data and second discrimination data is enhanced, a greater degree of freedom is provided by the circuit configuration.
  • FIG. 21 shows the structure of an image processing apparatus 3 according to a third embodiment.
  • the image data generating means 123 of the image processing apparatus 2 shown in FIG. 20 is omitted. Since the image processing apparatus 3 of the third embodiment does not generate the second image data, the line memory, etc. are not needed and the image processing apparatus can be formed at low cost.
  • FIG. 22 shows the structure of an image processing apparatus 4 according to a fourth embodiment.
  • the controller unit (image development means 11 ) of the image processing apparatus 1 shown in FIG. 1 is omitted and it is provided as an external element.
  • interface means data input means 141 as interface with the external controller and discrimination type setting means 146 are provided.
  • the data input means 141 of the image processing apparatus 4 is, for example, an interface unit of a LAN (Local Area Network).
  • the discrimination type setting means 146 is a means for setting the type of the first discrimination data input by the external controller. Specification information of the external controller is input to the discrimination type setting means 146 , and the discrimination type setting means 146 is preset by the operation by a user, a manager, a designer, etc.
  • the discrimination types of first discrimination data described in connection with the first embodiment are “TEXT”, “GRAPHIC” and “IMAGE”, and the correspondency of the three discrimination types as shown in FIG. 7 is registered (set) by the discrimination type setting means 146 .
  • an external controller that generates any kind of discrimination data can be connected to the image processing apparatus 4 .
  • FIG. 23 shows the structure of an image processing apparatus 5 according to a fifth embodiment.
  • the controller unit image development means 121 of the image processing apparatus 2 shown in FIG. 20 is omitted and it is provided as an external element.
  • interface means data input means 151 as interface with the external controller and discrimination type setting means 156 are provided.
  • the data input means 151 of the image processing apparatus 5 is, for example, an interface unit of a LAN (Local Area Network).
  • the discrimination type setting means 156 is a means for setting the type of the first discrimination data input by the external controller. Specification information of the external controller is input to the discrimination type setting means 156 , and the discrimination type setting means 156 is preset by the operation by a user, a manager, a designer, etc.
  • the discrimination types of first discrimination data described in connection with the first embodiment are “TEXT”, “GRAPHIC” and “IMAGE”, and the correspondency of the three discrimination types as shown in FIG. 7 is registered (set) by the discrimination type setting means 156 .
  • an external controller that generates any kind of discrimination data can be connected to the image processing apparatus 5 .
  • FIG. 24 shows the structure of an image processing apparatus 6 according to a sixth embodiment.
  • the controller unit (image development means 131 ) of the image processing apparatus 3 shown in FIG. 21 is omitted and it is provided as an external element.
  • interface means data input means 161 as interface with the external controller and discrimination type setting means 165 are provided.
  • the data input means 161 of the image processing apparatus 6 is, for example, an interface unit of a LAN (Local Area Network).
  • the discrimination type setting means 165 is a means for setting the type of the first discrimination data input by the external controller. Specification information of the external controller is input to the discrimination type setting means 165 , and the discrimination type setting means 165 is preset by the operation by a user, a manager, a designer, etc.
  • the discrimination types of first discrimination data described in connection with the first embodiment are “TEXT”, “GRAPHIC” and “IMAGE”, and the correspondency of the three discrimination types as shown in FIG. 7 is registered (set) by the discrimination type setting means 165 .
  • an external controller that generates any kind of discrimination data can be connected to the image processing apparatus 6 .

Abstract

In an image processing apparatus, first image data and first discrimination data is generated on the basis of page information described in a page description language. Second discrimination data is generated on the basis of the generated first image data and first discrimination data. Second image data is generated on the basis of the generated second discrimination data and the first image data. An image processing is performed based on the generated second image data and second discrimination data, and the processed data is transferred on a printing medium.

Description

    BACKGROUND OF THE INVENTION
  • The present invention relates to an image processing apparatus and an image processing method, and more particularly to an image processing apparatus and an image processing method for outputting page description information with high quality, which has been formed by an image forming apparatus such as a personal computer. [0001]
  • In general, when page description information such as DTP data, which is formed by a personal computer, is to be output from an image output apparatus such as a printer, the data to be output is sent to an image output apparatus such as a printer or an MFP via a printer controller that receives the page description information and develops it to image data comprising pixel arrays of four colors, Cyan, Magenta, Yellow and Black, which represent ink amounts. The printer controller not only performs development to image data but also produces discrimination data representative of attributes of respective pixels of the image data. [0002]
  • For example, Jpn. Pat. Appln. KOKAI Publication No. 9-282472 discloses a technique wherein characters or given discrimination signals representing other attributes, as well as image data, are produced and transmitted, and the image data is subjected to an image process corresponding to the discrimination signals in an image output apparatus. Thereby, where image data includes character information, an image process, for example, for preventing degradation in quality of characters is performed and the data is output from the image output apparatus. [0003]
  • On the other hand, Jpn. Pat. Appln. KOKAI Publication No. 2000-270213 discloses a technique wherein generated discrimination data is converted to data representing correspondency with image data, thereby reducing the memory capacity needed for storing the discrimination data. [0004]
  • In the technique disclosed in the above-mentioned Jpn. Pat. Appln. KOKAI Publication No. 9-282472, however, image development means (i.e. printer controller) simultaneously produces image data and discrimination data on the basis of page description information, and the image data is output from an image forming apparatus capable of switching image processes according to the discrimination data. In this case, an ordinary printer controller is unable to generate desired discrimination data, and thus the printer controller is limited to a specific type. [0005]
  • Moreover, when an ordinary printer controller is used, image data matching with characteristics of an output apparatus is not necessarily produced. For example, in the case of a color image having a colored background on which black characters are written, such image data is produced in ordinary cases that the black character portion is written in black alone and there is no information on the color of the background. In the case where this image data is output as such from a printer, if an error has occurred in print position between black ink and color ink, a colorless portion forms around the character and the image quality deteriorates. [0006]
  • BRIEF SUMMARY OF THE INVENTION
  • The object of the present invention is to provide an image processing apparatus and an image processing method capable of performing a high-image-quality image process matching with output characteristics of a printer, even in a case where an ordinary printer controller is used. [0007]
  • In order to achieve the object, the present invention provides an image processing apparatus comprising: image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data and the first discrimination data generated by the image development means; image data generating means for generating second image data by correcting the first image data generated by the image development means on the basis of the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means. [0008]
  • The invention provides an image processing apparatus comprising: image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data generated by the image development means; image data generating means for generating second image data by correcting the first image data generated by the image development means on the basis of the second discrimination data generated by the discrimination data generating means and the first discrimination data generated by the image development means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the first discrimination data generated by the image development means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means. [0009]
  • The invention provides an image processing apparatus comprising: image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data generated by the image development means or using the first image data and the first discrimination data; image processing means for subjecting the first image data generated by the image development means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means and the first discrimination data generated by the image development means; and image output means for outputting image data processed by the image processing means. [0010]
  • The invention provides an image processing apparatus comprising: input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the type of attributes set by the setting means and the first image data and the first discrimination data input by the input means; image data generating means for generating second image data by correcting the first image data input by the input means on the basis of the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means. [0011]
  • The invention provides an image processing apparatus comprising: input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data input by the input means; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; image data generating means for generating second image data by correcting the first image data input by the input means on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means. [0012]
  • The invention provides an image processing apparatus comprising: input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data input by the input means; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; image processing means for subjecting the first image data input by the input means to a predetermined process on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means. [0013]
  • The invention provides an image processing method for image-processing information described in a page description language, and outputting an image, comprising: generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of the information described in the page description language; generating second discrimination data different from the first discrimination data, using the generated first image data and first discrimination data; generating second image data by correcting the generated first image data on the basis of the generated second discrimination data; subjecting the generated second image data to a predetermined process on the basis of the generated second discrimination data; and outputting processed image data.[0014]
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING
  • FIG. 1 is a block diagram showing the structure of an image processing apparatus according to a first embodiment of the present invention; [0015]
  • FIG. 2 shows an example of the structure of image development means; [0016]
  • FIG. 3 shows an example of the structure of discrimination data generating means; [0017]
  • FIG. 4 shows an example of the structure of an edge detection section in the discrimination data generating means; [0018]
  • FIG. 5 shows an example of the structure of a color detection section in the discrimination data generating means; [0019]
  • FIG. 6 shows an example of the structure of a synthetic determination section in the discrimination data generating means; [0020]
  • FIG. 7 shows an example of conversion by a converter; [0021]
  • FIG. 8 shows an example of the structure of image data generating means; [0022]
  • FIG. 9A shows an example of first image data; [0023]
  • FIG. 9B shows an example of second image data in a case where an output value of the first image data has been replaced; [0024]
  • FIG. 10 is a view for describing a smoothing process; [0025]
  • FIG. 11 shows an example of the structure of image processing means; [0026]
  • FIG. 12 shows an example of a correction table; [0027]
  • FIG. 13 shows an example of the correction table; [0028]
  • FIG. 14 shows an example of the correction table; [0029]
  • FIG. 15 shows an example of the correction table; [0030]
  • FIG. 16 shows an example of the correction table; [0031]
  • FIG. 17 shows an example of the correction table; [0032]
  • FIG. 18 shows an example of the correction table; [0033]
  • FIG. 19 shows an example of the correction table; [0034]
  • FIG. 20 is a b lock diagram showing the structure of an image processing apparatus according to a second embodiment; [0035]
  • FIG. 21 is a block diagram showing the structure of an image processing apparatus according to a third embodiment; [0036]
  • FIG. 22 is a block diagram showing the structure of an image processing apparatus according to a fourth embodiment; [0037]
  • FIG. 23 is a block diagram showing the structure of an image processing apparatus according to a fifth embodiment; and [0038]
  • FIG. 24 is a block diagram showing the structure of an image processing apparatus according to a sixth embodiment. [0039]
  • DETAILED DESCRIPTION OF THE INVENTION
  • Embodiments of the present invention will now be described with reference to the accompanying drawings. [0040]
  • FIG. 1 is a block diagram showing the structure of an [0041] image processing apparatus 1 according to a first embodiment of the present invention. This image processing apparatus 1 is called a printer in usual cases. The apparatus receives document data, etc. produced by a personal computer via a network, etc., generates image data comprising toner amount information, and transfers toner on paper, thus performing image formation.
  • The [0042] image processing apparatus 1 comprises image development means (controller unit) 11, discrimination data generating means 12, image data generating means 13, image processing means 14, and image output means (printer) 15.
  • The image development means [0043] 11 receives DTP (Desk Top Publishing) data formed on a personal computer or document data of a word processor, etc. as page information described in a page description language (PDL). The image development means 11 develops the received data to first image data as bit map data and to first discrimination data representative of attributes of each pixel.
  • The page information contains characters as font data, figures as line description data or painted-out region data, and others as ordinary raster image data. When the page information is output as a print image, it is necessary to develop all data as the same bit map data. [0044]
  • In addition, it is necessary to develop the attribute cata to pixel-by-pixel discrimination data so that the image processing means [0045] 14 may perform an appropriate image quality enhancing process in accordance with attributes of image data.
  • Alternatively, the image processing apparatus may be constructed such that the image development means [0046] 11 is provided as an external element as a printer controller.
  • The discrimination data generating means [0047] 12 generates second discrimination data for each pixel, which is necessary for controlling the image processing means 14, on the basis of the first image data and the first discrimination data. The second discrimination data differs from the first discrimination data and corresponds to an image area discrimination signal that is commonly used in a copying machine, etc.
  • Accordingly, even where a scanner is connected to the [0048] image processing apparatus 1 for the purpose of use as a copying machine, the second discrimination data can be generated from the scanner input image.
  • It is necessary, however, to switch the method of generating the second discrimination data, depending on which of images should be treated as first image data, the images being obtained in a case where the scanner input image is processed and in a case where the information of the page description language is developed to the image. [0049]
  • The image data generating means [0050] 13 corrects the first image data on the basis of the second discrimination data generated by the discrimination data generating means 12, and thus generates second image data. The correction of the image data in this context is effected by an over-print process, which is performed based on the fact that a white blank portion forms between a black line and a C, M, or Y color component background due to a print position error at the time of printing out, a trapping process, a character smoothing process, etc.
  • The image processing means [0051] 14 performs a process for emphasizing an image (in particular, a character) at the time of printing out. General methods of the process are filtering, gamma correction, etc. A filter coefficient or a gamma correction table is switched in accordance with the second discrimination data.
  • The image output means [0052] 15 uses output image data (corresponding to the ink amount of each color in the case of a printer) generated by the image processing means 14, and transfers ink on a printing medium (paper, etc.).
  • FIG. 2 shows an example of the structure of the image development means [0053] 11. The image development means 11 comprises a CPU 21, a RAM 22 and a page memory 23. The page information received by the image development means 11 is converted to first image data and first discrimination data by the CPU 21, which is then developed in the page memory 23 and transmitted pixel by pixel.
  • FIG. 3 shows an example of the structure of the discrimination data generating means [0054] 12. The discrimination data generating means 12 comprises line buffers 31 a and 31 b, an edge detection section 32, a color detection section 33 and a synthetic determination section 34.
  • The first image data transmitted from the image development means [0055] 11 is input to the line buffer 31 a of the discrimination data generating means 12. The first image data is accumulated in the line buffer 31 a by several lines, thereby forming block data.
  • The first image data output from the [0056] line buffer 31 a is sent to the edge detection section 32, and it is determined for each color component whether a center pixel (“pixel of interest”) of the block corresponds to an edge portion.
  • In addition, the first image data output from the [0057] line buffer 31 a is sent to the color detection section 33, and it is determined based on the chroma whether the pixel of interest has an achromatic color or a chromatic color.
  • On the other hand, the first discrimination data transmitted from the image development means [0058] 11 is input to the line buffer 31 b of the discrimination data generating means 12. The line buffer 31 b is used for establishing synchronism with the first image data.
  • The [0059] synthetic determination section 34 outputs second discrimination data by performing synthetic determination on the basis of the edge detection result from the edge detection section 32, the determination result from the color detection section 33, and the first discrimination data synchronized by the line buffer 31 b.
  • FIG. 4 shows an example of the structure of the [0060] edge detection section 32 in the discrimination data generating means 12. The edge detection section 32 comprises Multipliers 41 a and 41 b, adders 42 a and 42 b, positive number generators 43 a and 43 b, an adder 44 and a comparator 45. The edge detection section 32 is provided for each of the color component image signals C, M, Y and K of the first image data input from the line buffer 31 a, and the edge detection is performed in parallel.
  • The [0061] multiplier 41 a multiplies a 3×3 matrix of the first image data with coefficients (edge detection operators) shown in FIG. 4 by symbol A. The adder 42 a adds calculated values of the multiplier 41 a. The positive number generator 43 a produces an absolute value of the value calculated by the adder 42 a.
  • The [0062] multiplier 41 b multiplies a 3×3 matrix of the first image data with coefficients (edge detection operators) shown in FIG. 4 by symbol B. The adder 42 b adds calculated values of the multiplier 41 b. The positive number generator 43 b produces an absolute value of the value calculated by the adder 42 b.
  • Subsequently, the [0063] adder 44 adds the two absolute values obtained by the absolute value generators 43 a and 43 b. The comparator 45 compares the added value with a predetermined value, thereby determining the presence/absence of the edge.
  • The comparison result of the [0064] comparator 45 is output to the synthetic determination section 34 as an edge determination result EC, EM, EY, EK, in association with a color component image signal C, M, Y, K in the first image data input from the line buffer 31 a.
  • FIG. 5 shows an example of the structure of the [0065] color detection section 33 in the discrimination data generating means 12. The color detection section 33 comprises subtracters 51 a, 51 b and 51 c, positive number generators 52 a, 52 b and 52 c, a maximum value selector 53, a comparator 54, digitizers 55 a, 55 b, 55 c and 55 d, selectors 56 a, 56 b, 56 c and 56 d, AND gates 57 a, 57 b and 57 c, and a NOT gate 58.
  • The [0066] subtracter 51 a calculates a difference in density between color components (C, Y) of image signals of the first image data input from the line buffer 31 a, and outputs the difference to the positive number generator 52 a. The positive number generator 52 a produces an absolute value of the input density difference between the color components (C, Y), and outputs the absolute value to the maximum value selector 53.
  • The [0067] subtracter 51 b calculates a difference in density between color components (C, M) of image signals of the first image data input from the line buffer 31 a, and outputs the difference to the positive number generator 52 b. The positive number generator 52 b produces an absolute value of the input density difference between the color components (C, M), and outputs the absolute value to the maximum value selector 53.
  • The [0068] subtracter 51 c calculates a difference in density between color components (M, Y) of image signals of the first image data input from the line buffer 31 a, and outputs the difference to the positive number generator 52 c. The positive number generator 52 c produces an absolute value of the input density difference between the color components (M, Y), and outputs the absolute value to the maximum value selector 53.
  • The [0069] maximum value selector 53 selects a maximum of the values input from the positive number generators 52 a, 52 b and 53 c, and outputs the maximum value to the comparator 54.
  • The [0070] comparator 54 compares the input maximum value and a predetermined value, and determines whether the color is achromatic or chromatic.
  • On the other hand, the [0071] digitizer 55 a digitizes the density of the color component image signal C of the first image data input from the line buffer 31 a. The digitizer 55 b digitizes the density of the color component image signal M of the first image data input from the line buffer 31 a. The digitizer 55 c digitizes the density of the color component image signal Y of the first image data input from the line buffer 31 a. The digitizer 55 d digitizes the density of the color component image signal K of the first image data input from the line buffer 31 a.
  • The digitized result represents which color component is effective in the synthetic determination section. When the digitized result of the image signal K, i.e. the output of the [0072] digitizer 55 d, is “1”, there is a case where a black over-print process (for incorporating a background density in the image data of the color component C, M, Y) is performed at the time of image development. An AND value between the digitized result of the image signal C, M, Y and an inverted value of the digitized result of the image signal K is obtained.
  • Specifically, the digitized result of the [0073] digitizer 55 a and an inverted value of the digitized result of the digitizer 55 d are input to the AND gate 57 a to produce an AND value. The digitized result of the digitizer 55 b and an inverted value of the digitized result of the digitizer 55 d are input to the AND gate 57 b to produce an AND value. The digitized result of the digitizer 55 c and an inverted value of the digitized result of the digitizer 55 d are input to the AND gate 57 c to produce an AND value.
  • The [0074] selector 56 a receives the comparison result of the comparator 54 and the AND value of the AND gate 57 a, selects one of them, and outputs a select result SC. The selector 56 b receives the comparison result of the comparator 54 and the AND value of the AND gate 57 b, selects one of them, and outputs a select result SM. The selector 56 c receives the comparison result of the comparator 54 and the AND value of the AND gate 57 c, selects one of them, and outputs a select result SY. The selector 56 d receives the comparison result of the comparator 54, which has been inverted by the NOT gate 58, and the digitized result of the digitizer 55 d, selects one of them, and outputs a select result SK.
  • This operation is performed since it is necessary to effect switching between the use as a copying machine and the use as a printer. [0075]
  • FIG. 6 shows an example of the structure of the [0076] synthetic determination section 34 in the discrimination data generating means 12. The synthetic determination section 34 comprises converters 61 a, 61 b, 61 c and 61 d, and AND gates 62 a, 62 b, 62 c and 62 d.
  • Signals EC, EM, EY and EK input from the [0077] edge detection sections 32 associated with the image signals C, M, Y and K represent the edge detection results of C, M, Y and K. Signals SC, SM, SY and SK input from the color detector 33 represent the color detection results of C, M, Y and K.
  • The [0078] converter 61 a receives the edge detection result EC from the edge detector 32 and the first discrimination data from the line buffer 31 b, and outputs desired converted discrimination data based on them.
  • The [0079] converter 61 b receives the edge detection result EM from the edge detector 32 and the first discrimination data from the line buffer 31 b, and outputs desired converted discrimination data based on them.
  • The [0080] converter 61 c receives the edge detection result EY from the edge detector 32 and the first discrimination data from the line buffer 31 b, and outputs desired converted discrimination data based on them.
  • The [0081] converter 61 d receives the edge detection result EK from the edge detector 32 and the first discrimination data from the line buffer 31 b, and outputs desired converted discrimination data based on them.
  • FIG. 7 shows an example of conversion by the [0082] converters 61 a, 61 b, 61 c and 61 d. In FIG. 7, the first discrimination data is classified such that a character described as font data with a predetermined size or less is “TEXT”, an object described as line description data or painted-out data and a character other than “TEXT” are “GRAPHIC”, and an object other than “TEXT” and “GRAPHIC” is “IMAGE”.
  • For example, when the first discrimination data is “TEXT” and the edge detection result is “EDGE”, the second discrimination data is output as “NEW-TEXT” (conversion result). When the first discrimination data is “IMAGE” and the edge detection result is “NON-EDGE”, the second discrimination data is output as “NEW-GRAPHIC” (conversion result). [0083]
  • The desired discrimination data (second discrimination data) output from the [0084] converter 61 a is input to the AND gate 62 a. The AND gate 62 a produces second discrimination data DC as an AND value between the desired discrimination data input from the converter 61 a and the color detection result SC input from the color detection section 33.
  • The desired discrimination data (second discrimination data) output from the [0085] converter 61 b is input to the AND gate 62 b. The AND gate 62 b produces second discrimination data DM as an AND value between the desired discrimination data input from the converter 61 b and the color detection result SM input from the color detection section 33.
  • The desired discrimination data (second discrimination data) output from the [0086] converter 61 c is input to the AND gate 62 c. The AND gate 62 c produces second discrimination data DY as an AND value between the desired discrimination data input from the converter 61 c and the color detection result SY input from the color detection section 33.
  • The desired discrimination data (second discrimination data) output from the [0087] converter 61 d is input to the AND gate 62 d. The AND gate 62 d produces second discrimination data DK as an AND value between the desired discrimination data input from the converter 61 d and the color detection result SK input from the color detection section 33.
  • FIG. 8 shows an example of the structure of the image data generating means [0088] 13. The image data generating means 13 comprises line buffers 71 a and 71 b, a background density averaging section 72, a character density averaging section 73, and a selector 74.
  • The line buffer [0089] 71 a accumulates n-lines of the first image data output from the image development means 11.
  • The [0090] line buffer 71 b accumulates n-lines of the second discrimination data output from the discrimination data generating means 12.
  • The background [0091] density averaging section 72 calculates the average density of each of the color components C, M and Y as regards a pixel within n×n pixels around a pixel of interest, with respect to which the second discrimination data DK on the color component K is not “NEW-TEXT”.
  • On the other hand, the character [0092] density averaging section 73 calculates the average density of each of the color components C, M, Y and K within an area of m×m pixels (m≦n) around the pixel of interest.
  • The selector [0093] 74 outputs second image data by properly replacing the pixel values in accordance with the second discrimination data on the pixel of interest.
  • For example, when the pixel value of K of the pixel of interest is “NEW-TEXT” and all the pixel values of C, M and Y are zero, the data C, M, Y of the pixel of interest shown in FIG. 9A is changed to the output value of the background [0094] density averaging section 72 as shown in FIG. 9B (over-print process or trapping process). Specifically, the first image data C, M, Y, K shown in FIG. 9A is replaced with the output value of the second image data C, M, Y, K shown in FIG. 9B.
  • Similarly, when the color component of K of the pixel of interest is “NEW-TEXT”, the pixel value of K of the pixel of interest is replaced with the output value of the character [0095] density averaging section 73, as shown in a, b and c of FIG. 10 (smoothing process, etc.).
  • The processing of the image data generating means [0096] 13 has been described above merely by way of example, and the content of the processing is not limited to the above-described one.
  • FIG. 11 shows an example of the structure of the image processing means [0097] 14. The image processing means 14 comprises line buffers 101 a and 101 b, a filter section 102, a gamma correction section 103, and a screen processing section 104.
  • The [0098] line buffer 101 a accumulates several lines of the second image data generated by the image data generating means 13 for the purpose of filter processing.
  • The [0099] line buffer 101 b outputs the second discrimination data on the pixel of interest (center pixel of an image matrix) in synchronism with the second image data.
  • The [0100] filter section 102 multiplies each pixel of the image matrix buffered by the line buffer 101 b with a predetermined coefficient, thus calculating the sum. In this case, the filter section 102 changes the coefficient for multiplication in accordance with the second discrimination data output synchronously from the line buffer 101 b.
  • The [0101] gamma correction section 103 corrects each pixel of the second image data for each color component, using correction tables as shown in FIGS. 12 to 19. In this case, the gamma correction section 103 switches the correction table in accordance with the second discrimination data output synchronously from the line buffer 101 b.
  • A correction table shown in FIG. 12 relates to correction of color component C in a case where the second discrimination data is “NEW-TEXT”. [0102]
  • A correction table shown in FIG. 13 relates to correction of color component C in a case where the second discrimination data is not “NEW-TEXT”. [0103]
  • A correction table shown in FIG. 14 relates to correction of color component M in a case where the second discrimination data is “NEW-TEXT”. [0104]
  • A correction table shown in FIG. 15 relates to correction of color component M in a case where the second discrimination data is not “NEW-TEXT”. [0105]
  • A correction table shown in FIG. 16 relates to correction of color component Y in a case where the second discrimination data is “NEW-TEXT”. [0106]
  • A correction table shown in FIG. 17 relates to correction of color component Y in a case where the second discrimination data is not “NEW-TEXT”. [0107]
  • A correction table shown in FIG. 18 relates to correction of color component K in a case where the second discrimination data is “NEW-TEXT”. [0108]
  • A correction table shown in FIG. 19 relates to correction of color component K in a case where the second discrimination data is not “NEW-TEXT”. [0109]
  • The [0110] screen processing section 104 processes each pixel of the corrected second image data input from the gamma correction section 103 in accordance with the second discrimination data input synchronously from the line buffer 101 b, thereby outputting image data of each color component matching with the image output means 15 in the rear stage. The processing is, for example, an error spreading process for converting image data of 8 bits per pixel (256 tone levels) to image data of 1 bit (2 tone levels).
  • The image output means [0111] 15 transfers the output image data from the screen processing section 104 onto printing medium (paper or the like).
  • In the first embodiment, the first discrimination data is generated from the image development means and the second discrimination data is generated from the discrimination data generating means, for example, in the following manner. [0112]
  • a) The image development means generates first discrimination data that discriminates whether each pixel is associated with a character or a line figure, and the discrimination data generating means generates second discrimination data that does not discriminate whether each pixel is associated with a character or a line figure, using the first discrimination data generated by the image development means. [0113]
  • The character is an object disposed in the first image data as font data. [0114]
  • The line figure is an object described by a straight line and a curve. [0115]
  • b) The image development means generates first discrimination data that does not discriminate whether each pixel is associated with a line figure or a plane figure, and the discrimination data generating means generates second discrimination data that discriminates whether each pixel is associated with a line figure or a plane figure, using the first discrimination data generated by the image development means. [0116]
  • The plane figure is an object, the entirety or each component of which is painted out with uniform density. [0117]
  • c) The image development means generates first discrimination data that does not discriminate whether each pixel is associated with a contour portion or an inside portion of a plane figure, and the discrimination data generating means generates second discrimination data that discriminates whether each pixel is associated with a contour portion or an inside portion of a plane figure, using the first discrimination data generated by the image development means. [0118]
  • d) The image development means generates first discrimination data that discriminates whether each pixel is associated with a plane figure or a tone image, and the discrimination data generating means generates second discrimination data that does not discriminate whether each pixel is associated with a plane figure or a tone image, using the first discrimination data generated by the image development means. [0119]
  • e) The image development means generates first discrimination data that discriminates that each pixel is associated with a tone image, and the discrimination data generating means generates second discrimination data that discriminates the magnitude of density variation in each pixel, using the first discrimination data generated by the image development means. [0120]
  • As has been described above, the first embodiment comprises the discrimination data generating means for generating the second discrimination data on the basis of the first image data and the first discrimination data generated from the page information described in the page description language, and the image data generating means for correcting the first image data on the basis of the second discrimination data and generating the second image data, thereby performing an image quality enhancing process matching with the output characteristics of the printer. [0121]
  • Second to sixth embodiments of the invention will now be described. [0122]
  • FIG. 20 shows the structure of an [0123] image processing apparatus 2 according to a second embodiment.
  • The main difference between the [0124] image processing apparatus 2 of the second embodiment and the image processing apparatus 1 shown in FIG. 1 is that a discrimination data generating means 122 generates second discrimination data, without using first discrimination data generated by image development means 121. Thereby, the independency of the first discrimination data and second discrimination data is enhanced, a greater degree of freedom is provided by the circuit configuration.
  • However, when image data generating means [0125] 123 generates second image data and when image processing means 124 switches the processing, both the first discrimination data and second discrimination data needs to be referred to.
  • FIG. 21 shows the structure of an [0126] image processing apparatus 3 according to a third embodiment.
  • In the [0127] image processing apparatus 3 of the third embodiment, the image data generating means 123 of the image processing apparatus 2 shown in FIG. 20 is omitted. Since the image processing apparatus 3 of the third embodiment does not generate the second image data, the line memory, etc. are not needed and the image processing apparatus can be formed at low cost.
  • FIG. 22 shows the structure of an [0128] image processing apparatus 4 according to a fourth embodiment. In the image processing apparatus 4 of the fourth embodiment, the controller unit (image development means 11) of the image processing apparatus 1 shown in FIG. 1 is omitted and it is provided as an external element. In addition, interface means (data input means 141) as interface with the external controller and discrimination type setting means 146 are provided.
  • The data input means [0129] 141 of the image processing apparatus 4 is, for example, an interface unit of a LAN (Local Area Network).
  • The discrimination type setting means [0130] 146 is a means for setting the type of the first discrimination data input by the external controller. Specification information of the external controller is input to the discrimination type setting means 146, and the discrimination type setting means 146 is preset by the operation by a user, a manager, a designer, etc.
  • The discrimination types of first discrimination data described in connection with the first embodiment are “TEXT”, “GRAPHIC” and “IMAGE”, and the correspondency of the three discrimination types as shown in FIG. 7 is registered (set) by the discrimination type setting means [0131] 146.
  • With this structure, an external controller that generates any kind of discrimination data can be connected to the [0132] image processing apparatus 4.
  • FIG. 23 shows the structure of an image processing apparatus [0133] 5 according to a fifth embodiment. In the image processing apparatus 5 of the fifth embodiment, the controller unit (image development means 121) of the image processing apparatus 2 shown in FIG. 20 is omitted and it is provided as an external element. In addition, interface means (data input means 151) as interface with the external controller and discrimination type setting means 156 are provided.
  • The data input means [0134] 151 of the image processing apparatus 5 is, for example, an interface unit of a LAN (Local Area Network).
  • The discrimination type setting means [0135] 156 is a means for setting the type of the first discrimination data input by the external controller. Specification information of the external controller is input to the discrimination type setting means 156, and the discrimination type setting means 156 is preset by the operation by a user, a manager, a designer, etc.
  • The discrimination types of first discrimination data described in connection with the first embodiment are “TEXT”, “GRAPHIC” and “IMAGE”, and the correspondency of the three discrimination types as shown in FIG. 7 is registered (set) by the discrimination type setting means [0136] 156.
  • With this structure, an external controller that generates any kind of discrimination data can be connected to the image processing apparatus [0137] 5.
  • FIG. 24 shows the structure of an [0138] image processing apparatus 6 according to a sixth embodiment. In the image processing apparatus 6 of the sixth embodiment, the controller unit (image development means 131) of the image processing apparatus 3 shown in FIG. 21 is omitted and it is provided as an external element. In addition, interface means (data input means 161) as interface with the external controller and discrimination type setting means 165 are provided.
  • The data input means [0139] 161 of the image processing apparatus 6 is, for example, an interface unit of a LAN (Local Area Network).
  • The discrimination type setting means [0140] 165 is a means for setting the type of the first discrimination data input by the external controller. Specification information of the external controller is input to the discrimination type setting means 165, and the discrimination type setting means 165 is preset by the operation by a user, a manager, a designer, etc.
  • The discrimination types of first discrimination data described in connection with the first embodiment are “TEXT”, “GRAPHIC” and “IMAGE”, and the correspondency of the three discrimination types as shown in FIG. 7 is registered (set) by the discrimination type setting means [0141] 165.
  • With this structure, an external controller that generates any kind of discrimination data can be connected to the [0142] image processing apparatus 6.
  • As has been described above, according to the embodiments of the present invention, a high-image-quality image process matching with output characteristics of a printer can be performed, even in a case where an ordinary printer controller is used. [0143]

Claims (20)

What is claimed is:
1. An image processing apparatus comprising:
image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language;
discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data and the first discrimination data generated by the image development means;
image data generating means for generating second image data by correcting the first image data generated by the image development means on the basis of the second discrimination data generated by the discrimination data generating means;
image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means; and
image output means for outputting image data processed by the image processing means.
2. An image processing apparatus according to claim 1, wherein the image development means generates first discrimination data that discriminates whether each pixel is associated with a character, or a line figure described by a straight line and a curve.
3. An image processing apparatus according to claim 1, wherein the discrimination data generating means generates second discrimination data that does not discriminate whether each pixel is associated with a character, or a line figure described by a straight line and a curve, using the first image data generated by the image development means.
4. An image processing apparatus according to claim 1, wherein the image development means generates first discrimination data that does not discriminate whether each pixel is associated with a line figure described by a straight line and a curve, or a plane figure, the entirety or each component of which is painted out with uniform density.
5. An image processing apparatus according to claim 1, wherein the discrimination data generating means generates second discrimination data that discriminates whether each pixel is associated with a line figure described by a straight line and a curve, or a plane figure, the entirety or each component of which is painted out with uniform density, using the first image data generated by the image development means.
6. An image processing apparatus according to claim 1, wherein the image development means generates first discrimination data that does not discriminate between a contour portion and an inside portion of a plane figure painted out with uniform density.
7. An image processing apparatus according to claim 1, wherein the discrimination data generating means generates second discrimination data that discriminates between a contour portion and an inside portion of a plane figure painted out with uniform density, using the first image data generated by the image development means.
8. An image processing apparatus according to claim 1, wherein the image development means generates first discrimination data that discriminates between a plane figure painted out with uniform density and a tone image.
9. An image processing apparatus according to claim 1, wherein the discrimination data generating means generates second discrimination data that does not discriminate between a plane figure painted out with uniform density and a tone image, using the first image data generated by the image development means.
10. An image processing apparatus according to claim 1, wherein the image development means generates first discrimination data that discriminates that each pixel is associated with a tone image.
11. An image processing apparatus according to claim 1, wherein the discrimination data generating means generates second discrimination data that discriminates the magnitude of density variation in each pixel, using the first image data generated by the image development means.
12. An image processing apparatus according to claim 1, wherein the discrimination data generating means generates, when the first image data generated by the image development means is color image data comprising plural color components, second discrimination data which represents attributes of each pixel for each color component and is different from the first discrimination data, using the color image data.
13. An image processing apparatus according to claim 1, wherein the image generating means generates, where the first image data generated by the image development means is color image data comprising plural color components and where at least one color component of each pixel of the color image data is associated with a character or a line figure described by a straight line and a curve, second image data by replacing the data other than said color component with data of a peripheral pixel of said pixel and thus correcting the first image data.
14. An image processing apparatus according to claim 1, wherein the image data generating means generates second image data by subjecting a pixel of a line figure described by a straight line and a curve or a character of the first image data generated by the image development means to a smoothing process for providing a smooth density variation, on the basis of the second discrimination data generated by the discrimination data generating means.
15. An image processing apparatus comprising:
image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language;
discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data generated by the image development means;
image data generating means for generating second image data by correcting the first image data generated by the image development means on the basis of the second discrimination data generated by the discrimination data generating means and the first discrimination data generated by the image development means;
image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the first discrimination data generated by the image development means and the second discrimination data generated by the discrimination data generating means; and
image output means for outputting image data processed by the image processing means.
16. An image processing apparatus comprising:
image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language;
discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data generated by the image development means or using the first image data and the first discrimination data;
image processing means for subjecting the first image data generated by the image development means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means and the first discrimination data generated by the image development means; and
image output means for outputting image data processed by the image processing means.
17. An image processing apparatus comprising:
input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data;
setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means;
discrimination data generating means for generating second discrimination data different from the first discrimination data, using the type of attributes set by the setting means and the first image data and the first discrimination data input by the input means;
image data generating means for generating second image data by correcting the first image data input by the input means on the basis of the second discrimination data generated by the discrimination data generating means;
image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means; and
image output means for outputting image data processed by the image processing means.
18. An image processing apparatus comprising:
input means for inputting data from an external unit that. generates first image data and first discrimination data representing attributes of each of pixels of the first image data;
discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data input by the input means;
setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means;
image data generating means for generating second image data by correcting the first image data input by the input means on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means;
image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; and
image output means for outputting image data processed by the image processing means.
19. An image processing apparatus comprising:
input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data;
discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data input by the input means;
setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means;
image processing means for subjecting the first image data input by the input means to a predetermined process on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; and
image output means for outputting image data processed by the image processing means.
20. An image processing method for image-processing information described in a page description language, and outputting an image, comprising:
generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of the information described in the page description language;
generating second discrimination data different from the first discrimination data, using the generated first image data and first discrimination data;
generating second image data by correcting the generated first image data on the basis of the generated second discrimination data;
subjecting the generated second image data to a predetermined process on the basis of the generated second discrimination data; and
outputting processed image data.
US09/921,703 2000-09-21 2001-08-06 Image processing apparatus and image processing method Abandoned US20030025926A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2001290212A JP2003051939A (en) 2001-08-06 2001-09-21 Image processing apparatus and image processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2000286753A JP2002101051A (en) 2000-09-21 2000-09-21 Wavelength division multiplexing optical interconnection device

Publications (1)

Publication Number Publication Date
US20030025926A1 true US20030025926A1 (en) 2003-02-06

Family

ID=34640449

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/917,707 Expired - Fee Related US6907198B2 (en) 2000-09-21 2001-07-31 Wavelength division multiplexed optical interconnection device
US09/921,703 Abandoned US20030025926A1 (en) 2000-09-21 2001-08-06 Image processing apparatus and image processing method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/917,707 Expired - Fee Related US6907198B2 (en) 2000-09-21 2001-07-31 Wavelength division multiplexed optical interconnection device

Country Status (2)

Country Link
US (2) US6907198B2 (en)
JP (1) JP2002101051A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6642993B2 (en) 2001-12-27 2003-11-04 Kabushiki Kaisha Toshiba Image processing device and method for controlling the same
US20040234134A1 (en) * 2003-05-19 2004-11-25 Kabushiki Kaisha Toshiba Image processing apparatus and image processing method
US20050185071A1 (en) * 2004-01-23 2005-08-25 Sanyo Electric Co., Ltd. Image signal processing apparatus
US8704447B2 (en) 2009-05-28 2014-04-22 Citizen Holdings Co., Ltd. Light source device

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6643419B2 (en) * 2001-12-05 2003-11-04 Pacific Wave Industries, Inc. Ultra-high speed, active polymer-silica hybrid, single control voltage MMI-based 1-by-N packet switch and WG-based WDM packet router/TDM converter and methods of making same
JP4249474B2 (en) * 2002-12-06 2009-04-02 セイコーエプソン株式会社 Wavelength multiplexing chip-to-chip optical interconnection circuit
KR100541655B1 (en) * 2004-01-07 2006-01-11 삼성전자주식회사 Package circuit board and package using thereof
JP4524120B2 (en) * 2004-02-03 2010-08-11 富士通株式会社 Blade type optical transmission equipment
CN1993639B (en) * 2004-09-29 2013-01-16 日立化成工业株式会社 Photoelectric integrated circuit element and transmission apparatus using the same
US7725183B1 (en) * 2006-02-10 2010-05-25 Pacesetter, Inc. Implantable stimulation device equipped with a hardware elastic buffer
US7542641B1 (en) * 2006-12-01 2009-06-02 Kotura, Inc. Multi-channel optical device
US8041230B2 (en) * 2008-12-12 2011-10-18 Fujitsu Limited System and method for optoelectrical communication
US8041229B2 (en) * 2008-12-12 2011-10-18 Fujitsu Limited System and method for optoelectrical communication
US8965208B2 (en) * 2009-05-22 2015-02-24 Kotura, Inc. Multi-channel optical device
US9178620B2 (en) 2011-09-23 2015-11-03 Te Connectivity Nederland B.V. Optical interface for bidirectional communications
US9525490B2 (en) * 2012-07-26 2016-12-20 Aurrion, Inc. Reconfigurable optical transmitter
US8775992B2 (en) 2012-09-05 2014-07-08 International Business Machines Corporation Designing photonic switching systems utilizing equalized drivers
US9268890B2 (en) 2012-09-05 2016-02-23 International Business Machines Corporation Designing photonic switching systems utilizing equalized drivers
KR101999199B1 (en) * 2013-03-12 2019-07-11 삼성전자주식회사 Optical package
US9446467B2 (en) 2013-03-14 2016-09-20 Taiwan Semiconductor Manufacturing Company, Ltd. Integrate rinse module in hybrid bonding platform
US10488682B2 (en) 2013-08-31 2019-11-26 Acacia Communications, Inc. Distributed CMOS driver with enhanced drive voltage for silicon optical push-pull Mach-Zehnder modulators
CN107199405B (en) * 2016-03-15 2023-04-18 中国科学院沈阳自动化研究所 Automatic slicer for corn breeding and sampling
US10133142B2 (en) 2016-03-29 2018-11-20 Acacia Communications, Inc. Silicon modulators and related apparatus and methods
US10177161B2 (en) * 2016-12-28 2019-01-08 Intel Corporation Methods of forming package structures for enhanced memory capacity and structures formed thereby
JP7181462B2 (en) 2019-01-17 2022-12-01 日本電信電話株式会社 photodetector

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5724444A (en) * 1994-09-16 1998-03-03 Kabushiki Kaisha Toshiba Image forming apparatus
US5875036A (en) * 1990-11-08 1999-02-23 Canon Kabushiki Kaisha Image processing apparatus which separates an image signal according to density level
US20010013953A1 (en) * 1999-12-27 2001-08-16 Akihiko Uekusa Image-processing method, image-processing device, and storage medium
US6549657B2 (en) * 1995-04-06 2003-04-15 Canon Kabushiki Kaisha Image processing apparatus and method
US6631207B2 (en) * 1998-03-18 2003-10-07 Minolta Co., Ltd. Image processor including processing for image data between edge or boundary portions of image data

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5394489A (en) * 1993-07-27 1995-02-28 At&T Corp. Wavelength division multiplexed optical communication transmitters

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5875036A (en) * 1990-11-08 1999-02-23 Canon Kabushiki Kaisha Image processing apparatus which separates an image signal according to density level
US5724444A (en) * 1994-09-16 1998-03-03 Kabushiki Kaisha Toshiba Image forming apparatus
US6549657B2 (en) * 1995-04-06 2003-04-15 Canon Kabushiki Kaisha Image processing apparatus and method
US6631207B2 (en) * 1998-03-18 2003-10-07 Minolta Co., Ltd. Image processor including processing for image data between edge or boundary portions of image data
US20010013953A1 (en) * 1999-12-27 2001-08-16 Akihiko Uekusa Image-processing method, image-processing device, and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6642993B2 (en) 2001-12-27 2003-11-04 Kabushiki Kaisha Toshiba Image processing device and method for controlling the same
US20040234134A1 (en) * 2003-05-19 2004-11-25 Kabushiki Kaisha Toshiba Image processing apparatus and image processing method
US20050185071A1 (en) * 2004-01-23 2005-08-25 Sanyo Electric Co., Ltd. Image signal processing apparatus
US8704447B2 (en) 2009-05-28 2014-04-22 Citizen Holdings Co., Ltd. Light source device

Also Published As

Publication number Publication date
US20030025962A1 (en) 2003-02-06
JP2002101051A (en) 2002-04-05
US6907198B2 (en) 2005-06-14

Similar Documents

Publication Publication Date Title
US20030025926A1 (en) Image processing apparatus and image processing method
US7760934B2 (en) Color to grayscale conversion method and apparatus utilizing a high pass filtered chrominance component
US5428377A (en) Color spatial filtering for thermal ink jet printers
US8477324B2 (en) Image processor and image processing method that uses s-shaped gamma curve
EP1014694B1 (en) Automated enhancement of print quality based on feature size, shape, orientation, and color
JP4498233B2 (en) Image processing apparatus and image processing method
EP1073260B1 (en) Image processing device, image forming device incorporating the same, and storage medium for storing program used thereby
EP1349371A2 (en) Image processing apparatus, image processing program and storage medium storing the program
JP2012518303A (en) Image processing system for processing digital image and image processing method for processing digital image
US5784496A (en) Error sum method and apparatus for intercolor separation control in a printing system
JP4386216B2 (en) Color printing system and control method thereof
US7315398B2 (en) Multi-level error diffusion with color image data
JP4153568B2 (en) How to determine the colorant to be used for printing gray areas
JP4377249B2 (en) Ink consumption reduction error diffusion
US7295347B2 (en) Image processing method for generating multi-level data
JP6736299B2 (en) Printing device, printing method, and program
US20100079818A1 (en) Image forming apparatus to improve image quality and image quality improvement method
US6249354B1 (en) Image processing apparatus and method
US20030197897A1 (en) Quantization apparatus and method, and inkjet printing apparatus
JP2000341547A (en) Device and method for image processing
WO2022065299A1 (en) Image formation device
JP2001309188A (en) Image processing unit and image processing method
JP2007087190A (en) Image processing device and program
JP2003051939A (en) Image processing apparatus and image processing method
JP4124900B2 (en) Color printer and control method thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: TOSHIBA TEC KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FUCHIGAMI, TAKAHIRO;TABATA, SUNAO;REEL/FRAME:012054/0696

Effective date: 20010717

AS Assignment

Owner name: TOSHIBA TEC KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT (ONE-HALF INTEREST);ASSIGNOR:TOSHIBA TEC KABUSHIKI KAISHA;REEL/FRAME:014118/0099

Effective date: 20030530

Owner name: KABUSHIKI KAISHA TOSHIBA, JAPAN

Free format text: ASSIGNMENT (ONE-HALF INTEREST);ASSIGNOR:TOSHIBA TEC KABUSHIKI KAISHA;REEL/FRAME:014118/0099

Effective date: 20030530

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION