US20060109263A1 - Universal computing device - Google Patents

Universal computing device Download PDF

Info

Publication number
US20060109263A1
US20060109263A1 US11/329,060 US32906006A US2006109263A1 US 20060109263 A1 US20060109263 A1 US 20060109263A1 US 32906006 A US32906006 A US 32906006A US 2006109263 A1 US2006109263 A1 US 2006109263A1
Authority
US
United States
Prior art keywords
input device
data
location
image
representative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/329,060
Inventor
Jian Wang
Chunhui Zhang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US11/329,060 priority Critical patent/US20060109263A1/en
Publication of US20060109263A1 publication Critical patent/US20060109263A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form

Definitions

  • This disclosure relates to a computer input device for generating smooth electronic ink. More particularly, the disclosure relates to an input device may be used on divergent platforms, while providing a common user interface.
  • GUI graphical user interface
  • One user interface that is bridging the gap between advanced computing systems and the functionality of printed paper is a stylus-based user interface.
  • One approach for the stylus-based user interface is to use resistive technology (common in today's PDAs).
  • Another approach is to use active sensors in a notebook computer.
  • One of the next goals of the computing world is to bring the user interface for operating the computer back to the user.
  • a drawback associated with the use of a stylus is that such devices are tied to the computing device containing the sensor board. In other words, the stylus may only be used to generate inputs when used in conjunction with the required sensor board. Moreover, detection of a stylus is affected by the proximity of the stylus to the sensing board.
  • aspects of the present invention address one or more of the issues identified above, thereby providing a common user interface to users across divergent computing platforms.
  • aspects of the present invention relate to an input device for generating electronic ink, and/or generating other inputs, independent of the device for which the data is intended.
  • the input device may be formed in the shape of a pen, and may or may not include an ink cartridge to facilitate movement of the input device in a familiar manner.
  • FIG. 1 shows a general description of a computer that may be used in conjunction with embodiments of the present invention.
  • FIG. 2 illustrates an input device (including all of the components) in accordance with an illustrative embodiment of the present invention.
  • FIG. 3 provides three illustrative embodiments of a camera system for use in accordance with aspects of the present invention.
  • FIG. 4 illustrates an illustrative technique (maze pattern) for encoding the location of the document.
  • FIG. 5 provides an illustration of a trace pattern from which electronic ink may be generated.
  • FIG. 6 shows the hardware architecture of a system in accordance with one embodiment of the present invention.
  • FIG. 7 illustrates a further combination of components incorporated in an input device for generating electronic ink in accordance with another illustrative embodiment.
  • FIG. 8 illustrates uses of an input device in accordance with several illustrative embodiments of the present invention.
  • aspects of the present invention relate to an input device that may be used in a variety of different computing platforms from controlling a desktop or notebook computer, writing on a whiteboard, controlling a PDA or cellular phone, or creating ink that may be ported to any of these platforms.
  • the following description is divided into a number of sections as follows: terms, general-purpose operating environment, universal pen and camera, active coding, passive coding, internal sensors, additional components, sample implementations.
  • Pen any writing implement that may or may not include the ability to store ink.
  • a stylus with no ink capability may be used as a pen in accordance with embodiments of the present invention.
  • Camera an image capture system.
  • Active Coding incorpororation of codes within the object or surface over which the input device is positioned for the purpose of determining positioning and/or movement of the input device using appropriate processing algorithms.
  • Passive Coding detecting movement/positioning of the input device using image data, other than codes incorporated for that purpose, obtained from the object or surfaces over which the input device is moved using appropriate processing algorithms.
  • Input Device a device for entering information which may be configured for generating and processing information
  • Active Input Device an input device that actively measures signals and generates data indicative of positioning and/or movement of the input device using sensors incorporated within the input device.
  • Passive Input Device an input device for which movement is detected using sensors incorporated other than within the input device.
  • Computing Device a desktop computer, a laptop computer, Tablet PCTM, a personal data assistant, a telephone, or any device which is configured to process information including an input device.
  • FIG. 1 is a functional block diagram of an example of a general-purpose digital computing environment that can be used to implement various aspects of the present invention.
  • a computer 100 includes a processing unit 110 , a system memory 120 , and a system bus 130 that couples various system components including the system memory to the processing unit 110 .
  • the system bus 130 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • the system memory 120 includes read only memory (ROM) 140 and random access memory (RAM) 150 .
  • a basic input/output system 160 (BIOS), containing the basic routines that help to transfer information between elements within the computer 100 , such as during start-up, is stored in the ROM 140 .
  • the computer 100 also includes a hard disk drive 170 for reading from and writing to a hard disk (not shown), a magnetic disk drive 180 for reading from or writing to a removable magnetic disk 190 , and an optical disk drive 191 for reading from or writing to a removable optical disk 192 such as a CD ROM or other optical media.
  • the hard disk drive 170 , magnetic disk drive 180 , and optical disk drive 191 are connected to the system bus 130 by a hard disk drive interface 192 , a magnetic disk drive interface 193 , and an optical disk drive interface 194 , respectively.
  • the drives and their associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for the personal computer 100 . It will be appreciated by those skilled in the art that other types of computer readable media that can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, random access memories (RAMs), read only memories (ROMs), and the like, may also be used in the example operating environment.
  • RAMs random access memories
  • ROMs read only memories
  • a number of program modules can be stored on the hard disk drive 170 , magnetic disk 190 , optical disk 192 , ROM 140 or RAM 150 , including an operating system 195 , one or more application programs 196 , other program modules 197 , and program data 198 .
  • a user can enter commands and information into the computer 100 through input devices such as a keyboard 101 and pointing device 102 .
  • Other input devices may include a microphone, joystick, game pad, satellite dish, scanner or the like.
  • These and other input devices are often connected to the processing unit 110 through a serial port interface 106 that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, game port or a universal serial bus (USB).
  • USB universal serial bus
  • these devices may be coupled directly to the system bus 130 via an appropriate interface (not shown).
  • a monitor 107 or other type of display device is also connected to the system bus 130 via an interface, such as a video adapter 108 .
  • personal computers typically include other peripheral output devices (not shown), such as speakers and printers.
  • a pen digitizer 165 and accompanying pen or stylus 166 are provided in order to digitally capture freehand input.
  • the pen digitizer 165 may be coupled to the processing unit 110 directly, via a parallel port or other interface and the system bus 130 as known in the art.
  • the digitizer 165 is shown apart from the monitor 107 , it is preferred that the usable input area of the digitizer 165 be co-extensive with the display area of the monitor 107 . Further still, the digitizer 165 may be integrated in the monitor 107 , or may exist as a separate device overlaying or otherwise appended to the monitor 107 .
  • the computer 100 can operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 109 .
  • the remote computer 109 can be a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 100 , although only a memory storage device 111 has been illustrated in FIG. 1 .
  • the logical connections depicted in FIG. 1 include a local area network (LAN) 112 and a wide area network (WAN) 113 .
  • LAN local area network
  • WAN wide area network
  • the computer 100 When used in a LAN networking environment, the computer 100 is connected to the local network 112 through a network interface or adapter 114 .
  • the personal computer 100 When used in a WAN networking environment, the personal computer 100 typically includes a modem 115 or other means for establishing a communications over the wide area network 113 , such as the Internet.
  • the modem 115 which may be internal or external, is connected to the system bus 130 via the serial port interface 106 .
  • program modules depicted relative to the personal computer 100 may be stored in the remote memory storage device.
  • network connections shown are illustrative and other techniques for establishing a communications link between the computers can be used.
  • the existence of any of various well-known protocols such as TCP/IP, Ethernet, FTP, HTTP and the like is presumed, and the system can be operated in a client-server configuration to permit a user to retrieve web pages from a web-based server.
  • Any of various conventional web browsers can be used to display and manipulate data on web pages.
  • FIG. 2 provides an illustrative embodiment of an input device for use in accordance with various aspects of the invention.
  • the following describes a number of different elements and/or sensors. Various sensor combinations may be used to practice aspects of the present invention. Further, additional sensors may be included as well, including a magnetic sensor, an accelerometer, a gyroscope, a microphone, or any sensor for that might detect the position of the input device relative to a surface or object.
  • pen 201 includes ink cartridge 202 , pressure sensor 203 , camera 204 , inductive element 205 , processor 206 , memory 207 , transceiver 208 , power supply 209 , docking station 210 , cap 211 , and display 212 .
  • Pen 201 may serve as an input device for a range of devices including a desktop computer, a laptop computer, Tablet PCTM, a personal data assistant, a telephone, or any device which may process and/or display information.
  • the input device 201 may include an ink cartridge 202 for performing standard pen and paper writing or drawing. Moreover, the user can generate electronic ink with the input device while operating the device in the manner typical of a pen. Thus, the ink cartridge 202 may provide a comfortable, familiar medium for generating handwritten strokes on paper while movement of the pen is recorded and used to generate electronic ink. Ink cartridge 202 may be moved into a writing position from a withdrawn position using any of a number of known techniques. Alternatively, ink cartridge 202 may be replaced with a cartridge that does not contain ink, such as a plastic cartridge with a rounded tip, but that will allow the user to move the pen about a surface without damaging the pen or the surface.
  • an inductive element or elements may be included to aid in detecting relative movement of the input device by, for example, providing signals indicative of the input device in a manner similar to those generated by a stylus.
  • Pressure sensor 203 may be included for designating an input, such as might be indicated when the pen 201 is depressed while positioned over an object, thereby facilitating the selection of an object or indication as might be achieved by selecting the input of a mouse button, for example.
  • the pressure sensor 203 may detect the depressive force with which the user makes strokes with the pen for use in varying the width of the electronic ink generated. Further, sensor 203 may trigger operation of the camera.
  • camera 204 may operate independent of the setting of pressure sensor 203 .
  • switches may also be included to effect various settings for controlling operation of the input device.
  • one or more switches may be provided on the outside of the input device and used to power on the input device, to activate the camera or light source, to control the sensitivity of the sensor or the brightness of the light source, set the input device in a sketch mode in which conversion to text is not performed, to set the device to store the input data internally, to process and store the input data, to transmit the data to the an processing unit such as a computing device with which the input device is capable of communicating, or to control any setting that might be desired.
  • Camera 204 may be included to capture images of the surface over which the pen is moved.
  • Inductive element 205 also may be included to enhance performance of the pen when used as a stylus in an inductive system.
  • Processor 206 may be comprised of any known processor for performing functions associated with various aspects of the invention, as will described in more detail to follow.
  • memory 207 may include a RAM, a ROM, or any memory device for storing data and/or software for controlling the device or processing data.
  • the input device may further include a transceiver 208 .
  • the transceiver permits information exchange with other devices. For example, Bluetooth or other wireless technologies may be used to facilitate communications.
  • the other devices may include a computing device which may further includes input devices.
  • Power supply 209 may be included, and may provide power if the pen 201 is to be used independent of and remotely from the host device, the device in which the data is to be processed, stored and/or displayed.
  • the power supply 209 may be incorporated into the input device 201 in any number of locations, and may be positioned for immediate replacement, should the power supply be replaceable, or to facilitate its recharging should the power supply be rechargeable.
  • the pen may be coupled to alternate power supplies, such as an adapter for electrically coupling the pen 201 to a car battery, a recharger connected to a wall outlet, to the power supply of a computer, or to any other power supply.
  • Docking station link 212 may be used to transfer information between the input device and a second device, such as an external host computer.
  • the docking station link may also include structure for recharging the power supply 206 when attached to a docking station, not shown, or connected to a power supply.
  • a USB or other connection may removably connect the input device to a host computer through the docking station link, or through an alternative port.
  • a hardwire connection may also be used to connect the pen to a device for transferring and receiving data. In a hardwired configuration, the docking station link would be omitted in favor of wiring connecting the input device directly to a host.
  • the docking station link may be omitted or replaced with another system for communicating with another device (Bluetooth 802.116, for example).
  • the input device 201 may further include a removable cap 211 which may be equipped with a metal tip for facilitating resistive sensing, so that input device 201 may be used with a device that includes a sensing board, for example.
  • the shell of input device 201 may be comprised of plastic, metal, a resin, a combination thereof, or any material that may provide protection to the components or the overall structure of the input device.
  • the chassis may include a metal compartment for electrically shielding some or all of the sensitive electronic components of the device.
  • the input device may be of an elongated shape, which may correspond to the shape of a pen. The device may, however, be formed in any number of shapes consistent with its use as an input device and/or ink generating device.
  • FIGS. 3 A-C depict three illustrative embodiments of a camera for use in accordance with aspects of the present invention.
  • the input device 201 may be used to generate electronic ink by detecting movement of the pen using, for example, a camera.
  • Camera 321 may be included to capture images of the surface over which the pen is moved, and through image analysis, detect the amount of movement of the pen over the surface being scanned. The movements may be correlated with the document and electronically transpose, add, or associate (e.g. store input annotations apart from the original document) electronic ink to the document).
  • camera 304 includes an image sensor 320 comprised of, for example, an array of image sensing elements.
  • the camera may be comprised of a CMOS image sensor with the capability of scanning a 1.79 mm by 1.79 mm square area at a resolution of 32 pixels by 32 pixels.
  • the minimum exposure frame rate for one such image sensor may be approximately 330 Hz, while the illustrative image sensor may operate at a processing rate of 110 Hz.
  • the image sensor selected may comprise a color image sensor, a grayscale image sensor, or may operate to detect intensities exceeding a single threshold.
  • selection of the camera or its component parts may vary based on the desired operating parameters associated with the camera, based on such considerations as performance, costs or other considerations, as may be dictated by such factors as the resolution required to accurately calculate the location of the input device.
  • a light source, 321 may illuminate the surface over which the input device is moved.
  • the light source may, for example, be comprised of a single light emitting diode (LED), an LED array, or other light emitting devices.
  • the light source may produce light of a single color, including white, or may produce multiple colors.
  • a half mirror 322 may be included within the camera to direct light as desired.
  • the camera 304 may further include one or more optical devices 323 for focusing light from the light source 321 onto the surface scanned 324 and/or to focus the light reflected from that surface to the image sensor 320 .
  • light emitted from light source 321 is reflected by half-mirror 322 , a mirror that reflects or transmits light depending on direction of the impinging light.
  • the reflected light is then directed through lens system 323 and transmitted to the reflective surface below.
  • the light is then reflected off of that surface, through lens system 323 , strikes half-mirror 322 at a transmission angle passing through the mirror, and impinges on sensing array 320 .
  • cameras including a wide range of components may be used to capture the image data, including cameras incorporating a lesser, or a greater, number of components. Variations in the arrangement of components may also be numerous.
  • the light source and the sensing array may be positioned together such that they both face the surface from which the image is to be captured. In that case, because no reflections within the camera are required, the half-mirror may be removed from the system.
  • the light source 321 is positioned a distance from the lens 323 and sensor 320 .
  • the light source may be removed and ambient light reflecting off the object surface is focused by lens 323 onto the sensor 320 .
  • variations in the components incorporated into the camera, or their placement may be employed in a manner consist with aspects of the present invention.
  • the placement and/or orientation of the camera and/or cartridge may be varied from that shown in FIG. 2 to allow for the use of a wide range of camera and/or ink configurations and orientations.
  • camera 304 or any of its component parts, may be located in openings adjacent that provided for the ink cartridge, rather than within the same opening as illustrated.
  • camera 304 may be positioned in the center of the input device with the ink cartridge positioned to the side of the camera.
  • the light source 321 may be incorporated within the structure housing the remaining components of the camera, or one or more components may be positioned separate from the others.
  • a light projecting feature may also be enabled, using a light source and/or optical system, with additional structure and/or software, or modifications to the illustrated components as necessary.
  • the surface of an object over which the input device is positioned may include image data that indicates the relative position of areas of the surface.
  • the surface being scanned may comprise the display of a host computer or other external computing device, which may correspond to the monitor of a desktop computer, a laptop computer, Tablet PCTM, a personal data assistant, a telephone, digital camera, or any device which may display information.
  • a blank document or other image generated on the screen of a Tablet PCTM may include data corresponding to a code that represents the relative position of that portion of the document within the entire document, or relative to any other portion of the image.
  • the information may be comprised of images, which may include alphanumeric characters, a coding pattern, or any discernable pattern of image data that may be used to indicate relative position.
  • the image or images selected for use in designating the location of areas within the surface of the object may depend on the sensitivities of the scanning device incorporated into the camera, such as the pixel resolution of the sensor, and/or the pixel resolution of the image data contained within the surface being scanned.
  • the location information extracted from the object may then be used to track movement of the input device over the object. Using that information, electronic ink or other information corresponding to movement of the input device may be accurately generated.
  • Location information may be used to both detect the position within the image at which the input is to be effected, as well as to provide an indication of movement of the input device over the object surface.
  • the resulting information may be used interactively with word processing software to generate changes in a document, for example.
  • the object used in combination with the input device may be composed of paper with positional information included in the background, for example.
  • the positional information may be incorporated in any form of code, optical representation, or other form that may be sensed by a sensor associated with the input device and used to represent the relative location of the specific site on the paper.
  • FIG. 4 illustrates an illustrative technique for encoding the location of the document.
  • the background of the image may include thin lines that, when viewed in large groups form a maze-like pattern.
  • Each grouping of lines within the maze design comprised of a few thin lines with unique orientations and relative positions, for example, may indicate the position of that portion of the maze pattern relative to other portions of the document.
  • Decoding of the maze pattern found in a captured image may be performed in accordance with numerous decoding schemes. In one embodiment, a particular arrangement and grouping of lines may be decoded to generate positional information.
  • an indication of the position of the captured data may be derived by extracting a code from the image corresponding to the sampled pattern, and using that code to address a look-up table containing data identifying the location of that area.
  • Reference to the coding technique employing a maze pattern is provided for illustrative purposes, and alternative active coding techniques, including, but not limited to the visual coding techniques in U.S. Ser. No. 10/284,412, entitled, “Active Embedded Interaction Code,” invented by Jian Wang, Qiang Wang, Chunhui Zhang, and Yue Li, whose contents are expressly incorporated by reference for all essential subject matter, may also be used consistent with aspects of the invention.
  • images captured by the image sensor may be analyzed to determine the location of the input device at the time of image capture. Successive images may be used to calculate the relative positions of the input device at different times. Correlation of this information may yield an accurate trace of the input device over the substrate. Using this trace information electronic ink accurately representing handwritten strokes may be generated, for example.
  • FIG. 5 provides an illustration of a trace pattern from which electronic ink may be generated.
  • a first captured image may contain a portion of a maze pattern indicative of a first position p 1 of the input device at a first time, t 1 .
  • the next captured image may contain a portion of the coded image data, a different portion of the maze pattern in this example, providing location information of a second position p 2 at a second time, t 2 .
  • a third captured image may contain a third portion of the maze pattern, thereby indicating positioning of the input device at a third position p 3 at time t 3 . Using this data, the three points may indicate a trace of the input device from time t 1 through t 3 .
  • an inking pattern traced by the input device may be generated.
  • the complexity of the algorithm applied may dictate the accuracy of the ink generated.
  • a basic inking algorithm may simply connect the dots with straight lines of unvarying thickness.
  • Algorithms factoring previous sampling points, the time between samplings or other data indicative of the velocity or acceleration at which the input was moved, data indicative of the depressive force used, or any other relevant data may be processed to provide ink that more accurately represents the actual movement of the input device (for example, from other sensors).
  • Optical scanning performed by camera 304 may generate data necessary to determine the position of the input device at various times and that information may be used to generate electronic ink.
  • comparisons of the image captured at time t 1 with that of the image captured at time t 2 may provide data indicating the distance of movement of the pen from one point to another during the period t 1 to t 2 .
  • Those two points of data, and/or the relative distance moved may then be used to generate a trace of the movement of the input device for generating electronic ink representative of handwritten strokes.
  • Comparisons of two or multiple images, or portions of captured images, for calculating the relative movement might be accomplished by a difference analysis.
  • features appearing in more than one image may be compared and the relative movement of the feature or features from one location to another within those images may provide an accurate indication of pen movement, for example.
  • the processing algorithm may be modified to compensate for variations in the sampling period to more accurately indicate the correlation between movement of the input device with the actual time required for each movement.
  • Information indicative of the velocity of motion may assist in generating ink of the appropriate thickness.
  • the surface over which the input device is moved may include the display of a computing device, a mouse pad, a desktop, or any non-uniform reflective surface from which objects or image data may be extracted indicating movement of the input device over that surface.
  • the tracking algorithm with which the captured image data may be processed may be fixed or may vary dependent on the characteristics of the images captured.
  • the processor may detect grains in the wood of a desktop, for example, and based on a comparison of a sequence of images captured by the camera, the relative location of particular patterns of grain within successive images may be used to determine the location of the input at various times and/or the relative movement of the input device over that surface.
  • a more complex tracking algorithm may be required where features within the images are less easily discerned and the image more uniform.
  • Alternative passive coding techniques including, but not limited to, the coding techniques found in U.S. Ser. No. 10/284,451, entitled, “Passive Embedded Interaction Code,” invented by Jian Wang, Yingnong Dang, Jiang Wu and xiaoxu Ma, whose contents are expressly incorporated by reference for all essential subject matter, may also be employed consist with aspects of the invention.
  • FIG. 6 shows the hardware architecture of a system in accordance with one embodiment of the present invention.
  • Processor 606 may be comprised of any known processor for performing functions associated with various aspects of the invention.
  • the processor may include an FPSLIC AT94S40, and may be comprised of an FPGA with an AVR core. That particular device may include a 20 MHz clock and operate at a speed of 20 MIPS.
  • selection of a processor for use in input device 601 may be dictated by the cost and/or processing speed requirements of the system.
  • the processor 606 may perform image analysis, should such analysis be conducted within the input device.
  • processing may be performed by a second processor, such as a digital signal processor (DSP) incorporated into the device 601 .
  • DSP digital signal processor
  • the processor 606 may further operate to perform steps critical to reducing power consumption to conserve power stored in power supply 609 , such as powering down various components when the input device is inactive, which may be based on data indicating movement and/or positioning of the device.
  • the processor 606 may further operate to calibrate and regulate the performance of various components, including adjustments to the intensity of light source or to the sensitivity of the sensing array of camera 604 , for example.
  • the processor may choose from among a plurality of stored image processing algorithms, and may be controlled to select the image analysis algorithm most suitable for detecting movement, in accordance for example, characteristics associated with the surface over which the device is moved.
  • the algorithm may be selected automatically based on performance considerations programmed into the input device.
  • the input device may be controlled, and settings established, based on user's selections input, for example, via actuations of the force sensor or based on handwritten strokes corresponding to commands.
  • Memory 607 may include one or more RAMs, ROMs, FLASH memories, or any memory device or devices for storing data, storing software for controlling the device, or for storing software from processing data.
  • data representing location information may be processed within the input device 601 and stored in memory 607 for transfer to a host computer 620 .
  • the captured image data may be buffered in memory 607 within the input device 601 for transfer to a host device 620 for processing or otherwise.
  • Transceiver, or communication unit, 608 may include a transmission unit and receiving unit.
  • information representing movement of the input device may be transmitted to a host computer 620 , such as the previously described desktop computer, lap top computer, Tablet PCTM, personal digital assistant, telephone, or other such device for which user inputs and electronic ink might be useful.
  • the transceiver 608 may communicate with an external device using any wireless communication technique, including blue tooth technology, for performing short-range wireless communications, infrared communications, or even cellular or other long range wireless technologies.
  • the transceiver 608 may control the transmission of data over a direct link to a host computer, such as over a USB connection, or indirectly through a connection with docking cradle 630 .
  • the input device may also be hardwired to a particular host computer using a dedicated connection.
  • the transceiver 608 may also be used to receive information and/or software, which in one embodiment, may be used for improving performance of the input device. For example, program information for updating the control functions of the processor or processor may be uploaded via any of the previously described techniques.
  • software may also be transmitted to the input device, including software for analyzing the image data and/or for calibrating the input device may be downloaded from an external device.
  • Processor 606 may operate in accordance with an interaction model.
  • An interaction model may be implemented in the form of software for maintaining a consistent experience in which electronic ink is generated regardless of the external device for which the unit performs the functions of an input device.
  • the interaction model may process captured data for conversion into a form universally suitable for use on any number of host devices including a desktop computer, a laptop computer, Tablet PCTM, a personal data assistant, a telephone, a whiteboard, or any device that might store, display or record data input via the input device.
  • the processor 606 may recognize the device to which it is connected, or for which the data representing handwritten inputs are intended, and based on such recognition, select processing that converts input data into a form suitable for the specific device recognized.
  • a conversion to a form useful for each potential recipient computing device would be contained within the input device and made available as necessary.
  • Recognition of the intended recipient device may be attained as a result of communication between the devices, should they be connected wirelessly or directly. Alternatively, the user may enter the identity of the device or devices for which the data is intended directly into the input device.
  • the input device includes a display
  • data may be processed using a default processing algorithm suitable for use with the display and/or a multitude of other devices.
  • input device 701 also may include one or more inertial sensors 715 for sensing pen movement, position, or orientation, in addition to the previously described components represented with like reference numerals.
  • input device 701 may include a gyroscope for providing data representing the angular velocity of the pen in a plurality of directions.
  • the input device 701 may include one or more accelerometers, or sets of accelerometers, measuring the acceleration or gravitational forces upon the pen.
  • Data representing movement of the pen may also be obtained using a magnetic sensor which measures movements of the pen by detecting variations in measurements of the earth's magnetic field, described herein as an inertial sensor because it detects movement of the input device based on data other than image data.
  • Data from either or any of the inertial sensors incorporated with or into the input device may be used in combination with data from the camera to obtain data representing movement or positioning of the input device, and thereby produce data for generating electronic ink.
  • the Input Device May Operate Using Active Coding
  • A. Active Coding Provides Location Information for the Entry of Input Information on a Display or Other Writing Surface
  • a surface of an object over which the input device is positioned and/or moved may include coded image data that indicates the location or relative position of each area within that surface.
  • the object may comprise the display of a computing device, such as a laptop computer.
  • a document may be recalled from memory and displayed on the screen. Imbedded within that document, such as in the background, may lie coded information indicating the position of each area of the document.
  • the background of the document may include a maze pattern, a sufficiently large enough portion of that pattern uniquely identifying each region within the entire document.
  • the input device may be used in combination with the coded location information to add annotations or edits to the document at specified locations even if the display of the laptop does not include sensors for detecting movement of an input device over a screen.
  • the input device may function as an “active input device” such that sensors associated with the input device generate data indicative of position or location of that device.
  • the image sensor incorporated within the input device captures image data representing the surface of the display screen over which the input device is positioned and/or moved.
  • the sensor captures images including location codes indicating the relative position of the input device.
  • the input device As the user moves about the displayed image, entering annotations and/or making edits to the electronic document displayed, the input device generates signals representing those inputs and data representing the location within the document at which those inputs are to be incorporated. Control of the laptop may be also effected using the input device, in place of a mouse, or to perform other standard inputs function including the movement of a cursor and the actuation of selections.
  • the input device may be used in conjunction with word processing software to edit the document by, for example, deleting text and inserting new text.
  • a user positions the input device over the screen at the desired location.
  • the user may position the input device proximate the screen and move the device in a motion to strike through the image of the text displayed.
  • the image may be processed to determine both that the pen has been moved in a striking motion, and to identify the text corresponding to the location at which the user moved the input device. Accordingly, the inputs may be used to delete that data.
  • the user may wish to insert new text.
  • the user may draw a symbol for inserting text, such as a “carrot” or upside-down “V”, at the location at which the new text is to be inserted.
  • Processing software for converting inputs into image data and/or commands, stored in the input device or host computer recognizes the symbol as a control signal for inserting text. With the aid of the input device, the user may then write text to be inserted by hand.
  • the user may add notes with highlighting indicating the original text to which the annotations pertain. For example, the user may select the text to be highlighted using a pull-down menu, or a highlighting button, displayed on the screen. Next, the input device is dragged over text to be selected for highlighting. Then comments to be associated with the highlighted/selected text may be written on the screen of the display at a location adjacent the highlighted text. When the operation is complete, the user may select the prompts necessary for completing entry of annotations. All of these modifications to the document may be created using the input device regardless of whether the display includes sensors for detecting movement of the input device.
  • Modifications to the document may be displayed and/or incorporated within the document in the form of image data, electronic ink or data converted into text. Conversion of the inputs into text may occur in a manner invisible to the user, such that text appears in the display of the document on screen as it is entered. Alternatively, the handwriting of the user may appear within the body of the document. To achieve instantaneous display of edits, information representing the movement of the pen and the location of such edits may be transmitted to the laptop device on an ongoing basis.
  • the identity of the person entering the inputs may also be recorded.
  • the input device may generate information identifying the user and/or the particular input device.
  • the identity information may be attached to the generated input data.
  • identification information may be provided as a separate signal transmitted to the host device.
  • the input device may also function to detect positioning using codes incorporated within a surface of any object over which the device may be moved.
  • an image incorporating location codes may be created and/or editing using the input device in combination with the monitor of a desktop computer, Tablet PCTM, a personal data assistant, a telephone, or any device which may display information.
  • Coded information may also be incorporated within a transparent sheet laid over the display screen of such devices, or incorporated within a surface that may be used in combination with a display, including protective films.
  • Coded information may also be incorporated on a writing surface or on writing material, such as paper, to uniquely identify the locations on that surface.
  • positional information may be incorporated in the background of the paper.
  • the positional information may include any form of indication or code representing the relative location of the specific site on the paper.
  • the input device may be used in association with coded paper to record information corresponding to the handwriting of a user at the appropriate location. For example, armed with only the input device and a writing surface incorporating coded position information, while riding in a taxi, the input device may be used to draft a letter to a client. Writing on paper with the input device, gestures corresponding to text or other input information are recognized by detecting changes in the location of the input device at certain times.
  • the inputs may then be converted into electronic ink or other electronic data for use in generating information corresponding to those gestures. Conversion of the inputs may be performed as those inputs are generated, either within the input device or if received by a host computing device coupled to the input device. Alternatively, such conversion may be performed at a later time. For example, the information generated using the input device may be stored in memory and transmitted to a recipient and/or host computer for suitable processing at later time.
  • Data generated using the input device may be incorporated into a document at locations identified by the location codes.
  • the layout of a document such as the previously described letter, may be achieved using the location information identifying the location within the document at which the information is to be entered. For example, the address of the drafter, address of the recipient, body and closing of the letter, and remaining components, may be entered on the paper at the appropriate location.
  • the coded location information captured by the scanner the words or other images forming the contents of the corresponding electronic document are incorporated at the appropriate locations.
  • the input device may also interact with the host computing device for entering commands and making selections and the like.
  • the computing device is a portable camera or telephone with web browsing properties
  • the input device may be used in the manner of a stylus or a mouse to select from displayed buttons or menus. Therefore, the input device may be used to activate the browser of the host computer and to select options for retrieving a file, such as the previously described document, even one stored remotely.
  • the user may select downloading of the file containing the information needed by the user.
  • the user may enter annotations to the downloaded file or files via the input device.
  • Those edits may be transmitted to the remote location from which the file was downloaded, where the input device is equipped to perform communications with remote computing devices.
  • the edits may be used to edit the file stored within the input device and/or a host computing device, assuming the input device is in communication with the host computing device.
  • the file displayed on the monitor of a host computing device may be a spreadsheet, generated using spreadsheet software such as EXCELTM.
  • the location codes can be used to associate locations with given cells within the spreadsheet.
  • the user may enter a numerical entry in the cell displayed on the screen.
  • the input device captures images associated with the location of the input device and transmits that information to the host computing device.
  • the processing software located in the host computing device determines the identity of the cell selected for entry based on the detected location codes, and modifies the spreadsheet document contents accordingly.
  • the input device may also be used to recall images or other prestored information associated with particular gestures or combination of gestures.
  • the input device may be used to draw a symbol which the processing algorithms device is programmed to recognize.
  • the maze pattern may be used to accurately detect movement of the input device over the pattern so that a particular symbol associated with such movement may be detected.
  • the user may control the input device to draw a symbol on the paper previously identified by the user to be associated with the company logo.
  • the maze pattern may identify a combination of movements corresponding to the letter “M” followed immediately by the letter “S” as an instruction to designate entry of a logo of the Microsoft Corporation.
  • prestored information may be entered within a document by entry of a sequence of previously inputs.
  • the input device may also be used as a passive input device.
  • the input device may be used in combination with a computing device that senses movement of the input device using resistive sensing, for example.
  • the input device may function in the manner of a stylus.
  • electronic ink or other image data may be generated with the input device is positioned in very close proximity to the screen.
  • Control functions may be entered in a similar manner.
  • the image displayed on the monitor of the particular computing device may also include data corresponding to a code that represents the relative position of that portion of the document. The location information extracted from the object using the camera may then be used to track movement of the input device, as a substitute to, or in combination with, movement detected using sensors of the computing device.
  • a user may wish to create or modify an image on a portable computing device which already includes the ability to detect positioning of an input device, such as the Tablet PCTM or personal data assistant.
  • the input device may function solely as a passive input device, such that information representing movement of the input device is generated by the computing device.
  • the sensors of the computing device may not have the ability to detected movement of the pen at a range required by the user in a given situation. For example, accurate detection of user inputs may be hindered when the user is traveling in an unstable vehicle. As the user edits a file by moving the input device over the surface of the display of the computing device, the input device may be jostled and displaced a significant distance from the sensor board.
  • Image data captured by the input device may be used to detect movement of the input device within a plane horizontal to the surface of the computing device, even though the signals generated by the sensors of the computing device have become less accurate. Even if the sensors of the computer device are no longer capable of detecting movement of the input device, the image sensor of the input device, may produce sufficient information to maintain an accurate representation of the movement of the input device to reflect the intended inputs of the user. Thus, even when used in combination with a computing device including the capability of sensing movement of the input device, the input device may function either as a passive input device or as an active input device.
  • the Input Device May Operate Using Passive Coding Techniques
  • A. Passive Coding Provides Location Information for Entering Input Information on a Display or Other Writing Surface
  • the Input Device May Operate Using Passive Coding Techniques
  • A. Passive Coding Provides Location Information for Entering Input Information on a Display or Other Writing Surface
  • the input device may also be used in association with any paper, writing surface or other substrate, to record information corresponding to the handwriting of a user. Again, armed with only the input device and a writing surface, the input device may be used to draft a letter to a client. In this case, gesturing is detected on the basis of passive coding, wherein movements of the input devices are detected using other than codes embedded within the image of a surface of the substrate. For example, the user may draft the letter on a plain sheet of paper. As the user writes with the input device, the image sensor captures images of the paper. Objects within the images may be identified and their movement within the series of captured images are indicative of movement.
  • Sensed objects may include artifacts or other objects on the surface of the paper, which may correspond to a watermark or other defect of the paper.
  • the paper may include ruled lines which may also be used to calculate movement of the pen over the surface. Even in the absence of paper, relative movement of the input device may be determined.
  • the input device could be moved over the surface of a desk, the grain of the wood providing the objects necessary for detecting relative movement of the input device.
  • a user can draft a letter on paper, or any surface over which movement can be detected optically.
  • the movements of the input device may be stored in memory and/or converted into information representing those gestures.
  • the portable device may be used as a substitute for a portable computing device.
  • an engineer may turn to her input device as a suitable replacement for recording her thoughts as she travels by train to meet the rest of the design team.
  • the user composes a sketch representing a modification to the suspect electrical circuit in question. She activates the input device, sets it in a mode conducive to generating a sketch (which may, for example, include deactivation of conversions), and begin sketching a simplified design representing a solution to the problem.
  • the input device may then store the file representing the handwritten strokes.
  • notations and references may be jotted down next to relevant portions of the sketch, and those entries incorporated within the image file.
  • the user may switch to a notation mode, in which gestures corresponding to letters are recognized.
  • she may incorporate a description of her proposed solution along with the sketch.
  • the operator may choose to transmit the schematic to the rest of the design team for full consideration prior to the scheduled meeting.
  • Such transmission may be achieved any number of ways, including uploading the revised document from the input device to a portable wireless device such as a cellular telephone.
  • the information may then be used to generate an image file such as a VISIOTM document.
  • the previously described file corresponding to a sketch of a schematic may be displayed on the monitor of a team member's host computing device.
  • the image and accompanying text may be presented on the display of a desktop computer.
  • additional annotations may be added to those displayed.
  • movement of the input device may be detected by measuring the relative movement of objects within images captured by the optical sensor of the input device.
  • Signals generated by the input device may be processed by software stored within the input device, or transmitted to the host computing device for processing. Processing of the detected movement may generate electronic ink, text, or other data representing the notations entered via the input device.
  • the input device may be used in conjunction with a computing device having sensors for detecting movement of the input device, even in the absence of location codes.
  • the input device may be used as a source for generating handwritten notes on a personal data assistant or other computing device designed for use with a stylus. Therefore, while running errands a user may be reminded of and wish to add an item to an existing “to do list.”
  • the user retrieves the list stored in a host computing device, such as a personal data assistant. Positioning the tip of the input device over the display of the personal data assistant, the user is able to traverse through menus and make selections to retrieve the desired list. Presented with this list, the user may input checks on the screen of the host device in empty boxes located next to descriptions of tasks already completed.
  • the input device captures images of the screen including data corresponding to the box and transmits that data to the host computing device.
  • the host computing device uses a processing algorithm for analyzing image data, the host computing device then detects the shape of the box as an object for which an entry may be made.
  • the image data may be processed to detect movement of the pen over and within the area of the box, the gestures forming the recognized shape of a “check.”
  • the host device modifies the file associated with the list to include a representation of a check within the box.
  • the sensors of the host device detects movement of the input device and generates data representing those inputs.
  • the inputs are converted into text and displayed to the user, along with an empty box.
  • a user of Microsoft Reader may wish to jot down notes next to the relevant text.
  • the image displayed on the monitor of the portable host device is annotated using the input device.
  • the user positions the input device over the monitor of the host computer, a Tablet PCTM for example, and enters handwritten notes next to the relevant text.
  • the gestures are detected by the sensors of the host device and stored as electronic data which is converted into image data and displayed on the screen.
  • the notes may remain in handwritten form or may be converted into alphanumeric characters.
  • the notes may not be seen without actuation of additional functions, such as activating viewing of appended comments or positioning the input device over highlighting or some other indication that annotations are present.
  • the notes may then be stored in a separate file, or stored with a copy of the electronic version of the novel stored within the host computer.
  • information from additional sensors forming part of the input device may be used to supplement or completely replace other forms of movement detection.
  • additional sensors may detect linear acceleration of the input device, angular acceleration, velocity, rotation, depressive force, tilt, changes in electromagnetic fields or any sensed indication of movement or positioning of the input device. Such information may aid in an effort to produce more accurate movement detection.
  • the additional sensors may provide the only information available at a given time.
  • the input device may be used in conjunction with a generally uniform surface, such as blank paper. In such cases, the image captured by the optical sensor may provide insufficient information to consistently and accurately detect movement of the input device.
  • additional information from the additional sensors may be used to provide more refined motion detection.
  • the algorithm or algorithms used to determine position and/or movement may incorporate calculations to factor in the additional information and to thereby supplement movement and/or location detection if the optical motion detection.
  • the additional sensors may provide the only information with which to detect movement. For example, if the user attempts to sketch out a drawing on the uniform white counter of a laminated countertop, the optical sensing system may sufficient data representative of movement. In that case, the additional sensors may provide sufficient information to generate an acceptably accurate representation of input information.
  • the optical sensor unit may not capture an accurate representation of the image provided.
  • additional information from the additional sensors may be used to compliment data obtained by the image of the object over which the input device is moved.
  • sensors within the input device may provide an indication of movement of the pen within the plane of the display, i.e., in the horizontal and vertical directions.
  • an input device used in conjunction with a laptop computer is positioned on the tray table in front of the user.
  • An image of a document, with a maze pattern incorporated into the background, is displayed on the screen of the laptop.
  • Annotations entered by the user are shown in solid blue ink.
  • the seat belt sign comes on as the airplane experiences turbulence.
  • the image sensor may not accurately detect the lines forming the maze pattern displayed, movement in the x and y axis is measured by the additional sensors incorporated within the input device.
  • FIG. 8 illustrates uses of an input device in accordance with several illustrative embodiments of the present invention, as a document is created, transmitted and edited using an input device in various environments.
  • the following description is merely an illustration of uses of the input device and is not intended to limit the structure or functionality of the present invention.
  • the input device may be used to extend the life of a document by allowing the creation and/or editing of documents in a wide range of environments and for use in association with numerous devices.
  • a document 802 may be electronically created on the screen of one computing device, such as Tablet PC 803 illustrated.
  • the input device may be used to generate a handwritten draft of a document.
  • Electronic ink corresponding to the information entered on the screen of the Tablet PC 803 is generated as the input device functions as a stylus for the Tablet PC 803 .
  • the electronic ink may be converted into text form and stored in the Tablet PC 803 .
  • the electronic file representing the document may be transmitted to a second computing device, such as desktop PC 804 .
  • the document may be edited on the screen of the desktop device using the input device operates as an independent input unit. Because the input device senses relative location of the input device within the displayed image of the document, edits entered on the screen of the desktop device may be reflected in the electronic document, even if the display does not include elements for sensing positioning of the input device.
  • the edits generated using the input device may be transmitted to the desktop PC 804 as they are generated or may stored within the input device for transmission to any PC at a later time. The edits may be entered into the version of the document stored in the desktop PC 804 .
  • the document created may also be output in hard-copy form by a printer, such as printer 805 linked to the desktop PC 804 .
  • the hard-copy 806 version of the document may include information or codes designating the relative location of the input device at any location in the document, using a maze-pattern, for example.
  • the hard-copy may be marked-up by one or more users each having an input device, and the edits of each user generated by the separate input device. Along with information representing edits, information identifying the pen used to generate those edits may provided as well.
  • the inputs may be reflected using underlined colored text such as that found in applications for tracking changes made to documents.
  • the edits/inputs may be forwarded from the desktop PC 804 to the Tablet PC 803 , for incorporation into that document. Alternatively, the edits may be stored within the input device and uploaded at a later time.
  • the document may also be output on plain paper, or on any substrate not including indications of relative positioning of the input device.
  • the hard-copy may be marked-up by one or more users having an input device, and the edits of each user generated by the input device.
  • position or movement of the pen may be determined using coding techniques for optically sensing movement of the input device over the paper.
  • location/movement may be determined using a comparison algorithm in which the relative position of objects within each frame of image data are detected and used to determine movement of the input device.
  • the resulting edits may be transmitted to the computing device in which the document originated, for example, for updating of the original data file.
  • the edits may be transmitted through a computing device, such as the Pocket PC 807 for transmission to the destination device either through a wireless or wired communication or upon docking the device containing edits in the computing device.
  • the electronic document may also be transmitted to a second computing device, such as the Tablet PC illustrated.
  • the document may be edited on the screen of the tablet device using the input device as a simple stylus.
  • Those inputs may be forwarded from the Tablet PC to the computing device storing the original copy of the document as annotations to the document or as edits for incorporation into that document, for example.
  • the relocation of various components within the input device structure may be implemented without greatly impacting the accuracy with which the camera or the inertial sensors detect movement of the pen and produce electronic ink.
  • the image sensor may be replaced by or supplemented with a sensing device for detecting properties of the surface or object over which the input device may be moved.
  • the maze pattern was formed on the surface of an object such that the pattern could be detected based on the radiation of energy outside the visible light spectrum, reflectance of such energy transmitted to the object, or other such sensing techniques.
  • Sensing of any property of the surface may be detected and used to determine position and/or movement of the input device over the surface of an object.
  • a microphone sensing system may be employed such that the microphone detects acoustic reflections or emissions from the object over which the input device is positioned.
  • the input device may include a suitable display.
  • the display of a host computing device may be used to review documents and images created. The user may select formatting of the document before or after the information, such as text, is input, or may review the document and make changes to the format of the document. Viewing the document created on such a display, in the context of the above example, the user may insert a header including his or her address in the appropriate location.

Abstract

A universal input device is described. The universal input device provides a common user interface for a variety of different computing platforms including printed documents. Using the present system, one may use the universal input device to control various computing devices as well as capture handwritten electronic ink and have the electronic in be associated with new or stored documents.

Description

    RELATED APPLICATIONS
  • This application is a continuation application based on co-pending application Ser. No. 10/284,417 filed on Oct. 31, 2002 entitled “Universal Computing Device”, which is related to U.S. Ser. No. 10/284,412, entitled “Active Embedded Interaction Code,” invented by Jian Wang, Qiang Wang, Chunhui Zhang, and Yue Li, and to U.S. Ser. No. 10/284,451, entitled “Passive Embedded Interaction Code,” invented by Jian Wang, Yingnong Dang, Jiang Wu and Xiaoxu Ma, whose contents are hereby incorporated by reference.
  • BACKGROUND
  • 1. Technical Field
  • This disclosure relates to a computer input device for generating smooth electronic ink. More particularly, the disclosure relates to an input device may be used on divergent platforms, while providing a common user interface.
  • 2. Related Art
  • Computing systems have dramatically changed the way in which we live. The first wave of computers was prohibitively expensive, and was only cost effective for use in business settings. As computers became more affordable, the use of personal computers both in the workplace and at home have become so widespread that computers have become as common as desks in the office and kitchen tables in the home. Microprocessors have been incorporated in all aspects of our daily lives, from use in television and other entertainment systems to devices for regulating the operation of our automobile.
  • The evolution of computing devices, from data crunching devices that occupied entire floors of large office facilities, to laptop computers or other portable computing devices, has dramatically impacted the manner in which documents are generated and information stored. Such portable computing have enabled individuals to type letters, draft memorandum, take notes, create images, and perform numerous tasks in places other than the office using these computing devices. Professionals and nonprofessionals alike are empowered to take perform tasks while on the move using devices that fulfill their computing needs in any location.
  • Typical computer systems, especially computer systems using graphical user interface (GUI) systems, such as Microsoft Windows, are optimized for accepting user input from one or more discrete input devices such as a keyboard (for entering text), and a pointing device (such as a mouse) with one or more buttons for activating user selections.
  • One of the original goals of the computing world was to have a computer on every desk. To a large extent, this goal has been realized by the personal computer becoming ubiquitous in the office workspace. With the advent of notebook computers and high-capacity personal data assistants, the office workspace has been expanded to include a variety of non-traditional venues in which work is accomplished. To an increasing degree, computer users must become masters of the divergent user interfaces for each of their computing devices. From a mouse and keyboard interface for the standard personal computer to the simplified resistive stylus interface of personal data assistants and even to the minimalistic keys of a cellular telephone, a user is confronted with a variety of different user interfaces that one needs to master before he can use the underlying technology.
  • Despite the advances in technology, most users tend to use documents printed on paper as their primary editing tool. Some advantages of printed paper include its readability and portability. Others include the ability to share annotated paper documents and the ease at which one can archive printed paper. One user interface that is bridging the gap between advanced computing systems and the functionality of printed paper is a stylus-based user interface. One approach for the stylus-based user interface is to use resistive technology (common in today's PDAs). Another approach is to use active sensors in a notebook computer. One of the next goals of the computing world is to bring the user interface for operating the computer back to the user.
  • A drawback associated with the use of a stylus is that such devices are tied to the computing device containing the sensor board. In other words, the stylus may only be used to generate inputs when used in conjunction with the required sensor board. Moreover, detection of a stylus is affected by the proximity of the stylus to the sensing board.
  • There is a need in the art for a portable computing device that may function as an input device for any one of a variety of computing devices and which may operate in a variety of situations.
  • SUMMARY
  • Aspects of the present invention address one or more of the issues identified above, thereby providing a common user interface to users across divergent computing platforms. Aspects of the present invention relate to an input device for generating electronic ink, and/or generating other inputs, independent of the device for which the data is intended. The input device may be formed in the shape of a pen, and may or may not include an ink cartridge to facilitate movement of the input device in a familiar manner.
  • The foregoing summary of aspects of the invention, as well as the following detailed description of various embodiments, is better understood when read in conjunction with the accompanying drawings, which are included by way of example, and not by way of limitation with regard to the claimed invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows a general description of a computer that may be used in conjunction with embodiments of the present invention.
  • FIG. 2 illustrates an input device (including all of the components) in accordance with an illustrative embodiment of the present invention.
  • FIG. 3 provides three illustrative embodiments of a camera system for use in accordance with aspects of the present invention.
  • FIG. 4 illustrates an illustrative technique (maze pattern) for encoding the location of the document.
  • FIG. 5 provides an illustration of a trace pattern from which electronic ink may be generated.
  • FIG. 6 shows the hardware architecture of a system in accordance with one embodiment of the present invention.
  • FIG. 7 illustrates a further combination of components incorporated in an input device for generating electronic ink in accordance with another illustrative embodiment.
  • FIG. 8 illustrates uses of an input device in accordance with several illustrative embodiments of the present invention.
  • DETAILED DESCRIPTION OF THE DRAWINGS
  • Aspects of the present invention relate to an input device that may be used in a variety of different computing platforms from controlling a desktop or notebook computer, writing on a whiteboard, controlling a PDA or cellular phone, or creating ink that may be ported to any of these platforms. The following description is divided into a number of sections as follows: terms, general-purpose operating environment, universal pen and camera, active coding, passive coding, internal sensors, additional components, sample implementations.
  • Terms
  • Pen—any writing implement that may or may not include the ability to store ink. In some examples a stylus with no ink capability may be used as a pen in accordance with embodiments of the present invention.
  • Camera—an image capture system.
  • Active Coding—incorporation of codes within the object or surface over which the input device is positioned for the purpose of determining positioning and/or movement of the input device using appropriate processing algorithms.
  • Passive Coding—detecting movement/positioning of the input device using image data, other than codes incorporated for that purpose, obtained from the object or surfaces over which the input device is moved using appropriate processing algorithms.
  • Input Device—a device for entering information which may be configured for generating and processing information
  • Active Input Device—an input device that actively measures signals and generates data indicative of positioning and/or movement of the input device using sensors incorporated within the input device.
  • Passive Input Device—an input device for which movement is detected using sensors incorporated other than within the input device.
  • Computing Device—a desktop computer, a laptop computer, Tablet PC™, a personal data assistant, a telephone, or any device which is configured to process information including an input device.
  • General Purpose Operating Environment
  • FIG. 1 is a functional block diagram of an example of a general-purpose digital computing environment that can be used to implement various aspects of the present invention. In FIG. 1, a computer 100 includes a processing unit 110, a system memory 120, and a system bus 130 that couples various system components including the system memory to the processing unit 110. The system bus 130 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. The system memory 120 includes read only memory (ROM) 140 and random access memory (RAM) 150.
  • A basic input/output system 160 (BIOS), containing the basic routines that help to transfer information between elements within the computer 100, such as during start-up, is stored in the ROM 140. The computer 100 also includes a hard disk drive 170 for reading from and writing to a hard disk (not shown), a magnetic disk drive 180 for reading from or writing to a removable magnetic disk 190, and an optical disk drive 191 for reading from or writing to a removable optical disk 192 such as a CD ROM or other optical media. The hard disk drive 170, magnetic disk drive 180, and optical disk drive 191 are connected to the system bus 130 by a hard disk drive interface 192, a magnetic disk drive interface 193, and an optical disk drive interface 194, respectively. The drives and their associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for the personal computer 100. It will be appreciated by those skilled in the art that other types of computer readable media that can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, random access memories (RAMs), read only memories (ROMs), and the like, may also be used in the example operating environment.
  • A number of program modules can be stored on the hard disk drive 170, magnetic disk 190, optical disk 192, ROM 140 or RAM 150, including an operating system 195, one or more application programs 196, other program modules 197, and program data 198. A user can enter commands and information into the computer 100 through input devices such as a keyboard 101 and pointing device 102. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner or the like. These and other input devices are often connected to the processing unit 110 through a serial port interface 106 that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, game port or a universal serial bus (USB). Further still, these devices may be coupled directly to the system bus 130 via an appropriate interface (not shown). A monitor 107 or other type of display device is also connected to the system bus 130 via an interface, such as a video adapter 108. In addition to the monitor, personal computers typically include other peripheral output devices (not shown), such as speakers and printers. In a preferred embodiment, a pen digitizer 165 and accompanying pen or stylus 166 are provided in order to digitally capture freehand input. Although a direct connection between the pen digitizer 165 and the serial port is shown, in practice, the pen digitizer 165 may be coupled to the processing unit 110 directly, via a parallel port or other interface and the system bus 130 as known in the art. Furthermore, although the digitizer 165 is shown apart from the monitor 107, it is preferred that the usable input area of the digitizer 165 be co-extensive with the display area of the monitor 107. Further still, the digitizer 165 may be integrated in the monitor 107, or may exist as a separate device overlaying or otherwise appended to the monitor 107.
  • The computer 100 can operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 109. The remote computer 109 can be a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 100, although only a memory storage device 111 has been illustrated in FIG. 1. The logical connections depicted in FIG. 1 include a local area network (LAN) 112 and a wide area network (WAN) 113. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
  • When used in a LAN networking environment, the computer 100 is connected to the local network 112 through a network interface or adapter 114. When used in a WAN networking environment, the personal computer 100 typically includes a modem 115 or other means for establishing a communications over the wide area network 113, such as the Internet. The modem 115, which may be internal or external, is connected to the system bus 130 via the serial port interface 106. In a networked environment, program modules depicted relative to the personal computer 100, or portions thereof, may be stored in the remote memory storage device.
  • It will be appreciated that the network connections shown are illustrative and other techniques for establishing a communications link between the computers can be used. The existence of any of various well-known protocols such as TCP/IP, Ethernet, FTP, HTTP and the like is presumed, and the system can be operated in a client-server configuration to permit a user to retrieve web pages from a web-based server. Any of various conventional web browsers can be used to display and manipulate data on web pages.
  • Universal Pen and Camera
  • FIG. 2 provides an illustrative embodiment of an input device for use in accordance with various aspects of the invention. The following describes a number of different elements and/or sensors. Various sensor combinations may be used to practice aspects of the present invention. Further, additional sensors may be included as well, including a magnetic sensor, an accelerometer, a gyroscope, a microphone, or any sensor for that might detect the position of the input device relative to a surface or object. In FIG. 2, pen 201 includes ink cartridge 202, pressure sensor 203, camera 204, inductive element 205, processor 206, memory 207, transceiver 208, power supply 209, docking station 210, cap 211, and display 212. The various components may be electrically coupled as necessary using, for example, a bus, not shown. Pen 201 may serve as an input device for a range of devices including a desktop computer, a laptop computer, Tablet PC™, a personal data assistant, a telephone, or any device which may process and/or display information.
  • The input device 201 may include an ink cartridge 202 for performing standard pen and paper writing or drawing. Moreover, the user can generate electronic ink with the input device while operating the device in the manner typical of a pen. Thus, the ink cartridge 202 may provide a comfortable, familiar medium for generating handwritten strokes on paper while movement of the pen is recorded and used to generate electronic ink. Ink cartridge 202 may be moved into a writing position from a withdrawn position using any of a number of known techniques. Alternatively, ink cartridge 202 may be replaced with a cartridge that does not contain ink, such as a plastic cartridge with a rounded tip, but that will allow the user to move the pen about a surface without damaging the pen or the surface. Additionally, an inductive element or elements may be included to aid in detecting relative movement of the input device by, for example, providing signals indicative of the input device in a manner similar to those generated by a stylus. Pressure sensor 203 may be included for designating an input, such as might be indicated when the pen 201 is depressed while positioned over an object, thereby facilitating the selection of an object or indication as might be achieved by selecting the input of a mouse button, for example. Alternatively, the pressure sensor 203 may detect the depressive force with which the user makes strokes with the pen for use in varying the width of the electronic ink generated. Further, sensor 203 may trigger operation of the camera. In alternative modes, camera 204 may operate independent of the setting of pressure sensor 203.
  • Moreover, in addition to the pressure sensor which may act as a switch, additional switches may also be included to effect various settings for controlling operation of the input device. For example, one or more switches, may be provided on the outside of the input device and used to power on the input device, to activate the camera or light source, to control the sensitivity of the sensor or the brightness of the light source, set the input device in a sketch mode in which conversion to text is not performed, to set the device to store the input data internally, to process and store the input data, to transmit the data to the an processing unit such as a computing device with which the input device is capable of communicating, or to control any setting that might be desired.
  • Camera 204 may be included to capture images of the surface over which the pen is moved. Inductive element 205 also may be included to enhance performance of the pen when used as a stylus in an inductive system. Processor 206 may be comprised of any known processor for performing functions associated with various aspects of the invention, as will described in more detail to follow. Similarly, memory 207 may include a RAM, a ROM, or any memory device for storing data and/or software for controlling the device or processing data. The input device may further include a transceiver 208. The transceiver permits information exchange with other devices. For example, Bluetooth or other wireless technologies may be used to facilitate communications. The other devices may include a computing device which may further includes input devices.
  • Power supply 209 may be included, and may provide power if the pen 201 is to be used independent of and remotely from the host device, the device in which the data is to be processed, stored and/or displayed. The power supply 209 may be incorporated into the input device 201 in any number of locations, and may be positioned for immediate replacement, should the power supply be replaceable, or to facilitate its recharging should the power supply be rechargeable. Alternatively, the pen may be coupled to alternate power supplies, such as an adapter for electrically coupling the pen 201 to a car battery, a recharger connected to a wall outlet, to the power supply of a computer, or to any other power supply.
  • Docking station link 212 may be used to transfer information between the input device and a second device, such as an external host computer. The docking station link may also include structure for recharging the power supply 206 when attached to a docking station, not shown, or connected to a power supply. A USB or other connection may removably connect the input device to a host computer through the docking station link, or through an alternative port. Alternatively, a hardwire connection may also be used to connect the pen to a device for transferring and receiving data. In a hardwired configuration, the docking station link would be omitted in favor of wiring connecting the input device directly to a host. The docking station link may be omitted or replaced with another system for communicating with another device (Bluetooth 802.116, for example).
  • The input device 201 may further include a removable cap 211 which may be equipped with a metal tip for facilitating resistive sensing, so that input device 201 may be used with a device that includes a sensing board, for example. The shell of input device 201 may be comprised of plastic, metal, a resin, a combination thereof, or any material that may provide protection to the components or the overall structure of the input device. The chassis may include a metal compartment for electrically shielding some or all of the sensitive electronic components of the device. The input device may be of an elongated shape, which may correspond to the shape of a pen. The device may, however, be formed in any number of shapes consistent with its use as an input device and/or ink generating device.
  • FIGS. 3A-C depict three illustrative embodiments of a camera for use in accordance with aspects of the present invention. As described, the input device 201 may be used to generate electronic ink by detecting movement of the pen using, for example, a camera. Camera 321 may be included to capture images of the surface over which the pen is moved, and through image analysis, detect the amount of movement of the pen over the surface being scanned. The movements may be correlated with the document and electronically transpose, add, or associate (e.g. store input annotations apart from the original document) electronic ink to the document).
  • As shown in FIG. 3A, in one embodiment, camera 304 includes an image sensor 320 comprised of, for example, an array of image sensing elements. For example, the camera may be comprised of a CMOS image sensor with the capability of scanning a 1.79 mm by 1.79 mm square area at a resolution of 32 pixels by 32 pixels. The minimum exposure frame rate for one such image sensor may be approximately 330 Hz, while the illustrative image sensor may operate at a processing rate of 110 Hz. The image sensor selected may comprise a color image sensor, a grayscale image sensor, or may operate to detect intensities exceeding a single threshold. However, selection of the camera or its component parts may vary based on the desired operating parameters associated with the camera, based on such considerations as performance, costs or other considerations, as may be dictated by such factors as the resolution required to accurately calculate the location of the input device.
  • A light source, 321, may illuminate the surface over which the input device is moved. The light source may, for example, be comprised of a single light emitting diode (LED), an LED array, or other light emitting devices. The light source may produce light of a single color, including white, or may produce multiple colors. A half mirror 322 may be included within the camera to direct light as desired. The camera 304 may further include one or more optical devices 323 for focusing light from the light source 321 onto the surface scanned 324 and/or to focus the light reflected from that surface to the image sensor 320.
  • As illustrated in FIG. 3A, light emitted from light source 321 is reflected by half-mirror 322, a mirror that reflects or transmits light depending on direction of the impinging light. The reflected light is then directed through lens system 323 and transmitted to the reflective surface below. The light is then reflected off of that surface, through lens system 323, strikes half-mirror 322 at a transmission angle passing through the mirror, and impinges on sensing array 320. Of course, cameras including a wide range of components may be used to capture the image data, including cameras incorporating a lesser, or a greater, number of components. Variations in the arrangement of components may also be numerous. To provide just one example, in simplified arrangement, the light source and the sensing array may be positioned together such that they both face the surface from which the image is to be captured. In that case, because no reflections within the camera are required, the half-mirror may be removed from the system. As shown in FIG. 3B, in a simplified configuration the light source 321 is positioned a distance from the lens 323 and sensor 320. In further simplified arrangement, as shown in FIG. 3C, the light source may be removed and ambient light reflecting off the object surface is focused by lens 323 onto the sensor 320.
  • Thus, variations in the components incorporated into the camera, or their placement, may be employed in a manner consist with aspects of the present invention. For example, the placement and/or orientation of the camera and/or cartridge may be varied from that shown in FIG. 2 to allow for the use of a wide range of camera and/or ink configurations and orientations. For example, camera 304, or any of its component parts, may be located in openings adjacent that provided for the ink cartridge, rather than within the same opening as illustrated. As an additional example, camera 304 may be positioned in the center of the input device with the ink cartridge positioned to the side of the camera. Similarly, the light source 321 may be incorporated within the structure housing the remaining components of the camera, or one or more components may be positioned separate from the others. Furthermore, a light projecting feature may also be enabled, using a light source and/or optical system, with additional structure and/or software, or modifications to the illustrated components as necessary.
  • Active Coding
  • To aid in the detection and/or positioning of the input device, the surface of an object over which the input device is positioned may include image data that indicates the relative position of areas of the surface. In one exemplarily embodiment, the surface being scanned may comprise the display of a host computer or other external computing device, which may correspond to the monitor of a desktop computer, a laptop computer, Tablet PC™, a personal data assistant, a telephone, digital camera, or any device which may display information. Accordingly, a blank document or other image generated on the screen of a Tablet PC™ may include data corresponding to a code that represents the relative position of that portion of the document within the entire document, or relative to any other portion of the image. The information may be comprised of images, which may include alphanumeric characters, a coding pattern, or any discernable pattern of image data that may be used to indicate relative position. The image or images selected for use in designating the location of areas within the surface of the object may depend on the sensitivities of the scanning device incorporated into the camera, such as the pixel resolution of the sensor, and/or the pixel resolution of the image data contained within the surface being scanned. The location information extracted from the object may then be used to track movement of the input device over the object. Using that information, electronic ink or other information corresponding to movement of the input device may be accurately generated. Location information may be used to both detect the position within the image at which the input is to be effected, as well as to provide an indication of movement of the input device over the object surface. The resulting information may be used interactively with word processing software to generate changes in a document, for example.
  • In an alternate embodiment, the object used in combination with the input device may be composed of paper with positional information included in the background, for example. The positional information may be incorporated in any form of code, optical representation, or other form that may be sensed by a sensor associated with the input device and used to represent the relative location of the specific site on the paper.
  • FIG. 4 illustrates an illustrative technique for encoding the location of the document. In this example, the background of the image may include thin lines that, when viewed in large groups form a maze-like pattern. Each grouping of lines within the maze design, comprised of a few thin lines with unique orientations and relative positions, for example, may indicate the position of that portion of the maze pattern relative to other portions of the document. Decoding of the maze pattern found in a captured image may be performed in accordance with numerous decoding schemes. In one embodiment, a particular arrangement and grouping of lines may be decoded to generate positional information. In another embodiment, an indication of the position of the captured data may be derived by extracting a code from the image corresponding to the sampled pattern, and using that code to address a look-up table containing data identifying the location of that area. Reference to the coding technique employing a maze pattern is provided for illustrative purposes, and alternative active coding techniques, including, but not limited to the visual coding techniques in U.S. Ser. No. 10/284,412, entitled, “Active Embedded Interaction Code,” invented by Jian Wang, Qiang Wang, Chunhui Zhang, and Yue Li, whose contents are expressly incorporated by reference for all essential subject matter, may also be used consistent with aspects of the invention.
  • Passive Coding
  • Even in the absence of location codes, images captured by the image sensor may be analyzed to determine the location of the input device at the time of image capture. Successive images may be used to calculate the relative positions of the input device at different times. Correlation of this information may yield an accurate trace of the input device over the substrate. Using this trace information electronic ink accurately representing handwritten strokes may be generated, for example.
  • FIG. 5 provides an illustration of a trace pattern from which electronic ink may be generated. In this example, a first captured image may contain a portion of a maze pattern indicative of a first position p1 of the input device at a first time, t1. The next captured image may contain a portion of the coded image data, a different portion of the maze pattern in this example, providing location information of a second position p2 at a second time, t2. A third captured image may contain a third portion of the maze pattern, thereby indicating positioning of the input device at a third position p3 at time t3. Using this data, the three points may indicate a trace of the input device from time t1 through t3. Applying algorithms for estimating the inking pattern traced by the input device, electronic ink may be generated. The complexity of the algorithm applied may dictate the accuracy of the ink generated. For example, a basic inking algorithm may simply connect the dots with straight lines of unvarying thickness. Algorithms factoring previous sampling points, the time between samplings or other data indicative of the velocity or acceleration at which the input was moved, data indicative of the depressive force used, or any other relevant data, may be processed to provide ink that more accurately represents the actual movement of the input device (for example, from other sensors).
  • Optical scanning performed by camera 304 may generate data necessary to determine the position of the input device at various times and that information may be used to generate electronic ink. In one illustrative embodiment, comparisons of the image captured at time t1 with that of the image captured at time t2 may provide data indicating the distance of movement of the pen from one point to another during the period t1 to t2. Those two points of data, and/or the relative distance moved, may then be used to generate a trace of the movement of the input device for generating electronic ink representative of handwritten strokes. Comparisons of two or multiple images, or portions of captured images, for calculating the relative movement, might be accomplished by a difference analysis. In that case, features appearing in more than one image may be compared and the relative movement of the feature or features from one location to another within those images may provide an accurate indication of pen movement, for example. Should an irregular sampling period be used, the processing algorithm may be modified to compensate for variations in the sampling period to more accurately indicate the correlation between movement of the input device with the actual time required for each movement. Information indicative of the velocity of motion may assist in generating ink of the appropriate thickness.
  • In accordance with such an embodiment, the surface over which the input device is moved may include the display of a computing device, a mouse pad, a desktop, or any non-uniform reflective surface from which objects or image data may be extracted indicating movement of the input device over that surface. The tracking algorithm with which the captured image data may be processed may be fixed or may vary dependent on the characteristics of the images captured. Using a simple tracking algorithm, the processor may detect grains in the wood of a desktop, for example, and based on a comparison of a sequence of images captured by the camera, the relative location of particular patterns of grain within successive images may be used to determine the location of the input at various times and/or the relative movement of the input device over that surface. A more complex tracking algorithm may be required where features within the images are less easily discerned and the image more uniform. Alternative passive coding techniques, including, but not limited to, the coding techniques found in U.S. Ser. No. 10/284,451, entitled, “Passive Embedded Interaction Code,” invented by Jian Wang, Yingnong Dang, Jiang Wu and xiaoxu Ma, whose contents are expressly incorporated by reference for all essential subject matter, may also be employed consist with aspects of the invention.
  • Hardware Architecture
  • FIG. 6 shows the hardware architecture of a system in accordance with one embodiment of the present invention. Many of the same or related components illustrated in previous embodiments will be represented using like reference numerals. Processor 606 may be comprised of any known processor for performing functions associated with various aspects of the invention. For example, the processor may include an FPSLIC AT94S40, and may be comprised of an FPGA with an AVR core. That particular device may include a 20 MHz clock and operate at a speed of 20 MIPS. Of course, selection of a processor for use in input device 601 may be dictated by the cost and/or processing speed requirements of the system. The processor 606 may perform image analysis, should such analysis be conducted within the input device. Alternatively, processing may be performed by a second processor, such as a digital signal processor (DSP) incorporated into the device 601. The processor 606 may further operate to perform steps critical to reducing power consumption to conserve power stored in power supply 609, such as powering down various components when the input device is inactive, which may be based on data indicating movement and/or positioning of the device. The processor 606 may further operate to calibrate and regulate the performance of various components, including adjustments to the intensity of light source or to the sensitivity of the sensing array of camera 604, for example. Also, the processor, or a coupled digital signal processor, may choose from among a plurality of stored image processing algorithms, and may be controlled to select the image analysis algorithm most suitable for detecting movement, in accordance for example, characteristics associated with the surface over which the device is moved. Thus, the algorithm may be selected automatically based on performance considerations programmed into the input device. Alternatively, the input device may be controlled, and settings established, based on user's selections input, for example, via actuations of the force sensor or based on handwritten strokes corresponding to commands.
  • In one embodiment, Memory 607 may include one or more RAMs, ROMs, FLASH memories, or any memory device or devices for storing data, storing software for controlling the device, or for storing software from processing data. As noted, data representing location information may be processed within the input device 601 and stored in memory 607 for transfer to a host computer 620. Alternatively, the captured image data may be buffered in memory 607 within the input device 601 for transfer to a host device 620 for processing or otherwise.
  • Transceiver, or communication unit, 608 may include a transmission unit and receiving unit. As noted, information representing movement of the input device, either processed into a form suitable for generating and/or displaying electronic ink or otherwise, may be transmitted to a host computer 620, such as the previously described desktop computer, lap top computer, Tablet PC™, personal digital assistant, telephone, or other such device for which user inputs and electronic ink might be useful. The transceiver 608 may communicate with an external device using any wireless communication technique, including blue tooth technology, for performing short-range wireless communications, infrared communications, or even cellular or other long range wireless technologies. Alternatively, the transceiver 608 may control the transmission of data over a direct link to a host computer, such as over a USB connection, or indirectly through a connection with docking cradle 630. The input device may also be hardwired to a particular host computer using a dedicated connection. The transceiver 608 may also be used to receive information and/or software, which in one embodiment, may be used for improving performance of the input device. For example, program information for updating the control functions of the processor or processor may be uploaded via any of the previously described techniques. Moreover, software may also be transmitted to the input device, including software for analyzing the image data and/or for calibrating the input device may be downloaded from an external device.
  • Processor 606 may operate in accordance with an interaction model. An interaction model may be implemented in the form of software for maintaining a consistent experience in which electronic ink is generated regardless of the external device for which the unit performs the functions of an input device. The interaction model may process captured data for conversion into a form universally suitable for use on any number of host devices including a desktop computer, a laptop computer, Tablet PC™, a personal data assistant, a telephone, a whiteboard, or any device that might store, display or record data input via the input device. The processor 606 may recognize the device to which it is connected, or for which the data representing handwritten inputs are intended, and based on such recognition, select processing that converts input data into a form suitable for the specific device recognized. In that case, a conversion to a form useful for each potential recipient computing device would be contained within the input device and made available as necessary. Recognition of the intended recipient device may be attained as a result of communication between the devices, should they be connected wirelessly or directly. Alternatively, the user may enter the identity of the device or devices for which the data is intended directly into the input device. Of course, if the input device includes a display, data may be processed using a default processing algorithm suitable for use with the display and/or a multitude of other devices.
  • Inertial Sensors
  • As illustrated in FIG. 7, input device 701 also may include one or more inertial sensors 715 for sensing pen movement, position, or orientation, in addition to the previously described components represented with like reference numerals. For example, input device 701 may include a gyroscope for providing data representing the angular velocity of the pen in a plurality of directions. The input device 701 may include one or more accelerometers, or sets of accelerometers, measuring the acceleration or gravitational forces upon the pen. Data representing movement of the pen may also be obtained using a magnetic sensor which measures movements of the pen by detecting variations in measurements of the earth's magnetic field, described herein as an inertial sensor because it detects movement of the input device based on data other than image data. Data from either or any of the inertial sensors incorporated with or into the input device, which may include gyroscopes, accelerometers, magnetic sensor, inductive elements or any device or devices for measuring movement of the input device, may be used in combination with data from the camera to obtain data representing movement or positioning of the input device, and thereby produce data for generating electronic ink.
  • Sample Implementations
  • I. The Input Device May Operate Using Active Coding
  • A. Active Coding Provides Location Information for the Entry of Input Information on a Display or Other Writing Surface
  • As noted, a surface of an object over which the input device is positioned and/or moved may include coded image data that indicates the location or relative position of each area within that surface. The object may comprise the display of a computing device, such as a laptop computer. In one embodiment, a document may be recalled from memory and displayed on the screen. Imbedded within that document, such as in the background, may lie coded information indicating the position of each area of the document. For example, the background of the document may include a maze pattern, a sufficiently large enough portion of that pattern uniquely identifying each region within the entire document. The input device may be used in combination with the coded location information to add annotations or edits to the document at specified locations even if the display of the laptop does not include sensors for detecting movement of an input device over a screen. Thus, the input device may function as an “active input device” such that sensors associated with the input device generate data indicative of position or location of that device.
  • In one example, the image sensor incorporated within the input device captures image data representing the surface of the display screen over which the input device is positioned and/or moved. The sensor captures images including location codes indicating the relative position of the input device. As the user moves about the displayed image, entering annotations and/or making edits to the electronic document displayed, the input device generates signals representing those inputs and data representing the location within the document at which those inputs are to be incorporated. Control of the laptop may be also effected using the input device, in place of a mouse, or to perform other standard inputs function including the movement of a cursor and the actuation of selections.
  • The input device may be used in conjunction with word processing software to edit the document by, for example, deleting text and inserting new text. To edit the document displayed on the screen of a computing device, a user positions the input device over the screen at the desired location. To delete text, the user may position the input device proximate the screen and move the device in a motion to strike through the image of the text displayed. By sensing the location codes, the image may be processed to determine both that the pen has been moved in a striking motion, and to identify the text corresponding to the location at which the user moved the input device. Accordingly, the inputs may be used to delete that data.
  • Next, the user may wish to insert new text. In a familiar manner, the user may draw a symbol for inserting text, such as a “carrot” or upside-down “V”, at the location at which the new text is to be inserted. Processing software for converting inputs into image data and/or commands, stored in the input device or host computer, recognizes the symbol as a control signal for inserting text. With the aid of the input device, the user may then write text to be inserted by hand.
  • In an alternative embodiment, the user may add notes with highlighting indicating the original text to which the annotations pertain. For example, the user may select the text to be highlighted using a pull-down menu, or a highlighting button, displayed on the screen. Next, the input device is dragged over text to be selected for highlighting. Then comments to be associated with the highlighted/selected text may be written on the screen of the display at a location adjacent the highlighted text. When the operation is complete, the user may select the prompts necessary for completing entry of annotations. All of these modifications to the document may be created using the input device regardless of whether the display includes sensors for detecting movement of the input device.
  • Modifications to the document may be displayed and/or incorporated within the document in the form of image data, electronic ink or data converted into text. Conversion of the inputs into text may occur in a manner invisible to the user, such that text appears in the display of the document on screen as it is entered. Alternatively, the handwriting of the user may appear within the body of the document. To achieve instantaneous display of edits, information representing the movement of the pen and the location of such edits may be transmitted to the laptop device on an ongoing basis.
  • As noted, the identity of the person entering the inputs may also be recorded. For example, the input device may generate information identifying the user and/or the particular input device. The identity information may be attached to the generated input data. Alternatively, such identification information may be provided as a separate signal transmitted to the host device.
  • While the above illustrative embodiment identifies the surface over which the input device is moved as the display of a laptop device, the input device may also function to detect positioning using codes incorporated within a surface of any object over which the device may be moved. Thus, an image incorporating location codes may be created and/or editing using the input device in combination with the monitor of a desktop computer, Tablet PC™, a personal data assistant, a telephone, or any device which may display information. Coded information may also be incorporated within a transparent sheet laid over the display screen of such devices, or incorporated within a surface that may be used in combination with a display, including protective films.
  • Coded information may also be incorporated on a writing surface or on writing material, such as paper, to uniquely identify the locations on that surface. For example, positional information may be incorporated in the background of the paper. As noted, the positional information may include any form of indication or code representing the relative location of the specific site on the paper. Accordingly, the input device may be used in association with coded paper to record information corresponding to the handwriting of a user at the appropriate location. For example, armed with only the input device and a writing surface incorporating coded position information, while riding in a taxi, the input device may be used to draft a letter to a client. Writing on paper with the input device, gestures corresponding to text or other input information are recognized by detecting changes in the location of the input device at certain times. The inputs may then be converted into electronic ink or other electronic data for use in generating information corresponding to those gestures. Conversion of the inputs may be performed as those inputs are generated, either within the input device or if received by a host computing device coupled to the input device. Alternatively, such conversion may be performed at a later time. For example, the information generated using the input device may be stored in memory and transmitted to a recipient and/or host computer for suitable processing at later time.
  • Data generated using the input device, whether those inputs are handwritten letters, symbols, words or other written images, may be incorporated into a document at locations identified by the location codes. Thus, even in the absence of a formatted template, the layout of a document, such as the previously described letter, may be achieved using the location information identifying the location within the document at which the information is to be entered. For example, the address of the drafter, address of the recipient, body and closing of the letter, and remaining components, may be entered on the paper at the appropriate location. Using the coded location information captured by the scanner, the words or other images forming the contents of the corresponding electronic document are incorporated at the appropriate locations.
  • Using detected location information, the input device may also interact with the host computing device for entering commands and making selections and the like. Where the computing device is a portable camera or telephone with web browsing properties, the input device may be used in the manner of a stylus or a mouse to select from displayed buttons or menus. Therefore, the input device may be used to activate the browser of the host computer and to select options for retrieving a file, such as the previously described document, even one stored remotely. Using the input device, the user may select downloading of the file containing the information needed by the user. Next, the user may enter annotations to the downloaded file or files via the input device. Those edits may be transmitted to the remote location from which the file was downloaded, where the input device is equipped to perform communications with remote computing devices. Alternatively, the edits may be used to edit the file stored within the input device and/or a host computing device, assuming the input device is in communication with the host computing device.
  • In another embodiment, the file displayed on the monitor of a host computing device may be a spreadsheet, generated using spreadsheet software such as EXCEL™. The location codes can be used to associate locations with given cells within the spreadsheet. The user may enter a numerical entry in the cell displayed on the screen. At that time, the input device captures images associated with the location of the input device and transmits that information to the host computing device. The processing software located in the host computing device, for example, and working in combination with the spreadsheet software, determines the identity of the cell selected for entry based on the detected location codes, and modifies the spreadsheet document contents accordingly.
  • The input device may also be used to recall images or other prestored information associated with particular gestures or combination of gestures. For example, the input device may be used to draw a symbol which the processing algorithms device is programmed to recognize. The maze pattern may be used to accurately detect movement of the input device over the pattern so that a particular symbol associated with such movement may be detected. For example, the user may control the input device to draw a symbol on the paper previously identified by the user to be associated with the company logo. The maze pattern may identify a combination of movements corresponding to the letter “M” followed immediately by the letter “S” as an instruction to designate entry of a logo of the Microsoft Corporation. As a result, such prestored information, may be entered within a document by entry of a sequence of previously inputs.
  • B. Active Coding May Provide Location Information where the Host Computing Device Includes Sensors for Sensing Movement of the Input Device
  • The input device may also be used as a passive input device. In that mode, the input device may be used in combination with a computing device that senses movement of the input device using resistive sensing, for example. When used in combination with a device that includes a sensor board for detecting movement of an input device, such as a Tablet PC™ or personal data assistant, the input device may function in the manner of a stylus. Using the input device, electronic ink or other image data may be generated with the input device is positioned in very close proximity to the screen. Control functions may be entered in a similar manner. Additionally, the image displayed on the monitor of the particular computing device may also include data corresponding to a code that represents the relative position of that portion of the document. The location information extracted from the object using the camera may then be used to track movement of the input device, as a substitute to, or in combination with, movement detected using sensors of the computing device.
  • For example, a user may wish to create or modify an image on a portable computing device which already includes the ability to detect positioning of an input device, such as the Tablet PC™ or personal data assistant. The input device may function solely as a passive input device, such that information representing movement of the input device is generated by the computing device. The sensors of the computing device, however, may not have the ability to detected movement of the pen at a range required by the user in a given situation. For example, accurate detection of user inputs may be hindered when the user is traveling in an unstable vehicle. As the user edits a file by moving the input device over the surface of the display of the computing device, the input device may be jostled and displaced a significant distance from the sensor board. Image data captured by the input device may be used to detect movement of the input device within a plane horizontal to the surface of the computing device, even though the signals generated by the sensors of the computing device have become less accurate. Even if the sensors of the computer device are no longer capable of detecting movement of the input device, the image sensor of the input device, may produce sufficient information to maintain an accurate representation of the movement of the input device to reflect the intended inputs of the user. Thus, even when used in combination with a computing device including the capability of sensing movement of the input device, the input device may function either as a passive input device or as an active input device.
  • II. The Input Device May Operate Using Passive Coding Techniques
  • A. Passive Coding Provides Location Information for Entering Input Information on a Display or Other Writing Surface
  • II. The Input Device May Operate Using Passive Coding Techniques
  • A. Passive Coding Provides Location Information for Entering Input Information on a Display or Other Writing Surface
  • The input device may also be used in association with any paper, writing surface or other substrate, to record information corresponding to the handwriting of a user. Again, armed with only the input device and a writing surface, the input device may be used to draft a letter to a client. In this case, gesturing is detected on the basis of passive coding, wherein movements of the input devices are detected using other than codes embedded within the image of a surface of the substrate. For example, the user may draft the letter on a plain sheet of paper. As the user writes with the input device, the image sensor captures images of the paper. Objects within the images may be identified and their movement within the series of captured images are indicative of movement. Sensed objects may include artifacts or other objects on the surface of the paper, which may correspond to a watermark or other defect of the paper. Alternatively, the paper may include ruled lines which may also be used to calculate movement of the pen over the surface. Even in the absence of paper, relative movement of the input device may be determined. The input device could be moved over the surface of a desk, the grain of the wood providing the objects necessary for detecting relative movement of the input device. In a manner similar to that previously described, a user can draft a letter on paper, or any surface over which movement can be detected optically. The movements of the input device may be stored in memory and/or converted into information representing those gestures.
  • In yet another embodiment, the portable device may be used as a substitute for a portable computing device. For example, having just crafted a solution to the circuit failures associated with her company's pacemaker, but with no laptop or other computing device available, an engineer may turn to her input device as a suitable replacement for recording her thoughts as she travels by train to meet the rest of the design team. Making the most of the time available (and with the ink cartridge removed or the cap in place), on the back of the chair in front of her, the user composes a sketch representing a modification to the suspect electrical circuit in question. She activates the input device, sets it in a mode conducive to generating a sketch (which may, for example, include deactivation of conversions), and begin sketching a simplified design representing a solution to the problem. The input device may then store the file representing the handwritten strokes. Switching out of a sketch mode, notations and references may be jotted down next to relevant portions of the sketch, and those entries incorporated within the image file. For example, the user may switch to a notation mode, in which gestures corresponding to letters are recognized. Thus, she may incorporate a description of her proposed solution along with the sketch. Rather than wait until reaching the medical research center, the operator may choose to transmit the schematic to the rest of the design team for full consideration prior to the scheduled meeting. Such transmission may be achieved any number of ways, including uploading the revised document from the input device to a portable wireless device such as a cellular telephone. The information may then be used to generate an image file such as a VISIO™ document.
  • Once transmitted to the remaining members of the team, the previously described file corresponding to a sketch of a schematic may be displayed on the monitor of a team member's host computing device. For example, the image and accompanying text may be presented on the display of a desktop computer. By placing the input device in proximity to the image of the file displayed on the monitor, additional annotations may be added to those displayed. In that case, movement of the input device may be detected by measuring the relative movement of objects within images captured by the optical sensor of the input device. Signals generated by the input device may be processed by software stored within the input device, or transmitted to the host computing device for processing. Processing of the detected movement may generate electronic ink, text, or other data representing the notations entered via the input device.
  • B. Passive Coding May Provide Location Information where the Host Computing Device Includes Sensors for Sensing Movement of the Input Device
  • The input device may be used in conjunction with a computing device having sensors for detecting movement of the input device, even in the absence of location codes. For example, the input device may be used as a source for generating handwritten notes on a personal data assistant or other computing device designed for use with a stylus. Therefore, while running errands a user may be reminded of and wish to add an item to an existing “to do list.” The user retrieves the list stored in a host computing device, such as a personal data assistant. Positioning the tip of the input device over the display of the personal data assistant, the user is able to traverse through menus and make selections to retrieve the desired list. Presented with this list, the user may input checks on the screen of the host device in empty boxes located next to descriptions of tasks already completed. The input device captures images of the screen including data corresponding to the box and transmits that data to the host computing device. Using a processing algorithm for analyzing image data, the host computing device then detects the shape of the box as an object for which an entry may be made. To successfully enter check marks, the image data may be processed to detect movement of the pen over and within the area of the box, the gestures forming the recognized shape of a “check.” The host device then modifies the file associated with the list to include a representation of a check within the box. Positioning the input device over the space following the last item in the list, the user enters text describing an additional item. The sensors of the host device detects movement of the input device and generates data representing those inputs. The inputs are converted into text and displayed to the user, along with an empty box.
  • Similarly, a user of Microsoft Reader, for example, such as a student reading an assigned novel, may wish to jot down notes next to the relevant text. The image displayed on the monitor of the portable host device is annotated using the input device. For example, the user positions the input device over the monitor of the host computer, a Tablet PC™ for example, and enters handwritten notes next to the relevant text. The gestures are detected by the sensors of the host device and stored as electronic data which is converted into image data and displayed on the screen. The notes may remain in handwritten form or may be converted into alphanumeric characters. The notes may not be seen without actuation of additional functions, such as activating viewing of appended comments or positioning the input device over highlighting or some other indication that annotations are present. The notes may then be stored in a separate file, or stored with a copy of the electronic version of the novel stored within the host computer.
  • III. Additional Sensor May Produce Additional Information Indicative of the Relative Position of the Input Device
  • In yet another embodiment, information from additional sensors forming part of the input device may be used to supplement or completely replace other forms of movement detection. Such additional sensors may detect linear acceleration of the input device, angular acceleration, velocity, rotation, depressive force, tilt, changes in electromagnetic fields or any sensed indication of movement or positioning of the input device. Such information may aid in an effort to produce more accurate movement detection. Alternatively, the additional sensors may provide the only information available at a given time. For example, the input device may be used in conjunction with a generally uniform surface, such as blank paper. In such cases, the image captured by the optical sensor may provide insufficient information to consistently and accurately detect movement of the input device. If optical motion detection becomes more difficult, such as if objects for tracking movement of the input device become more difficult to detect, in accordance with one embodiment for optically detecting movement, additional information from the additional sensors may be used to provide more refined motion detection. Specifically, the algorithm or algorithms used to determine position and/or movement may incorporate calculations to factor in the additional information and to thereby supplement movement and/or location detection if the optical motion detection.
  • If optical detection fails to provide useful results, then the additional sensors may provide the only information with which to detect movement. For example, if the user attempts to sketch out a drawing on the uniform white counter of a laminated countertop, the optical sensing system may sufficient data representative of movement. In that case, the additional sensors may provide sufficient information to generate an acceptably accurate representation of input information.
  • For example, if the input device moves a sufficient distance from the surface being scanned, the optical sensor unit may not capture an accurate representation of the image provided. In that case, additional information from the additional sensors may be used to compliment data obtained by the image of the object over which the input device is moved. Thus, even if the input device moves an inch or greater from the display over which it is being moved (the “z-axis), sensors within the input device may provide an indication of movement of the pen within the plane of the display, i.e., in the horizontal and vertical directions.
  • For example, an input device used in conjunction with a laptop computer is positioned on the tray table in front of the user. An image of a document, with a maze pattern incorporated into the background, is displayed on the screen of the laptop. Annotations entered by the user are shown in solid blue ink. The seat belt sign comes on as the airplane experiences turbulence. As the user reaches over the keyboard of the laptop computer and adds another word to the annotation, his hand quickly drifts away from the surface of the screen. Although the image sensor may not accurately detect the lines forming the maze pattern displayed, movement in the x and y axis is measured by the additional sensors incorporated within the input device.
  • The Impact of the Input Device on the Life of a Document
  • FIG. 8 illustrates uses of an input device in accordance with several illustrative embodiments of the present invention, as a document is created, transmitted and edited using an input device in various environments. The following description is merely an illustration of uses of the input device and is not intended to limit the structure or functionality of the present invention.
  • The input device may be used to extend the life of a document by allowing the creation and/or editing of documents in a wide range of environments and for use in association with numerous devices. Using an input device 801, a document 802 may be electronically created on the screen of one computing device, such as Tablet PC 803 illustrated. For example, the input device may be used to generate a handwritten draft of a document. Electronic ink corresponding to the information entered on the screen of the Tablet PC 803 is generated as the input device functions as a stylus for the Tablet PC 803. The electronic ink may be converted into text form and stored in the Tablet PC 803.
  • The electronic file representing the document may be transmitted to a second computing device, such as desktop PC 804. In that environment, the document may be edited on the screen of the desktop device using the input device operates as an independent input unit. Because the input device senses relative location of the input device within the displayed image of the document, edits entered on the screen of the desktop device may be reflected in the electronic document, even if the display does not include elements for sensing positioning of the input device. The edits generated using the input device may be transmitted to the desktop PC 804 as they are generated or may stored within the input device for transmission to any PC at a later time. The edits may be entered into the version of the document stored in the desktop PC 804.
  • The document created may also be output in hard-copy form by a printer, such as printer 805 linked to the desktop PC 804. The hard-copy 806 version of the document may include information or codes designating the relative location of the input device at any location in the document, using a maze-pattern, for example. The hard-copy may be marked-up by one or more users each having an input device, and the edits of each user generated by the separate input device. Along with information representing edits, information identifying the pen used to generate those edits may provided as well. For example, the inputs may be reflected using underlined colored text such as that found in applications for tracking changes made to documents. The edits/inputs may be forwarded from the desktop PC 804 to the Tablet PC 803, for incorporation into that document. Alternatively, the edits may be stored within the input device and uploaded at a later time.
  • The document may also be output on plain paper, or on any substrate not including indications of relative positioning of the input device. Again, the hard-copy may be marked-up by one or more users having an input device, and the edits of each user generated by the input device. In this example, position or movement of the pen may be determined using coding techniques for optically sensing movement of the input device over the paper. As noted, location/movement may be determined using a comparison algorithm in which the relative position of objects within each frame of image data are detected and used to determine movement of the input device. The resulting edits may be transmitted to the computing device in which the document originated, for example, for updating of the original data file. The edits may be transmitted through a computing device, such as the Pocket PC 807 for transmission to the destination device either through a wireless or wired communication or upon docking the device containing edits in the computing device.
  • The electronic document may also be transmitted to a second computing device, such as the Tablet PC illustrated. In that environment, the document may be edited on the screen of the tablet device using the input device as a simple stylus. Those inputs may be forwarded from the Tablet PC to the computing device storing the original copy of the document as annotations to the document or as edits for incorporation into that document, for example.
  • Additional Components
  • While the description above and accompanying figures depict embodiments utilizing specific components, the addition of components and/or removal of any component depicted is within the scope of the present invention. Similarly, the relocation of various components within the input device structure may be implemented without greatly impacting the accuracy with which the camera or the inertial sensors detect movement of the pen and produce electronic ink. For example, the image sensor may be replaced by or supplemented with a sensing device for detecting properties of the surface or object over which the input device may be moved. Thus, if the maze pattern was formed on the surface of an object such that the pattern could be detected based on the radiation of energy outside the visible light spectrum, reflectance of such energy transmitted to the object, or other such sensing techniques. Sensing of any property of the surface may be detected and used to determine position and/or movement of the input device over the surface of an object. As a further example, a microphone sensing system may be employed such that the microphone detects acoustic reflections or emissions from the object over which the input device is positioned.
  • The illustrative embodiments described and illustrated above have described an input device implemented in the shape of a pen. Aspects of the present invention are applicable, however, to input devices of any number of shapes and sizes.
  • Use of such an input device should enable personal computing in any location. Thus, users equipped with the described input device may generate or edit data files regardless of where they may be. Documents and other information may be generated, edited or recorded in an office setting, in a classroom, in a hotel, while in transit, or even on the beach.
  • As noted, the input device may include a suitable display. Alternative, the display of a host computing device may be used to review documents and images created. The user may select formatting of the document before or after the information, such as text, is input, or may review the document and make changes to the format of the document. Viewing the document created on such a display, in the context of the above example, the user may insert a header including his or her address in the appropriate location.
  • Although the invention has been defined using the appended claims, these claims are illustrative in that the invention may be intended to include the elements and steps described herein in any combination or sub combination. Accordingly, there are any number of alternative combinations for defining the invention, which incorporate one or more elements from the specification, including the description, claims, and drawings, in various combinations or sub combinations. It will be apparent to those skilled in the relevant technology, in light of the present specification, that alternate combinations of aspects of the invention, either alone or in combination with one or more elements or steps defined herein, may be utilized as modifications or alterations of the invention or as part of the invention. It may be intended that the written description of the invention contained herein covers all such modifications and alterations. For instance, in various embodiments, a certain order to the data has been shown. However, any reordering of the data is encompassed by the present invention. Also, where certain units of properties such as size (e.g., in bytes or bits) are used, any other units are also envisioned.

Claims (20)

1. An input device for generating data representative of hand written strokes, the input device comprising:
an image capturing unit for capturing an image of an area of an object over which the input device is positioned and generating captured image data, said data representative of hand written strokes being determined from location information extracted from said captured image data;
a processor processing the captured image data; and
a memory storing data,
wherein the object comprises a display of a computing device including an image pattern providing location information of an area on the display.
2. An input device according to claim 1, wherein the image of the area over which the input device is positioned includes image data representative of a location of the area of the object.
3. An input device according to claim 2, wherein the image data representative of a location of the area of the object includes an image pattern representative of the location of the area of the object.
4. An input device according to claim 1, wherein the image data representative of a location of the area of the object includes a coded reference representative of the location of the area of the object.
5. An input device according to claim 1, wherein the processor processes the captured image data and creates an image file representing handwritten inputs.
6. An input device according to claim 1, wherein the image capturing unit captures multiple images of multiple areas of the object, and
the processor processes the captured images, detects data within the captured images representative of the location of the area over which the input device is located, and processes the detected data to generate information representative of user inputs.
7. An input device according to claim 6, wherein the data within the captured images representative of a location of the area of the object includes an image pattern representative of each location of each area of the object.
8. An input device according to claim 6, wherein the data within the captured images representative of a location of the area of the object includes a portion of a maze-like pattern.
9. An input device according to claim 6, wherein the data within the captured images representative of a location of the area of the object includes a coded reference representative of the location of the area of the object.
10. An input device according to claim 6, wherein the data within the captured images representative of a location of the area of the object includes a feature of the object appearing in more than one captured image.
11. An input device according to claim 10, wherein the processor performs a comparison of the locations of a feature of the object in multiple captured images to determine movement of the input device over the object.
12. An input device according to claim 1, wherein the image pattern representative of the location of an area within the display is generated as part of the image produced on the display of the computing device.
13. An input device according to claim 1, wherein the image pattern representative of the location of an area within the display is formed in the structure of the display.
14. An input device according to claim 1, wherein the object comprises a writing surface.
15. An input device according to claim 14, wherein the writing surface includes an image pattern representative of the location of an area within the writing surface.
16. An input device according to claim 1, wherein the object comprises a non-uniform reflective surface.
17. An input device according to claim 1, the input device further including an inertial sensor for generating supplemental data representing movement of the input device.
18. An input device according to claim 17, wherein the processor processes the captured image data and the supplemental data and creates an image file representing handwritten inputs.
19. An input device according to claim 1, the input device further including a communication unit transmitting data representing movement of the input device to an external processing unit for generating signals representative of handwritten inputs.
20. An input device according to claim 7, the input device further including a communication unit transmitting data representative of handwritten inputs to an external processing unit.
US11/329,060 2002-10-31 2006-01-11 Universal computing device Abandoned US20060109263A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/329,060 US20060109263A1 (en) 2002-10-31 2006-01-11 Universal computing device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/284,417 US7009594B2 (en) 2002-10-31 2002-10-31 Universal computing device
US11/329,060 US20060109263A1 (en) 2002-10-31 2006-01-11 Universal computing device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/284,417 Continuation US7009594B2 (en) 2002-10-31 2002-10-31 Universal computing device

Publications (1)

Publication Number Publication Date
US20060109263A1 true US20060109263A1 (en) 2006-05-25

Family

ID=32093521

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/284,417 Expired - Fee Related US7009594B2 (en) 2002-10-31 2002-10-31 Universal computing device
US11/329,060 Abandoned US20060109263A1 (en) 2002-10-31 2006-01-11 Universal computing device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/284,417 Expired - Fee Related US7009594B2 (en) 2002-10-31 2002-10-31 Universal computing device

Country Status (6)

Country Link
US (2) US7009594B2 (en)
EP (1) EP1416423A3 (en)
JP (1) JP2004164609A (en)
KR (1) KR101026630B1 (en)
CN (1) CN1320506C (en)
BR (1) BR0304250A (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030231189A1 (en) * 2002-05-31 2003-12-18 Microsoft Corporation Altering a display on a viewing device based upon a user controlled orientation of the viewing device
US20060012581A1 (en) * 2004-07-15 2006-01-19 N-Trig Ltd. Tracking window for a digitizer system
US20060123049A1 (en) * 2004-12-03 2006-06-08 Microsoft Corporation Local metadata embedding solution
US20060182343A1 (en) * 2005-02-17 2006-08-17 Microsoft Digital pen calibration by local linearization
US20060190818A1 (en) * 2005-02-18 2006-08-24 Microsoft Corporation Embedded interaction code document
US20070001950A1 (en) * 2005-06-30 2007-01-04 Microsoft Corporation Embedding a pattern design onto a liquid crystal display
US20070003150A1 (en) * 2005-06-30 2007-01-04 Microsoft Corporation Embedded interaction code decoding for a liquid crystal display
US20070139399A1 (en) * 2005-11-23 2007-06-21 Quietso Technologies, Llc Systems and methods for enabling tablet PC/pen to paper space
US20090027354A1 (en) * 2004-07-15 2009-01-29 N-Trig Ltd. Automatic switching for a dual mode digitizer
US20090073144A1 (en) * 2007-09-18 2009-03-19 Acer Incorporated Input apparatus with multi-mode switching function
US20090119573A1 (en) * 2005-04-22 2009-05-07 Microsoft Corporation Global metadata embedding and decoding
US7532366B1 (en) 2005-02-25 2009-05-12 Microsoft Corporation Embedded interaction code printing with Microsoft Office documents
US20090233593A1 (en) * 2008-03-12 2009-09-17 Dish Network L.L.C. Apparatus and methods for controlling an entertainment device using a mobile communication device
US20090233715A1 (en) * 2008-03-12 2009-09-17 Dish Network L.L.C. Apparatus and methods for authenticating a user of an entertainment device using a mobile communication device
US7684618B2 (en) 2002-10-31 2010-03-23 Microsoft Corporation Passive embedded interaction coding
US7729539B2 (en) 2005-05-31 2010-06-01 Microsoft Corporation Fast error-correcting of embedded interaction codes
US7817816B2 (en) 2005-08-17 2010-10-19 Microsoft Corporation Embedded interaction code enabled surface type identification
US7826074B1 (en) 2005-02-25 2010-11-02 Microsoft Corporation Fast embedded interaction code printing with custom postscript commands
US7920753B2 (en) 2005-05-25 2011-04-05 Microsoft Corporation Preprocessing for information pattern analysis
US20120019487A1 (en) * 2009-01-14 2012-01-26 Takashi Kazamaki Electronic device and information processing method

Families Citing this family (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7113171B2 (en) * 1997-06-10 2006-09-26 Mark Vayda Universal input device
US8279169B2 (en) * 1997-06-10 2012-10-02 Mark Vayda Universal input device and system
SE0200419L (en) * 2002-02-12 2003-08-13 Anoto Ab Electronic pen and sensor arrangement and control device for such
US7009594B2 (en) * 2002-10-31 2006-03-07 Microsoft Corporation Universal computing device
US7262764B2 (en) * 2002-10-31 2007-08-28 Microsoft Corporation Universal computing device for surface applications
US7289105B2 (en) * 2003-06-04 2007-10-30 Vrbia, Inc. Real motion detection sampling and recording for tracking and writing instruments using electrically-active viscous material and thin films
US20050024346A1 (en) * 2003-07-30 2005-02-03 Jean-Luc Dupraz Digital pen function control
JPWO2005029380A1 (en) * 2003-09-17 2008-06-12 アステラス製薬株式会社 Drug survey information collection system
US20050125717A1 (en) * 2003-10-29 2005-06-09 Tsakhi Segal System and method for off-line synchronized capturing and reviewing notes and presentations
US7583842B2 (en) * 2004-01-06 2009-09-01 Microsoft Corporation Enhanced approach of m-array decoding and error correction
US7263224B2 (en) * 2004-01-16 2007-08-28 Microsoft Corporation Strokes localization by m-array decoding and fast image matching
FR2866458B1 (en) * 2004-02-12 2006-07-14 Commissariat Energie Atomique METHOD FOR RECOGNIZING THE TRACK OF A TIP OF A BODY ON A SUPPORT
GB2412153A (en) * 2004-03-20 2005-09-21 Hewlett Packard Development Co Digital pen with a memory tag reader/writer
US20050289453A1 (en) * 2004-06-21 2005-12-29 Tsakhi Segal Apparatys and method for off-line synchronized capturing and reviewing notes and presentations
US7656395B2 (en) * 2004-07-15 2010-02-02 Microsoft Corporation Methods and apparatuses for compound tracking systems
JP2008508621A (en) * 2004-08-03 2008-03-21 シルバーブルック リサーチ ピーティワイ リミテッド Walk-up printing
US7349554B2 (en) * 2004-09-02 2008-03-25 Microsoft Corporation Maze pattern analysis
US20060215913A1 (en) * 2005-03-24 2006-09-28 Microsoft Corporation Maze pattern analysis with image matching
US20060242562A1 (en) * 2005-04-22 2006-10-26 Microsoft Corporation Embedded method for embedded interaction code array
US7580576B2 (en) * 2005-06-02 2009-08-25 Microsoft Corporation Stroke localization and binding to electronic document
KR100724939B1 (en) 2005-06-20 2007-06-04 삼성전자주식회사 Method for implementing user interface using camera module and mobile communication terminal therefor
JP4770614B2 (en) * 2006-07-11 2011-09-14 株式会社日立製作所 Document management system and document management method
US8330773B2 (en) * 2006-11-21 2012-12-11 Microsoft Corporation Mobile data and handwriting screen capture and forwarding
JP5060798B2 (en) * 2007-02-23 2012-10-31 任天堂株式会社 Information processing program and information processing apparatus
US8040320B2 (en) * 2007-11-05 2011-10-18 Eldad Shemesh Input device and method of operation thereof
JP5130930B2 (en) * 2008-01-31 2013-01-30 富士ゼロックス株式会社 Electronic writing instrument
US8243028B2 (en) * 2008-06-13 2012-08-14 Polyvision Corporation Eraser assemblies and methods of manufacturing same
EP2219100A1 (en) * 2009-02-12 2010-08-18 Siemens Aktiengesellschaft Electronic operational pen for an operating device with a touch screen
JP2011113191A (en) * 2009-11-25 2011-06-09 Seiko Epson Corp Information processing device and information processing system
KR101669618B1 (en) * 2010-01-15 2016-10-26 삼성전자주식회사 Display apparatus and display method thereof
EP2756370A1 (en) * 2011-08-29 2014-07-23 Stefan Valícek Multifunctional pencil input peripheral computer controller
US9690877B1 (en) * 2011-09-26 2017-06-27 Tal Lavian Systems and methods for electronic communications
IN2014MN01629A (en) * 2012-02-29 2015-05-08 Qualcomm Inc
TW201423497A (en) * 2012-12-12 2014-06-16 Hon Hai Prec Ind Co Ltd Digital pen and digital writing module
CN103870019A (en) * 2012-12-17 2014-06-18 鸿富锦精密工业(深圳)有限公司 Digital pen and digital writing module
EP3079052A4 (en) 2014-12-18 2017-08-16 Wacom Co., Ltd. Digital ink generating device, digital ink generating method, and digital ink reproduction device
KR102312888B1 (en) * 2015-05-13 2021-10-15 삼성전자주식회사 Input apparatus, electronic device having the same and control method thereof
CZ306281B6 (en) * 2016-02-25 2016-11-09 O.Pen S.R.O. Wireless positioning pen with pressure tip
CN105607766B (en) * 2016-03-15 2017-12-22 深圳市华鼎星科技有限公司 A kind of adjustable capacitance pressure transducer and true person's handwriting stylus
DE202016103403U1 (en) * 2016-06-28 2017-09-29 Stabilo International Gmbh Spring-loaded battery contact with sensor protection
CN110895452B (en) * 2019-03-25 2020-07-10 广西北部湾在线投资控股有限公司 State detection platform based on cloud server
US11157099B2 (en) * 2019-08-05 2021-10-26 Adx Research, Inc. Electronic writing device and a method for operating the same
TWM607724U (en) * 2020-11-03 2021-02-11 吳榮 Multi-use auxiliary device

Citations (83)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4561105A (en) * 1983-01-19 1985-12-24 Communication Intelligence Corporation Complex pattern recognition method and system
US4829583A (en) * 1985-06-03 1989-05-09 Sino Business Machines, Inc. Method and apparatus for processing ideographic characters
US5247137A (en) * 1991-10-25 1993-09-21 Mark Epperson Autonomous computer input device and marking instrument
US5280289A (en) * 1992-04-23 1994-01-18 Hughes Aircraft Company Automatic signal thresholding system
US5294792A (en) * 1991-12-31 1994-03-15 Texas Instruments Incorporated Writing tip position sensing and processing apparatus
US5406479A (en) * 1993-12-20 1995-04-11 Imatron, Inc. Method for rebinning and for correcting cone beam error in a fan beam computed tomographic scanner system
US5414538A (en) * 1993-10-07 1995-05-09 Xerox Corporation Image-dependent exposure enhancement
US5587558A (en) * 1992-01-24 1996-12-24 Seiko Instruments Inc. Coordinate detecting apparatus having acceleration detectors
US5644652A (en) * 1993-11-23 1997-07-01 International Business Machines Corporation System and method for automatic handwriting recognition with a writer-independent chirographic label alphabet
US5661506A (en) * 1994-11-10 1997-08-26 Sia Technology Corporation Pen and paper information recording system using an imaging pen
US5686718A (en) * 1995-03-15 1997-11-11 Sharp Kabushiki Kaisha Recording method, decoding method, and decoding apparatus for digital information
US5748808A (en) * 1994-07-13 1998-05-05 Yashima Electric Co., Ltd. Image reproducing method and apparatus capable of storing and reproducing handwriting
US5754280A (en) * 1995-05-23 1998-05-19 Olympus Optical Co., Ltd. Two-dimensional rangefinding sensor
US5756981A (en) * 1992-02-27 1998-05-26 Symbol Technologies, Inc. Optical scanner for reading and decoding one- and-two-dimensional symbologies at variable depths of field including memory efficient high speed image processing means and high accuracy image analysis means
US5774602A (en) * 1994-07-13 1998-06-30 Yashima Electric Co., Ltd. Writing device for storing handwriting
US5822465A (en) * 1992-09-01 1998-10-13 Apple Computer, Inc. Image encoding by vector quantization of regions of an image and codebook updates
US5855483A (en) * 1994-11-21 1999-01-05 Compaq Computer Corp. Interactive play with a computer
US5898166A (en) * 1995-05-23 1999-04-27 Olympus Optical Co., Ltd. Information reproduction system which utilizes physical information on an optically-readable code and which optically reads the code to reproduce multimedia information
US5902968A (en) * 1996-02-20 1999-05-11 Ricoh Company, Ltd. Pen-shaped handwriting input apparatus using accelerometers and gyroscopes and an associated operational device for determining pen movement
US5926567A (en) * 1995-03-01 1999-07-20 Compaq Computer Corporation Method and apparatus for storing and rapidly displaying graphic data
US5960124A (en) * 1994-07-13 1999-09-28 Yashima Electric Co., Ltd. Image reproducing method for reproducing handwriting
US5995084A (en) * 1997-01-17 1999-11-30 Tritech Microelectronics, Ltd. Touchpad pen-input and mouse controller
US6005973A (en) * 1993-12-01 1999-12-21 Motorola, Inc. Combined dictionary based and likely character string method of handwriting recognition
US6014462A (en) * 1996-07-08 2000-01-11 Ricoh Company, Ltd. Image processing apparatus and method for distinguishing alphanumeric symbols on a white background and those on a mesh pattern and individually processing such image data
US6081261A (en) * 1995-11-01 2000-06-27 Ricoh Corporation Manual entry interactive paper and electronic document handling and processing system
US6141014A (en) * 1995-04-20 2000-10-31 Hitachi, Ltd. Bird's-eye view forming method, map display apparatus and navigation system
US6157935A (en) * 1996-12-17 2000-12-05 Tran; Bao Q. Remote data access and management system
US6173084B1 (en) * 1997-06-06 2001-01-09 U.S. Philips Corporation Noise reduction in an image
US6181329B1 (en) * 1997-12-23 2001-01-30 Ricoh Company, Ltd. Method and apparatus for tracking a hand-held writing instrument with multiple sensors that are calibrated by placing the writing instrument in predetermined positions with respect to the writing surface
US6202060B1 (en) * 1996-10-29 2001-03-13 Bao Q. Tran Data management system
US6226636B1 (en) * 1998-11-20 2001-05-01 Philips Electronics North America Corp. System for retrieving images using a database
US6243071B1 (en) * 1993-11-03 2001-06-05 Apple Computer, Inc. Tool set for navigating through an electronic book
US6278968B1 (en) * 1999-01-29 2001-08-21 Sony Corporation Method and apparatus for adaptive speech recognition hypothesis construction and selection in a spoken language translation system
US20010024193A1 (en) * 1999-12-23 2001-09-27 Christer Fahraeus Written command
US6335727B1 (en) * 1993-03-12 2002-01-01 Kabushiki Kaisha Toshiba Information input device, position information holding device, and position recognizing system including them
US20020024499A1 (en) * 1998-03-27 2002-02-28 International Business Machines Corporation Flexibly interfaceable portable computing device
US20020148655A1 (en) * 2001-04-12 2002-10-17 Samsung Electronics Co., Ltd. Electronic pen input device and coordinate detecting method therefor
US20020163511A1 (en) * 2000-11-29 2002-11-07 Sekendur Oral Faith Optical position determination on any surface
US20020163510A1 (en) * 2001-05-04 2002-11-07 Microsoft Corporation Method of generating digital ink thickness information
US20030063045A1 (en) * 2001-10-02 2003-04-03 Harris Corporation Pen cartridge that transmits acceleration signal for recreating handwritten signatures and communications
US20030063072A1 (en) * 2000-04-04 2003-04-03 Brandenberg Carl Brock Method and apparatus for scheduling presentation of digital content on a personal communication device
US6577299B1 (en) * 1998-08-18 2003-06-10 Digital Ink, Inc. Electronic portable pen apparatus and method
US20030118233A1 (en) * 2001-11-20 2003-06-26 Andreas Olsson Method and device for identifying objects in digital images
US6585154B1 (en) * 2000-08-03 2003-07-01 Yaakov Ostrover System, method and devices for documents with electronic copies attached thereto
US20030146883A1 (en) * 1997-08-28 2003-08-07 Visualabs Inc. 3-D imaging system
US6651894B2 (en) * 2000-12-12 2003-11-25 Ricoh Company, Ltd. Imaging method, imaging apparatus, and image information management system
US20040032393A1 (en) * 2001-04-04 2004-02-19 Brandenberg Carl Brock Method and apparatus for scheduling presentation of digital content on a personal communication device
US20040046744A1 (en) * 1999-11-04 2004-03-11 Canesta, Inc. Method and apparatus for entering data using a virtual input device
US6731271B1 (en) * 1999-03-19 2004-05-04 Canon Kabushiki Kaisha Coordinate input device and its control method, and computer readable memory
US20040085286A1 (en) * 2002-10-31 2004-05-06 Microsoft Corporation Universal computing device
US6744967B2 (en) * 2001-12-20 2004-06-01 Scientific-Atlanta, Inc. Program position user interface for personal video recording time shift buffer
US6752317B2 (en) * 1998-04-01 2004-06-22 Xerox Corporation Marking medium area with encoded identifier for producing action through network
US20040128511A1 (en) * 2000-12-20 2004-07-01 Qibin Sun Methods and systems for generating multimedia signature
US20040136083A1 (en) * 2002-10-31 2004-07-15 Microsoft Corporation Optical system design for a universal computing device
US20040143559A1 (en) * 2003-01-17 2004-07-22 Ayala Francisco J. System and method for developing artificial intelligence
US20040140964A1 (en) * 2002-10-31 2004-07-22 Microsoft Corporation Universal computing device for surface applications
US20040140965A1 (en) * 2002-10-31 2004-07-22 Microsoft Corporation Universal computing device
US20040153649A1 (en) * 1995-07-27 2004-08-05 Rhoads Geoffrey B. Digital authentication with digital and analog documents
US6847356B1 (en) * 1999-08-13 2005-01-25 Canon Kabushiki Kaisha Coordinate input device and its control method, and computer readable memory
US20050024324A1 (en) * 2000-02-11 2005-02-03 Carlo Tomasi Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US20050052469A1 (en) * 1999-12-16 2005-03-10 Matt Crosby Method and apparatus for rendering a low-resolution thumbnail image suitable for a low resolution display having a reference back to an original digital negative and an edit list of operations
US6874420B2 (en) * 1999-10-22 2005-04-05 Cc1, Inc. System and method for register mark recognition
US6880755B2 (en) * 1999-12-06 2005-04-19 Xerox Coporation Method and apparatus for display of spatially registered information using embedded data
US6919892B1 (en) * 2002-08-14 2005-07-19 Avaworks, Incorporated Photo realistic talking head creation system and method
US6935562B2 (en) * 1999-12-06 2005-08-30 Xerox Corporation Operations on images having glyph carpets
US6970183B1 (en) * 2000-06-14 2005-11-29 E-Watch, Inc. Multimedia surveillance and monitoring system including network configuration
US6987534B1 (en) * 1999-08-30 2006-01-17 Fuji Jukogyo Kabushiki Kaisha Brightness adjusting apparatus for stereoscopic camera
US6993185B2 (en) * 2002-08-30 2006-01-31 Matsushita Electric Industrial Co., Ltd. Method of texture-based color document segmentation
US7023426B1 (en) * 2002-09-16 2006-04-04 Hewlett-Packard Development Company, L.P. User input device
US20060082557A1 (en) * 2000-04-05 2006-04-20 Anoto Ip Lic Hb Combined detection of position-coding pattern and bar codes
US20060125805A1 (en) * 2004-03-17 2006-06-15 James Marggraff Method and system for conducting a transaction using recognized text
US20060267965A1 (en) * 2005-05-25 2006-11-30 Advanced Digital Systems, Inc. System and method for associating handwritten information with one or more objects via discontinuous regions of a printed pattern
US7440583B2 (en) * 2003-04-25 2008-10-21 Oki Electric Industry Co., Ltd. Watermark information detection method
US7505982B2 (en) * 2004-12-03 2009-03-17 Microsoft Corporation Local metadata embedding solution
US20090110308A1 (en) * 2002-10-31 2009-04-30 Microsoft Corporation Decoding and error correction in 2-d arrays
US7528848B2 (en) * 2005-06-30 2009-05-05 Microsoft Corporation Embedded interaction code decoding for a liquid crystal display
US20090119573A1 (en) * 2005-04-22 2009-05-07 Microsoft Corporation Global metadata embedding and decoding
US7532366B1 (en) * 2005-02-25 2009-05-12 Microsoft Corporation Embedded interaction code printing with Microsoft Office documents
US7536051B2 (en) * 2005-02-17 2009-05-19 Microsoft Corporation Digital pen calibration by local linearization
US7542976B2 (en) * 2005-04-22 2009-06-02 Microsoft Corporation Local metadata embedding and decoding
US7570813B2 (en) * 2004-01-16 2009-08-04 Microsoft Corporation Strokes localization by m-array decoding and fast image matching
US7580576B2 (en) * 2005-06-02 2009-08-25 Microsoft Corporation Stroke localization and binding to electronic document
US7583842B2 (en) * 2004-01-06 2009-09-01 Microsoft Corporation Enhanced approach of m-array decoding and error correction

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5051736A (en) 1989-06-28 1991-09-24 International Business Machines Corporation Optical stylus and passive digitizing tablet data input system
DE69523024T2 (en) 1994-12-16 2002-03-14 Hyundai Electronics America Digitizing pen and operating procedures
JPH08255233A (en) 1995-03-16 1996-10-01 Toshiba Corp Bar code encoding system
JP3671517B2 (en) * 1996-04-24 2005-07-13 ユニマテック株式会社 Fluorine-containing copolymer elastomer, production method and composition thereof
JPH10256921A (en) 1997-03-13 1998-09-25 Olympus Optical Co Ltd Method and device for modulating and demodulating digital data
JP3186643B2 (en) * 1997-05-08 2001-07-11 日本電気株式会社 Wireless device comprising a charger and a charger and a portable wireless device
SE517445C2 (en) 1999-10-01 2002-06-04 Anoto Ab Position determination on a surface provided with a position coding pattern
MXPA02004131A (en) * 1999-10-25 2004-04-02 Silverbrook Res Pty Ltd Electronically controllable pen with code sensor.
KR100752817B1 (en) 1999-12-23 2007-08-29 아노토 아베 General information management system
SE0000951L (en) 2000-03-21 2001-09-22 Anoto Ab Device and method for spatial relationship determination
US6751352B1 (en) 2000-05-25 2004-06-15 Hewlett-Packard Development Company, L.P. Method and apparatus for generating and decoding a visually significant barcode
US6681060B2 (en) 2001-03-23 2004-01-20 Intel Corporation Image retrieval using distance measure
JP4102105B2 (en) 2002-05-24 2008-06-18 株式会社日立製作所 Document entry system using electronic pen
TWI285868B (en) * 2003-01-20 2007-08-21 Ind Tech Res Inst Method and apparatus to enhance response time of display

Patent Citations (88)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4561105A (en) * 1983-01-19 1985-12-24 Communication Intelligence Corporation Complex pattern recognition method and system
US4829583A (en) * 1985-06-03 1989-05-09 Sino Business Machines, Inc. Method and apparatus for processing ideographic characters
US5247137A (en) * 1991-10-25 1993-09-21 Mark Epperson Autonomous computer input device and marking instrument
US5294792A (en) * 1991-12-31 1994-03-15 Texas Instruments Incorporated Writing tip position sensing and processing apparatus
US5587558A (en) * 1992-01-24 1996-12-24 Seiko Instruments Inc. Coordinate detecting apparatus having acceleration detectors
US5756981A (en) * 1992-02-27 1998-05-26 Symbol Technologies, Inc. Optical scanner for reading and decoding one- and-two-dimensional symbologies at variable depths of field including memory efficient high speed image processing means and high accuracy image analysis means
US5280289A (en) * 1992-04-23 1994-01-18 Hughes Aircraft Company Automatic signal thresholding system
US5822465A (en) * 1992-09-01 1998-10-13 Apple Computer, Inc. Image encoding by vector quantization of regions of an image and codebook updates
US6335727B1 (en) * 1993-03-12 2002-01-01 Kabushiki Kaisha Toshiba Information input device, position information holding device, and position recognizing system including them
US5414538A (en) * 1993-10-07 1995-05-09 Xerox Corporation Image-dependent exposure enhancement
US6243071B1 (en) * 1993-11-03 2001-06-05 Apple Computer, Inc. Tool set for navigating through an electronic book
US5644652A (en) * 1993-11-23 1997-07-01 International Business Machines Corporation System and method for automatic handwriting recognition with a writer-independent chirographic label alphabet
US6005973A (en) * 1993-12-01 1999-12-21 Motorola, Inc. Combined dictionary based and likely character string method of handwriting recognition
US5406479A (en) * 1993-12-20 1995-04-11 Imatron, Inc. Method for rebinning and for correcting cone beam error in a fan beam computed tomographic scanner system
US5748808A (en) * 1994-07-13 1998-05-05 Yashima Electric Co., Ltd. Image reproducing method and apparatus capable of storing and reproducing handwriting
US5960124A (en) * 1994-07-13 1999-09-28 Yashima Electric Co., Ltd. Image reproducing method for reproducing handwriting
US5774602A (en) * 1994-07-13 1998-06-30 Yashima Electric Co., Ltd. Writing device for storing handwriting
US5661506A (en) * 1994-11-10 1997-08-26 Sia Technology Corporation Pen and paper information recording system using an imaging pen
US5855483A (en) * 1994-11-21 1999-01-05 Compaq Computer Corp. Interactive play with a computer
US5926567A (en) * 1995-03-01 1999-07-20 Compaq Computer Corporation Method and apparatus for storing and rapidly displaying graphic data
US5686718A (en) * 1995-03-15 1997-11-11 Sharp Kabushiki Kaisha Recording method, decoding method, and decoding apparatus for digital information
US6141014A (en) * 1995-04-20 2000-10-31 Hitachi, Ltd. Bird's-eye view forming method, map display apparatus and navigation system
US5898166A (en) * 1995-05-23 1999-04-27 Olympus Optical Co., Ltd. Information reproduction system which utilizes physical information on an optically-readable code and which optically reads the code to reproduce multimedia information
US5754280A (en) * 1995-05-23 1998-05-19 Olympus Optical Co., Ltd. Two-dimensional rangefinding sensor
US20040153649A1 (en) * 1995-07-27 2004-08-05 Rhoads Geoffrey B. Digital authentication with digital and analog documents
US6081261A (en) * 1995-11-01 2000-06-27 Ricoh Corporation Manual entry interactive paper and electronic document handling and processing system
US5902968A (en) * 1996-02-20 1999-05-11 Ricoh Company, Ltd. Pen-shaped handwriting input apparatus using accelerometers and gyroscopes and an associated operational device for determining pen movement
US6014462A (en) * 1996-07-08 2000-01-11 Ricoh Company, Ltd. Image processing apparatus and method for distinguishing alphanumeric symbols on a white background and those on a mesh pattern and individually processing such image data
US6202060B1 (en) * 1996-10-29 2001-03-13 Bao Q. Tran Data management system
US20020069220A1 (en) * 1996-12-17 2002-06-06 Tran Bao Q. Remote data access and management system utilizing handwriting input
US6157935A (en) * 1996-12-17 2000-12-05 Tran; Bao Q. Remote data access and management system
US5995084A (en) * 1997-01-17 1999-11-30 Tritech Microelectronics, Ltd. Touchpad pen-input and mouse controller
US6173084B1 (en) * 1997-06-06 2001-01-09 U.S. Philips Corporation Noise reduction in an image
US20030146883A1 (en) * 1997-08-28 2003-08-07 Visualabs Inc. 3-D imaging system
US6181329B1 (en) * 1997-12-23 2001-01-30 Ricoh Company, Ltd. Method and apparatus for tracking a hand-held writing instrument with multiple sensors that are calibrated by placing the writing instrument in predetermined positions with respect to the writing surface
US20020024499A1 (en) * 1998-03-27 2002-02-28 International Business Machines Corporation Flexibly interfaceable portable computing device
US6752317B2 (en) * 1998-04-01 2004-06-22 Xerox Corporation Marking medium area with encoded identifier for producing action through network
US6577299B1 (en) * 1998-08-18 2003-06-10 Digital Ink, Inc. Electronic portable pen apparatus and method
US6226636B1 (en) * 1998-11-20 2001-05-01 Philips Electronics North America Corp. System for retrieving images using a database
US6278968B1 (en) * 1999-01-29 2001-08-21 Sony Corporation Method and apparatus for adaptive speech recognition hypothesis construction and selection in a spoken language translation system
US6731271B1 (en) * 1999-03-19 2004-05-04 Canon Kabushiki Kaisha Coordinate input device and its control method, and computer readable memory
US6847356B1 (en) * 1999-08-13 2005-01-25 Canon Kabushiki Kaisha Coordinate input device and its control method, and computer readable memory
US6987534B1 (en) * 1999-08-30 2006-01-17 Fuji Jukogyo Kabushiki Kaisha Brightness adjusting apparatus for stereoscopic camera
US6874420B2 (en) * 1999-10-22 2005-04-05 Cc1, Inc. System and method for register mark recognition
US20040046744A1 (en) * 1999-11-04 2004-03-11 Canesta, Inc. Method and apparatus for entering data using a virtual input device
US6935562B2 (en) * 1999-12-06 2005-08-30 Xerox Corporation Operations on images having glyph carpets
US6880755B2 (en) * 1999-12-06 2005-04-19 Xerox Coporation Method and apparatus for display of spatially registered information using embedded data
US20050052469A1 (en) * 1999-12-16 2005-03-10 Matt Crosby Method and apparatus for rendering a low-resolution thumbnail image suitable for a low resolution display having a reference back to an original digital negative and an edit list of operations
US20010024193A1 (en) * 1999-12-23 2001-09-27 Christer Fahraeus Written command
US20050024324A1 (en) * 2000-02-11 2005-02-03 Carlo Tomasi Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US20030063072A1 (en) * 2000-04-04 2003-04-03 Brandenberg Carl Brock Method and apparatus for scheduling presentation of digital content on a personal communication device
US20060082557A1 (en) * 2000-04-05 2006-04-20 Anoto Ip Lic Hb Combined detection of position-coding pattern and bar codes
US6970183B1 (en) * 2000-06-14 2005-11-29 E-Watch, Inc. Multimedia surveillance and monitoring system including network configuration
US6585154B1 (en) * 2000-08-03 2003-07-01 Yaakov Ostrover System, method and devices for documents with electronic copies attached thereto
US20020163511A1 (en) * 2000-11-29 2002-11-07 Sekendur Oral Faith Optical position determination on any surface
US6651894B2 (en) * 2000-12-12 2003-11-25 Ricoh Company, Ltd. Imaging method, imaging apparatus, and image information management system
US20040128511A1 (en) * 2000-12-20 2004-07-01 Qibin Sun Methods and systems for generating multimedia signature
US20040032393A1 (en) * 2001-04-04 2004-02-19 Brandenberg Carl Brock Method and apparatus for scheduling presentation of digital content on a personal communication device
US20020148655A1 (en) * 2001-04-12 2002-10-17 Samsung Electronics Co., Ltd. Electronic pen input device and coordinate detecting method therefor
US20020163510A1 (en) * 2001-05-04 2002-11-07 Microsoft Corporation Method of generating digital ink thickness information
US20030063045A1 (en) * 2001-10-02 2003-04-03 Harris Corporation Pen cartridge that transmits acceleration signal for recreating handwritten signatures and communications
US20030118233A1 (en) * 2001-11-20 2003-06-26 Andreas Olsson Method and device for identifying objects in digital images
US6744967B2 (en) * 2001-12-20 2004-06-01 Scientific-Atlanta, Inc. Program position user interface for personal video recording time shift buffer
US6919892B1 (en) * 2002-08-14 2005-07-19 Avaworks, Incorporated Photo realistic talking head creation system and method
US6993185B2 (en) * 2002-08-30 2006-01-31 Matsushita Electric Industrial Co., Ltd. Method of texture-based color document segmentation
US7023426B1 (en) * 2002-09-16 2006-04-04 Hewlett-Packard Development Company, L.P. User input device
US20040140964A1 (en) * 2002-10-31 2004-07-22 Microsoft Corporation Universal computing device for surface applications
US20090110308A1 (en) * 2002-10-31 2009-04-30 Microsoft Corporation Decoding and error correction in 2-d arrays
US7009594B2 (en) * 2002-10-31 2006-03-07 Microsoft Corporation Universal computing device
US20040136083A1 (en) * 2002-10-31 2004-07-15 Microsoft Corporation Optical system design for a universal computing device
US20040085286A1 (en) * 2002-10-31 2004-05-06 Microsoft Corporation Universal computing device
US20040140965A1 (en) * 2002-10-31 2004-07-22 Microsoft Corporation Universal computing device
US7133031B2 (en) * 2002-10-31 2006-11-07 Microsoft Corporation Optical system design for a universal computing device
US7142197B2 (en) * 2002-10-31 2006-11-28 Microsoft Corporation Universal computing device
US7262764B2 (en) * 2002-10-31 2007-08-28 Microsoft Corporation Universal computing device for surface applications
US20040143559A1 (en) * 2003-01-17 2004-07-22 Ayala Francisco J. System and method for developing artificial intelligence
US7440583B2 (en) * 2003-04-25 2008-10-21 Oki Electric Industry Co., Ltd. Watermark information detection method
US7583842B2 (en) * 2004-01-06 2009-09-01 Microsoft Corporation Enhanced approach of m-array decoding and error correction
US7570813B2 (en) * 2004-01-16 2009-08-04 Microsoft Corporation Strokes localization by m-array decoding and fast image matching
US20060125805A1 (en) * 2004-03-17 2006-06-15 James Marggraff Method and system for conducting a transaction using recognized text
US7505982B2 (en) * 2004-12-03 2009-03-17 Microsoft Corporation Local metadata embedding solution
US7536051B2 (en) * 2005-02-17 2009-05-19 Microsoft Corporation Digital pen calibration by local linearization
US7532366B1 (en) * 2005-02-25 2009-05-12 Microsoft Corporation Embedded interaction code printing with Microsoft Office documents
US20090119573A1 (en) * 2005-04-22 2009-05-07 Microsoft Corporation Global metadata embedding and decoding
US7542976B2 (en) * 2005-04-22 2009-06-02 Microsoft Corporation Local metadata embedding and decoding
US20060267965A1 (en) * 2005-05-25 2006-11-30 Advanced Digital Systems, Inc. System and method for associating handwritten information with one or more objects via discontinuous regions of a printed pattern
US7580576B2 (en) * 2005-06-02 2009-08-25 Microsoft Corporation Stroke localization and binding to electronic document
US7528848B2 (en) * 2005-06-30 2009-05-05 Microsoft Corporation Embedded interaction code decoding for a liquid crystal display

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030231189A1 (en) * 2002-05-31 2003-12-18 Microsoft Corporation Altering a display on a viewing device based upon a user controlled orientation of the viewing device
US7684618B2 (en) 2002-10-31 2010-03-23 Microsoft Corporation Passive embedded interaction coding
US20060012581A1 (en) * 2004-07-15 2006-01-19 N-Trig Ltd. Tracking window for a digitizer system
US7649524B2 (en) * 2004-07-15 2010-01-19 N-Trig Ltd. Tracking window for a digitizer system
US20090027354A1 (en) * 2004-07-15 2009-01-29 N-Trig Ltd. Automatic switching for a dual mode digitizer
US20060123049A1 (en) * 2004-12-03 2006-06-08 Microsoft Corporation Local metadata embedding solution
US7505982B2 (en) 2004-12-03 2009-03-17 Microsoft Corporation Local metadata embedding solution
US7536051B2 (en) 2005-02-17 2009-05-19 Microsoft Corporation Digital pen calibration by local linearization
US20060182343A1 (en) * 2005-02-17 2006-08-17 Microsoft Digital pen calibration by local linearization
US20060190818A1 (en) * 2005-02-18 2006-08-24 Microsoft Corporation Embedded interaction code document
US7826074B1 (en) 2005-02-25 2010-11-02 Microsoft Corporation Fast embedded interaction code printing with custom postscript commands
US7532366B1 (en) 2005-02-25 2009-05-12 Microsoft Corporation Embedded interaction code printing with Microsoft Office documents
US8156153B2 (en) 2005-04-22 2012-04-10 Microsoft Corporation Global metadata embedding and decoding
US20090119573A1 (en) * 2005-04-22 2009-05-07 Microsoft Corporation Global metadata embedding and decoding
US7920753B2 (en) 2005-05-25 2011-04-05 Microsoft Corporation Preprocessing for information pattern analysis
US7729539B2 (en) 2005-05-31 2010-06-01 Microsoft Corporation Fast error-correcting of embedded interaction codes
US7528848B2 (en) 2005-06-30 2009-05-05 Microsoft Corporation Embedded interaction code decoding for a liquid crystal display
US20070003150A1 (en) * 2005-06-30 2007-01-04 Microsoft Corporation Embedded interaction code decoding for a liquid crystal display
US20070001950A1 (en) * 2005-06-30 2007-01-04 Microsoft Corporation Embedding a pattern design onto a liquid crystal display
US7817816B2 (en) 2005-08-17 2010-10-19 Microsoft Corporation Embedded interaction code enabled surface type identification
US20070139399A1 (en) * 2005-11-23 2007-06-21 Quietso Technologies, Llc Systems and methods for enabling tablet PC/pen to paper space
US20090073144A1 (en) * 2007-09-18 2009-03-19 Acer Incorporated Input apparatus with multi-mode switching function
US8564574B2 (en) * 2007-09-18 2013-10-22 Acer Incorporated Input apparatus with multi-mode switching function
US20090233715A1 (en) * 2008-03-12 2009-09-17 Dish Network L.L.C. Apparatus and methods for authenticating a user of an entertainment device using a mobile communication device
US8152642B2 (en) 2008-03-12 2012-04-10 Echostar Technologies L.L.C. Apparatus and methods for authenticating a user of an entertainment device using a mobile communication device
US20090233593A1 (en) * 2008-03-12 2009-09-17 Dish Network L.L.C. Apparatus and methods for controlling an entertainment device using a mobile communication device
US8639287B2 (en) 2008-03-12 2014-01-28 Echostar Technologies L.L.C. Apparatus and methods for controlling an entertainment device using a mobile communication device
US8758138B2 (en) 2008-03-12 2014-06-24 Echostar Technologies L.L.C. Apparatus and methods for authenticating a user of an entertainment device using a mobile communication device
US9210355B2 (en) 2008-03-12 2015-12-08 Echostar Technologies L.L.C. Apparatus and methods for controlling an entertainment device using a mobile communication device
US20120019487A1 (en) * 2009-01-14 2012-01-26 Takashi Kazamaki Electronic device and information processing method
US8553015B2 (en) * 2009-01-14 2013-10-08 Sharp Kabushiki Kaisha Electronic device and information processing method

Also Published As

Publication number Publication date
EP1416423A3 (en) 2005-07-13
EP1416423A2 (en) 2004-05-06
CN1320506C (en) 2007-06-06
US20040085286A1 (en) 2004-05-06
US7009594B2 (en) 2006-03-07
KR20040038643A (en) 2004-05-08
CN1499446A (en) 2004-05-26
JP2004164609A (en) 2004-06-10
KR101026630B1 (en) 2011-04-04
BR0304250A (en) 2004-09-08

Similar Documents

Publication Publication Date Title
US7009594B2 (en) Universal computing device
US7262764B2 (en) Universal computing device for surface applications
CA2491771C (en) Universal computing device
US7133031B2 (en) Optical system design for a universal computing device
US7627703B2 (en) Input device with audio capabilities
US20070003168A1 (en) Computer input device
EP1403777A2 (en) Method and system for identifying a paper form using a digital pen
AU2002335029A1 (en) A combined writing instrument and digital documentor apparatus and method of use
WO2008150919A9 (en) Electronic annotation of documents with preexisting content
JP2010519622A (en) Note capture device
EP1403755A2 (en) Method and system for creating a document having metadata
JP2008181510A (en) Handwritten entry information collection and management system using digital pen
JP2007183987A (en) Writing input device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034766/0001

Effective date: 20141014