WO2002059716A2 - Pointing systems for addressing objects - Google Patents
Pointing systems for addressing objects Download PDFInfo
- Publication number
- WO2002059716A2 WO2002059716A2 PCT/US2001/050804 US0150804W WO02059716A2 WO 2002059716 A2 WO2002059716 A2 WO 2002059716A2 US 0150804 W US0150804 W US 0150804W WO 02059716 A2 WO02059716 A2 WO 02059716A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- methods
- pointing
- attitude
- action
- mobile unit
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
- H04W4/026—Services making use of location information using location based information parameters using orientation information, e.g. compass
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1632—External expansion units, e.g. docking stations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1698—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a sending/receiving arrangement to establish a cordless communication link, e.g. radio or infrared link, integrated cellular phone
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9537—Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/52—Network services specially adapted for the location of the user terminal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72457—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to geographic location
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/18—Information format or content conversion, e.g. adaptation by the network of the transmitted or received information for the purpose of wireless delivery to users or terminals
- H04W4/185—Information format or content conversion, e.g. adaptation by the network of the transmitted or received information for the purpose of wireless delivery to users or terminals by embedding added-value information into content, e.g. geo-tagging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/20—Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/51—Relative positioning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
Definitions
- the field of these inventions is best characterized as pointing systems for addressing objects and is more specifically characterized as computer pointing systems for addressing objects and for manipulating information relating to such objects.
- 'pointing systems' include apparatus and methods arranged to address objects.
- a thing is the subject of a pointing system address action when the pointing system is manipulated in a manner which causes it to suggest an alignment or association (generally a spatial relationship) with the thing via pointing and position references of the system.
- An object can be said to be 'addressed' when a pointing system is pointing thereto.
- a common gesture in communication involves the pointing of ones finger toward an object of interest to indicate the object. For example:
- a computer has a 'pointing device' which is commonly a mouse type periphery; but may be a track-ball, PowerPoint®, touch screen, et cetera.
- a computer pointing device With a computer pointing device, a user is provided the opportunity to interface with the display of a computer by making 'point-and-click' actions, among others.
- a cursor position within the display region is driven by tactile inputs from a user's hand. Such is the case with a mouse type periphery where spatial position is driven in two dimensions by the movements of a handheld orb.
- a 'Touch Screen' type pointing system is interesting because it is not a cursor icon device which is doing the pointing but rather the tip of a physical object, a user's finger. Contact made with the screen as a finger taps the screen's surface causes an event which may be detected and measured by tranduction apparatus, typically a resistive matrix membrane.
- the present invention concerns a pointer for addressing real objects anywhere rather than objects represented in two space on a computer's display screen such as an icon.
- Page Lohr Associates P.O. Box 757, La Jolla, CA 92038 (619) 7024471 Comes now, Thomas Ellenby, Peter Ellenby, John Ellenby, Jeffrey Alan Jay, and Joseph Page with inventions of pointing systems including devices and methods of addressing objects. It is a primary function of these systems to provide users means of indicating to a computer an object of interest and to further process information relating to addressed objects.
- Inventions presented in this disclosure are best characterized as pointing systems which operate in three orthonormal spatial dimensions. These pointing systems may be used to address objects and trigger computer responses relating to or depending upon the objects being addressed. Devices of these inventions may be envisaged as a mouse for the 'real- world'.
- a device which is freely movable, a mobile unit may be manipulated in a fashion to cause it to point towards or address an object of interest.
- a determination of position and orientation, among other parameters of the mobile unit uniquely defines an instantaneous address state of the device.
- a search of a database is performed to determine which objects are being addressed.
- Data relating to an object is stored in a database along with a geometric construct which describes a spatial body associated with the object, herein called a 'geometric descriptor'.
- a mobile telephone equipped with a Global Positioning System and an electronic compass is arranged to pass position and attitude measurement information to a computer.
- a computer, prepared with pre-programmed operating instruction and data relating to objects including geometric descriptors which define spatial bodies associated with various objects, is set to be responsive to particular address states as determined by the mobile telephone.
- the pointing state of the telephone described by an address indicator may become known to the computer periodically upon certain stimuli, for example, expiration of a pre-set time period kept by a timer. In response to receipt of address
- the computer performs a database search to determine which objects have geometric descriptors intersected by the address indicator.
- a result set is prepared in accordance with a program running on the computer in view of certain filters which may be applied to the recalled object data.
- the result set may be used as a computer takes an action, which relates to the results and thus the address state of the mobile unit, in agreement with particular programming running thereon.
- a user presses a special function key to cause the computer to report the telephone number of the Sheraton Grand to the user via a display screen and further to place a call to the hotel. Additional information relating to the hotel may also be available and passed to the mobile telephone user. Information such as vacancy, pricing, preferred room availability, et cetera.
- a computer action may also include those which do not occur at the telephone but rather at the hotel.
- a user may cause a dinner reservation to be transmitted to the hotel in response to pointing to the hotel and pressing a certain reservation function key.
- Figure 1 is a block diagram of a system overview
- FIG. 2 is also a block diagram which details one element of the overall system
- Figure 3 is a similar block diagram directed to an alternative version of a system where one subsystem is associated with a different element;
- Figure 4 is yet another alternative where a subsystem is distributed among two primary system elements;
- Figure 5 diagrams a global positioning system in relation to the Earth;
- Figure 6 presents a simple representation of the Earth's magnetic field model;
- Figure 7 sets forth a coordinate definition for directional references used herein;
- Figure 8 is a block diagram directed to components of a direction sensing system
- Figure 9 illustrates a mobile telephone in combination with a compass element
- Figure 10 similarly shows a telephone in use with a compass
- Figure 11 - 13 show similar illustrations of a mobile telephone used in combination with a dipole compass
- Figure 14 graphically describes a server having a database and special data structure;
- Figures 15 - 23 are each block diagrams directed to method steps, in particular,
- Figure 15 describes the four primary steps of any method of these inventions;
- Figure 16 illustrates a 'determine address state' step;
- Figure 17 further details the substep 'collect parameters';
- Figure 18 illustrates a 'prepare request' step;
- Figure 19 further details the substep 'build request';
- Figure 20 illustrates a 'transmit request' step;
- Figure 21 illustrates a 'process request' step;
- Figure 22 further details the substep 'database search';
- Figure 24 is an illustration of an important object used in examples of this disclosure.
- Figure 25 is a similar illustration with a graphical element relating thereto also included;
- Figure 26 shows a high contrast version and is presented for extra clarity
- FIG. 27 shows a similar version with added detail
- Figure 28 is a halftoned photographic image illustrating a person using a device of the invention
- Figure 29 is a similar image with exaggerated contrast for more perfect clarity
- Figure 30 shows another user in relation to objects being addressed
- Figure 31 is a photo illustration of the same user addressing other objects.
- Figure 32 illustrates a plurality of users in a special relationship with a single addressed object
- Figure 33 depicts certain geometry of critical importance
- Figures 35 - 44 also show geometries of important elements of these inventions.
- Figure 45 shows a mobile unit addressing a restaurant type object
- Figures 46 - 48 illustrate multi-media data being played at a user interface
- Figure 50 shows a user engaging a restaurant via a mobile unit of these inventions
- Figure 51 is a line drawing to show the graphical user interface and element thereon;
- Figure 52 illustrates a particular use of the device to manage a switching means;
- Figure 53 further demonstrates switching between toolbar icons
- Figures 54 - 61 are line drawings directed to still further techniques of moving a cursor about a graphical display.
- An application server may be a general purpose computer operable for execution of computer code. Address
- Address Indicator is a term used to describe a collection of parameters relating to the physical nature of a device.
- An 'address indicator' is a geometric construct used to describe the address state of devices of these inventions.
- Attitude Attitude and orientation may be used as synonyms herein this document.
- Attitude is a specification of direction in an arbitrary coordinate system. It generally relates to a linear reference which may be incorporated as part of a physical device.
- one possible coordinate system includes that where attitude is given as a combination of two angular measures which may be represented by ⁇ and ⁇ .
- ⁇ representing the directions of a compass, i.e. those in the horizontal plane 0° to 360° where North is 0°
- ⁇ representing an angle measured from a horizontal plane from -90° to +90° to uniquely define any direction which may exist in three space.
- 'database' means data storage facility without limit to conventional database systems which include considerable amount of data management computer code. Accordingly, a simple common delimited text file may be a data store in some special versions of these inventions. Geometric Descriptor
- a geometric descriptor is a construct having an association with a particular object used to define a region of space which may be similar to the region of space occupied by the associated object.
- Space described by a geometric descriptor may change as a function of time.
- Mobile Unit Use of the term 'mobile unit' is not meant to limit the scope of these inventions.
- a unit may be considered mobile if it is operable in changing either its position reference or its direction reference.
- devices having a fixed position but variable direction reference are intended as mobile devices.
- devices having fixed direction reference but a variable position reference is included as a device of these inventions.
- preferred versions have mobile units which are highly mobile, i.e. telephone handsets, 'mobile unit' is not to be applied to constrain less mobile devices from being included in the definition of these inventions.
- a unit is a 'mobile unit' when its position or direction references.
- An 'object' may be a real or virtual entity having some spatial extent associated therewith.
- a request is formed in and transmitted from a mobile unit to a server as a request for information and processing services.
- a server computer includes a computer which operates to receive requests from and provide services to client computing devices such as mobile units which may be remote from but in communication with such server computer.
- Special Function Facility
- a special function facility is a module which is arranged to perform application specific function.
- a special function facility may be physically located in a mobile unit, in a server computer or in a network but is in communication with either a server computer or a mobile unit whereby it may receive instructions or cues therefrom and perform a special function in response thereto.
- Wireless Network
- 'wireless network' is used throughout to promote a better understanding of preferred versions. However, use of 'wireless network' is not meant to exclude a case which is counter intuitive in view of the word 'network'. Although a network is generally comprised of many nodes, the special case where there is but one node is not meant to be excluded. It is entirely possible to configure devices of these inventions, all elements being identical, where the 'wireless network' has but a single node. That is, mobile devices are in communication with a server via a wireless link but there exists only one transmission point in which all mobile units are coupled via the wireless communication link. Therefore, these inventions is meant to include the special case where the wireless network includes only one node.
- Wireless Application Protocol is the name presently used in referring to the standard for wireless communication. By committee, a protocol was designed and agreed upon in order that developers of computer applications and network administers provide products which cooperate together. The protocol, like most others, continues development and is accompanied by changes from time-to-time. Accordingly, it will be recognized that references to WAP herein this document will be broadly interpreted as the standard which prevails at the time of question rather than at the time of this writing.
- a reference to 'WAP' includes those versions of wireless application protocols which are sure to come regardless whether they are called 'WAP' or not. What is meant by WAP is the most current and prevailing version of a wireless application protocol.
- a mobile unit 1 which may be in the form of a handheld computing appliance such as a mobile telephone 2, is in wireless communication 3 with a network of receiving station(s) 4 to which are connected computers 5 arranged to direct data traffic via a gateway 6.
- a 'request' initiated in the mobile telephone depends on the physical nature (address state) of the telephone.
- the request 7 is passed to an application server 8 arranged to receive such requests which may contain parameters and values as inputs to general and application specific computer programming.
- a content assembler 10 receives data 11 from a database 12 of information relating to objects which might be addressed by mobile units from time to time. Assembled content 13 is then passed back to the gateway where encoders 14 may be used to prepare data for wireless transmission as encoded content 15 back to the mobile unit.
- Figure 2 illustrates some important details regarding the elements of a mobile unit 21 in preferred versions.
- Embodied as a handheld wireless telephone 22 mobile units of preferred versions of these inventions have seven critical elements.
- a computer processing facility 23 a point reference 24 in communication 25 with a position determining means 26, and a direction reference 27 in communication 28 with an attitude determining means 29.
- FIG. 3 suggests that hardware of a position determining means may physically be incorporated within a wireless network apparatus.
- a mobile telephone 31 sends radio signals 32 to receiving stations 33 having special devices 34 to measure the time of arrival of such radio signals. Timing of received signals contains implied information about the location of the transmitting devices.
- position determining hardware 34 may be said to be in mathematical communication 35 with a point reference 36, the point reference being within the mobile device, the position determining hardware being in a wireless network. From this presentation, one will be reminded throughout the remainder of this disclosure that the phyical location of specified elements is not to be implied with strict limitations. To emphasize the point, an example may be drawn to the case where another component is distributed rather than physically confined to the mobile unit. Figure 4 suggests that alternative relationships may occur in other versions with regard to attitude determining means. Where a system is configured with a radio direction finding technique of attitude determination, parts of the attitude determining means lie in the mobile unit 41 while other elements lie in the wireless network. Thus it can be said that the attitude determining means 43 is physically located in both places.
- Position Determining Means i) Global Positioning System GPS ii) e-911 D. Attitude Determining Means i) Solid State Magnetoresistive Sensors ii) Simple Magnetic Dipole Compass E. Computer Processing Facility i) Input / Output for sensors ii) A Wireless Application Protocol browser iii) Code in Read Only Memory F. Local Database G. User Interfaces i) Inputs a) tactile switches b) wheels/trackballs c) touch pad d) DeltaTheta detection e) Voice recognition f) other sensors ii) Outputs a) audio b) visual
- a 'mobile unit' is a portable computing appliance.
- portable computing devices are wireless communications devices, wireless devices, hand-held mobile computers, et cetera.
- the term 'wireless' refers to the device's ability to communicate with other computers connected to a network of fixed or orbiting transceiving stations via electromagnetic communication.
- Devices such as a 'personal digital assistant' PDA, a mobile telephone, a personal navigation aid, are examples of mobile units into which
- Page Lohr Associates, P.O. Box 757, La Jolla, CA 92038 (619) 702 4471 devices of these inventions may be integrated. As functionality of some of these devices begins to overlap the functionality of the others, it becomes difficult to drawn distinction between them. For example, personal mobile telephones are commonly enabled with a programmable telephone directory. Personal digital assistants also incorporate identical function of storing telephone numbers. Since the concepts taught herein can be incorporated into a plurality of types of devices, no attempt is made to describe further the class of product in which these concepts are best placed. Accordingly, the term 'mobile unit' is used to indicate any portable computing platform and the reader will appreciate that restriction to one type or another should not be made. Further, examples presented herein may be drawn to one or another type of mobile unit without the implication that restriction to that type is intended.
- Mobile units of these inventions have the following essential elements.
- Mobile units have a point reference and have a direction reference.
- these elements are sometimes merely geometric constructs without association to physical objects, they serve as important structural elements to which other system elements have a strong and concrete relationship.
- a point reference lies roughly in the geometric center of a physical case or enclosure from which a mobile unit is comprised.
- a direction reference may be arranged to correspond to a longitudinal axis of the body of a mobile unit case.
- mobile unit enclosure can be formed of a hard plastic or similar material suitable for containing electronic components therein, sometimes the point and direction references are arranged with relationships to the mobile unit enclosure.
- an enclosure is an elongated member having a longitudinal axis suggesting the enclosure has a natural "pointing" direction.
- a telephone handset typically has a length which is significantly longer than its width and thickness. Accordingly, a longitudinal axis is said to run along telephone handset length at the geometric center of the phone cross section. When naturally held in a single hand, the telephone handset serves as a good pointing device.
- a mobile telephone handset includes a protruding antenna. The antenna further suggests a pointing direction as it is typically an elongated member which also runs along the direction of a longitudinal axis, albeit sometimes with a slight offset to the edge of the enclosure.
- Mobile units of these inventions also may have computing or computer processing facility.
- the computer processing facility is arranged to receive input signals from specially arranged devices in communication with the computer
- these input signals include those which yield information relating to the position of the point reference and the pointing attitude of the direction reference of the mobile unit.
- the computer processor facility is in communication with or coupled to position determining means and attitude determining means.
- position and attitude determining means are described in more detail below. It is important to note that in addition to position and attitude determining means, other systems which measure the physical state of the mobile unit may also be coupled to a computing facility.
- mobile units of these inventions have at least one user interface.
- An output type user interface enables the computing facility to present information to a user.
- An input type allows a user to interrupt code and send signals to processor.
- Mobile units of these inventions may be in communication with one or more databases of stored information.
- 'Communication' may be described as a chain of links including communication via a wireless network, through the Internet, to a server computer, and finally to a database; a return path being similar but in an reverse order.
- a mobile unit can be best visualized as being coupled to the communication chain via a wireless network such as those presently used for mobile telecommunications.
- best versions of mobile units of these inventions include the same transceivers used in modern mobile telephones; for example, those developed by Qualcomm Corporation as CDMA technologies. 2.
- a wireless network includes electronic means for communication with mobile devices which operate within a network's coverage.
- a mobile device may exchange information with surrounding fixed sites via electromagnetic communication without hardwire connections to the network.
- small computing devices are free to move while remaining connected to powerful information and communication systems.
- a wireless network is generally comprised of a plurality of fixed transceiving stations where one station 'hands-off a signal to an adjacent station as a mobile device moves from one coverage region, sometimes called 'cells', to another.
- Each of the fixed transceiving stations may be in communication with the others or a central processing whereby messages and handling instructions are passed therebetween.
- wireless networks may also be directly in communication with wireline
- Page Lohr Associates P.O. Box 757, La Jolla, CA 92038 (619) 7024471 networks; for example telephone wireline networks and the network of computers known as the Internet.
- Examples of wireless networks include GSM, Cellular, and PCS type networks. Continuous improvements have resulted in very sophisticated systems which are being installed presently. These include the new ability to handle packet type data traffic to support interaction with other digital systems.
- Advance networks known as Universal Mobile Telecommunications System or "UMTS” is the European member of the family of third generation 3G mobile standards. The goal of UMTS is to enable networks that offer true global roaming and can support a wide range of voice, data and multimedia services. Proposed data rates offered by UMTS are: fast moving (vehicular) 144 kbit/s; slow moving (pedestrian) 384 kbit/s; fixed (in- building) 2Mb/s. Commercial UMTS networks are expected from May 2001 (Japan).
- Wireless networks can be coupled to the Internet to provide mobile devices with an extensive, perhaps unlimited, source of information and computing power.
- the Internet is a network of computers in a continuous conversation unlikely to end soon. Although any single computer may terminate its connection and thus leave the conversation, the others remain in communication without effect from the absence of the exiting party. Similarly, a new computer may join the conversation at any time without causing upset to the communication scheme that connects the others.
- the Internet is therefore an efficient means for computers to be connected to other computers and to exchange data and information.
- Computers of different types running different operating systems may be coupled to the Internet by way of communication rules known as 'Internet Protocol' IP and standards such as HyperText Transfer Protocol, HTTP; HyperText Markup Language, HTML; and extensible Markup Language, XML.
- 'Internet Protocol' IP HyperText Transfer Protocol
- HTTP HyperText Markup Language
- HTML HyperText Markup Language
- XML extensible Markup Language
- WAP - Wireless Application Protocol - is the bridge between mobile communication and the Internet. Because mobile devices have attributes with limitations particular to those devices, for example, limited power, limited screen size, limited bandwidth, limited keypad, among others, mobile devices are not well positioned to communicate directly with the Internet standards including the HTML standards. However, the wireless application protocol is a world standard aimed and bridging the gap between mobile devices and today's Internet. A mobile telephone may be used to make an Internet request for information.
- the wireless network is arranged to communicate further with a WAP gateway which processes requests and passes them to an application server which may be positioned as a common Internet server computer.
- a server computer of the instant inventions may be a computer in communication with the Internet whereby other computers can make requests thereof and receive information therefrom.
- a mobile device passes requests including information parameters to the application server.
- the request thus carries information relating to a user's needs.
- the request carries information about the physical state of the mobile unit, i.e. its address state.
- position and attitude parameters may be included as part of an encoded request.
- an application processor can digest the information passed to it from the requesting client (mobile device) and determine an appropriate response.
- an application server searches a database of pre-programmed information to recall information which relates to the position and attitude parameters passed via the request. Therefore some database searches of these inventions are dependent upon the position and attitude information passed from the mobile unit.
- a database record comprises at least a geometric descriptor associated with an object and information elements associated with the same object.
- the database record is the 'glue' which connects information relating to an object to the object's physical being via the geometric descriptor.
- geometric descriptors may be used as indexing means by which information relating to objects may be recalled in a database search action.
- a point reference may be a mere geometric construct. However, it is essential structure with regard to apparatus of these inventions and its importance should not be discounted because of an apparent lack of size or concreteness.
- a point reference allows it to be assigned a location which may or may not correspond to a physical element. Thus, a point reference may be for example assigned a location described as two meters to the left of a certain object where that location is merely a location in space occupied by nothing.
- position measurements are made. These measurements are made with respect to the point reference and some arbitrary frame of reference. For example, a frame of reference coordinate system may be adopted whereby location or position is described by a latitude, longitude and altitude values.
- Preferred versions of these inventions include a point reference which is coupled to a position determining means whereby the position determining means operates to measure the position of the point reference in a particular frame of reference.
- a direction reference may also be merely a geometric construct.
- a direction reference has an endpoint and infinite extent along a line in one direction away from the endpoint.
- a direction reference is essential structure with regard to apparatus of these inventions and its importance should not be diminished because of its apparent lack of size or
- a direction reference may be assigned such that it may or may not correspond to a physical element such as an elongated pointer.
- attitude measurements are made. These measurements are made with respect to the direction reference and some arbitrary frame of reference. For example, a frame of reference coordinate system may be adopted whereby tilt, roll and pitch values may be measured and specified.
- Preferred versions of these inventions include a direction reference coupled to an attitude determining means whereby the attitude determining means operates to measure orientation of the direction reference.
- a spatial relationship exists between the point reference and the direction reference.
- a point reference may be arranged to be coincident with an origin of a vector which represents the direction reference.
- Position Determining Means A position determining means is coupled to and arranged to determine the position of the point reference. Further, the position determining means is coupled in a manner which allows position measurements to be passed into requests conveyed to an application server.
- the position determining means is at least partly integrated within the mobile unit.
- a position determining means is arranged to determine the position of the mobile unit (via a point reference) but the hardware from which it is comprised is part of the wireless network or other external apparatus. Accordingly, a position determining means may exist within the mobile unit or within the wireless network without loss of generality. For purposes of this disclosure, the limitation of a 'position determining means' is met when means are arranged to determine the position of a point reference associated with a mobile unit.
- a first preferred arrangement has a Global Positioning System GPS receiver contained in the mobile unit case.
- a GPS receiver is sufficiently small in size whereby it is easily incorporated as a portion of an electronic hand-held device. The accuracy of position measurements made by a GPS receiver is quite good and within
- the global positioning system is a satellite-based navigation system consisting of a network of orbiting satellites 51 that are eleven thousand nautical miles in space and in six different orbital paths.
- the satellites are constantly moving, making two complete orbits around the Earth in just under 24 hours or about 1.8 miles per second.
- GPS satellites are referred to as NAVSTAR satellites. Transmitter power is approximately 50 watts, or less. Each satellite 52 transmits two signals 53, LI and L2; civilian GPS uses the 'LI' frequency of 1575.42 MHz. Each satellite is expected to last approximately 10 years. Replacements are constantly being built and launched into orbit.
- the orbital paths 55 of these satellites take them between roughly 60 degrees North and 60 degrees South latitudes. Accordingly, one 54 can receive satellite signals anywhere in the world, at any time. As one moves close to the poles, the GPS satellite signals remain available. They just won't be directly overhead anymore.
- GPS works in all weather conditions.
- the GPS signal contains a 'pseudo-random code', ephemeris and almanac data.
- the pseudo-random code identifies which satellite is transmitting — in other words, an I.D. code.
- satellites by their PRN (pseudo-random number), from 1 through 32, and this is the number displayed on a GPS receiver to indicate which satellite(s) is(are) being receiving.
- PRN pseudo-random number
- Ephemeris data is constantly transmitted by each satellite and contains important information such as status of the satellite (healthy or unhealthy), current date, and time. Without this part of the message, your GPS receiver would have no idea what the current time and date are. This part of the signal is essential to determining a position.
- Each satellite transmits a message which essentially says, "I'm satellite #X, my position is currently Y, and this message was sent at time Z.”
- GPS receivers read the message and saves the ephemeris and almanac data for continual use. This information can also be used to set (or correct) the clock within the GPS receiver.
- a GPS receiver compares the time a signal was transmitted by a satellite with the time it was received by the GPS receiver. The time difference tells the GPS receiver how far away that particular satellite is. If distance measurements from a few more satellites are added, triangulation techniques yield a position measurement. This is exactly what a GPS receiver does. With a minimum of three or more satellites, a GPS receiver can determine a latitude/longitude position — sometimes called a 2D position fix. With four or more satellites, a GPS receiver can determine a 3D position which includes latitude, longitude, and altitude. By using a series of position measurements, a GPS receiver can also accurately provide speed and direction of travel (referred to as 'ground speed' and 'ground track').
- a typical civilian GPS receiver provides 60 to 225 feet accuracy, depending on the number of satellites available and the geometry of those satellites. More sophisticated and expensive GPS receivers, costing several thousand dollars or more, can provide accuracy within a centimeter by using more than one GPS frequency. However, a typical civilian GPS receiver's accuracy can be improved to fifteen feet or better (in some cases under three feet) through a process known as Differential GPS (DGPS).
- DGPS employs a second receiver to compute corrections to the GPS satellite measurements.
- the U.S. Coast Guard and U.S. Army Corps of Engineers (and many foreign government departments as well) transmit DGPS corrections through marine beacon stations. These beacons operate in the 283.5 - 325.0 kHz frequency range and are free of charge.
- DGPS Beacon Receiver may be coupled to a GPS receiver via a three-wire connection, which relays corrections in a standard serial data format called 'RTCM SC- 104.'
- An alternative position determining means may be arranged as part of the wireless network. In consideration of the time of arrival of radio signals at multiple
- 'e911' positioning the system is being considered for installation in wireless networks for use in determining locations of callers for example in times of emergencies.
- the load on the mobile unit is lighted as power and space requirements are removed from the mobile unit and placed at the wireless network which has much greater tolerance of these parameters.
- an encoded request is received at the wireless stations with attitude information but without position information.
- the wireless network computers then compute position information and attach that to the encoded request.
- that encoded request is transmitted to the WAP gateway where the request has position and attitude information therein.
- attitude determining means is coupled to and arranged to determine the attitude of the mobile unit direction reference. Further, the attitude determining means is coupled in a manner which allows attitude measurements to be passed into requests conveyed to an application server.
- attitude determining means are integrated within the mobile unit. In alternative versions, it is arranged to determine the attitude of the mobile unit but the hardware from which it is comprised is part of a wireless network or other external apparatus.
- attitude determining means include solid state devices arranged to sense magnetic fields natural about the Earth's dipole. A determination of the strength of magnetic fields in three orthogonal directions allows one to compute a pointing attitude.
- Mobile units of these inventions can be integrated with such solid state devices and supporting hardware whereby pointing actions can be monitored and measured.
- These solid state sensor packages are available as off the shelf items ready for integration. For example Honeywell three-axis magnetic compassing sensor attitude reference HMC2003 based on magnetoresistive principles.
- the Earth's magnetic field intensity is about 0.5 to 0.6 gauss and has a component parallel to the Earth's surface that always points toward magnetic north. This is the basis for all magnetic compasses.
- Anisotropic Magnetoresistance AMR sensors are best suited for electronic compasses since their range of sensitivity is centered within the Earth's field.
- the Earth's magnetic field can be approximated with the dipole model shown in Figure 6. This figure illustrates that the Earth's 61 magnetic fields 62 point down toward North in the northern hemisphere, is horizontal and pointing north at the equator, and points up toward north in the southern hemisphere. In all cases, the direction of the Earth's field is always pointing to magnetic north.
- Azimuth arcTan (y/x) the required magnetometer resolution can be estimated.
- Solid state magnetoresistive sensors are available today that reliably resolve 0.07 milligauss signals giving a five times margin of detection sensitivity. Often compasses are not confined to a flat and level plane. As devices of the present
- Page Lohr Associates, P.O. Box 757, La Jolla, CA 92038 (619) 7024471 invention are preferably hand held, it is difficult to determine an azimuth associated with the reference or heading direction since the compass is not always horizontal to the Earth's surface. Errors introduced by tilt angles can be quite large and depends on the tilt angle.
- a typical method for correcting the compass tilt is to use an inclinometer, or tilt sensor, to determine roll and pitch angles illustrated in Figure 7.
- the terms 'roll' 71 and 'pitch' 72 are commonly used in aviation: roll refers to the rotation around a forward direction indicated in the drawing as 'X', and pitch refers to the rotation around a left-right, direction indicated as ⁇ in the figure.
- Liquid filled tilt sensors use electrodes to monitor fluid movement as the sensor changes angles. Newer solid state accelerometer tilt sensors are available that measure the Earth's gravitational field by means of an electromechanical circuit. The outputs of these devices are electrical signals equivalent to angles of tilt.
- the equation for azimuth in a horizontal plane can be used to determine the azimuth with a tilt bias.
- a block diagram for a tilt compensated compass is shown in Figure 8. After the azimuth is determined, the declination correction can be applied to find true north according to the geographic region of operation.
- Solid state magnetic sensors 81 cooperate with tilt sensors 82, to arrive at signals which are processed digitally 83 in a computing apparatus. The entire device has been made quite small and versions are easily integrate with common wristwatchs.
- a simple dipole compass may consist of a pointer needle having a magnetic bias which is set to float in a liquid where it may freely turn about an axis. When exposed to the Earth's magnetic fields, the dipole aligns itself such that it points to magnetic north. Although a simple dipole compass is very ineffective when a telephone is held upright as shown, the compass becomes operable when the phone is held substantially level in a horizontal plane.
- Figure 10 presents a mobile telephone 101 lying in a near horizontal plane having a simple dipole compass 102 on its backside.
- a pointing needle 103 is aligned with the Earth's magnetic fields. Used in proper fashion, the telephone, and consequently the compass, is rotated about a vertical axis 104 to cause the floating needle to become further aligned with indicia 105 on the compass bezel 106 which indicates North. In this way, a user can determine the pointing attitude of the telephone and reference direction 107 with respect to Earth's magnetic North.
- a mobile telephone 111 is presented substantially in a horizontal plane with reference direction 112.
- the telephone bottom portion is configured with special mechanical interlock devices 113 whereby an add-in 115 unit may be firmly coupled to the telephone at the bottom by inserting 114.
- the add-in unit is accompanied by a simple compass 116 having pointing needle 117.
- Page Lohr Associates P.O. Box 757, La Jolla, CA 92038 (619) 7024471 used. Instead, it serves as a great simplification if the dipole is configured as a disk rather than a needle and is made to freely rotate in fluid and become aligned with the Earth's magnetic field. As the phone is pointed in various directions, the disk aligns itself appropriately to reveal directional information. Indicia on the disk directly references the direction in which the mobile unit is pointing.
- a floating disk which may be integrated into an add-in module 121 allows the telephone 122 and consequently the reference direction to be pointed in a direction of interest while a disk 123 rotates about a vertical axis 124 to align indicia 125 with a reference mark 126.
- indicia shown appears in a viewing window having magnifying properties thus improving readability of very small compass devices.
- a user can enter that information by way of tactile manipulation of a keypad or other user input interface.
- simple dipole compass as described can be affixed to the back of a telephone by a user as an 'after market' addition. A common telephone purchased at the time of this writing is easily modified with the addition of a compass so attached to its backside.
- Figure 13 shows a streamlined mobile unit 131 having a reference direction 132 and dipole disk compass integrated therein the back side of the unit housing. Indicia 133 is visible through a window 134 with a lens.
- Power, weight, screen size, et cetera are all limited by the portable nature of the devices.
- the key pad generally used with mobile telephones is limited in the number of keys available for user input.
- encoding protocol used in wireless devices is aimed at cooperating with such brief keypads, user input is limited in that regard also. Therefore, a special scheme is devised whereby the directions about a compass, are translated to cooperate with a 10-key keypad of a mobile unit as follows.
- Page Lohr Associates, P.O. Box 757, La Jolla, CA 92038 (619) 7024471 envisage the 10-key keypad superimposed upon a simple compass rose including the eight primary points of the compass directions: North, South, East, West, Northeast, Southeast, Northwest, Southwest.
- North is meant to correspond to the '2' key of the keypad which lies in the upper middle of the keypad in a standard arrangement.
- South is aligned with the '8' key, '6' being East, and '4', West. It then follows that the '7' key is Southwest, and the other keys assignments logically follow.
- this scheme which assigns a key to each of the eight points of a compass, leaves the '5' key and the '0' unused.
- the scheme is further arranged to provide a prefix to the points described about.
- Using the '0' key as a prefix to any of the other keys indicates the direction assigned to the key minus 25 degrees.
- the 'T key is assigned to Southwest, which is 225 degrees. If a '0' prefix is used before a 'T key, then the direction being refereed to is 200 degrees.
- the '5' key is used as a prefix to indicate the value of the assigned key plus 25 degrees. Therefore, composing a '5' and a 'T causes the direction input to be 250 degrees.
- a computer processor arranged to run programming in the form of instruction sets is provided within mobile units.
- the computer processing facility includes typical supporting elements such as: memory, bus, display, input/output, power supply support et cetera.
- a general processing facility may be preprogrammed via stored code in a ROM type memory, or may be a generalized processor arranged to execute stored code as well as code received from external devices. i) Input / Output for sensors
- a computer processing facility may be configured with special means of communication between devices such as sensors and other measurment apparatus.
- programming runiiing on the computer processing facility may include support for interupts and messaging technique which interacts with or responds to signals present at input and output ports.
- 'WAP' A Wireless Application Protocol browser Development of 'Wireless Application Protocol', or 'WAP', is being driven by the WAP Forum, initially founded by Motorola, Nokia, Ericsson and Unwired Planet now more precisely known as 'Openwave'. Since its inception, the WAP Forum has grown dramatically and now comprises over 80 members drawn from the world's leading mobile telecommunications and software companies.
- WAP is a technology designed to provide users of mobile terminals with rapid and efficient access to the Internet.
- WAP is a protocol optimized, not only for use on the narrow band radio channels used by second generation digital wireless systems but also for the limited display capabilities and functionality of the display systems used by today's mobile terminals.
- WAP integrates telephony services with microbrowsing and enables easy-to-use interactive Internet access from the mobile handset.
- Typical WAP applications include over-the-air e-commerce transactions, online banking, information provisioning and messaging. WAP will enable operators to develop innovative services to provide differentiation in competitive market environments.
- Devices of these inventions therefore may include a module known as a WAP browser.
- This browser is implemented in software and allows devices to communicate with the WAP gateway by way of wireless networks.
- a 'device' as consisting of hareware elements only, it may be instructive to include software as part of the device.
- Software or computer instruction code may be stored in a memory such as a RAM module which is part of the computer processing facility.
- a memory device such as a CD-ROM may be employed to run programming particular to a certain application. That an infinite number of applications are possible should not disturb the notion that a pointing device responsive to position and attitude measure is a unique invention in and of itself without regard to any particular application associated with that functionality.
- a local database can be envisioned as a database separate from a system main database.
- a specialized request is sent to a server computer who returns as a response a small portion of a primary database.
- this database may be a subset of the data stored in the primary database but may be useful in the mobile unit as a temporary database of limited extend for quick searching and other operations which do not generate a new call/request through a network.
- a system make be arranged to take a preliminary position measument. This measurement may result in the determination that the user is in a particular city; for example San Francisco.
- a remote server may send a data set of anticipated targets, i.e. those targets in San Francisco, to a database cache.
- a more precise position measurement and an attitude measurement are combined to suggest an address indicator the cache may be searched first as it has been preloaded with a primary data set. In this scheme, it is not necessary to transmit a request and response tlirough a network, but rather the entire process would be handled within the mobile unit and with regard to a limited database held there.
- a local database may also be set-up as an empty template which operates to receive data therein in response to actions applied to the mobile unit.
- a mobile unit may be set into a program mode to collect data in accordance with a particular scheme.
- a salesman of lawn fertilizer may be interested in generating a mailing list of customers who are particularly in need of lawn fertilizer products. While driving down a residential street, the salesman may use a mobile device of these inventions to address houses with poor quality or unhealthy lawns. Upon being addressed, mailing information associated with a particular house may be pulled from the primary database containing all houses and entered into a local database of houses to be sent an advertisement relating to lawn fertilizer.
- actions applied to a mobile unit stimulate construction of a local database. It is impossible to present an exhaustive
- User interfaces may be included as elements in a mobile unit. Interfaces of these inventions may be classified as either an input type interface or an output type interface. Input type interfaces are arranged to convert physical conditions external to the system into electronic signals which may be processed by the systems. Output interfaces convert system electronic signals into physical embodiments perceptible to observers users of the systems. Electronic transducers and transducer systems are coupled to mobile unit computing processing facilities by input or output communications ports and operate as either input or output type interfaces. i) Input Interfaces Some examples of input type user interfaces include, but are not limited to, tactile switches, wheels, trackballs, keypads, touch pads, angular displacement detectors, voice recognition systems.
- a) Tactile Switch Perhaps the most important input interface is a simple tactile switch. To perform and realize a 'click' event, a simple switch operable under influence of finger actions is arranged. Analogous to the buttons of a mouse type peripheral computer device, a tactile switch may be arranged as a button which may be easily engaged by a human finger.
- a tactile switch yields exceptional utility because it may be associated with any of a great plurality of computer actions which may be offered at appropriate times during the runtime of a computer program.
- Code may be prepared such that the computer is set to respond in a particular way whenever a click event, i.e. an operation of the tactile switch, is detected. Accordingly, best versions of systems of these
- Page Lohr Associates, P.O. Box 757, La Jolla, CA 92038 (619) 7024471 inventions include an apparatus having a tactile switch arranged to generate click events which are detected by computer processing facilities.
- a illustrative example include the 'send' key of a mobile telephone. Upon stimulation of the send key, a mobile telephone is set into an operation wherein a connection to another telephone is created.
- a keypad may include a 'get' key which activates a processing step and forms a request for information.
- Wheels and trackballs are tactile devices providing a continuous or analog signal of increasing amplitude rather than a discrete or non-linear signal associated with a switch.
- a wheel or trackball may be associated with functions such as zoom or volume controls which are more closely associated with analog type adjustments.
- a wheel or trackball tactile device is employed to provide a continuous analog input.
- touch pad type device Another type of tactile device which may be used in various apparatus of these inventions is the touch pad type device.
- a touch pad allows one to drag a fingertip across a sensitive surface which provides position indication to the system.
- a button or two are placed in close proximity for the purpose of 'click', 'left-click' and 'double-click' type interactons.
- a touch screen device marries an image display screen where information is displayed thereon and associated with a position on the screen and that position on the screen is addressable via touch actions generated with a user's fingertips. As such, 'clicks', or more precisely screen 'taps' serve as stimulus or events to lauch programming. d) Angular Displacement Detection System
- Another input-type user interface includes a system to detect displacements which are rotational or angular in nature. These may be accelerometers, gyroscopes or electronic compass devices. When a mobile unit is manipulated in a predetermined maimer, i.e. moved in a manner described as an angular displacment, the mobile unit may cause an associated response. For example, a display showing four options in a listbox with one option having a focus property indicated by a highlighted background. Upon wanting to choose one of the non-selected items in the list, a user can cause the mobile unit to be rotated about a horizontal axis to cause the forward
- Voice recognition systems may be employed to drive computer commands in a normal fashion. Because mobile telephones are well equipped with audio devices such as a speaker and microphone, some versions of these inventions will employ voice recognition to interface with the underlying computer processing facility. f) Other sensors
- Audio Audio indicators for example buzzers and speakers, may be used to communicate with human users by way audio cues and signals.
- a mobile unit can be set into an operational mode whereby a pan operation or scan motion allows the device to produce a 'beep' response in connection with a the unit being aligned with a selected type target.
- Mobile units of some preferred embodiments may include output interfaces for providing a visual feedback signal to a user.
- visual type output interfaces such as a simple text display may serve to provide text information.
- Preferred pixelized displays including color pixel elements are quite common and becoming very inexpensive. Even single LEDs may be appropriate for use as user interfaces in some simple versions of these inventions.
- a wireless network may be thought of as a communication link between two terminal ends.
- a mobile telephone handset forms a first of the terminal ends.
- a personal digital assistant PDA, or a simple laptop computer may describe the appliance which forms the terminal end of a wireless network link.
- a second terminal end is typically a wireline telephone, but may alternatively be: another wireless telephone handset, a computer, a PDA, et cetera. Either terminal end communicates with another via transmission of radio signals. Radio signals may propagate from a terminal end to a receiving station.
- a receiving station sometimes referred to as a 'cell site', may be connected to a wireline network. In special cases, cell sites may operate to direct certain transmissions into the Internet via a WAP gateway.
- a wireless network is sometimes referred to as 'mobile Internet' or the 'wireless web'.
- mobile units of the invention may communicate with server computers in agreement with wireless protocol presently in service.
- WAP is presently leading technology, it is not an essential element whereby its absence would cause defects in any of the devices suggested here. It is hereby acknowledged that WAP is designed and directed to second generation of wireless networks and simple display screens. It is not yet certain that the protocol used in newer wireless networks such as UMTS or HDR will bear the name 'WAP'. It is however certain that some standard will prevail and that standard will allow wireless devices to communicate with application servers connected to the Internet. A good faith effort to meet 'best mode' requirements suggests the detailed description of WAP be provided herein.
- mobile units convey requests which includes description of the physical state of the mobile unit, in particular position and attitude information, and that information is used in execution of a special database search to retrieve data relating to objects having a spatial relationship with the requesting mobile unit.
- Wireless networks of these inventions can be set-up to include special function facilities.
- generalized hardware in a wireless network may include transmitters, computers and wireline interconnects, specialized hardware may be integrated to perform certain special function.
- a terminal in the network may include machinery which can be triggered to perform a desired task.
- the wireless network Upon designation of a command to open and flood a portion of the lock, the wireless network transmits the request to the special function facility, i.e. a pump and gate system, the lock is operated without the attention of the lock master.
- the special function facility i.e. a pump and gate system
- This example illustrates how a wireless network terminal in the form of a special function facility may cooperate with devices of the invention to allow a user to choose and operate a machine remotely by merely pointing at the system and interacting with a graphical user interface.
- Point-to-call Facility Another example of a special function facility of particular importance is herein referred to as a 'point-to-call' facility. Mobile units placed in a certain operational mode may trigger a request which is transmitted to the wireless network.
- This request directs the wireless network to place a telephone type connection to any telephone in the world.
- a user may initiate a telephone call.
- the portion of the wireless network which operates to process these types of requests may be considered the special function facility.
- Well trained wireless engineers will note that these requests may be captured and processed without need to install additional equipment; i.e. the wireless computers in place today may be arranged to handle 'point-to-call' requests from mobile customers.
- a server computing unit of these inventions is charged with tasks including handling requests from a client unit.
- a mobile unit or handset unit transmits a request over a wireless connection to a remote server computer which hosts a database and modules to handle such requests.
- a server computer which hosts a database and modules to handle such requests.
- server computers are arranged to receive and process requests.
- this task includes receiving address indicator information and forming a database query, performing a database search, receiving search results, transmitting those results to the requester.
- server computers of these inventions include provision for receiving requests, means for executing programming code, means of forming connections and communications with databases, and means for transmitting results back to a requester.
- Special components of server computers may include a special function facility in the form of a programming module arranged to perform a particular desired function. For example, a server may be set-up to log all activity of a particular user. Where security requires records be kept of system transactions, a special function facility may be set to record transactions made by selected users.
- a server computer in normal operation receives requests, performs database searches, transmits results, and carries out the special function of logging the transactions taken.
- a database is considered an essential element of devices of the invention.
- An extremely unique data structure is formed where general information relating to an object is connected to and associated with a multi-dimensional spatial description of the object. This connection between information and spatial description allows the database to be searched in a manner which allows data to be recalled in response to an alignment of a handheld unit with regard to the spatial desription. This important aspect should not be overlooked as it remains a key to a full understanding of these
- the essential 'glue' is the association between a geometric descriptor which describes a space associated with an object and general information relating the object. In this way, information may be recalled in response to a test for the condition whereby the address state of a mobile unit forms an intersection with a geometric descriptor.
- an application server 141 having a search module 142 cooperates with database 143. Data may be kept in a matrix of high order whereby a record 144 relates to a unitary object comprising at least a geometric descriptor element 145 and additional multi-media data elements 146.
- a geometric descriptor is a definition of the spatial extend which is associated with an object. For example, a certain building on a particular city block may be said to occupy a cubic shaped space which may be specified by mathematically. Thus the building is said to have a geometric descriptor associated therewith which defines the space substantially occupied by the building.
- a database record In addition to a geometric descriptor, a database record also has multi-media data associated with the object. Digitally recorded information such as audio files such as those known as '.wav' or midi, video files such as MPEG, simple text lists, text fields, graphics, photographs, control objects, et cetera, among others, are examples of multi-media data which may be included in an object record as 'information elements'.
- Digitally recorded information such as audio files such as those known as '.wav' or midi, video files such as MPEG, simple text lists, text fields, graphics, photographs, control objects, et cetera, among others, are examples of multi-media data which may be included in an object record as 'information elements'.
- both information elements and geometric descriptors may be arranged in hierarchical datasets.
- a single field element of any record may be defined as another record containing a plurality of field elements.
- data and data structures may be arranged in a 'nested' fashion without
- Mobile units are set into various operational modes via computer code running on the computing facility. Because programming code is highly dynamic and easily changed from one application to another, mobile units may be operated in various operational modes.
- a mobile unit computing facility includes programming directed to providing adjustments in functionality whereby various subsets of instructions are selectably executed, each corresponding to a different 'operational mode' or application specific arrangement.
- a system user may, by keypad entry or other control, change the operational mode of a mobile unit from a currently running mode to another mode, in accordance with various user desires which change from time-to-time. Operational modes may change automatically. Certain device arrangments may provide for a change in operational mode when a mobile unit is carried into a region, as detected by a position determining means, where actions of interest are anticipated in view of the current position of the mobile unit.
- a sailboat racing operational mode may include a menu list of functions which are not useful, indeed not applicable for air travelers.
- an automatic change in the operational mode may be stimulated in response to a change in position of the mobile device without explicit cue from a user.
- most prefered methods of these inventions include the following steps illustrated in the block diagram of Figure 15: determine address state 151; form request 152; transmit request 153; and take action 154. More precisely, a determination of the address state of a mobile unit is made; a request which relates to the address state is formed in accordance with an current operational mode; the request is transmitted to a server running application software; and an action based upon the received request is taken.
- this brief description serves well as an guideline, a more complete complete understanding will be realized in consideration of the following more detailed presentation.
- Mobile unit devices of these inventions are said to have an 'address state'.
- an address state is defined by physical conditions in which a mobile unit exists, and in particular, conditions with respect to certain prearranged references sometimes including a position reference and a direction reference.
- a mobile unit's address state may be specified by a data set herein refered to as an 'address indicator'.
- a computer processor exectutes four substeps and may repeat that execution.
- a template which suggests which parameters are necessary for a currently running application, and how those parameters are to be determined is received 163 as input 162.
- values for all required parameters are collected in a 'collect parameters' substep 164.
- These parameter values are then combined 165 into a single data set to form the address indicator.
- the address indicator is passed as output 168 of the 'determine address state' step to the request generator.
- the substeps may be repeated via a loop 167 command which causes each step to be reexecuted in turn.
- an address indicator is a description of the physical nature of a mobile unit, the address indicator may include many parameters. Values for these parameters may be found via several approaches, including at least: physical measurement;
- Inputs 172 include instructions relating to which parameters are to be collected and how to collect them.
- Physical measurement 173 techniques include those such as perfo ⁇ ning a position measurement with a global position system, GPS. In a time difference of arrival scheme, radio signals received from orbiting satellites form the basis of this physical measurement system. Other parameters may also be found by electronic measurement apparatus. Defaults may be set in advance and applied 174 in the 'determine address state' steps. For example, a range gate setting to specify that objects to be subject to address will always lie further than .5 miles and closer that 5 miles. In this regard, part of the description of a mobile unit address state includes a preset default parameter.
- User inputs may also be used in routines executed in the 'determine address state' step.
- a user may operate a computer interface to provide 175 values for any of the parameters which make up an address indicator. In some versions, this may be viewed as an override where a sub-system measures the parameter but a subsequent user entry replaces the measured value.
- Logical deduction routines 176 may be executed which derive a value for an address indicator parameter in response to assumptions and coded into the logic routines. For example, where a series of events suggests to the computer than a certain activity is happening, the computer may adjust parameters to provide a more favorable address indicator which facilitates recall of best information which relates to the detected activity. Each of the above mentioned techniques may be employed in combination to arrive at values for parameters which make up the address indicator and form an output in the 'Collect Parameters' substep.
- Figure 18 illustrates a request generation module 181 having input 182 from the determine address state step. An address indicator is received 183 and a request is formed 184 in agreement with values of address indicator parameters and further in view of a current operational
- a computing module is the basis of a request generator.
- the computing module executes instructions to formulate a brief data set which may be transmitted to a server as output 185 of the prepare request module.
- the request generator may produce a request in the form of a uniform resource locator, a URL.
- requests of these inventions may be a well known and highly used format of usual Internet protocol.
- a request may include datasets particular to systems of the invention and not related to other Internet techniques.
- a system can be configured to use the XML standard where data objects are described in a document type definition such that receiving applications of any platform can properly recognise the information.
- a request is prepared in agreement with the operational mode and the parameters from which an address indicator includes.
- the 'build request' module 191 may be further described as follows with reference to Figure 19, a block diagram drawn to that process.
- Inputs 192 to the build request module include both instructions from any active operational mode, and information received as an address indicator.
- the operational mode suggests the format of the request to be built.
- the request therefore may include portions known as a header, data specific elements, scripts, user IDs, and return type address information.
- a request is build by assembling these components in agreement with formatting rules inherited in an operational mode program.
- the substep includes actions: set forth a header 193, insert address indicator data 194, apply scripts 195, and prepare return ID information 196. Together these components form the output 197 of the build request module.
- requests are then passed to or transmitted to a server computer in a 'transmit request' substep 201 where requests are received as inputs 202.
- requests may have requests prepared as encoded messages. These encoded messages can be transmitted to remote server facilities via wireless transmission means.
- a radio transmission originates in a mobile unit, propagates through space, to a base station including a high data rate radio receiver, routed via a WAP internet gateway, via landline, i.e. copper or fiber network, and
- a server computer may be running within the mobile unit and a 'transmit request' step merely suggests passing request parameters into processing routines resident on a local device or devices. Accordingly, the step is complete and well performed whenever an existing request finds its way from a request forming module to a server configured to process such requests. Combinations of these are also anticipated.
- a server may be a local server with a limited data set, it can handle certain requests while others are passed into the network with the destination of a remote server. Appropriate routing of requests is handled as part of the transmit request step.
- the transmit request step may be envisaged as including two steps as follows: a select appropriate route step 203, and an execute transmission sequence 204 step.
- the output 205 of the transmit request module is a request having been passed to a server.
- Requests are processed 211 in a manner regulated by a stratagem set forth in an application.
- substeps 'receive request' 213, 'execute instructions' 214, 'search database' 215, 'take action' 216, and 'form reply' 217 are performed. Although these substeps may have variations due to differences in operational modes, their general nature will be fully appreciated.
- requests are received at a server.
- a server is configured, in some cases, to receive requests from many mobile units.
- receiving a request includes managing a plurality of transactions with a plurality of requesting parties and the computing overhead associated therewith. Request handling and management services permits the process request module to address these complex transactions.
- a received request may have therewithin a script or computer code unit which instructs the server to behave in a modified way or to process a function to arrive at some result.
- a part of a 'process request' step includes special processes to be run at the server.
- Sometimes such function will be employed to shift a computing load to the server thereby freeing the mobile unit processor from load which may be difficult to handle there.
- Another occasion where special instructions may be used prior to a database search is when a user has indicated that special data
- One processing action taken in all versions of these inventions is a 'database search' based upon information in the request which relates to the address indicator.
- This search is generally used to produce a result set of information relating to objects being addressed by the mobile unit.
- a primary objective of any database search of these inventions is to determine objects which are being addressed and to retrive information which relates thereto.
- an object is said to be 'addressed' whenever the address indicator which describes a device's physical nature forms an intersection with any portion of an object's geometric descriptor.
- actions are taken where the actions may depend upon the specific objects being addressed. In most instances, actions depend upon information relating to the addressed objects as recalled from the database. These actions may be taken at the server, at the mobile unit, in places therebetween, or in completely unconnected locations.
- a database search 221 as illustrated in drawing Figure 22 illustrates the major steps of a database search.
- Inputs 222 prepared in prior processing are recieved into the search module. These inputs may be in a form of strict or highly regulated form. For example, certain databases have a language which cooperates with retrieving select information from the database via a database 'query'. For example, Structured Query Language, or SQL specifies a form that can be run against any database schema complying with the language rules. An input to the database step of this section therefore may be a highly structured database query string. This string is prepared in agreement with any
- a SQL string is processed in an 'examine records' step 223.
- Records may exist in a database in a plurality of tables where some tables may have known relationships with respect to others.
- a primary table may have information recorded therein which relates to fundemental properties of objects which are further common to all objects. For example, data fields such as 'date created'; 'information source'; and 'expiration date', et cetera.
- Other tables may be arranged and connected to a primary table whereby those tables contain information in a structure which applies only to a class of object. For example a restaurant class object may contain data fields: 'food catagory'; 'quality rating'; and 'price range'.
- Objects belonging to a class different than the restaurant class may not find those fields applicable.
- An object such as a baseball stadium would not have any data relating to 'food category'.
- a SQL command iterates through all records which may be of interest in view of conditions which arise in an operational mode. This of course means that a plurality of tables, as well as a plurality of records, may be addressed while the SQL command is being executed. A well guided iteration through the database information occurs in the 'examine records' step.
- a step to detect intersection 224 considers a data record's geometic descriptor in view of conditions defined in the search command to detect an intersection with the address indicator. When an intersection occurs, the object is marked as a 'hit' object, or in other words, an object currently being addressed by the system.
- data associated with the hit object in a plurality of tables is marked for recall and may be placed into a dataset type container.
- a database search continuedes after finding a hit object. Records are examined one at a time in turn, as directed by the query command, each additional target identified as an addressed target has information relating thereto recalled and
- the search is concluded by emitting an output 226 which includes a completed dataset with information relating to all hit objects.
- the output may also include markers which may be used in computer processing routines for management of related operations. For example, a marker may be included to describe the success, failure or completeness of the search or to describe the dataset such as the total number of records found. These indicators are produced as a clean-up method which may be performed as database search overhead.
- Actions may include a wide range of tasks and operational execution which may be initiated by a computer. This is very important because in some cases it is not the computer which actually performs the task but rather the computer merely commands that it be done. For example, where an object being addressed is the garage door of a person's home, the action to be taken may be to open the door. Thus the computer may generate an 'Open' command and pass it to the door opening machinery. In this sense, the door opener is not an integral part of the system, but is a system in communication with devices of the inventions whereby it may receive commands therefrom.
- An action module receives database search results including the dataset as inputs 232.
- the action module receives 233 the dataset and information relating to objects being addressed for processing therein. This information may be used in or to control actions taken in the module.
- Tests may be performed to determine properties of the data contained in the dataset. As part of a take action process, tests are performed 234 against the dataset and the information contained therein. The results of these tests may suggest precisesly which actions are to be taken. For example, a test may be performed to determine whether some of the objects being addressed by a user are objects which cannot be readily seen by the user due to the object's position behind another object in a user's line of sight or view. In this case, i.e.
- a certain action may be triggered to provide a graphical representation of the object in relation to other nearby objects, a map. This is further described in a prior U.S. application having serial number 09/384,469. Other tests may be executed to
- Page Lohr Associates, P.O. Box 757, La Jolla, CA 92038 (619) 7024471 determine the true nature and state of the dataset and these test results may be used to trigger various actions.
- an instruction set may be called and executed in an 'execute instructions' 235 step. These instructions may produce a result locally at the server or may cause an external operation to be triggered.
- the server preforms a 'build reply' 236 step.
- An appropriate reply is prepared in view of data recalled and in further view of any instruction particular to an operational mode running on the system.
- the reply may include data elements and further organization of said elements as suggested by templates which may be particular to an application.
- a reply template may command that a XML reply be sent where the document type definition of the XML reply is used to arrange data recalled in a fashion whereby it can be well received in the requesting client.
- the 'Take Action' step ends in a 'Transmit Reply' 237 step where the reply is passed as output 238 into a routing algorithm and sent back to the unit where the request was initiated.
- a user manipulates a mobile unit to cause it to point at an object of interest. Since a mobile unit may be encased in an elongated shell with an easily discemable pointing reference, a user may simply hold the mobile unit while moving the wrist to cause the device to address an object.
- a user would merely align the antenna of her mobile telephone so that it points to an object such as a supermarket. In this way, a user effects the step 'addressing an object'.
- the user may indicate to the computer that the addressed supermarket object is to be processed further.
- a user may stimulate a click event while the supermarket is being addressed.
- a click event is one whereby a user operates a switch to provide indication to the computer.
- a click event may cause the computer to divert into an instruction set where any of a great plurality of processes may occur.
- the essence of this step is independent of the action taken. The act of providing an indication to a computer while simultaneously addressing an object of interest thereby setting the computer into any action relating to the particular object being addressed is fundemental to this step.
- the template includes a listing of the elements required in the application, and also provides default values for elements where appropriate. For elements requiring values but where
- an address indicator value may be left empty, set by default, set by reference, or measured.
- an address indicator template can be modified in agreement with user inputs. For example, in a mapping application, a user may only be interested in objects which are relatively near the user's location. Accordingly, a 'range' parameter set by default to eight miles may be changed to three miles in an interactive procedure whereby a user resets the value for the range parameter in the template. Other parameters can receive values in their respective ways; either omission, measurement, or reference.
- a server produces a response which sets forth or triggers an action.
- Actions may be widely varied in their embodiments, but generally they may be classified in the regard as to where the action occurs. By illustration, various types of actions are described. One will appreciate the exact number of different actions which may be taken is unlimited and no attempt is made here to catalogue them. It is sufficient to say that any action which can be set into being is contemplated as being part of these inventions so long as it is done so in view of the preceding and following steps. Although sometimes an action is taken entirely within a server, in other instances the action may be taken up outside the server.
- Action taken in Server Sometimes an action is one which can be taken entirely within the server. Actions taken at the server may include, by way of example, performing special operations on the result set produced in the database search. A result set may be modified and updated and returned to the database. A record recalled in the database search may be updated to reflect changes to objects which occur from time-to-time.
- a request transmitted from a mobile unit may include instructions which cause records associated with a certain geometric descriptor (being addressed) to be changed to reflect the termination of the business.
- the database record requires an update. Therefore a request process may include instruction to take action within the application server or a database connected to said server.
- An action taken by a server may include one whereby the action is within a group of related actions performed serially in view of a plurality of requests.
- the server may include a module configured and arranged to construct an activity history with regard to a particular client or group of clients.
- a business advertising executive user may set forth on a journey to document billboards in ideal locations for advertising products of concern to the business. While driving about a city, the executive chooses preferred billboards, points and clicks a mobile device theretowards the chosen billboards.
- the server can be arranged to build a data set of chosen billboards adding each to the list as it is addressed and identified via the address indicator and database search. This illustrates how server actions to a group of requests are processed at the server to yield a useful product.
- the database search produces the identity of the addressed object and that identity is added (as one of a group of server actions) to a data set which forms documentation desired by the user.
- Requests may be of a form recognized by the server to cause them to be handled in view of special procedure.
- a group of people belonging to a certain social club and registered as such may cause requests to be sent which notify the server of the requestor's present location. Any member of the club could then ask the server to expose the locations of other club members in order that meetings are more easily and frequently brought about.
- each incoming request from any member of the group causes a server action to be taken whereby the server updates a list of club member locations.
- a server may produce a command and convey that command to the wireless network whereby an action is taken there.
- a good example of this case includes what is described herein as a 'point-to-call' function.
- Users of mobile units can find themselves in the position of wishing to contact the entity residing in some building of interest.
- the building, being addressed by the user has contact information in the database.
- the server passes a command into the wireless network to place a telephone call to the addressed entity.
- a server may convey a command to a mobile unit, or a plurality of linked mobile units, to effect an action at the mobile unit.
- devices of these inventions may operate in a manner including such action at a mobile unit.
- a person who is 'it' attempts to locate other players who are hiding By using a mobile unit of these inventions to point-and-click on various locations where players may be hiding the user causes a request to be sent to the server where a response includes a command to alert the user as to the status of the point-and-click action in relation to the game scheme. If an opponent player is hiding in the subject location (addressed location) a server response includes a command to drive an alert signal at the mobile unit. For example, an audio 'Buzz' sound can be provided to indicate a failed attempt to find a hiding person while a 'BeepBeepBeep' sound can be provided to indicate a successful attempt to discover a players hiding location.
- game versions of these inventions illustrate where actions can be taken at the mobile unit portion of systems via a command sent from the server computer. iv) Action at Addressed Object
- An action may be taken at the object being addressed.
- an application is arranged to provide a command from the server to the object being addressed in order that an action be taken there.
- information concerning contact information for restaurants being addressed is recalled.
- a server may contact a subject restaurant by telephone, alternatively by e- mail, or even dynamic web page interaction, to cause a reservation to be made with all
- systems of these inventions include reporting applications where a user points-and- clicks on an incident scene to cause a report to be generated and transmitted to appropriate authorities.
- a server causes a report to be logged at a city facilities unit.
- the railway operations people are provided an alert at their central office.
- authorities appropriate for handling a response to those types of emergencies are contacted.
- a geometric descriptor includes the combination of a mathematical definition of a geometric construct or body and an association with an object.
- a certain building on a particular city block may be said to occupy a cubic shaped space which may be specified mathematically in some reference coordinate system.
- the association with the building (an object) together with the specified shape (mathematical definition) forms a 'geometric descriptor'.
- the geometric descriptor defines the space substantially occupied by the building.
- a mathematical definition of a shape and location alone cannot form a complete geometric descriptor without a connection to some object.
- all geometric descriptors are comprised of at least a description of some spatial extent, a precise position specification and an association with an object.
- Geometric descriptors of these inventions may be set and preloaded into a database as a field type data element; i.e. part of a record.
- a geometric descriptor may be a simple shape without complex detail; in other conditions, a geometric descriptor may include considerable detail with great attention to precise and intricate geometries.
- Figure 24 which contains an image of interest including several objects, specifically the San Francisco Bay 241, the sky above Marin 242, and the famous Golden Gate Bridge 243.
- Geometric descriptors may be configured and arranged for each of these objects.
- An example of a geometric descriptor having an association with the Golden Gate Bridge is presented in the perspective of the image viewpoint in Figure 25. Again, the image contains the bay 251, the sky 252, and the bridge 253.
- the image shows a graphical representation of a geometric descriptor associated with the bridge and superimposed thereon.
- a heavy black outline 254 suggests the periphery of such geometric descriptor. It will be understood that the geometric descriptor actually extends beyond the limits of the image 255.
- Figures 24 - 26 While the precision and level of detail of the geometric descriptor shown in Figures 24 - 26 are moderate, it is easy to imagine that an object can be more precisely modeled and thus a geometric descriptor of considerable detail may be formed for the same object, the Golden Gate Bridge. This becomes necessary in some applications where high resolution demands precise definitions of the spatial extent of which is occupied by an object.
- Figure 27 shows a different geometric descriptor which can be associated with the Golden Gate Bridge in applications requiring extra detail. It is noted that geometric descriptor carefully accounts for the roadway 271, the tower 272, the base members 273 and 274, and finally, the distant base 275.
- FIG. 28 illustrates this without ambiguity.
- a user points a device of these inventions 281 towards awnings 282 and 283, and a banner 284 presented as advertisement by a restaurant.
- These awning and banner objects may be included in a group of geometric descriptors associated with the Karl Strauss Brewery & Grill of La Jolla. They may be envisaged more clearly via the presentation of Figure 29.
- the device is shown as 291, the awnings as 292 and 293, the banner being 294.
- a pointing vector is shown as dotted line 295.
- geometric descriptors which describe a physical building
- additional geometric descriptors may also be associated with the same object, i.e. the restaurant.
- a user by pointing at either of the geometric descriptors, a user causes the restaurant to be addressed.
- a computer make take an action whereby the user receives a menu of the afternoon specials presented on the display of the device for convenient review.
- a geometric descriptor may include geometric shapes which are not three dimensional but rather infinitely thin. This is the case for the banner which may have a geometric descriptor that is planar in nature. Thus a geometric descriptor is not always descriptive of a space but may also describe planar, linear, or even a point geometry.
- An housing domain may exist whereby it is comprised of five separate buildings, each building having four single family units each, each single family unit having three rooms therein.
- the domain may have a master geometric descriptor; each building may also have a geometric descriptor, that geometric descriptor being a slave to the master geometic descriptor associated with the domain; each single family unit likewise has a geometric descriptor which is said to 'belong to' the geometric descriptor associated with the building, and further to the geometric
- Figure 30 depicts a user 301 of a system of these inventions.
- a hand-held mobile unit device 302 is pointed in a direction indicated by arrow 303 towards a building which houses several unrelated businesses. It is easy to appreciate that the user's position as described by latitude, longitude and altitude values is well defined. It is similarly easy to appreciate that the pointing direction corresponds to a compass heading roughly south, or more precisely 254°.
- FIG. 31 shows the young boy 311 equipped with a mobile unit of these inventions 312 pointing in a direction indicated by arrow 313 below the horizon indicated by dotted line 314 towards the juice bar.
- Altitude In high precision devices of these inventions, it is not enough to merely have latitude, longitude, heading, information. This is due to the fact that two systems may have identical latitude and longitude values while pointing along the same compass heading but have very different altitude and pitch values.
- Figure 32 shows how a woman 321 using a mobile device 322 positioned directly above a lady 323 with a similar hand-held device 324, both pointing as indicated by arrows 325 and 327 respectively, to the same object, the 'Mr. Juice' juice bar.
- the altitude and pitch angles greatly affect the outcome of a determination of an address state and corresponding database search for addressed objects.
- attitude determining means which includes a pitch sensor and a GPS type positioning means which includes an altitude determination.
- an 'address state' is a description of the pointing nature of a device.
- Some versions of these inventions may include address state parameters as follows: position which is preferably described as: Latitude; Longitude; and Altitude, measures and attitude which is preferably described as: Heading; Pitch; and Roll. Although in simplified versions position and attitude may be sufficient to
- Page Lohr Associates, P.O. Box 757, La Jolla, CA 92038 (619) 7024471 completely describe an address state of a certain device, other versions may include additional description of an address state.
- Figure 33 shows a point reference and direction reference represented by point 331 and arrow 332.
- the point is arranged at the endpoint of the direction vector for convenience.
- a position determining means is coupled to the point reference represented by the point 331 and makes a measurement of the location including latitude, longitude and altitude of the point.
- an attitude determining means is coupled to the direction reference represented by the arrow 332 and makes a measurement of the pointing nature including compass heading, pitch and roll of the direction reference.
- preferred versions include an address vector having spatial extent in three dimensions.
- Figure 34 shows an illustration of an interesting representation of an address indicator.
- a point 341 is joined with a pointing vector 342.
- An angular measure indicated as ⁇ 343 describes an extension to previous address indicators.
- the angular measure suggests a conic shaped volume to represent the address indicator; i.e. the address indicator has extent in a transverse sense, extent which increases as a function of the distance from the point of origin.
- This is a natural extension for an address indicator because is approximates a beam of light which has been used by all to point towards things, for example a flashlight, or even the headlights of an automobile.
- pointers having finite transverse extent are particularly useful.
- an address indicator may be arranged to have minimal and maximal distance limits associated therewith to set forth a range gate.
- Figure 35 shows a conic section address indicator specified by a point 351 in combination with a pointing vector 352.
- an elliptical element 353 suggests a minimum distance parameter while an elliptical element 354 suggests a maximum distance parameter.
- an address indicator can be said to exist as the volume of space occupied between the limit surface elements: the conic surface 355, minimum ellipse 353 and maximum ellipse 354.
- Angular measure in two orthogonal directions, i.e. shown as ⁇ and ⁇ in the drawing, each is different in value, can be set to describe a special address indicator.
- the address indicator which is rectangular in cross section of Figure 36 is provided with a maximal distance limit without a minimum limit to yield a pyramid shape address indicator. Careful readers will understand that the true shape of the maximum limit may be a spherical section rather than a mere rectangle as shown in the drawing for simplicity.
- FIG. 37 illustrates a case where a simple address indicator in the form of a vector 371 comprising a point reference 372 and a direction reference 373 is directed toward a certain object of interest represented by a geometric descriptor.
- the pointer is aligned such that a (dashed) portion 374 of it intersects a first geometric figure 375, a rectangular cylinder while not forming an intersection with a second geometric figure 376, a circular cylinder.
- Such action taken by a user is said to cause the rectangular cylinder to be addressed.
- the circular cylinder is not being addressed because the pointer does not form an intersection therewith.
- a geometric descriptor must share at least a single point with an address indicator.
- Figure 38 shows the case where an address indicator 381 is coincident with a geometric descriptor 382 at only a single point 383. At no point does the circular cylinder geometric descriptor 384 coexist with any portion of the address indicator and therefor the circular cylinder is not being addressed by the system illustrated.
- Some geometric descriptors are defined as infinitely thin constructs, a pointer
- 392 may intersect a planar geometric descriptor 393 at a single point 394 as shown in drawing Figure 39.
- an address state of a mobile unit is configured to include a range or range gate.
- a description of range limits or complementary minimum and maximum distance limits with respect to the point reference may be used to define a particular region of interest. For example, two objects which lie on a single line with respect to a user's perspective may include one object further from the user than the second which is nearer to the user.
- a user may set an address indicator range gate. Figure 41 illustrates this case more completely.
- Position reference 411 and direction reference 412 form an address indicator which passes through two objects.
- An address indicator may be provided with a range parameter having a minimal distance limit 413 and a maximal distance limit 414.
- the cubic object 415 otherwise would be said to form an intersection with the pointing vector as the address indicator passes therethrough, however it is not within the range gate so it does not form an intersection for purposes of this discussion.
- the portion of the address indicator within the range gate i.e. marked as line segment 416 in the drawing, does pass through the circularly cylindrical object 417. Thus it is said to be addressed.
- Point reference 421 and direction reference 422 together with maximal distance limit 423 form a address indicator to represent an address state.
- An intersection is formed between object 424 and the address indicator at line segment 425. Because object 426 lies outside the range, no intersection exists between the pointing vector and that circular cylinder object despite the fact that the object lies on the direction reference 422.
- 427 is said to be a address indicator which represents an address state having a range limit.
- Figure 43 is provided to show an address indicator having finite transverse extent comprised of a point reference 431, direction reference 432 together with a range gate forming a conic section 433, having intersection represented as portion 434 with a circular cylindrical geometric descriptor 435.
- a time-of-day parameter may be omitted from a description of the address state of a system.
- a time of day parameter is very important to many other applications. For example, in applications where a menu is displayed for restaurant type objects being addressed, it is important to alternatively display the dinner or lunch menu in agreement with the time-of-day. For bars and clubs, a 'happy hour' includes specials valid only for certain hours; those specials should only be presented during appropriate hours. In certain cases, an object has a geometric descriptor which changes shape and/or position in time. The highly regular trains in Japan move with very certain and well defined regularity.
- a geometric descriptor may be configured to move with the train as a function of time. Users of systems of these inventions are thereby enabled the function of addressing a moving train to learn more information about the train. Accordingly, some preferred systems include an address state having a time-of-day parameter.
- Tilt X (Pitch) 23°
- Tilt Y (Roll) 0°
- the system is equipped with means for providing those values. Although one system may use a different arrangement than another, arriving at parameter values for an address state as described is essential to the step determining the address state of a mobile unit.
- a mobile unit may be provided a clock or a data link to a clock.
- a database record also has information elements associated with an object. Included in these information elements are a special class herein referred to as multi-media information elements.
- a response may include recalling multi-media information elements from the database and presenting that information at a user interface of the mobile unit.
- the device may 'recognize' the object via a test for intersection and provide an audio announcement to the user regarding the object's identity.
- FIG. 45 where a mobile unit 451 is illustrated as being pointed, via pointing vector 452, towards a restaurant building 453.
- the system may be set in a special mode to provide an automatic response whenever a known object has been addressed.
- objects upon being addressed cause an identification step to be executed whereby the mobile unit recalls an object identity audio clip from the database and plays that clip at an output type user interface, for example a speaker 454.
- the speaker produces sound waves 455 to alert the user that the device is being pointed at "Tony Anita's Pizza".
- This example illustrates a first type of multi-media information element which may be associated with a particular object, stored in a database, recalled in response to an object being addressed, and presented at a user interface.
- multi-media information which similarly can be presented to a user.
- FIG. 46 shows a mobile unit 461 having a pixel display screen 462 as an output user interface.
- Such a display allows the restaurant to invite a user for lunch in a clever and attractive advertisement played in response to the restaurant object being addressed by the user.
- Figure 47 includes a mobile unit 471 being pointed via address indicator 472 at the pizza restaurant 473, where a response further shows a text list presented on display screen 474, including a pizza menu 475, and specifically 'Four Cheese' pizza 476.
- the 'Four Cheese' Pizza may be ordered directly by selecting the item from the list and clicking a trigger thereby causing a request function to send a message to the restaurant.
- examples above generally include objects which are concrete and of readily discerable structure for example buildings and the like, under some circumstances, an 'object' may not have any physical structure at all but may never- the-less have a geometric descriptor associated therewith. In these cases, an object may be referred to as a 'virtual object'.
- An example is the restricted airspace over a sitting president's residence, The White House.
- a rectangular cubic volume of space delimits a region in which unauthorized air travel is strictly prohibited. This is one example of an object having a discrete spatial extent which may be described by a geometric descriptor whereby the object is merely space and has no physical part or concrete structure.
- an infinitely thin planar region may form an object of interest to which a geometric descriptor may be associated and thus systems of these inventions may address.
- An example of this type of object is the boundary of a baseball playing space known as the foul ball plane. Extending in a vertical plane from home plate and into the cheap seats, the foul ball planes, there are two on each field, marks the limits of the playing field.
- a foul ball plane may be a virtual type object in certain versions of these inventions.
- an 'object' may be a group of things. Use of the singular form of the word 'object' is not intended to imply there be a limit of only one 'thing' in the object.
- a collection of buildings such as a group of related apartment units may form a single object for purposes of a geometric descriptor. Thus a large plurality of buildings in a group may be included as a single object having one geometric descriptor.
- Special Topic 5 Database Filtering An important aspect of data management, with regard to limited bandwith systems and further in view of the position dependent nature of data of interest, includes forming a data subset and caching it in a readily accessable fast memory. For example, when a mobile device of the invention is located in San Francisco, data relating to objects in Detroit are not of significant consequence. It is unlikely that the General Motors headquarters building would be addressed by users in San Francisco (although, strictly speaking it is possible). Accordingly, programming can be arranged to read a data set and extract portions of data therefrom whereby the extracted data depends upon the user's current position. That position dependent dataset then is transmitted to a special memory which is limited in size but fast in access operations, further that memory can be within the hand-held device thus reducing round trip requests/responses on the network.
- some preferred mobile units may additionally contain a memory which supports this function.
- Preferred methods include steps whereby a predetermined dataset is transmitted to a mobile unit for fast access upon multiple address steps.
- Figure 51 illustrates a mobile telephone 511 used by the lady of figure 50.
- the device includes an output type user interface in the form of a pixelized display 512, whereon the name of the object being addressed appears in an identity header
- a 'toolbar' 514 of important icons which relate to the object being addressed includes a 'menu request' icon 515 which is selected via the highlighted selection cursor 516.
- the display arrives in this condition automatically because the address state of the telephone includes an address indicator 517 which is pointing at a geometric descriptor associated with a restaurant type object (see Figure 50).
- the computer knows the object being addressed is a restaurant type object because the results of a database search produces a dataset of addressed objects where one record corresponds to the Brewery and a field in that record identifies the class of the object as belonging to the restaurant class. Provision for sub-classes may additionally be included such that the computer could be notified the addressed object is a restaurant of the type serving American food. Presentation of toolbars as well as other information can be made responsive to the fields containing information relating to objects being addressed.
- Figure 52 sets forth illustration of such an action used to drive movement of a selection cursor.
- Pretty lady 521 holding mobile telephone 522 simply twists 523 her wrist slightly in a clockwise manner which is detected by tilt sensors.
- Note the pointing vector 524 remains without change and the banner object 525 remains addressed throughout the twist action.
- before and after address states are considered. It is not an instantaneous address state which triggers action but rather a particular change to the address state from one instant to the next.
- FIG. 54 consists a mobile unit in the form of a telephone.
- the telephone has a body 541 which is elongated in nature where the length is considerably greater than the width and thickness (not shown).
- a radio frequency antenna 542 extends from and protrudes outwardly from the telephone body to give a natural feel and a bias with regard to a pointing direction; the antenna suggests a natural pointing direction for the telephone.
- a display screen 543 is an output type user interface of the type having a pixelized array of discrete picture elements. The a pixelized display screen nicely supports use of icon devices in a toolbar arrangement 544.
- the mobile unit may have a point reference 545 in the geometric center of the device.
- the mobile unit reference direction 546 is arranged parallel with the antenna on the axis of the elongated telephone body.
- the drawing includes five regions separated by thin lines and a special marker dashed line 547.
- a display screen toolbar as well as a direction origin may be set as follows.
- the mobile unit 551 having a display screen 552 with toolbar 553 having an icon in the center arbitrarily initiated with the 'focus' or the selection cursor 554 (the terminology which includes the word 'focus' is consistent with that used in programming arts and languages to refer to a programming object having the attention of the current process).
- mobile unit pointing direction 555 causes an origin direction 556 to become set.
- the field indicated by stippling in the drawing is the addressed field 557.
- Figure 56 illustrates a mobile unit 561 having been initialized as described above and further having been rotated about a vertical axis by approximately ten degrees counterclockwise whereby the telephone pointing direction 562 no longer points towards the origin direction 563 but rather now points to a newly addressed field 564.
- This rotational displacement to newly addressed field 575 is detected via the attitude determining means and causes the computer to shift the selection cursor 565 to the adjacent icon to the left, the same direction as the angular displacement.
- a set-up step includes establishing an origin direction 591. While the mobile unit addresses the middle field, a 'hold' function is realized and the character remains unchanged. To advance to another character in a character set one could enter a 'Forward Slow Scroll' 593 function by addressing the field right of and adjacent to the center field. A rotation further causes a 'Forward Fast Scroll' 594 function to be initiated whereby the characters are rapidly changed from one to a succeeding character. Similarly, a 'Reverse Slow Scroll' 595 and a 'Reverse Fast Scroll' 596 function are achieved via rotational displacements of between about five degrees and fifteen degrees and fifteen degrees and twenty-five degrees respectively in a counterclockwise sense.
- Figure 60 shows a special grid of labels in a middle row 601 including: 'Reverse Fast Scroll'; 'Reverse Slow Scroll'; 'Hold'; 'Forward Slow Scroll'; and 'Forward Fast Scroll'
- a top row 602 corresponds to a 'Last Letter' function and a bottom row 603 corresponds to a 'Next Letter' function.
- Figure 61 shows the grid of figure 60 in a perspective view with a mobile unit of these inventions 611 pointing indicated by arrow 612 towards 'Forward Slow Scroll" 613 field.
- the reader will be reminded that the present pointing state is compared to an origin pointing state illustrated as 614.
- the mobile unit is tilted upward at least five degrees causing the 'Last Letter' 615 field to be addressed. It should be noted that this mode ignores position altogether and it can be used wherever a mobile unit is at anytime.
- This very important example illustrates that abstract objects such as fields assigned to certain functions may be addressed with mobile units of these inventions. More particularly, a user can enter and exit functional modes of the computer by changing the address state of the mobile unit.
- a 'point-to-call' function can be better understood in view of the following more complete description.
- telephone services may cooperate well with function provided by 'point-and-click activity.
- a person wishes to contact someone by telephone it is a requirement that a numeric address, a telephone number be entered in order that the call be routed to the desired recipient. Without a telephone number, it is impossible to connect the call.
- a user may employ the services of a directory assistance at extra costs, both money and time, to the caller.
- it requires the sometimes difficult step of explaining to an operator the correct title of the intended recipient which is not always known to the caller. Due to these difficulties, among others, this process is quite unpopular.
- a user may easily place a telephone call with the aid of concepts presented here.
- the mobile telephone having a pointing reference is directed by the user towards an object to which a telephone call is to be placed.
- Objects may includes such entities as hotels, restaurants, ticket agencies, et cetera. Any object which has a telephone associated therewith can become the subject of this special operational mode.
- the mobile unit determines the object being addressed, recalls from the database data relating to the object including a telephone number, and completes the point to call action by initiating a voice connection to the addressed object via wireless link.
- the activity remains mostly transparent to the user, who merely has to point the device and click a switch to place a call.
- a group of persons operating in conjunction with others from the group may act as follows. By 'registering' with the special operation mode manager a person alerts the system to the desire to be fount by others in the predefined group. When another group member attempts to learn of the whereabouts of others via a point-and- click action, the system may respond by providing indication of the presence or absence of group members or individuals.
- users are provided via computer functionality the ability to create private virtual objects. For example a user may wish to place a billboard for others to see where the billboard is only known to particular registered users.
- the data associated with the virtual object for example text data, may also be created and provided by an initiating user. By applying point-and-click actions, a user in 'the know' enjoys the opportunity to address the virtual object set up by his friend.
- devices may be arranged to determine position and the pointing attitude of a hand-held device which thereafter is connected to a product offered for sale.
- Gaming strategy may be developed from infinite sets of rules whereby rules relate in-part to positions of things in relation to positions of other things. When formed in cooperation with systems taught here, gaming strategy offers a completely new dimension to computer game theory. Computer games which bring the user's immediate environment into the action and objectives of the game will be enjoyed by all who carry a telephone. A few examples herefollowing suggest how games will be created to employ the powerful notion of 'point-and-click in the real world.
- Page Lohr Associates, P.O. Box 757, La Jolla, CA 92038 (619) 7024471 having mobile devices set out from the start of the game and travel to well dispersed positions; each player taking up a different location.
- Their mobile device in communication with a central processing unit, i.e. via a network, reports the position to the game managing code.
- a geometric descriptor is formed for each player with regard to the position reported.
- the person who is 'it' must address locations where it is suspected that players are hiding.
- the game managing code makes a determination whether a player is hiding therein. On finding a player in this maimer, the both the person 'it' and the player are properly notified.
- devices of these inventions are quite useful in surveying techniques and procedure. For example, excavation projects are troubled with the issue of unintentionally digging into facilities which are easily damaged.
- excavation projects are troubled with the issue of unintentionally digging into facilities which are easily damaged.
- the city workers had to take great care not to puncture a pipe earring Jet5 type jet fuel; Point Loma is near the San Diego International Airport at Lindberg field.
- the project suffered considerable delays because the pipes carrying jet fuel could not be easily located.
- a geologist can locate previously mapped mineral fields. By simply arriving in a mining field and pointing toward various suspect locations a geologist can receive detailed data found in previous explorations without having to read
- a shipping company can provide a ship with a computerized database of information relating to underwater formations including reefs and wrecks.
- a ship captain and navigator can point a device toward suspected underwater features to precisely locate them. To the casual observer this may at first seem unremarkable.
- the systems are extremely powerful.
- attitude and position determining means may be arranged in methods and apparatus for the purpose of addressing objects of interest, and further for providing information relating to those objects being addressed, and still further for manipulating information relating to objects being addressed.
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002559779A JP2004531791A (en) | 2001-01-24 | 2001-10-29 | Pointing system for addressing objects |
EP01997159A EP1354260A4 (en) | 2001-01-24 | 2001-10-29 | Pointing systems for addressing objects |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/769,012 US7031875B2 (en) | 2001-01-24 | 2001-01-24 | Pointing systems for addressing objects |
US09/769,012 | 2001-01-24 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2002059716A2 true WO2002059716A2 (en) | 2002-08-01 |
WO2002059716A3 WO2002059716A3 (en) | 2003-06-05 |
Family
ID=25084149
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2001/050804 WO2002059716A2 (en) | 2001-01-24 | 2001-10-29 | Pointing systems for addressing objects |
Country Status (5)
Country | Link |
---|---|
US (2) | US7031875B2 (en) |
EP (1) | EP1354260A4 (en) |
JP (4) | JP2004531791A (en) |
NZ (1) | NZ560769A (en) |
WO (1) | WO2002059716A2 (en) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1839193A1 (en) * | 2004-12-31 | 2007-10-03 | Nokia Corporation | Provision of target specific information |
WO2009024881A1 (en) * | 2007-08-23 | 2009-02-26 | Telefonaktiebolaget Lm Ericsson (Publ) | System and method for gesture-based command and control of targets in wireless network |
WO2009024882A1 (en) * | 2007-08-23 | 2009-02-26 | Telefonaktiebolaget Lm Ericsson (Publ) | Method and apparatus for sending data relating to a target to a mobile device |
JP2009245444A (en) * | 2002-11-20 | 2009-10-22 | Koninkl Philips Electronics Nv | User interface system based on pointing device |
US7698387B2 (en) | 2004-07-06 | 2010-04-13 | Fujitsu Limited | Server system, user terminal, service providing method and service providing system using the server system and the user terminal for providing position-based service to the user terminal |
WO2011142700A1 (en) | 2010-05-12 | 2011-11-17 | Telefonaktiebolaget L M Ericsson (Publ) | Method, computer program and apparatus for determining an object in sight |
US8218873B2 (en) | 2000-11-06 | 2012-07-10 | Nant Holdings Ip, Llc | Object information derived from object images |
US8224078B2 (en) | 2000-11-06 | 2012-07-17 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8224077B2 (en) | 2000-11-06 | 2012-07-17 | Nant Holdings Ip, Llc | Data capture and identification system and process |
WO2012163427A1 (en) * | 2011-06-01 | 2012-12-06 | Sony Ericsson Mobile Communications Ab | Catch the screen |
US8588527B2 (en) | 2000-11-06 | 2013-11-19 | Nant Holdings Ip, Llc | Object information derived from object images |
EP2688318A1 (en) * | 2012-07-17 | 2014-01-22 | Alcatel Lucent | Conditional interaction control for a virtual object |
US8810598B2 (en) | 2011-04-08 | 2014-08-19 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US9071641B2 (en) | 2009-09-30 | 2015-06-30 | Biglobe Inc. | Mapping system that displays nearby regions based on direction of travel, speed, and orientation |
TWI514337B (en) * | 2009-02-20 | 2015-12-21 | 尼康股份有限公司 | Carrying information machines, photographic devices, and information acquisition systems |
US9310892B2 (en) | 2000-11-06 | 2016-04-12 | Nant Holdings Ip, Llc | Object information derived from object images |
US10140317B2 (en) | 2013-10-17 | 2018-11-27 | Nant Holdings Ip, Llc | Wide area augmented reality location-based services |
US10535279B2 (en) | 2010-02-24 | 2020-01-14 | Nant Holdings Ip, Llc | Augmented reality panorama supporting visually impaired individuals |
US10617568B2 (en) | 2000-11-06 | 2020-04-14 | Nant Holdings Ip, Llc | Image capture and identification system and process |
Families Citing this family (242)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7301536B2 (en) * | 1993-09-10 | 2007-11-27 | Geovector Corporation | Electro-optic vision systems |
US6654060B1 (en) * | 1997-01-07 | 2003-11-25 | Canon Kabushiki Kaisha | Video-image control apparatus and method and storage medium |
US7749089B1 (en) | 1999-02-26 | 2010-07-06 | Creative Kingdoms, Llc | Multi-media interactive play system |
US6411892B1 (en) * | 2000-07-13 | 2002-06-25 | Global Locate, Inc. | Method and apparatus for locating mobile receivers using a wide area reference network for propagating ephemeris |
US7445550B2 (en) | 2000-02-22 | 2008-11-04 | Creative Kingdoms, Llc | Magical wand and interactive play experience |
US6761637B2 (en) | 2000-02-22 | 2004-07-13 | Creative Kingdoms, Llc | Method of game play using RFID tracking device |
US7878905B2 (en) | 2000-02-22 | 2011-02-01 | Creative Kingdoms, Llc | Multi-layered interactive play experience |
US7688306B2 (en) * | 2000-10-02 | 2010-03-30 | Apple Inc. | Methods and apparatuses for operating a portable device based on an accelerometer |
US7066781B2 (en) | 2000-10-20 | 2006-06-27 | Denise Chapman Weston | Children's toy with wireless tag/transponder |
US20120154438A1 (en) * | 2000-11-06 | 2012-06-21 | Nant Holdings Ip, Llc | Interactivity Via Mobile Image Recognition |
US8817045B2 (en) | 2000-11-06 | 2014-08-26 | Nant Holdings Ip, Llc | Interactivity via mobile image recognition |
US7130466B2 (en) * | 2000-12-21 | 2006-10-31 | Cobion Ag | System and method for compiling images from a database and comparing the compiled images with known images |
US7031875B2 (en) * | 2001-01-24 | 2006-04-18 | Geo Vector Corporation | Pointing systems for addressing objects |
US6834249B2 (en) * | 2001-03-29 | 2004-12-21 | Arraycomm, Inc. | Method and apparatus for controlling a computing system |
US7299256B2 (en) * | 2001-04-17 | 2007-11-20 | Hewlett-Packard Development Company, L.P. | Creating a virtual link between a physical location and its web representation |
US20070066396A1 (en) | 2002-04-05 | 2007-03-22 | Denise Chapman Weston | Retail methods for providing an interactive product to a consumer |
US6967566B2 (en) | 2002-04-05 | 2005-11-22 | Creative Kingdoms, Llc | Live-action interactive adventure game |
CN1310480C (en) * | 2002-04-12 | 2007-04-11 | 西门子公司 | Method for commonly controlling bandwidth of a group of individual information flows |
US7203911B2 (en) * | 2002-05-13 | 2007-04-10 | Microsoft Corporation | Altering a display on a viewing device based upon a user proximity to the viewing device |
US20040073557A1 (en) * | 2002-06-14 | 2004-04-15 | Piccionelli Gregory A. | Method for obtaining information associated with item at specific location |
US7674184B2 (en) | 2002-08-01 | 2010-03-09 | Creative Kingdoms, Llc | Interactive water attraction and quest game |
US20050222802A1 (en) * | 2002-08-27 | 2005-10-06 | Yasuhiro Tamura | Mobile terminal apparatus |
KR100449743B1 (en) * | 2002-10-04 | 2004-09-22 | 삼성전자주식회사 | Chained image display apparatus having mutual examining function |
AU2002339684A1 (en) * | 2002-11-05 | 2004-06-07 | Nokia Corporation | Mobile electronic three-dimensional compass |
AU2003280203A1 (en) * | 2002-12-20 | 2004-07-14 | Koninklijke Philips Electronics N.V. | Providing a user with location-based information |
US20040121789A1 (en) * | 2002-12-23 | 2004-06-24 | Teddy Lindsey | Method and apparatus for communicating information in a global distributed network |
KR100485839B1 (en) * | 2003-01-30 | 2005-04-28 | 삼성전자주식회사 | Portable device for indicating specific location and controlling method thereof |
US20040154461A1 (en) * | 2003-02-07 | 2004-08-12 | Nokia Corporation | Methods and apparatus providing group playing ability for creating a shared sound environment with MIDI-enabled mobile stations |
US6795768B2 (en) * | 2003-02-20 | 2004-09-21 | Motorola, Inc. | Handheld object selector |
US9446319B2 (en) | 2003-03-25 | 2016-09-20 | Mq Gaming, Llc | Interactive gaming toy |
US20040243307A1 (en) * | 2003-06-02 | 2004-12-02 | Pieter Geelen | Personal GPS navigation device |
EP1537383B1 (en) * | 2003-07-16 | 2019-06-19 | Harman Becker Automotive Systems GmbH | Transmission of special routes to a navigation device |
WO2005015806A2 (en) * | 2003-08-08 | 2005-02-17 | Networks In Motion, Inc. | Method and system for collecting synchronizing and reporting telecommunication call events and work flow related information |
KR100590586B1 (en) * | 2003-09-15 | 2006-06-15 | 에스케이 텔레콤주식회사 | Mobile Telecommunication Terminal Has Electrical Compass Module and Playing Stand-Alone Type Mobile Game Method Using Electrical Compass Module Thereof |
KR100673080B1 (en) * | 2003-09-15 | 2007-01-22 | 에스케이 텔레콤주식회사 | Mobile Telecommunication Terminal Has Electrical Compass Module and Playing Network Type Mobile Game Method Using Electrical Compass Module Thereof |
KR100590583B1 (en) * | 2003-09-15 | 2006-06-15 | 에스케이 텔레콤주식회사 | Mobile Telecommunication Terminal Has Electrical Compass Module and Playing Mobile Game Method Using Electrical Compass Module Thereof |
US8665325B2 (en) * | 2003-10-08 | 2014-03-04 | Qwest Communications International Inc. | Systems and methods for location based image telegraphy |
US20050101314A1 (en) * | 2003-11-10 | 2005-05-12 | Uri Levi | Method and system for wireless group communications |
US7245923B2 (en) * | 2003-11-20 | 2007-07-17 | Intelligent Spatial Technologies | Mobile device and geographic information system background and summary of the related art |
US8060112B2 (en) | 2003-11-20 | 2011-11-15 | Intellient Spatial Technologies, Inc. | Mobile device and geographic information system background and summary of the related art |
US9858590B1 (en) | 2003-12-30 | 2018-01-02 | Google Inc. | Determining better ad selection, scoring, and/or presentation techniques |
JP4591353B2 (en) * | 2004-01-08 | 2010-12-01 | 日本電気株式会社 | Character recognition device, mobile communication system, mobile terminal device, fixed station device, character recognition method, and character recognition program |
US8629836B2 (en) | 2004-04-30 | 2014-01-14 | Hillcrest Laboratories, Inc. | 3D pointing devices with orientation compensation and improved usability |
JP2007535773A (en) | 2004-04-30 | 2007-12-06 | ヒルクレスト・ラボラトリーズ・インコーポレイテッド | Free space pointing device and pointing method |
US7746321B2 (en) | 2004-05-28 | 2010-06-29 | Erik Jan Banning | Easily deployable interactive direct-pointing system and presentation control system and calibration method therefor |
CN100510623C (en) * | 2004-07-15 | 2009-07-08 | 阿莫善斯有限公司 | Mobile terminal device |
US8137195B2 (en) | 2004-11-23 | 2012-03-20 | Hillcrest Laboratories, Inc. | Semantic gaming and application transformation |
JP4011101B2 (en) * | 2004-11-24 | 2007-11-21 | ソフトバンクモバイル株式会社 | Information processing method, information processing apparatus, and information processing program |
US8301159B2 (en) * | 2004-12-31 | 2012-10-30 | Nokia Corporation | Displaying network objects in mobile devices based on geolocation |
US7720436B2 (en) | 2006-01-09 | 2010-05-18 | Nokia Corporation | Displaying network objects in mobile devices based on geolocation |
US20070189544A1 (en) | 2005-01-15 | 2007-08-16 | Outland Research, Llc | Ambient sound responsive media player |
US7542816B2 (en) * | 2005-01-27 | 2009-06-02 | Outland Research, Llc | System, method and computer program product for automatically selecting, suggesting and playing music media files |
US20070276870A1 (en) * | 2005-01-27 | 2007-11-29 | Outland Research, Llc | Method and apparatus for intelligent media selection using age and/or gender |
US20060173556A1 (en) * | 2005-02-01 | 2006-08-03 | Outland Research,. Llc | Methods and apparatus for using user gender and/or age group to improve the organization of documents retrieved in response to a search query |
WO2006092647A1 (en) * | 2005-03-04 | 2006-09-08 | Nokia Corporation | Offering menu items to a user |
US7353034B2 (en) | 2005-04-04 | 2008-04-01 | X One, Inc. | Location sharing and tracking using mobile phones or other wireless devices |
US20060256008A1 (en) * | 2005-05-13 | 2006-11-16 | Outland Research, Llc | Pointing interface for person-to-person information exchange |
US9420423B1 (en) | 2005-04-12 | 2016-08-16 | Ehud Mendelson | RF beacon deployment and method of use |
US10117078B1 (en) | 2005-04-12 | 2018-10-30 | Ehud Mendelson | Medical information communication method |
US7899583B2 (en) | 2005-04-12 | 2011-03-01 | Ehud Mendelson | System and method of detecting and navigating to empty parking spaces |
US8836580B2 (en) * | 2005-05-09 | 2014-09-16 | Ehud Mendelson | RF proximity tags providing indoor and outdoor navigation and method of use |
US20060241864A1 (en) * | 2005-04-22 | 2006-10-26 | Outland Research, Llc | Method and apparatus for point-and-send data transfer within an ubiquitous computing environment |
US7822549B2 (en) * | 2005-05-05 | 2010-10-26 | Sapir Itzhak | Global positioning using planetary constants |
US20060256007A1 (en) * | 2005-05-13 | 2006-11-16 | Outland Research, Llc | Triangulation method and apparatus for targeting and accessing spatially associated information |
US20070150188A1 (en) * | 2005-05-27 | 2007-06-28 | Outland Research, Llc | First-person video-based travel planning system |
JP4507992B2 (en) * | 2005-06-09 | 2010-07-21 | ソニー株式会社 | Information processing apparatus and method, and program |
US7453395B2 (en) * | 2005-06-10 | 2008-11-18 | Honeywell International Inc. | Methods and systems using relative sensing to locate targets |
US7728869B2 (en) * | 2005-06-14 | 2010-06-01 | Lg Electronics Inc. | Matching camera-photographed image with map data in portable terminal and travel route guidance method |
US9285897B2 (en) | 2005-07-13 | 2016-03-15 | Ultimate Pointer, L.L.C. | Easily deployable interactive direct-pointing system and calibration method therefor |
US8313379B2 (en) | 2005-08-22 | 2012-11-20 | Nintendo Co., Ltd. | Video game system with wireless modular handheld controller |
US7942745B2 (en) | 2005-08-22 | 2011-05-17 | Nintendo Co., Ltd. | Game operating device |
JP4805633B2 (en) | 2005-08-22 | 2011-11-02 | 任天堂株式会社 | Game operation device |
US7927216B2 (en) | 2005-09-15 | 2011-04-19 | Nintendo Co., Ltd. | Video game system with wireless modular handheld controller |
US8870655B2 (en) | 2005-08-24 | 2014-10-28 | Nintendo Co., Ltd. | Wireless game controllers |
JP4262726B2 (en) | 2005-08-24 | 2009-05-13 | 任天堂株式会社 | Game controller and game system |
US20070047726A1 (en) * | 2005-08-25 | 2007-03-01 | Cisco Technology, Inc. | System and method for providing contextual information to a called party |
EP2764899A3 (en) | 2005-08-29 | 2014-12-10 | Nant Holdings IP, LLC | Interactivity via mobile image recognition |
US8308563B2 (en) | 2005-08-30 | 2012-11-13 | Nintendo Co., Ltd. | Game system and storage medium having game program stored thereon |
US8157651B2 (en) | 2005-09-12 | 2012-04-17 | Nintendo Co., Ltd. | Information processing program |
US7418341B2 (en) * | 2005-09-12 | 2008-08-26 | Intelligent Spatial Technologies | System and method for the selection of a unique geographic feature |
US20070067387A1 (en) * | 2005-09-19 | 2007-03-22 | Cisco Technology, Inc. | Conferencing system and method for temporary blocking / restoring of individual participants |
US20070067303A1 (en) * | 2005-09-21 | 2007-03-22 | Jukka Linjama | System and method for user interaction |
US20070083918A1 (en) * | 2005-10-11 | 2007-04-12 | Cisco Technology, Inc. | Validation of call-out services transmitted over a public switched telephone network |
JP4851771B2 (en) * | 2005-10-24 | 2012-01-11 | 京セラ株式会社 | Information processing system and portable information terminal |
US8243895B2 (en) | 2005-12-13 | 2012-08-14 | Cisco Technology, Inc. | Communication system with configurable shared line privacy feature |
US20060227047A1 (en) * | 2005-12-13 | 2006-10-12 | Outland Research | Meeting locator system and method of using the same |
US20070075127A1 (en) * | 2005-12-21 | 2007-04-05 | Outland Research, Llc | Orientation-based power conservation for portable media devices |
US8280405B2 (en) * | 2005-12-29 | 2012-10-02 | Aechelon Technology, Inc. | Location based wireless collaborative environment with a visual user interface |
DK1806643T3 (en) * | 2006-01-06 | 2014-12-08 | Drnc Holdings Inc | Method of introducing commands and / or characters to a portable communication device equipped with an inclination sensor |
JP4530419B2 (en) | 2006-03-09 | 2010-08-25 | 任天堂株式会社 | Coordinate calculation apparatus and coordinate calculation program |
JP4151982B2 (en) | 2006-03-10 | 2008-09-17 | 任天堂株式会社 | Motion discrimination device and motion discrimination program |
US20070214041A1 (en) * | 2006-03-10 | 2007-09-13 | Cisco Technologies, Inc. | System and method for location-based mapping of soft-keys on a mobile communication device |
US20070214040A1 (en) * | 2006-03-10 | 2007-09-13 | Cisco Technology, Inc. | Method for prompting responses to advertisements |
WO2007108099A1 (en) * | 2006-03-20 | 2007-09-27 | Fujitsu Limited | Connection device, connection method, and connection program |
JP4684147B2 (en) | 2006-03-28 | 2011-05-18 | 任天堂株式会社 | Inclination calculation device, inclination calculation program, game device, and game program |
US20070239531A1 (en) * | 2006-03-30 | 2007-10-11 | Francoise Beaufays | Controlling the serving of serially rendered ads, such as audio ads |
US7917391B2 (en) * | 2006-04-19 | 2011-03-29 | Market Hardware, Inc. | Integrated marketing portal for businesses |
FI20060470A0 (en) * | 2006-05-12 | 2006-05-12 | Nokia Corp | Orientation-based retrieval of messages |
US7761110B2 (en) * | 2006-05-31 | 2010-07-20 | Cisco Technology, Inc. | Floor control templates for use in push-to-talk applications |
US8538676B2 (en) * | 2006-06-30 | 2013-09-17 | IPointer, Inc. | Mobile geographic information system and method |
US7602302B2 (en) * | 2006-08-08 | 2009-10-13 | Garmin Ltd. | Animal tracking apparatus and method |
US8145234B1 (en) * | 2006-09-13 | 2012-03-27 | At&T Mobility Ii Llc | Secure user plane location (SUPL) roaming |
US8683386B2 (en) * | 2006-10-03 | 2014-03-25 | Brian Mark Shuster | Virtual environment for computer game |
US7899161B2 (en) | 2006-10-11 | 2011-03-01 | Cisco Technology, Inc. | Voicemail messaging with dynamic content |
US20080094357A1 (en) * | 2006-10-20 | 2008-04-24 | Qualcomm Incorporated | Design for the mouse for any portable device |
US20080109517A1 (en) * | 2006-11-08 | 2008-05-08 | Cisco Technology, Inc. | Scheduling a conference in situations where a particular invitee is unavailable |
US7564406B2 (en) * | 2006-11-10 | 2009-07-21 | Sirf Technology, Inc. | Method and apparatus in standalone positioning without broadcast ephemeris |
US8687785B2 (en) | 2006-11-16 | 2014-04-01 | Cisco Technology, Inc. | Authorization to place calls by remote users |
US8028961B2 (en) | 2006-12-22 | 2011-10-04 | Central Signal, Llc | Vital solid state controller |
US7996019B2 (en) | 2006-12-26 | 2011-08-09 | Motorola Mobilty, Inc. | Intelligent location-based services |
JP5127242B2 (en) | 2007-01-19 | 2013-01-23 | 任天堂株式会社 | Acceleration data processing program and game program |
US7720919B2 (en) * | 2007-02-27 | 2010-05-18 | Cisco Technology, Inc. | Automatic restriction of reply emails |
CN101802879A (en) * | 2007-04-03 | 2010-08-11 | 人类网络实验室公司 | Method and apparatus for acquiring local position and overlaying information |
US20080268876A1 (en) * | 2007-04-24 | 2008-10-30 | Natasha Gelfand | Method, Device, Mobile Terminal, and Computer Program Product for a Point of Interest Based Scheme for Improving Mobile Visual Searching Functionalities |
US8456300B2 (en) * | 2007-05-09 | 2013-06-04 | Sony Ericsson Mobile Communications Ab | Methods, electronic devices, and computer program products for generating presence information associated with a user of an electronic device based on environmental information |
US7963526B2 (en) * | 2007-05-15 | 2011-06-21 | Freudenberg-Nok General Partnership | Prelubricated multi-lipped radial shaft seal with large radial offset accommodation |
JP2008299619A (en) * | 2007-05-31 | 2008-12-11 | Toshiba Corp | Mobile device, data transfer method, and data transfer system |
US8817061B2 (en) | 2007-07-02 | 2014-08-26 | Cisco Technology, Inc. | Recognition of human gestures by a mobile phone |
JP5024668B2 (en) * | 2007-07-10 | 2012-09-12 | 富士ゼロックス株式会社 | Image forming apparatus and information processing apparatus |
US7885145B2 (en) * | 2007-10-26 | 2011-02-08 | Samsung Electronics Co. Ltd. | System and method for selection of an object of interest during physical browsing by finger pointing and snapping |
US8073198B2 (en) * | 2007-10-26 | 2011-12-06 | Samsung Electronics Co., Ltd. | System and method for selection of an object of interest during physical browsing by finger framing |
US9582937B2 (en) * | 2008-01-02 | 2017-02-28 | Nokia Technologies Oy | Method, apparatus and computer program product for displaying an indication of an object within a current field of view |
US7427980B1 (en) | 2008-03-31 | 2008-09-23 | International Business Machines Corporation | Game controller spatial detection |
US7529542B1 (en) | 2008-04-21 | 2009-05-05 | International Business Machines Corporation | Method of establishing communication between two or more real world entities and apparatuses performing the same |
US20090315766A1 (en) | 2008-06-19 | 2009-12-24 | Microsoft Corporation | Source switching for devices supporting dynamic direction information |
US8700301B2 (en) | 2008-06-19 | 2014-04-15 | Microsoft Corporation | Mobile computing devices, architecture and user interfaces based on dynamic direction information |
US20090319166A1 (en) * | 2008-06-20 | 2009-12-24 | Microsoft Corporation | Mobile computing services based on devices with dynamic direction information |
US8467991B2 (en) | 2008-06-20 | 2013-06-18 | Microsoft Corporation | Data services based on gesture and location information of device |
US20090315775A1 (en) * | 2008-06-20 | 2009-12-24 | Microsoft Corporation | Mobile computing services based on devices with dynamic direction information |
US8010313B2 (en) | 2008-06-27 | 2011-08-30 | Movea Sa | Hand held pointing device with roll compensation |
US20140184509A1 (en) | 2013-01-02 | 2014-07-03 | Movea Sa | Hand held pointing device with roll compensation |
FR2933212B1 (en) | 2008-06-27 | 2013-07-05 | Movea Sa | MOVING CAPTURE POINTER RESOLVED BY DATA FUSION |
JP5866199B2 (en) | 2008-07-01 | 2016-02-17 | ヒルクレスト・ラボラトリーズ・インコーポレイテッド | 3D pointer mapping |
EP2146490A1 (en) * | 2008-07-18 | 2010-01-20 | Alcatel, Lucent | User device for gesture based exchange of information, methods for gesture based exchange of information between a plurality of user devices, and related devices and systems |
US9191238B2 (en) * | 2008-07-23 | 2015-11-17 | Yahoo! Inc. | Virtual notes in a reality overlay |
US20100083189A1 (en) * | 2008-09-30 | 2010-04-01 | Robert Michael Arlein | Method and apparatus for spatial context based coordination of information among multiple devices |
US9547352B2 (en) * | 2008-09-30 | 2017-01-17 | Avaya Inc. | Presence-based power management |
US20110063167A1 (en) * | 2009-09-15 | 2011-03-17 | Qualcomm Incorporated | Using magnetometer with a positioning system |
US9482755B2 (en) | 2008-11-17 | 2016-11-01 | Faro Technologies, Inc. | Measurement system having air temperature compensation between a target and a laser tracker |
JP5539388B2 (en) * | 2008-12-22 | 2014-07-02 | インテリジェント スペイシャル テクノロジーズ,インク. | System and method for searching a 3D scene by pointing a reference object |
CA2748026A1 (en) * | 2008-12-22 | 2010-07-01 | Intelligent Spatial Technologies, Inc. | System and method for initiating actions and providing feedback by pointing at object of interest |
CA2748031A1 (en) * | 2008-12-22 | 2010-07-01 | Intelligent Spatial Technologies, Inc. | System and method for linking real-world objects and object representations by pointing |
US8483519B2 (en) * | 2008-12-22 | 2013-07-09 | Ipointer Inc. | Mobile image search and indexing system and method |
US20100169157A1 (en) * | 2008-12-30 | 2010-07-01 | Nokia Corporation | Methods, apparatuses, and computer program products for providing targeted advertising |
US20100228612A1 (en) * | 2009-03-09 | 2010-09-09 | Microsoft Corporation | Device transaction model and services based on directional information of device |
DE102009015918A1 (en) * | 2009-03-25 | 2010-09-30 | Kuhn, Andrè | Input arrangement for display system e.g. navigation system, use in motor vehicles, has control device controlling screen, and connecting device provided between direction determination device and position determination device |
FR2946824B1 (en) * | 2009-06-15 | 2015-11-13 | Oberthur Technologies | ELECTRONIC ENTITY AND MICROCIRCUIT CARD FOR ELECTRONIC ENTITY. |
US20100332324A1 (en) | 2009-06-25 | 2010-12-30 | Microsoft Corporation | Portal services based on interactions with points of interest discovered via directional device information |
US8427508B2 (en) * | 2009-06-25 | 2013-04-23 | Nokia Corporation | Method and apparatus for an augmented reality user interface |
US8872767B2 (en) | 2009-07-07 | 2014-10-28 | Microsoft Corporation | System and method for converting gestures into digital graffiti |
US8818274B2 (en) | 2009-07-17 | 2014-08-26 | Qualcomm Incorporated | Automatic interfacing between a master device and object device |
US20110015940A1 (en) * | 2009-07-20 | 2011-01-20 | Nathan Goldfein | Electronic physician order sheet |
US8599066B1 (en) * | 2009-09-29 | 2013-12-03 | Mark A. Wessels | System, method, and apparatus for obtaining information of a visually acquired aircraft in flight |
KR100957575B1 (en) * | 2009-10-01 | 2010-05-11 | (주)올라웍스 | Method, terminal and computer-readable recording medium for performing visual search based on movement or pose of terminal |
WO2011041836A1 (en) * | 2009-10-08 | 2011-04-14 | Someones Group Intellectual Property Holdings Pty Ltd Acn 131 335 325 | Method, system and controller for sharing data |
US8494544B2 (en) * | 2009-12-03 | 2013-07-23 | Osocad Remote Limited Liability Company | Method, apparatus and computer program to perform location specific information retrieval using a gesture-controlled handheld mobile device |
US8400548B2 (en) | 2010-01-05 | 2013-03-19 | Apple Inc. | Synchronized, interactive augmented reality displays for multifunction devices |
EP2529596B1 (en) * | 2010-01-29 | 2014-07-16 | Koninklijke Philips N.V. | Interactive lighting control system and method |
JP2011199750A (en) * | 2010-03-23 | 2011-10-06 | Olympus Corp | Image capturing terminal, external terminal, image capturing system, and image capturing method |
US8376586B2 (en) * | 2010-03-26 | 2013-02-19 | Robert J. Abbatiello | Low-divergence light pointer apparatus for use through and against transparent surfaces |
US8700592B2 (en) | 2010-04-09 | 2014-04-15 | Microsoft Corporation | Shopping search engines |
US8638222B2 (en) | 2010-04-19 | 2014-01-28 | Microsoft Corporation | Controllable device selection based on controller location |
US9377885B2 (en) | 2010-04-21 | 2016-06-28 | Faro Technologies, Inc. | Method and apparatus for locking onto a retroreflector with a laser tracker |
US8537371B2 (en) | 2010-04-21 | 2013-09-17 | Faro Technologies, Inc. | Method and apparatus for using gestures to control a laser tracker |
US9772394B2 (en) | 2010-04-21 | 2017-09-26 | Faro Technologies, Inc. | Method and apparatus for following an operator and locking onto a retroreflector with a laser tracker |
US8422034B2 (en) | 2010-04-21 | 2013-04-16 | Faro Technologies, Inc. | Method and apparatus for using gestures to control a laser tracker |
US8619265B2 (en) | 2011-03-14 | 2013-12-31 | Faro Technologies, Inc. | Automatic measurement of dimensional data with a laser tracker |
US9400170B2 (en) | 2010-04-21 | 2016-07-26 | Faro Technologies, Inc. | Automatic measurement of dimensional data within an acceptance region by a laser tracker |
US8724119B2 (en) | 2010-04-21 | 2014-05-13 | Faro Technologies, Inc. | Method for using a handheld appliance to select, lock onto, and track a retroreflector with a laser tracker |
US9785987B2 (en) | 2010-04-22 | 2017-10-10 | Microsoft Technology Licensing, Llc | User interface for information presentation system |
JP5038465B2 (en) * | 2010-05-25 | 2012-10-03 | 任天堂株式会社 | Information processing program, information processing apparatus, information processing method, and information processing system |
EP2393056A1 (en) * | 2010-06-02 | 2011-12-07 | Layar B.V. | Acquiring, ranking and displaying points of interest for use in an augmented reality service provisioning system and graphical user interface for displaying such ranked points of interests |
US20120041966A1 (en) * | 2010-07-15 | 2012-02-16 | Virtual Beam, Inc. | Directional information search from a mobile device |
US9043296B2 (en) | 2010-07-30 | 2015-05-26 | Microsoft Technology Licensing, Llc | System of providing suggestions based on accessible and contextual information |
KR101357262B1 (en) * | 2010-08-13 | 2014-01-29 | 주식회사 팬택 | Apparatus and Method for Recognizing Object using filter information |
US8559869B2 (en) | 2011-09-21 | 2013-10-15 | Daniel R. Ash, JR. | Smart channel selective repeater |
US8565735B2 (en) * | 2010-10-29 | 2013-10-22 | Jeffrey L. Wohlwend | System and method for supporting mobile unit connectivity to venue specific servers |
US20120117181A1 (en) * | 2010-11-05 | 2012-05-10 | Verizon Patent And Licensing, Inc. | System for and method of providing mobile applications management |
US11175375B2 (en) | 2010-11-12 | 2021-11-16 | Position Imaging, Inc. | Position tracking system and method using radio signals and inertial sensing |
US10416276B2 (en) | 2010-11-12 | 2019-09-17 | Position Imaging, Inc. | Position tracking system and method using radio signals and inertial sensing |
US20120120230A1 (en) * | 2010-11-17 | 2012-05-17 | Utah State University | Apparatus and Method for Small Scale Wind Mapping |
JP5768361B2 (en) * | 2010-11-22 | 2015-08-26 | ソニー株式会社 | Transmission device, reception device, and content transmission / reception system |
US20120127012A1 (en) * | 2010-11-24 | 2012-05-24 | Samsung Electronics Co., Ltd. | Determining user intent from position and orientation information |
CN103003783B (en) | 2011-02-01 | 2016-01-20 | 松下电器(美国)知识产权公司 | Function expanding device, method for developing functions, Function Extension program and integrated circuit |
US8274508B2 (en) * | 2011-02-14 | 2012-09-25 | Mitsubishi Electric Research Laboratories, Inc. | Method for representing objects with concentric ring signature descriptors for detecting 3D objects in range images |
GB2518769A (en) | 2011-03-03 | 2015-04-01 | Faro Tech Inc | Target apparatus and method |
EP2684145A4 (en) * | 2011-03-07 | 2014-09-03 | Kba2 Inc | Systems and methods for analytic data gathering from image providers at an event or geographic location |
US9164173B2 (en) | 2011-04-15 | 2015-10-20 | Faro Technologies, Inc. | Laser tracker that uses a fiber-optic coupler and an achromatic launch to align and collimate two wavelengths of light |
US9686532B2 (en) | 2011-04-15 | 2017-06-20 | Faro Technologies, Inc. | System and method of acquiring three-dimensional coordinates using multiple coordinate measurement devices |
GB2504890A (en) | 2011-04-15 | 2014-02-12 | Faro Tech Inc | Enhanced position detector in laser tracker |
US9482529B2 (en) | 2011-04-15 | 2016-11-01 | Faro Technologies, Inc. | Three-dimensional coordinate scanner and method of operation |
US8884877B2 (en) * | 2011-04-29 | 2014-11-11 | Movea | Pointing device |
US8352639B2 (en) | 2011-05-06 | 2013-01-08 | Research In Motion Limited | Method of device selection using sensory input and portable electronic device configured for same |
US9945940B2 (en) | 2011-11-10 | 2018-04-17 | Position Imaging, Inc. | Systems and methods of wireless position tracking |
US20140257862A1 (en) * | 2011-11-29 | 2014-09-11 | Wildfire Defense Systems, Inc. | Mobile application for risk management |
CN103999551B (en) * | 2011-12-14 | 2018-08-07 | 飞利浦灯具控股公司 | Method and apparatus for controlling illumination |
KR101874853B1 (en) * | 2011-12-29 | 2018-07-05 | 주식회사 알티캐스트 | Method and device of synchronization between a mobile device and display device, mobile device, display device |
GB2515922A (en) | 2012-01-27 | 2015-01-07 | Faro Tech Inc | Inspection method with barcode identification |
KR101533320B1 (en) * | 2012-04-23 | 2015-07-03 | 주식회사 브이터치 | Apparatus for acquiring 3 dimension object information without pointer |
US10269182B2 (en) * | 2012-06-14 | 2019-04-23 | Position Imaging, Inc. | RF tracking with active sensory feedback |
WO2014005066A1 (en) * | 2012-06-28 | 2014-01-03 | Experience Proximity, Inc., Dba Oooii | Systems and methods for navigating virtual structured data relative to real-world locales |
TWI526041B (en) * | 2012-07-17 | 2016-03-11 | 廣達電腦股份有限公司 | Interaction system and interaction method |
US10180490B1 (en) | 2012-08-24 | 2019-01-15 | Position Imaging, Inc. | Radio frequency communication system |
US9298970B2 (en) | 2012-11-27 | 2016-03-29 | Nokia Technologies Oy | Method and apparatus for facilitating interaction with an object viewable via a display |
US10234539B2 (en) | 2012-12-15 | 2019-03-19 | Position Imaging, Inc. | Cycling reference multiplexing receiver system |
FR3000242A1 (en) | 2012-12-21 | 2014-06-27 | France Telecom | METHOD FOR MANAGING A GEOGRAPHIC INFORMATION SYSTEM SUITABLE FOR USE WITH AT LEAST ONE POINTING DEVICE, WITH CREATION OF ASSOCIATIONS BETWEEN DIGITAL OBJECTS |
US9733729B2 (en) | 2012-12-26 | 2017-08-15 | Movea | Method and device for sensing orientation of an object in space in a fixed frame of reference |
US9482741B1 (en) | 2013-01-18 | 2016-11-01 | Position Imaging, Inc. | System and method of locating a radio frequency (RF) tracking device using a calibration routine |
US10856108B2 (en) | 2013-01-18 | 2020-12-01 | Position Imaging, Inc. | System and method of locating a radio frequency (RF) tracking device using a calibration routine |
US9928652B2 (en) | 2013-03-01 | 2018-03-27 | Apple Inc. | Registration between actual mobile device position and environmental model |
US9679414B2 (en) | 2013-03-01 | 2017-06-13 | Apple Inc. | Federated mobile device positioning |
US9041914B2 (en) | 2013-03-15 | 2015-05-26 | Faro Technologies, Inc. | Three-dimensional coordinate scanner and method of operation |
US9264474B2 (en) | 2013-05-07 | 2016-02-16 | KBA2 Inc. | System and method of portraying the shifting level of interest in an object or location |
US8954122B2 (en) * | 2013-07-03 | 2015-02-10 | BluFlux RF Technologies, LLC | Electronic device case with antenna |
US10634761B2 (en) | 2013-12-13 | 2020-04-28 | Position Imaging, Inc. | Tracking system with mobile reader |
JP6851133B2 (en) | 2014-01-03 | 2021-03-31 | ハーマン インターナショナル インダストリーズ インコーポレイテッド | User-directed personal information assistant |
US9497728B2 (en) | 2014-01-17 | 2016-11-15 | Position Imaging, Inc. | Wireless relay station for radio frequency-based tracking system |
US9473745B2 (en) | 2014-01-30 | 2016-10-18 | Google Inc. | System and method for providing live imagery associated with map locations |
US10200819B2 (en) | 2014-02-06 | 2019-02-05 | Position Imaging, Inc. | Virtual reality and augmented reality functionality for mobile devices |
JP2015152483A (en) * | 2014-02-17 | 2015-08-24 | Necネッツエスアイ株式会社 | Location information acquisition system and location information acquisition method |
WO2015125210A1 (en) * | 2014-02-18 | 2015-08-27 | 日立マクセル株式会社 | Information display device and information display program |
US9395174B2 (en) | 2014-06-27 | 2016-07-19 | Faro Technologies, Inc. | Determining retroreflector orientation by optimizing spatial fit |
CN104142823B (en) * | 2014-06-30 | 2016-04-27 | 腾讯科技(深圳)有限公司 | A kind of input equipment and control system |
CN105472768B (en) * | 2014-09-10 | 2019-10-01 | 华为技术有限公司 | Wirelessly communicate establishment of connection method and terminal device |
US9525968B2 (en) * | 2014-10-07 | 2016-12-20 | Broadsoft, Inc. | Methods, systems, and computer readable media for using bluetooth beacon information to obtain and publish fine grained user location information |
US10324474B2 (en) | 2015-02-13 | 2019-06-18 | Position Imaging, Inc. | Spatial diversity for relative position tracking |
US11132004B2 (en) | 2015-02-13 | 2021-09-28 | Position Imaging, Inc. | Spatial diveristy for relative position tracking |
US10642560B2 (en) | 2015-02-13 | 2020-05-05 | Position Imaging, Inc. | Accurate geographic tracking of mobile devices |
US11501244B1 (en) | 2015-04-06 | 2022-11-15 | Position Imaging, Inc. | Package tracking systems and methods |
US10148918B1 (en) | 2015-04-06 | 2018-12-04 | Position Imaging, Inc. | Modular shelving systems for package tracking |
US10853757B1 (en) | 2015-04-06 | 2020-12-01 | Position Imaging, Inc. | Video for real-time confirmation in package tracking systems |
US11416805B1 (en) | 2015-04-06 | 2022-08-16 | Position Imaging, Inc. | Light-based guidance for package tracking systems |
US10444323B2 (en) | 2016-03-08 | 2019-10-15 | Position Imaging, Inc. | Expandable, decentralized position tracking systems and methods |
US11436553B2 (en) | 2016-09-08 | 2022-09-06 | Position Imaging, Inc. | System and method of object tracking using weight confirmation |
US10455364B2 (en) | 2016-12-12 | 2019-10-22 | Position Imaging, Inc. | System and method of personalized navigation inside a business enterprise |
US10634503B2 (en) | 2016-12-12 | 2020-04-28 | Position Imaging, Inc. | System and method of personalized navigation inside a business enterprise |
US10634506B2 (en) | 2016-12-12 | 2020-04-28 | Position Imaging, Inc. | System and method of personalized navigation inside a business enterprise |
WO2018126247A2 (en) | 2017-01-02 | 2018-07-05 | Mojoose, Inc. | Automatic signal strength indicator and automatic antenna switch |
US11120392B2 (en) | 2017-01-06 | 2021-09-14 | Position Imaging, Inc. | System and method of calibrating a directional light source relative to a camera's field of view |
CN107656286B (en) * | 2017-09-26 | 2019-07-23 | 武汉大学 | Object localization method and system under big beveled distal end observing environment |
CN113424197A (en) | 2018-09-21 | 2021-09-21 | 定位成像有限公司 | Machine learning assisted self-improving object recognition system and method |
WO2020146861A1 (en) | 2019-01-11 | 2020-07-16 | Position Imaging, Inc. | Computer-vision-based object tracking and guidance module |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5508707A (en) * | 1994-09-28 | 1996-04-16 | U S West Technologies, Inc. | Method for determining position by obtaining directional information from spatial division multiple access (SDMA)-equipped and non-SDMA-equipped base stations |
US5884224A (en) * | 1997-03-07 | 1999-03-16 | J.R. Simplot Company | Mobile mounted remote sensing/application apparatus for interacting with selected areas of interest within a field |
US6009629A (en) * | 1996-03-13 | 2000-01-04 | Leica Geosystems Ag | Process for determining the direction of the earth's magnetic field |
US6173239B1 (en) * | 1998-09-30 | 2001-01-09 | Geo Vector Corporation | Apparatus and methods for presentation of information relating to objects being addressed |
US6381603B1 (en) * | 1999-02-22 | 2002-04-30 | Position Iq, Inc. | System and method for accessing local information by using referencing position system |
US6396475B1 (en) * | 1999-08-27 | 2002-05-28 | Geo Vector Corp. | Apparatus and methods of the remote address of objects |
US20020171581A1 (en) * | 1998-04-28 | 2002-11-21 | Leonid Sheynblat | Method and apparatus for providing location-based information via a computer network |
Family Cites Families (88)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2994971A (en) | 1959-01-28 | 1961-08-08 | Gilbert Co A C | Instructional sky scanner |
US3769894A (en) | 1967-11-22 | 1973-11-06 | Brunswich Corp | Golf game |
US3729315A (en) | 1970-10-01 | 1973-04-24 | Brunswick Corp | Method of making scenes for a golf game |
USRE28847E (en) | 1972-06-28 | 1976-06-08 | Honeywell Inc. | Inside helmet sight display apparatus |
US3923370A (en) | 1974-10-15 | 1975-12-02 | Honeywell Inc | Head mounted displays |
US3990296A (en) | 1975-01-08 | 1976-11-09 | Actron, A Division Of Mcdonnell Douglas Corporation | Acoustical holography imaging device |
SE429690B (en) | 1979-11-19 | 1983-09-19 | Hans Erik Ove Olofsson | AIRPLANE REFERRED WORLD REGISTRATION WITH THE HELP OF CAMERA, WHICH ALLOWS ELECTRONIC REGISTRATION, AND THEIR CONCERNING ENCOURAGEMENT PROCEDURE |
US4322726A (en) | 1979-12-19 | 1982-03-30 | The Singer Company | Apparatus for providing a simulated view to hand held binoculars |
US4425581A (en) | 1981-04-17 | 1984-01-10 | Corporation For Public Broadcasting | System for overlaying a computer generated video signal on an NTSC video signal |
US4439755A (en) | 1981-06-04 | 1984-03-27 | Farrand Optical Co., Inc. | Head-up infinity display and pilot's sight |
US4489389A (en) | 1981-10-02 | 1984-12-18 | Harris Corporation | Real time video perspective digital map display |
JPS58121091A (en) | 1982-01-14 | 1983-07-19 | 池上通信機株式会社 | Stereoscopic display system |
US4710873A (en) | 1982-07-06 | 1987-12-01 | Marvin Glass & Associates | Video game incorporating digitized images of being into game graphics |
US4645459A (en) | 1982-07-30 | 1987-02-24 | Honeywell Inc. | Computer generated synthesized imagery |
US4835532A (en) | 1982-07-30 | 1989-05-30 | Honeywell Inc. | Nonaliasing real-time spatial transform image processing system |
US4572203A (en) | 1983-01-27 | 1986-02-25 | Feinstein Steven B | Contact agents for ultrasonic imaging |
US4662635A (en) | 1984-12-16 | 1987-05-05 | Craig Enokian | Video game with playback of live events |
US4684990A (en) | 1985-04-12 | 1987-08-04 | Ampex Corporation | Method and apparatus for combining multiple video images in three dimensions |
US4736306A (en) | 1985-04-29 | 1988-04-05 | The United States Of America As Represented By The United States Department Of Energy | System for conversion between the boundary representation model and a constructive solid geometry model of an object |
US4947323A (en) | 1986-05-22 | 1990-08-07 | University Of Tennessee Research Corporation | Method and apparatus for measuring small spatial dimensions of an object |
US4805121A (en) | 1986-05-30 | 1989-02-14 | Dba Systems, Inc. | Visual training apparatus |
US4807158A (en) | 1986-09-30 | 1989-02-21 | Daleco/Ivex Partners, Ltd. | Method and apparatus for sampling images to simulate movement within a multidimensional space |
FR2610752B1 (en) | 1987-02-10 | 1989-07-21 | Sagem | METHOD FOR REPRESENTING THE PERSPECTIVE IMAGE OF A FIELD AND SYSTEM FOR IMPLEMENTING SAME |
GB8704560D0 (en) | 1987-02-26 | 1987-04-01 | Nautech Ltd | Hand bearing compass |
US4855822A (en) | 1988-01-26 | 1989-08-08 | Honeywell, Inc. | Human engineered remote driving system |
US5072218A (en) | 1988-02-24 | 1991-12-10 | Spero Robert E | Contact-analog headup display method and apparatus |
US4970666A (en) | 1988-03-30 | 1990-11-13 | Land Development Laboratory, Inc. | Computerized video imaging system for creating a realistic depiction of a simulated object in an actual environment |
US4939661A (en) | 1988-09-09 | 1990-07-03 | World Research Institute For Science And Technology | Apparatus for a video marine navigation plotter with electronic charting and methods for use therein |
GB8826550D0 (en) | 1988-11-14 | 1989-05-17 | Smiths Industries Plc | Image processing apparatus and methods |
US5020902A (en) | 1989-06-22 | 1991-06-04 | Kvh Industries, Inc. | Rangefinder with heads-up display |
US4992866A (en) | 1989-06-29 | 1991-02-12 | Morgan Jack B | Camera selection and positioning system and method |
NL8901695A (en) | 1989-07-04 | 1991-02-01 | Koninkl Philips Electronics Nv | METHOD FOR DISPLAYING NAVIGATION DATA FOR A VEHICLE IN AN ENVIRONMENTAL IMAGE OF THE VEHICLE, NAVIGATION SYSTEM FOR CARRYING OUT THE METHOD AND VEHICLE FITTING A NAVIGATION SYSTEM. |
US5410649A (en) | 1989-11-17 | 1995-04-25 | Texas Instruments Incorporated | Imaging computer system and network |
US5269065A (en) | 1990-03-20 | 1993-12-14 | Casio Computer Co., Ltd. | Compass including means for displaying constellation data |
US5124915A (en) | 1990-05-29 | 1992-06-23 | Arthur Krenzel | Computer-aided data collection system for assisting in analyzing critical situations |
US5528232A (en) | 1990-06-15 | 1996-06-18 | Savi Technology, Inc. | Method and apparatus for locating items |
US5189630A (en) | 1991-01-15 | 1993-02-23 | Barstow David R | Method for encoding and broadcasting information about live events using computer pattern matching techniques |
JP3247126B2 (en) | 1990-10-05 | 2002-01-15 | テキサス インスツルメンツ インコーポレイテツド | Method and apparatus for providing a portable visual display |
US5467444A (en) | 1990-11-07 | 1995-11-14 | Hitachi, Ltd. | Method of three-dimensional display of object-oriented figure information and system thereof |
CA2060406C (en) | 1991-04-22 | 1998-12-01 | Bruce Edward Hamilton | Helicopter virtual image display system incorporating structural outlines |
FR2675977B1 (en) | 1991-04-26 | 1997-09-12 | Inst Nat Audiovisuel | METHOD FOR MODELING A SHOOTING SYSTEM AND METHOD AND SYSTEM FOR PRODUCING COMBINATIONS OF REAL IMAGES AND SYNTHESIS IMAGES. |
JPH0693937B2 (en) | 1991-05-30 | 1994-11-24 | 株式会社セガ・エンタープライゼス | Video synchronizer for competitive game machines |
US5182641A (en) | 1991-06-17 | 1993-01-26 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Composite video and graphics display for camera viewing systems in robotics and teleoperation |
AU652051B2 (en) | 1991-06-27 | 1994-08-11 | Eastman Kodak Company | Electronically interpolated integral photography system |
US5367578A (en) | 1991-09-18 | 1994-11-22 | Ncr Corporation | System and method for optical recognition of bar-coded characters using template matching |
GB9121707D0 (en) | 1991-10-12 | 1991-11-27 | British Aerospace | Improvements in computer-generated imagery |
FR2683918B1 (en) | 1991-11-19 | 1994-09-09 | Thomson Csf | MATERIAL CONSTITUTING A RIFLE SCOPE AND WEAPON USING THE SAME. |
US5462275A (en) | 1991-12-20 | 1995-10-31 | Gordon Wilson | Player interactive live action football game |
US5252950A (en) | 1991-12-20 | 1993-10-12 | Apple Computer, Inc. | Display with rangefinder |
US5631973A (en) | 1994-05-05 | 1997-05-20 | Sri International | Method for telemanipulation with telepresence |
US5333874A (en) | 1992-05-06 | 1994-08-02 | Floyd L. Arnold | Sports simulator |
US5553864A (en) | 1992-05-22 | 1996-09-10 | Sitrick; David H. | User image integration into audiovisual presentation system and methodology |
US5342051A (en) | 1992-10-30 | 1994-08-30 | Accu-Sport International, Inc. | Apparatus and method for tracking the flight of a golf ball |
US5354063A (en) | 1992-12-04 | 1994-10-11 | Virtual Golf, Inc. | Double position golf simulator |
US5311203A (en) | 1993-01-29 | 1994-05-10 | Norton M Kent | Viewing and display apparatus |
US5815411A (en) | 1993-09-10 | 1998-09-29 | Criticom Corporation | Electro-optic vision system which exploits position and attitude |
US5457447A (en) | 1993-03-31 | 1995-10-10 | Motorola, Inc. | Portable power source and RF tag utilizing same |
US5454043A (en) | 1993-07-30 | 1995-09-26 | Mitsubishi Electric Research Laboratories, Inc. | Dynamic and static hand gesture recognition through low-level image analysis |
US5625765A (en) | 1993-09-03 | 1997-04-29 | Criticom Corp. | Vision systems including devices and methods for combining images for extended magnification schemes |
JPH07324935A (en) * | 1994-05-31 | 1995-12-12 | Sony Corp | Terrestrial-magnetism direction sensor and its manufacture |
JPH0863326A (en) | 1994-08-22 | 1996-03-08 | Hitachi Ltd | Image processing device/method |
US5528518A (en) | 1994-10-25 | 1996-06-18 | Laser Technology, Inc. | System and method for collecting data used to form a geographic information system database |
WO1996015517A2 (en) | 1994-11-02 | 1996-05-23 | Visible Interactive Corporation | Interactive personal interpretive device and system for retrieving information about a plurality of objects |
US5703961A (en) | 1994-12-29 | 1997-12-30 | Worldscape L.L.C. | Image transformation and synthesis methods |
US5796386A (en) | 1995-01-23 | 1998-08-18 | International Business Machines Corporation | Precise calibration procedure for sensor-based view point control system |
JPH09114851A (en) | 1995-10-20 | 1997-05-02 | Fuji Xerox Co Ltd | Information managing device |
JP3264614B2 (en) | 1996-01-30 | 2002-03-11 | 富士写真光機株式会社 | Observation device |
EP0914644B2 (en) * | 1996-07-26 | 2009-01-21 | Siemens Aktiengesellschaft | Device for receiving and for displaying cartographic data |
JPH1049290A (en) * | 1996-08-05 | 1998-02-20 | Sony Corp | Device and method for processing information |
JP3548357B2 (en) * | 1996-11-18 | 2004-07-28 | 本田技研工業株式会社 | Map information transmission device |
US5902347A (en) | 1996-11-19 | 1999-05-11 | American Navigation Systems, Inc. | Hand-held GPS-mapping device |
JP3500261B2 (en) * | 1996-12-20 | 2004-02-23 | セイコーエプソン株式会社 | Processing terminal device, information providing system, information obtaining method, and information providing method |
DE69810768D1 (en) * | 1997-06-03 | 2003-02-20 | Stephen Bide | PORTABLE NAVIGATION SYSTEM WITH DIRECTION DETECTOR, POSITION DETECTOR AND DATABASE |
JPH1168653A (en) * | 1997-08-22 | 1999-03-09 | Yoshio Watanabe | Mobile communication equipment |
JPH11160076A (en) * | 1997-12-01 | 1999-06-18 | Honda Motor Co Ltd | Notifying device for vehicle |
JPH11187469A (en) * | 1997-12-24 | 1999-07-09 | Casio Comput Co Ltd | Communication system |
JPH11259525A (en) * | 1998-03-06 | 1999-09-24 | Ntt Data Corp | Position dependent multi-media information provision system |
JP3587430B2 (en) * | 1998-03-18 | 2004-11-10 | 株式会社東芝 | Navigation system with wireless communication function |
JPH11288341A (en) * | 1998-04-01 | 1999-10-19 | Canon Inc | Device and method for navigation |
JP2000055682A (en) * | 1998-08-04 | 2000-02-25 | Alpine Electronics Inc | Navigation method |
DE19837568A1 (en) * | 1998-08-19 | 1999-06-17 | Gunar Schorcht | Pocket computer for personal digital and navigation assistant |
DE69915588T2 (en) * | 1998-09-17 | 2005-02-03 | Koninklijke Philips Electronics N.V. | REMOTE CONTROL DEVICE WITH LOCAL INTERFACE |
CA2373511C (en) * | 1999-05-19 | 2014-07-08 | Digimarc Corporation | Methods and systems for controlling computers or linking to internet resources from physical and electronic objects |
JP4808850B2 (en) * | 1999-05-19 | 2011-11-02 | ディジマーク コーポレイション | Method and system for computer control from physical / electronic objects, ie linking to Internet resources |
EP1059510A1 (en) * | 1999-06-10 | 2000-12-13 | Texas Instruments Incorporated | Wireless location |
JP2001012964A (en) * | 1999-06-28 | 2001-01-19 | Casio Comput Co Ltd | System, apparatus and method for generating document and recording medium |
JP2002132806A (en) * | 2000-10-18 | 2002-05-10 | Fujitsu Ltd | Server system, and information providing service system and method |
US7031875B2 (en) * | 2001-01-24 | 2006-04-18 | Geo Vector Corporation | Pointing systems for addressing objects |
-
2001
- 2001-01-24 US US09/769,012 patent/US7031875B2/en not_active Expired - Lifetime
- 2001-10-29 EP EP01997159A patent/EP1354260A4/en not_active Withdrawn
- 2001-10-29 WO PCT/US2001/050804 patent/WO2002059716A2/en active Search and Examination
- 2001-10-29 NZ NZ560769A patent/NZ560769A/en unknown
- 2001-10-29 JP JP2002559779A patent/JP2004531791A/en active Pending
-
2005
- 2005-12-12 US US11/301,603 patent/US20060161379A1/en not_active Abandoned
-
2007
- 2007-12-12 JP JP2007321215A patent/JP2008171410A/en active Pending
-
2010
- 2010-04-20 JP JP2010096943A patent/JP5448998B2/en not_active Expired - Fee Related
-
2012
- 2012-02-22 JP JP2012036051A patent/JP2012128869A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5508707A (en) * | 1994-09-28 | 1996-04-16 | U S West Technologies, Inc. | Method for determining position by obtaining directional information from spatial division multiple access (SDMA)-equipped and non-SDMA-equipped base stations |
US6009629A (en) * | 1996-03-13 | 2000-01-04 | Leica Geosystems Ag | Process for determining the direction of the earth's magnetic field |
US5884224A (en) * | 1997-03-07 | 1999-03-16 | J.R. Simplot Company | Mobile mounted remote sensing/application apparatus for interacting with selected areas of interest within a field |
US20020171581A1 (en) * | 1998-04-28 | 2002-11-21 | Leonid Sheynblat | Method and apparatus for providing location-based information via a computer network |
US6173239B1 (en) * | 1998-09-30 | 2001-01-09 | Geo Vector Corporation | Apparatus and methods for presentation of information relating to objects being addressed |
US6381603B1 (en) * | 1999-02-22 | 2002-04-30 | Position Iq, Inc. | System and method for accessing local information by using referencing position system |
US6396475B1 (en) * | 1999-08-27 | 2002-05-28 | Geo Vector Corp. | Apparatus and methods of the remote address of objects |
Non-Patent Citations (1)
Title |
---|
See also references of EP1354260A2 * |
Cited By (148)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8948544B2 (en) | 2000-11-06 | 2015-02-03 | Nant Holdings Ip, Llc | Object information derived from object images |
US9014512B2 (en) | 2000-11-06 | 2015-04-21 | Nant Holdings Ip, Llc | Object information derived from object images |
US10639199B2 (en) | 2000-11-06 | 2020-05-05 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9785651B2 (en) | 2000-11-06 | 2017-10-10 | Nant Holdings Ip, Llc | Object information derived from object images |
US9613284B2 (en) | 2000-11-06 | 2017-04-04 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9805063B2 (en) | 2000-11-06 | 2017-10-31 | Nant Holdings Ip Llc | Object information derived from object images |
US10635714B2 (en) | 2000-11-06 | 2020-04-28 | Nant Holdings Ip, Llc | Object information derived from object images |
US10617568B2 (en) | 2000-11-06 | 2020-04-14 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US10509821B2 (en) | 2000-11-06 | 2019-12-17 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US8218873B2 (en) | 2000-11-06 | 2012-07-10 | Nant Holdings Ip, Llc | Object information derived from object images |
US8218874B2 (en) | 2000-11-06 | 2012-07-10 | Nant Holdings Ip, Llc | Object information derived from object images |
US8224078B2 (en) | 2000-11-06 | 2012-07-17 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8224079B2 (en) | 2000-11-06 | 2012-07-17 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8224077B2 (en) | 2000-11-06 | 2012-07-17 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US8326031B2 (en) | 2000-11-06 | 2012-12-04 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US10509820B2 (en) | 2000-11-06 | 2019-12-17 | Nant Holdings Ip, Llc | Object information derived from object images |
US8335351B2 (en) | 2000-11-06 | 2012-12-18 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8437544B2 (en) | 2000-11-06 | 2013-05-07 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8457395B2 (en) | 2000-11-06 | 2013-06-04 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8463031B2 (en) | 2000-11-06 | 2013-06-11 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9785859B2 (en) | 2000-11-06 | 2017-10-10 | Nant Holdings Ip Llc | Image capture and identification system and process |
US8467600B2 (en) | 2000-11-06 | 2013-06-18 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8467602B2 (en) | 2000-11-06 | 2013-06-18 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8478037B2 (en) | 2000-11-06 | 2013-07-02 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8478047B2 (en) | 2000-11-06 | 2013-07-02 | Nant Holdings Ip, Llc | Object information derived from object images |
US8478036B2 (en) | 2000-11-06 | 2013-07-02 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8488880B2 (en) | 2000-11-06 | 2013-07-16 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8494271B2 (en) | 2000-11-06 | 2013-07-23 | Nant Holdings Ip, Llc | Object information derived from object images |
US8494264B2 (en) | 2000-11-06 | 2013-07-23 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US8498484B2 (en) | 2000-11-06 | 2013-07-30 | Nant Holdingas IP, LLC | Object information derived from object images |
US8520942B2 (en) | 2000-11-06 | 2013-08-27 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9578107B2 (en) | 2000-11-06 | 2017-02-21 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US8548245B2 (en) | 2000-11-06 | 2013-10-01 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8548278B2 (en) | 2000-11-06 | 2013-10-01 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8582817B2 (en) | 2000-11-06 | 2013-11-12 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US8588527B2 (en) | 2000-11-06 | 2013-11-19 | Nant Holdings Ip, Llc | Object information derived from object images |
US10500097B2 (en) | 2000-11-06 | 2019-12-10 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US10095712B2 (en) | 2000-11-06 | 2018-10-09 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US8712193B2 (en) | 2000-11-06 | 2014-04-29 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8718410B2 (en) | 2000-11-06 | 2014-05-06 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8774463B2 (en) | 2000-11-06 | 2014-07-08 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8792750B2 (en) | 2000-11-06 | 2014-07-29 | Nant Holdings Ip, Llc | Object information derived from object images |
US8798368B2 (en) | 2000-11-06 | 2014-08-05 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8798322B2 (en) | 2000-11-06 | 2014-08-05 | Nant Holdings Ip, Llc | Object information derived from object images |
US10089329B2 (en) | 2000-11-06 | 2018-10-02 | Nant Holdings Ip, Llc | Object information derived from object images |
US8824738B2 (en) | 2000-11-06 | 2014-09-02 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US8837868B2 (en) | 2000-11-06 | 2014-09-16 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8842941B2 (en) | 2000-11-06 | 2014-09-23 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8849069B2 (en) | 2000-11-06 | 2014-09-30 | Nant Holdings Ip, Llc | Object information derived from object images |
US8855423B2 (en) | 2000-11-06 | 2014-10-07 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8861859B2 (en) | 2000-11-06 | 2014-10-14 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8867839B2 (en) | 2000-11-06 | 2014-10-21 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8873891B2 (en) | 2000-11-06 | 2014-10-28 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8885982B2 (en) | 2000-11-06 | 2014-11-11 | Nant Holdings Ip, Llc | Object information derived from object images |
US8885983B2 (en) | 2000-11-06 | 2014-11-11 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9808376B2 (en) | 2000-11-06 | 2017-11-07 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8938096B2 (en) | 2000-11-06 | 2015-01-20 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8948459B2 (en) | 2000-11-06 | 2015-02-03 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8463030B2 (en) | 2000-11-06 | 2013-06-11 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US10772765B2 (en) | 2000-11-06 | 2020-09-15 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8923563B2 (en) | 2000-11-06 | 2014-12-30 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9536168B2 (en) | 2000-11-06 | 2017-01-03 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8948460B2 (en) | 2000-11-06 | 2015-02-03 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9014516B2 (en) | 2000-11-06 | 2015-04-21 | Nant Holdings Ip, Llc | Object information derived from object images |
US9014515B2 (en) | 2000-11-06 | 2015-04-21 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9014514B2 (en) | 2000-11-06 | 2015-04-21 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9014513B2 (en) | 2000-11-06 | 2015-04-21 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US10080686B2 (en) | 2000-11-06 | 2018-09-25 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9020305B2 (en) | 2000-11-06 | 2015-04-28 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9025813B2 (en) | 2000-11-06 | 2015-05-05 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9025814B2 (en) | 2000-11-06 | 2015-05-05 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9031290B2 (en) | 2000-11-06 | 2015-05-12 | Nant Holdings Ip, Llc | Object information derived from object images |
US9031278B2 (en) | 2000-11-06 | 2015-05-12 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9036947B2 (en) | 2000-11-06 | 2015-05-19 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9036949B2 (en) | 2000-11-06 | 2015-05-19 | Nant Holdings Ip, Llc | Object information derived from object images |
US9036862B2 (en) | 2000-11-06 | 2015-05-19 | Nant Holdings Ip, Llc | Object information derived from object images |
US9036948B2 (en) | 2000-11-06 | 2015-05-19 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9046930B2 (en) | 2000-11-06 | 2015-06-02 | Nant Holdings Ip, Llc | Object information derived from object images |
US9844467B2 (en) | 2000-11-06 | 2017-12-19 | Nant Holdings Ip Llc | Image capture and identification system and process |
US9087240B2 (en) | 2000-11-06 | 2015-07-21 | Nant Holdings Ip, Llc | Object information derived from object images |
US9104916B2 (en) | 2000-11-06 | 2015-08-11 | Nant Holdings Ip, Llc | Object information derived from object images |
US9110925B2 (en) | 2000-11-06 | 2015-08-18 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9116920B2 (en) | 2000-11-06 | 2015-08-25 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9135355B2 (en) | 2000-11-06 | 2015-09-15 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9844469B2 (en) | 2000-11-06 | 2017-12-19 | Nant Holdings Ip Llc | Image capture and identification system and process |
US9141714B2 (en) | 2000-11-06 | 2015-09-22 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9148562B2 (en) | 2000-11-06 | 2015-09-29 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9154695B2 (en) | 2000-11-06 | 2015-10-06 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9154694B2 (en) | 2000-11-06 | 2015-10-06 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9152864B2 (en) | 2000-11-06 | 2015-10-06 | Nant Holdings Ip, Llc | Object information derived from object images |
US9170654B2 (en) | 2000-11-06 | 2015-10-27 | Nant Holdings Ip, Llc | Object information derived from object images |
US9182828B2 (en) | 2000-11-06 | 2015-11-10 | Nant Holdings Ip, Llc | Object information derived from object images |
US9844468B2 (en) | 2000-11-06 | 2017-12-19 | Nant Holdings Ip Llc | Image capture and identification system and process |
US9235600B2 (en) | 2000-11-06 | 2016-01-12 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9244943B2 (en) | 2000-11-06 | 2016-01-26 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9262440B2 (en) | 2000-11-06 | 2016-02-16 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9288271B2 (en) | 2000-11-06 | 2016-03-15 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US9311552B2 (en) | 2000-11-06 | 2016-04-12 | Nant Holdings IP, LLC. | Image capture and identification system and process |
US9311554B2 (en) | 2000-11-06 | 2016-04-12 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9310892B2 (en) | 2000-11-06 | 2016-04-12 | Nant Holdings Ip, Llc | Object information derived from object images |
US9311553B2 (en) | 2000-11-06 | 2016-04-12 | Nant Holdings IP, LLC. | Image capture and identification system and process |
US9317769B2 (en) | 2000-11-06 | 2016-04-19 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9324004B2 (en) | 2000-11-06 | 2016-04-26 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9330327B2 (en) | 2000-11-06 | 2016-05-03 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9330328B2 (en) | 2000-11-06 | 2016-05-03 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9330326B2 (en) | 2000-11-06 | 2016-05-03 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9336453B2 (en) | 2000-11-06 | 2016-05-10 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9342748B2 (en) | 2000-11-06 | 2016-05-17 | Nant Holdings Ip. Llc | Image capture and identification system and process |
US9360945B2 (en) | 2000-11-06 | 2016-06-07 | Nant Holdings Ip Llc | Object information derived from object images |
US9844466B2 (en) | 2000-11-06 | 2017-12-19 | Nant Holdings Ip Llc | Image capture and identification system and process |
US9824099B2 (en) | 2000-11-06 | 2017-11-21 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US8970725B2 (en) | 2002-11-20 | 2015-03-03 | Koninklijke Philips N.V. | User interface system based on pointing device |
US8971629B2 (en) | 2002-11-20 | 2015-03-03 | Koninklijke Philips N.V. | User interface system based on pointing device |
US8537231B2 (en) | 2002-11-20 | 2013-09-17 | Koninklijke Philips N.V. | User interface system based on pointing device |
US7940986B2 (en) | 2002-11-20 | 2011-05-10 | Koninklijke Philips Electronics N.V. | User interface system based on pointing device |
JP2009245444A (en) * | 2002-11-20 | 2009-10-22 | Koninkl Philips Electronics Nv | User interface system based on pointing device |
US7698387B2 (en) | 2004-07-06 | 2010-04-13 | Fujitsu Limited | Server system, user terminal, service providing method and service providing system using the server system and the user terminal for providing position-based service to the user terminal |
US9451219B2 (en) | 2004-12-31 | 2016-09-20 | Nokia Technologies Oy | Provision of target specific information |
US9596414B2 (en) | 2004-12-31 | 2017-03-14 | Nokie Technologies Oy | Provision of target specific information |
EP1839193A1 (en) * | 2004-12-31 | 2007-10-03 | Nokia Corporation | Provision of target specific information |
EP2264621A3 (en) * | 2004-12-31 | 2011-11-23 | Nokia Corp. | Provision of target specific information |
EP2264622A3 (en) * | 2004-12-31 | 2011-12-21 | Nokia Corp. | Provision of target specific information |
WO2009024881A1 (en) * | 2007-08-23 | 2009-02-26 | Telefonaktiebolaget Lm Ericsson (Publ) | System and method for gesture-based command and control of targets in wireless network |
WO2009024882A1 (en) * | 2007-08-23 | 2009-02-26 | Telefonaktiebolaget Lm Ericsson (Publ) | Method and apparatus for sending data relating to a target to a mobile device |
TWI514337B (en) * | 2009-02-20 | 2015-12-21 | 尼康股份有限公司 | Carrying information machines, photographic devices, and information acquisition systems |
US9071641B2 (en) | 2009-09-30 | 2015-06-30 | Biglobe Inc. | Mapping system that displays nearby regions based on direction of travel, speed, and orientation |
US11348480B2 (en) | 2010-02-24 | 2022-05-31 | Nant Holdings Ip, Llc | Augmented reality panorama systems and methods |
US10535279B2 (en) | 2010-02-24 | 2020-01-14 | Nant Holdings Ip, Llc | Augmented reality panorama supporting visually impaired individuals |
WO2011142700A1 (en) | 2010-05-12 | 2011-11-17 | Telefonaktiebolaget L M Ericsson (Publ) | Method, computer program and apparatus for determining an object in sight |
US9020753B2 (en) | 2010-05-12 | 2015-04-28 | Telefonaktiebolaget L M Ericsson (Publ) | Method, computer program and apparatus for determining an object in sight |
US11514652B2 (en) | 2011-04-08 | 2022-11-29 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US10403051B2 (en) | 2011-04-08 | 2019-09-03 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US11107289B2 (en) | 2011-04-08 | 2021-08-31 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US10127733B2 (en) | 2011-04-08 | 2018-11-13 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US11854153B2 (en) | 2011-04-08 | 2023-12-26 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US8810598B2 (en) | 2011-04-08 | 2014-08-19 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US9396589B2 (en) | 2011-04-08 | 2016-07-19 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US9824501B2 (en) | 2011-04-08 | 2017-11-21 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US10726632B2 (en) | 2011-04-08 | 2020-07-28 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US11869160B2 (en) | 2011-04-08 | 2024-01-09 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
WO2012163427A1 (en) * | 2011-06-01 | 2012-12-06 | Sony Ericsson Mobile Communications Ab | Catch the screen |
US9143882B2 (en) | 2011-06-01 | 2015-09-22 | Sony Corporation | Catch the screen |
EP2688318A1 (en) * | 2012-07-17 | 2014-01-22 | Alcatel Lucent | Conditional interaction control for a virtual object |
WO2014012717A1 (en) * | 2012-07-17 | 2014-01-23 | Alcatel Lucent | Conditional interaction control for a virtual object |
US9571999B2 (en) | 2012-07-17 | 2017-02-14 | Alcatel Lucent | Conditional interaction control for a virtual object |
US10664518B2 (en) | 2013-10-17 | 2020-05-26 | Nant Holdings Ip, Llc | Wide area augmented reality location-based services |
US11392636B2 (en) | 2013-10-17 | 2022-07-19 | Nant Holdings Ip, Llc | Augmented reality position-based service, methods, and systems |
US10140317B2 (en) | 2013-10-17 | 2018-11-27 | Nant Holdings Ip, Llc | Wide area augmented reality location-based services |
Also Published As
Publication number | Publication date |
---|---|
WO2002059716A3 (en) | 2003-06-05 |
JP5448998B2 (en) | 2014-03-19 |
EP1354260A2 (en) | 2003-10-22 |
EP1354260A4 (en) | 2006-01-11 |
JP2012128869A (en) | 2012-07-05 |
US20020140745A1 (en) | 2002-10-03 |
JP2010205281A (en) | 2010-09-16 |
NZ560769A (en) | 2009-04-30 |
JP2004531791A (en) | 2004-10-14 |
US20060161379A1 (en) | 2006-07-20 |
US7031875B2 (en) | 2006-04-18 |
JP2008171410A (en) | 2008-07-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7031875B2 (en) | Pointing systems for addressing objects | |
Spohrer | Information in places | |
US7290000B2 (en) | Server, user terminal, information providing service system, and information providing service method | |
Burigat et al. | Location-aware visualization of VRML models in GPS-based mobile guides | |
Simon et al. | A mobile application framework for the geospatial web | |
Krüger et al. | The connected user interface: Realizing a personal situated navigation service | |
CN104335268B (en) | For changing the mthods, systems and devices for providing 3-D transition animation for map view | |
CN102754097A (en) | Method and apparatus for presenting a first-person world view of content | |
US20170115749A1 (en) | Systems And Methods For Presenting Map And Other Information Based On Pointing Direction | |
EP2202616B1 (en) | A method and apparatus to browse and access downloaded contextual information | |
US20080091654A1 (en) | Constellation Search Apparatus, Constellation Search Program, And Computer-Readable Storage Medium Storing Constellation Search Program | |
US8380427B2 (en) | Showing realistic horizons on mobile computing devices | |
Vlahakis et al. | Design and Application of an Augmented Reality System for continuous, context-sensitive guided tours of indoor and outdoor cultural sites and museums. | |
Laakso | Evaluating the use of navigable three-dimensional maps in mobile devices | |
Simon et al. | Beyond location based–the spatially aware mobile phone | |
Lautenschläger | Design and implementation of a campus navigation application with augmented reality for smartphones | |
Simon et al. | The point to discover GeoWand | |
AU2002248278A1 (en) | Pointing systems for addressing objects | |
Martín et al. | A context-aware application based on ubiquitous location | |
Chittaro et al. | Location-aware visualization of a 3D world to select tourist information on a mobile device | |
Inoue et al. | Use of human geographic recognition to reduce GPS error in mobile mapmaking learning | |
Luimula et al. | Techniques for location selection on a mobile device | |
Simon et al. | Enabling spatially aware mobile applications | |
Agostini | Mixed Reality Mobile Technologies and Tools in Education and Training | |
Ranacher | The social network I go...: design and implimentation of a web-based social network service with a geographic-coordinative background/vorgelegt von Peter Ranacher |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AU CA FI JP KP NZ |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2001997159 Country of ref document: EP |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2002559779 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2002248278 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 527804 Country of ref document: NZ |
|
WWP | Wipo information: published in national office |
Ref document number: 2001997159 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2001997159 Country of ref document: EP |
|
DPE2 | Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101) |