WO1999042946A2 - Apparatus and methods for presentation of information relating to objects being addressed - Google Patents
Apparatus and methods for presentation of information relating to objects being addressed Download PDFInfo
- Publication number
- WO1999042946A2 WO1999042946A2 PCT/US1999/003531 US9903531W WO9942946A2 WO 1999042946 A2 WO1999042946 A2 WO 1999042946A2 US 9903531 W US9903531 W US 9903531W WO 9942946 A2 WO9942946 A2 WO 9942946A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- geometric
- addressed
- attitude
- determining
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
Definitions
- the following invention disclosure is generally concerned with devices and technique for presenting recorded information relating to objects and specifically concerned with presenting recorded information relating to objects having an association with a particular location.
- Prior Art Systems have been devised to display images of objects which may be in the field-of-view of a vision system. Images may be formed in response to a determination of position and attitude of the vision system which locates the field-of-view with respect to objects being addressed. Details may be fully appreciated in consideration of U.S. Patents # 5,625,765; 5,682,332; and 5,742,521. While these systems are highly useful and sophisticated, they may require complex imaging apparatus and technique forming composite images which are aligned to actual objects.
- the present invention includes devices and methods for presenting information relating to objects having an association with a particular geometry and location.
- devices are arranged with a pointing reference and user interface.
- a device which is pointed toward an object known to a computer database, responds by determining which objects are being addressed and presenting information which relates to the objects at the interface.
- the device makes a determination of the position and attitude of the device.
- a reference address indicator associated with the determined position and attitude is defined and used in a search of a database.
- the database comprised of data elements having identifiers or descriptors associated with position and other spatial definition may form a geometric intersection with the reference address indicator.
- the search produces output which includes information about objects which are being addressed by the device. The information is presented to a user via a user interface.
- Figure 1 is a block diagram illustrating major elements of a device of the invention
- Figure 2 is a block diagram showing the configuration of a database of the invention
- Figure 3 is a geometric construct of interest
- Figure 4 shows a similar geometric construct which illustrates an important geometry
- Figures 5 - 15 similarly show geometries of importance.
- each of the preferred embodiments of the invention there is provided an apparatus for and method of presenting information relating to an object being addressed. It will be appreciated that each of the embodiments described may include both an apparatus and method and that the apparatus and method of one preferred embodiment may be different than the apparatus and method of another embodiment.
- a geometric descriptor is a mathematical definition of a geometric body.
- a geometric descriptor of the invention is used in association with an object which may be addressed by systems of the invention.
- An information element is a database record which relates to a particular object of interest.
- An information element comprises many forms of multi-media data including but not limited to: text, audio recordings, video streams, pictures, photographs, icons, Java applets, etc.
- each information element has associated therewith a geometric descriptor.
- Address is a term used herein as a verb, most commonly with the gerund -ing, to indicate a relationship between a device of the invention and an object; the object being the subject of the address.
- a device of the invention which is pointing at an object is said to be 'addressing' the object.
- An address indicator may be a geometric body, examples include vectors and cones, which has a pointing direction associated therewith. In addition to a reference point and reference pointing direction, some address indicators, for example a cone, subtend a solid angle or otherwise have spatial extent.
- a range gate is a geometric segment which is a subset of an address indicator having a first endpoint or planar region at some minimum distance from a point reference and a second endpoint or planar region at some maximum distance from the same point reference.
- objects refer to any element which may be of interest to a user.
- An object may be a real tangible object or may be a figurative element in space.
- the term 'object' should be read in a liberal sense. Although buildings and mountains suggest concrete forms of objects, objects for purposes of this disclosure include abstract forms as well. For example, the region of airspace over an airport which may be restricted is considered an 'object'. Indeed any region of space may considered an object whether is actually contains a tangible object therein or not.
- apparatus include the following elements as described herefollowing.
- Devices of the invention include a point reference and a directional reference. These may be mere structural constructs.
- the actual point and directional references may or may not correspond to any tangible object or element of the device. Alternatively, they may be collocated with actual physical elements of the device. In either case, an important relationship is made between them and a position and attitude determining means which are also included in devices of the invention.
- a position determining means is arranged to measure the position of the point reference. Since in many embodiments of the invention the position determining means is a global positioning system GPS receiver, the point reference lies at the center of the sphere which is defined by the resolution limits of the positioning system. For practical purposes, a handheld receiver which includes a GPS antenna may be said to have the point reference within the handheld unit. The position determining means therefore measures the position of the handheld unit. Many forms of alternate positioning systems may be used to accomplish the identical task. The particular positioning system employed may be chosen for a specific task at hand, for example a global positioning system would not be appropriate for a small space such as a warehouse so a radio triangulation technique may be preferred. The essence of the invention is not changed by the particular choice of positioning system.
- versions of the invention should not be limited to one particular type of positioning system.
- the limitation described by 'position determining means' is met when the position of the point reference is measured and made available to a computer processor. Therefore, by use of the term "position determining means” it is meant that any conceivable means for determining the position of a point reference and making that position known to a computer is anticipated. Experts will recognize that there are many thousands of possible ways of determining position and it will not serve a further understanding of the invention to attempt to catalogue them here. The reader will appreciate that the broadest possible definition of "positioning determining means" is intended here.
- Systems of the invention also include an attitude determining means.
- An attitude determining means is arranged to determine the pointing direction or orientation of a directional reference.
- an electronic compass will serve as an attitude determining means.
- More sophisticated versions will include an attitude determining means which is operable for measuring inclination as well as direction in a horizontal plane.
- an electronic flux gate compass or laser gyroscope system may be used in certain versions of the invention, it does not improve the description to limit the attitude determining means to any particular device. Similar to the position determining means described above, the limitation described as 'attitude determining means' is fully met by any device or systems which may be used to determine the attitude of a directional reference and make that information known to a computer processor.
- a user interface of the invention serves to convey information to a user of the device.
- a simple speaker driven by computer audio systems is operational for producing audio information and description to a user.
- a display screen driven by video systems of a computer functions to present video or graphic information to a user.
- other systems include non-display type visual systems such as simple light emitting diodes, or non-speaker audio systems such as buzzers, tactile outputs such as vibrating systems, et cetera.
- a user interface includes a transducer which is electronically driven by the computer which produces some physical disturbance which can be detected by a user's senses.
- systems of the invention include a computer programmed to execute specific routines.
- a computer is arranged to receive inputs from the position and attitude determining means. From these inputs, the computer defines a geometric body as an address indicator in association with the device reference point and pointing direction. From this geometric body definition, the computer performs a database search and determines if any of the geometric objects described in the information element geometric descriptors intersects the address indicator. Information elements which are determined to intersect said address indicator has data associated therewith which may be recalled and played back to the user interface as appropriate and in agreement with other criteria which may be selected.
- a database is arranged to accommodate data relating to objects of interest.
- Data relating to objects is prepared and stored in a predetermined and well organized fashion.
- the data may be stored in many formats and configurations and may be of the nature sometimes referred to as 'multi-media'.
- a database of the invention is comprised of a plurality of information elements. Each information element relates to a particular object which may be of interest to users of devices of the invention. Each information element contains a descriptor which describes a geometry and location relating to the object for which the stored information pertains.
- a geometric descriptor may describe a geometry which includes: a single point; alternatively, a polygon; which defines a planar region; a solid such as a sphere; or even a three dimensional object of arbitrary shape. Thus the rules which perfectly describe those geometries which are well known in the sciences are used in geometric descriptors of the invention. In all cases, a geometric descriptor includes at least one point and more frequently includes a set of many points. Methods of the invention are best described as being comprised of the follows steps.
- an object is addressed.
- the device pointing reference is merely pointed toward the object.
- the device is necessarily pointing in some direction at all times. Although it is not a necessity that the device be pointing to an object known to the database, the device is always pointing at something and thus it is said that it is "addressing" something at all times.
- the position of the device reference point is determined.
- a GPS employed locally at the device operates to measure the global position of the reference point. Although convenient measurement units might be latitude, longitude and altitude, others similarly provide workable solutions. Data from the position determining step is passed to the computer processor.
- the attitude of the device directional reference is determined.
- a compass may be used to determine which direction the device pointing reference is being pointed.
- Data from the attitude determining step is similarly passed to the computer processor.
- Data received at the computer processor from the position and attitude determining means is used to define an address indicator.
- a search of database information elements is commenced.
- a search operation reads database geometric descriptors and performs a coincidence test to see if an address indicator intersects any of the points in a geometry described. Items meeting that criteria are recalled for further processing and presentation at a user interface.
- Figure 1 illustrates a block diagram of major components of devices of the invention.
- a point reference 1 is a geometric construct to which measurements of position are directed. The point may correspond to an actual device such as a GPS antenna or may alternatively be merely a point in space having a convenient location within a physical device.
- a directional reference 2 similarly forms a geometric construct at a device of the invention but is otherwise arbitrary with respect to any physical element or part of a device of the invention.
- a position determining means 3 is in communication with the point reference and is arranged to measure its position. The position determining means is further in communication with a computer.
- the position measurement is made without regard to any particular coordinate system in various versions of the invention but some versions using GPS devices preferably use a latitude, longitude and altitude scheme which allows one to define position anywhere on or above the Earth's surface. Determination of position within a coordinate system is the essence of the function performed by the device without regard for any coordinate system chosen for convenience.
- An attitude determining means 4 is arranged in communication with a directional reference 2 and with a computer. The attitude determining means measures the pointing direction of the directional reference and reports that information to the computer processor.
- a computer processor 5 is coupled to and receives measurement data from position and attitude determining means.
- the computer is further connected to a user interface 6 and presents information to a user by way of the interface.
- the computer includes a database 7 which may contain preprogrammed information.
- a database 21 of the invention has a special construction.
- the database may include a great plurality of basic units each referred herethroughout as an information element 22.
- An information element may contain stored information in various formats 23.
- Each information element contains a descriptor 24 which defines a geometric body of interest. Additional information elements 25, each having their own descriptors and stored information, further make up the database.
- the database is comprised of any number of information elements the last element being the N 111 element 26.
- Drawing Figure 3 illustrates a simple geometric construction showing a point reference 31, a directional reference 32, a rectangular cylinder 33 and a circular cylinder 34. A portion of space 35 indicated by a dotted line is shared by the rectangular cylinder and an address indicator 36.
- the address indicator in this case a simple vector, has an endpoint coincident with the point reference and colinear with the direction reference.
- a computer routine is executed to determine which objects are intersected by the vector and which are not.
- the square cylinder is intersected by the vector but the circular cylinder is not.
- a device having a reference point 31 and directional reference 32 is said to be addressing the square cylinder.
- a computer having programmed information as to the location and shape of the cylinders can tell when a vector is intersecting the space of interest and when it is not. This fact depends on the condition that the cylinders remain stationary after the computer is programmed.
- the computer only needs the preprogrammed information and a measurement of the device point and direction references.
- the computer does not require input from any real object which may be associated with the space of interest and does not need to detect or probe it in any way.
- the hotel is implicitly addressed whenever the device addresses the square cylindrical space. If a construction crew removes the hotel and the computer is not updated, the computer still assumes the presence of the building because it is a necessity that the space defined by the information element (hotel) geometric descriptor remain despite the actual presence, or lack of presence, of the building therein.
- devices of the invention merely determine what space is being addressed and imply that particular objects are being addressed by way of the association of objects to spatial definitions or geometric descriptors.
- the mere fact that information contained in the database is accurate suggests and implies the presence of the hotel. It is the geometric descriptor which is preprogrammed into the computer which dictates if an intersection exists or not. The actual presence of an object does not affect whether the device is addressing it or not. It is useful to point out that one may generally rely on a hotel remaining in its set position. One may rely on this technique for most items of interest. For example, the Empire State Building presently occupies a space which is well defined. It is a very reliable fact that the Empire State Building will similarly occupy that same space tomorrow.
- a device of the invention which is pointed at the Empire State Building, the position of the device and attitude being measured and defining an address vector, can reasonably deduce that the building is there. In this way, devices of the invention 'know' what they are addressing simply by measuring their own position and attitude and comparing that information with information in a database.
- FIG 4 illustrates a scheme whereby the vector defined by the reference point 41 and the reference direction 42 is coincident with the square cylinder 43 at a single point 44.
- the circular cylinder 45 is not intersected by the vector and is not said to be coincident therewith. It is not a requirement that an object be three dimensional; quite contrarily, a two dimensional or single dimensional object forms perfect basis for an intersection with an address indicator in the form of a vector.
- Figure 5 illustrates a point reference 51 and a direction reference 52 forming a vector which intersects a plane 53 at a single point 54.
- Figure 6 shows a reference point 61 and reference direction 62 which define an address indicator in the form of a vector having an intersection with a spatial element 63 at line segment 64.
- a geometric descriptor used in devices of the invention to associate object data with position and shape may change in time. Although the trains in Japan are moving objects, they move in a highly reliable way in accordance with a rigid schedule. Therefore, a geometric descriptor might include information about changes of position with respect to time. When a device of the invention is pointed at a moving train, inquiry to the database may yield an intersection with a 'moving' spatial element, i.e. an object having a position descriptor which is dynamic with respect to time.
- Figure 7 shows an additional construction of interest. Although the term 'vector' implies a line segment with infinite extent in one direction, in some cases only a certain portion of the vector is of interest. Some operations described hereafter will refer to a "range gate".
- a range gate has two delimiters which define a portion of the vector which is of particular importance.
- Figure 7 shows a reference point 71, a reference direction 72, a first delimiter 73 a second delimiter 74, a cube body 75, a line segment 76, a circular cylinder 77, and a vector 78. Although the vector 78 intersects and passes through both the cube body and the circular cylinder, only a portion of the vector in the range gate, i.e.
- a range gate is created to designate which portions of the vector are of greatest interest.
- a user interface may present information regarding the cylinder and the cube but where information relating to the cube is presented with priority.
- Figure 8 shows another important range gate.
- a range gate may include all the points along a vector from the reference point to a selected maximum distance. For example a user may specify all targets "within fifty meters of the device". Objects which are greater than fifty meters away from the user are not included in any recall effort.
- Figure 8 illustrates this concept.
- a reference point 81 and line segment 82 form basis for a system having a range gate starting at the reference point and ending 83 at some predetermined distance from the reference point.
- a cubic object 84 has a portion 85 of the vector passing through it.
- circular cylindrical object 86 also has a portion of the vector intersecting that object. Of course, the vector 87 continues on without limit. The difference between the cubic object and the circular cylindrical object is that the cubic object lies within the range gate region of the address indicator and the circular cylindrical object does not.
- a computer search engine arranged to be responsive to criteria describing such a rate gate is useful in restricting objects which are presented.
- Figure 9 illustrates a reference point 91 and a direction vector 92 which passes through 93 a first object, continues through space 94 and passes through a second object 95.
- both objects a cubic object 96 and a circular cylindrical object 97 form an intersection with the vector and lie with a range which lies on the address indicator somewhere past the point indicated as 98.
- a search engine therefore identifies both objects as being addressed by the system.
- a display can handle this occurrence by listing all objects being addressed simultaneously.
- a list may include a scheme whereby closer objects are listed first while more distant objects appear nearer the end of the list.
- a user may select from the list an object of particular interest and request from the computer more information relating to that object.
- An address indicator may be any geometric construct including but not limited to: a point; a line; a vector; a line segment; a plane; a planar section; a cone; a conic section; et cetera.
- the search criteria may simple determine if any point of an address indicator is shared with any point described in an information elements geometric descriptor.
- an address indicator which is in the shape of a cone.
- Reference point 100 is joined by a surface 101 which describes a cone having an axis 102.
- the conic axis may be arranged to be colinear with the system reference pointing direction.
- ellipse 103 is useful to indicate a cross section of the cone.
- the careful observer might argue that the "cone” shown is not truly a cone in the sense that it is wider in one dimension that in an orthogonal dimension. This loose interpretation of a cone is intended to illustrate that the geometric shape of an address vector may be of complex form. In some systems of the invention, it is useful to have an address vector which is adjustable.
- Figure 11 shows a conic shape similar to that of Figure 10 whereby the extent of the limiting surface has been increased.
- Reference point 110 forms the apex of a cone having a surface 111 which is quite portly in comparison to the cone of Figure 10.
- the conic axis 112 is associated with the system pointing direction.
- Devices of the invention may include an adjustment setting which can be set by a user to alter the shape and size of an address indicator.
- Such adjustment may be used to configure the address indicator to take a shape having a width which is greater in extent than its height.
- Figure 12 shows a reference point 120 and address indicator surface 121 symmetric about pointing reference 122.
- Figure 13 shows how an address vector may be said to be intersecting an object.
- Reference point 130 is the apex of a conic address indicator having a surface 131 and a reference pointing direction 132 and cross section 133.
- Cylindrical object 134 contains spatial extent 135 which is shared with the address indicator.
- a device of the invention having a conic address indicator as shown is addressing the object.
- the reference pointing direction be intersecting the object, but that any portion of the address indicator is sufficient to form an intersection.
- the database search can be made to be responsive to this condition.
- Range gates cooperate with address indicators having spatial extent.
- Figure 14 shows a reference point 140, conic surface 141, pointing reference 142 and cross section 143.
- a conic section having cross sections 144 and 145 form a range gate which may be used to limit database searches.
- Figure 15 shows an example where a system reference pointing direction is not colinear with the axis of an address indicator.
- Reference point 150 is coupled with conic surface 151, axis 152, and cross section 153 to form a conic shaped address indicator.
- a system reference direction 154 on the surface of the cone may be used for system attitude measurements.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU29714/99A AU2971499A (en) | 1998-02-18 | 1999-02-17 | Apparatus and methods for presentation of information relating to objects being addressed |
CA002321448A CA2321448A1 (en) | 1998-02-18 | 1999-02-17 | Apparatus and methods for presentation of information relating to objects being addressed |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US7504798P | 1998-02-18 | 1998-02-18 | |
US60/075,047 | 1998-02-18 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO1999042946A2 true WO1999042946A2 (en) | 1999-08-26 |
WO1999042946A3 WO1999042946A3 (en) | 1999-10-28 |
Family
ID=22123200
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US1999/003688 WO1999042947A2 (en) | 1998-02-18 | 1999-02-17 | Apparatus and methods for presentation of information relating to objects being addressed |
PCT/US1999/003531 WO1999042946A2 (en) | 1998-02-18 | 1999-02-17 | Apparatus and methods for presentation of information relating to objects being addressed |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US1999/003688 WO1999042947A2 (en) | 1998-02-18 | 1999-02-17 | Apparatus and methods for presentation of information relating to objects being addressed |
Country Status (3)
Country | Link |
---|---|
AU (2) | AU2971499A (en) |
CA (1) | CA2321448A1 (en) |
WO (2) | WO1999042947A2 (en) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6470264B2 (en) | 1997-06-03 | 2002-10-22 | Stephen Bide | Portable information-providing apparatus |
WO2003047285A1 (en) * | 2001-11-20 | 2003-06-05 | Siemens Aktiengesellschaft | Method and device for displaying data |
US6978295B2 (en) | 2001-01-31 | 2005-12-20 | Fujitsu Limited | Server apparatus for space information service, space information service providing method, and charge processing apparatus and charging method for space information service |
US7096233B2 (en) | 2001-01-31 | 2006-08-22 | Fujitsu Limited | Server, user terminal, information providing service system and information providing service method for providing information in conjunction with a geographical mapping application |
US7290000B2 (en) | 2000-10-18 | 2007-10-30 | Fujitsu Limited | Server, user terminal, information providing service system, and information providing service method |
US7427980B1 (en) | 2008-03-31 | 2008-09-23 | International Business Machines Corporation | Game controller spatial detection |
EP2202616A2 (en) | 2000-05-16 | 2010-06-30 | Nokia Corporation | A method and apparatus to browse and access downloaded contextual information |
US8218873B2 (en) | 2000-11-06 | 2012-07-10 | Nant Holdings Ip, Llc | Object information derived from object images |
US8224078B2 (en) | 2000-11-06 | 2012-07-17 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8224077B2 (en) | 2000-11-06 | 2012-07-17 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US8588527B2 (en) | 2000-11-06 | 2013-11-19 | Nant Holdings Ip, Llc | Object information derived from object images |
US8810598B2 (en) | 2011-04-08 | 2014-08-19 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US9310892B2 (en) | 2000-11-06 | 2016-04-12 | Nant Holdings Ip, Llc | Object information derived from object images |
US9526658B2 (en) | 2010-02-24 | 2016-12-27 | Nant Holdings Ip, Llc | Augmented reality panorama supporting visually impaired individuals |
US9679414B2 (en) | 2013-03-01 | 2017-06-13 | Apple Inc. | Federated mobile device positioning |
US9928652B2 (en) | 2013-03-01 | 2018-03-27 | Apple Inc. | Registration between actual mobile device position and environmental model |
US10140317B2 (en) | 2013-10-17 | 2018-11-27 | Nant Holdings Ip, Llc | Wide area augmented reality location-based services |
US10617568B2 (en) | 2000-11-06 | 2020-04-14 | Nant Holdings Ip, Llc | Image capture and identification system and process |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090315766A1 (en) | 2008-06-19 | 2009-12-24 | Microsoft Corporation | Source switching for devices supporting dynamic direction information |
US8467991B2 (en) | 2008-06-20 | 2013-06-18 | Microsoft Corporation | Data services based on gesture and location information of device |
US20100228612A1 (en) * | 2009-03-09 | 2010-09-09 | Microsoft Corporation | Device transaction model and services based on directional information of device |
US8872767B2 (en) | 2009-07-07 | 2014-10-28 | Microsoft Corporation | System and method for converting gestures into digital graffiti |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5454043A (en) * | 1993-07-30 | 1995-09-26 | Mitsubishi Electric Research Laboratories, Inc. | Dynamic and static hand gesture recognition through low-level image analysis |
US5696837A (en) * | 1994-05-05 | 1997-12-09 | Sri International | Method and apparatus for transforming coordinate systems in a telemanipulation system |
US5796386A (en) * | 1995-01-23 | 1998-08-18 | International Business Machines Corporation | Precise calibration procedure for sensor-based view point control system |
US5801704A (en) * | 1994-08-22 | 1998-09-01 | Hitachi, Ltd. | Three-dimensional input device with displayed legend and shape-changing cursor |
US5929848A (en) * | 1994-11-02 | 1999-07-27 | Visible Interactive Corporation | Interactive personal interpretive device and system for retrieving information about a plurality of objects |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5528518A (en) * | 1994-10-25 | 1996-06-18 | Laser Technology, Inc. | System and method for collecting data used to form a geographic information system database |
JPH09114851A (en) * | 1995-10-20 | 1997-05-02 | Fuji Xerox Co Ltd | Information managing device |
JP3264614B2 (en) * | 1996-01-30 | 2002-03-11 | 富士写真光機株式会社 | Observation device |
US5902347A (en) * | 1996-11-19 | 1999-05-11 | American Navigation Systems, Inc. | Hand-held GPS-mapping device |
-
1999
- 1999-02-17 CA CA002321448A patent/CA2321448A1/en not_active Abandoned
- 1999-02-17 AU AU29714/99A patent/AU2971499A/en not_active Abandoned
- 1999-02-17 AU AU28710/99A patent/AU2871099A/en not_active Abandoned
- 1999-02-17 WO PCT/US1999/003688 patent/WO1999042947A2/en active Application Filing
- 1999-02-17 WO PCT/US1999/003531 patent/WO1999042946A2/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5454043A (en) * | 1993-07-30 | 1995-09-26 | Mitsubishi Electric Research Laboratories, Inc. | Dynamic and static hand gesture recognition through low-level image analysis |
US5696837A (en) * | 1994-05-05 | 1997-12-09 | Sri International | Method and apparatus for transforming coordinate systems in a telemanipulation system |
US5801704A (en) * | 1994-08-22 | 1998-09-01 | Hitachi, Ltd. | Three-dimensional input device with displayed legend and shape-changing cursor |
US5929848A (en) * | 1994-11-02 | 1999-07-27 | Visible Interactive Corporation | Interactive personal interpretive device and system for retrieving information about a plurality of objects |
US5796386A (en) * | 1995-01-23 | 1998-08-18 | International Business Machines Corporation | Precise calibration procedure for sensor-based view point control system |
Cited By (142)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6470264B2 (en) | 1997-06-03 | 2002-10-22 | Stephen Bide | Portable information-providing apparatus |
EP2202616A2 (en) | 2000-05-16 | 2010-06-30 | Nokia Corporation | A method and apparatus to browse and access downloaded contextual information |
EP3147759A1 (en) | 2000-05-16 | 2017-03-29 | Nokia Technologies Oy | A method and apparatus to browse and access downloaded contextual information |
EP2202617A2 (en) | 2000-05-16 | 2010-06-30 | Nokia Corporation | A method and apparatus to browse and access downloaded contextual information |
US7290000B2 (en) | 2000-10-18 | 2007-10-30 | Fujitsu Limited | Server, user terminal, information providing service system, and information providing service method |
US9087240B2 (en) | 2000-11-06 | 2015-07-21 | Nant Holdings Ip, Llc | Object information derived from object images |
US8478037B2 (en) | 2000-11-06 | 2013-07-02 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9116920B2 (en) | 2000-11-06 | 2015-08-25 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8218873B2 (en) | 2000-11-06 | 2012-07-10 | Nant Holdings Ip, Llc | Object information derived from object images |
US8218874B2 (en) | 2000-11-06 | 2012-07-10 | Nant Holdings Ip, Llc | Object information derived from object images |
US8224078B2 (en) | 2000-11-06 | 2012-07-17 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8224077B2 (en) | 2000-11-06 | 2012-07-17 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US8224079B2 (en) | 2000-11-06 | 2012-07-17 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8326031B2 (en) | 2000-11-06 | 2012-12-04 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9135355B2 (en) | 2000-11-06 | 2015-09-15 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8437544B2 (en) | 2000-11-06 | 2013-05-07 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8457395B2 (en) | 2000-11-06 | 2013-06-04 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8463030B2 (en) | 2000-11-06 | 2013-06-11 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8463031B2 (en) | 2000-11-06 | 2013-06-11 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8467602B2 (en) | 2000-11-06 | 2013-06-18 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8467600B2 (en) | 2000-11-06 | 2013-06-18 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8478036B2 (en) | 2000-11-06 | 2013-07-02 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9141714B2 (en) | 2000-11-06 | 2015-09-22 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8478047B2 (en) | 2000-11-06 | 2013-07-02 | Nant Holdings Ip, Llc | Object information derived from object images |
US8488880B2 (en) | 2000-11-06 | 2013-07-16 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8494271B2 (en) | 2000-11-06 | 2013-07-23 | Nant Holdings Ip, Llc | Object information derived from object images |
US8494264B2 (en) | 2000-11-06 | 2013-07-23 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US8498484B2 (en) | 2000-11-06 | 2013-07-30 | Nant Holdingas IP, LLC | Object information derived from object images |
US8520942B2 (en) | 2000-11-06 | 2013-08-27 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8548245B2 (en) | 2000-11-06 | 2013-10-01 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8548278B2 (en) | 2000-11-06 | 2013-10-01 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8582817B2 (en) | 2000-11-06 | 2013-11-12 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US8588527B2 (en) | 2000-11-06 | 2013-11-19 | Nant Holdings Ip, Llc | Object information derived from object images |
US8712193B2 (en) | 2000-11-06 | 2014-04-29 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8718410B2 (en) | 2000-11-06 | 2014-05-06 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8774463B2 (en) | 2000-11-06 | 2014-07-08 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8792750B2 (en) | 2000-11-06 | 2014-07-29 | Nant Holdings Ip, Llc | Object information derived from object images |
US8798368B2 (en) | 2000-11-06 | 2014-08-05 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8798322B2 (en) | 2000-11-06 | 2014-08-05 | Nant Holdings Ip, Llc | Object information derived from object images |
US10772765B2 (en) | 2000-11-06 | 2020-09-15 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8824738B2 (en) | 2000-11-06 | 2014-09-02 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US8837868B2 (en) | 2000-11-06 | 2014-09-16 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8842941B2 (en) | 2000-11-06 | 2014-09-23 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8849069B2 (en) | 2000-11-06 | 2014-09-30 | Nant Holdings Ip, Llc | Object information derived from object images |
US8855423B2 (en) | 2000-11-06 | 2014-10-07 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8861859B2 (en) | 2000-11-06 | 2014-10-14 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8867839B2 (en) | 2000-11-06 | 2014-10-21 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8873891B2 (en) | 2000-11-06 | 2014-10-28 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8885982B2 (en) | 2000-11-06 | 2014-11-11 | Nant Holdings Ip, Llc | Object information derived from object images |
US8885983B2 (en) | 2000-11-06 | 2014-11-11 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8923563B2 (en) | 2000-11-06 | 2014-12-30 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8938096B2 (en) | 2000-11-06 | 2015-01-20 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8948460B2 (en) | 2000-11-06 | 2015-02-03 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8948459B2 (en) | 2000-11-06 | 2015-02-03 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US8948544B2 (en) | 2000-11-06 | 2015-02-03 | Nant Holdings Ip, Llc | Object information derived from object images |
US9014516B2 (en) | 2000-11-06 | 2015-04-21 | Nant Holdings Ip, Llc | Object information derived from object images |
US9014513B2 (en) | 2000-11-06 | 2015-04-21 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9014515B2 (en) | 2000-11-06 | 2015-04-21 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9014514B2 (en) | 2000-11-06 | 2015-04-21 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9014512B2 (en) | 2000-11-06 | 2015-04-21 | Nant Holdings Ip, Llc | Object information derived from object images |
US9020305B2 (en) | 2000-11-06 | 2015-04-28 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9025814B2 (en) | 2000-11-06 | 2015-05-05 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9025813B2 (en) | 2000-11-06 | 2015-05-05 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9031290B2 (en) | 2000-11-06 | 2015-05-12 | Nant Holdings Ip, Llc | Object information derived from object images |
US9031278B2 (en) | 2000-11-06 | 2015-05-12 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9036949B2 (en) | 2000-11-06 | 2015-05-19 | Nant Holdings Ip, Llc | Object information derived from object images |
US9036862B2 (en) | 2000-11-06 | 2015-05-19 | Nant Holdings Ip, Llc | Object information derived from object images |
US9036948B2 (en) | 2000-11-06 | 2015-05-19 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9036947B2 (en) | 2000-11-06 | 2015-05-19 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9046930B2 (en) | 2000-11-06 | 2015-06-02 | Nant Holdings Ip, Llc | Object information derived from object images |
US10639199B2 (en) | 2000-11-06 | 2020-05-05 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9104916B2 (en) | 2000-11-06 | 2015-08-11 | Nant Holdings Ip, Llc | Object information derived from object images |
US9110925B2 (en) | 2000-11-06 | 2015-08-18 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US10635714B2 (en) | 2000-11-06 | 2020-04-28 | Nant Holdings Ip, Llc | Object information derived from object images |
US8335351B2 (en) | 2000-11-06 | 2012-12-18 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US10617568B2 (en) | 2000-11-06 | 2020-04-14 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9148562B2 (en) | 2000-11-06 | 2015-09-29 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9154694B2 (en) | 2000-11-06 | 2015-10-06 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9154695B2 (en) | 2000-11-06 | 2015-10-06 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9152864B2 (en) | 2000-11-06 | 2015-10-06 | Nant Holdings Ip, Llc | Object information derived from object images |
US9170654B2 (en) | 2000-11-06 | 2015-10-27 | Nant Holdings Ip, Llc | Object information derived from object images |
US9182828B2 (en) | 2000-11-06 | 2015-11-10 | Nant Holdings Ip, Llc | Object information derived from object images |
US9235600B2 (en) | 2000-11-06 | 2016-01-12 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9244943B2 (en) | 2000-11-06 | 2016-01-26 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9262440B2 (en) | 2000-11-06 | 2016-02-16 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9288271B2 (en) | 2000-11-06 | 2016-03-15 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US9311554B2 (en) | 2000-11-06 | 2016-04-12 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9311552B2 (en) | 2000-11-06 | 2016-04-12 | Nant Holdings IP, LLC. | Image capture and identification system and process |
US9310892B2 (en) | 2000-11-06 | 2016-04-12 | Nant Holdings Ip, Llc | Object information derived from object images |
US9311553B2 (en) | 2000-11-06 | 2016-04-12 | Nant Holdings IP, LLC. | Image capture and identification system and process |
US9317769B2 (en) | 2000-11-06 | 2016-04-19 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9324004B2 (en) | 2000-11-06 | 2016-04-26 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9330326B2 (en) | 2000-11-06 | 2016-05-03 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9330328B2 (en) | 2000-11-06 | 2016-05-03 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9330327B2 (en) | 2000-11-06 | 2016-05-03 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9336453B2 (en) | 2000-11-06 | 2016-05-10 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9342748B2 (en) | 2000-11-06 | 2016-05-17 | Nant Holdings Ip. Llc | Image capture and identification system and process |
US9360945B2 (en) | 2000-11-06 | 2016-06-07 | Nant Holdings Ip Llc | Object information derived from object images |
US10509820B2 (en) | 2000-11-06 | 2019-12-17 | Nant Holdings Ip, Llc | Object information derived from object images |
US10509821B2 (en) | 2000-11-06 | 2019-12-17 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US9536168B2 (en) | 2000-11-06 | 2017-01-03 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9578107B2 (en) | 2000-11-06 | 2017-02-21 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US10500097B2 (en) | 2000-11-06 | 2019-12-10 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9613284B2 (en) | 2000-11-06 | 2017-04-04 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US10095712B2 (en) | 2000-11-06 | 2018-10-09 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US9785859B2 (en) | 2000-11-06 | 2017-10-10 | Nant Holdings Ip Llc | Image capture and identification system and process |
US9785651B2 (en) | 2000-11-06 | 2017-10-10 | Nant Holdings Ip, Llc | Object information derived from object images |
US9805063B2 (en) | 2000-11-06 | 2017-10-31 | Nant Holdings Ip Llc | Object information derived from object images |
US9808376B2 (en) | 2000-11-06 | 2017-11-07 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9824099B2 (en) | 2000-11-06 | 2017-11-21 | Nant Holdings Ip, Llc | Data capture and identification system and process |
US10089329B2 (en) | 2000-11-06 | 2018-10-02 | Nant Holdings Ip, Llc | Object information derived from object images |
US9844469B2 (en) | 2000-11-06 | 2017-12-19 | Nant Holdings Ip Llc | Image capture and identification system and process |
US9844468B2 (en) | 2000-11-06 | 2017-12-19 | Nant Holdings Ip Llc | Image capture and identification system and process |
US9844467B2 (en) | 2000-11-06 | 2017-12-19 | Nant Holdings Ip Llc | Image capture and identification system and process |
US9844466B2 (en) | 2000-11-06 | 2017-12-19 | Nant Holdings Ip Llc | Image capture and identification system and process |
US10080686B2 (en) | 2000-11-06 | 2018-09-25 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US6978295B2 (en) | 2001-01-31 | 2005-12-20 | Fujitsu Limited | Server apparatus for space information service, space information service providing method, and charge processing apparatus and charging method for space information service |
US7096233B2 (en) | 2001-01-31 | 2006-08-22 | Fujitsu Limited | Server, user terminal, information providing service system and information providing service method for providing information in conjunction with a geographical mapping application |
WO2003047285A1 (en) * | 2001-11-20 | 2003-06-05 | Siemens Aktiengesellschaft | Method and device for displaying data |
US7427980B1 (en) | 2008-03-31 | 2008-09-23 | International Business Machines Corporation | Game controller spatial detection |
US11348480B2 (en) | 2010-02-24 | 2022-05-31 | Nant Holdings Ip, Llc | Augmented reality panorama systems and methods |
US10535279B2 (en) | 2010-02-24 | 2020-01-14 | Nant Holdings Ip, Llc | Augmented reality panorama supporting visually impaired individuals |
US9526658B2 (en) | 2010-02-24 | 2016-12-27 | Nant Holdings Ip, Llc | Augmented reality panorama supporting visually impaired individuals |
US10403051B2 (en) | 2011-04-08 | 2019-09-03 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US8810598B2 (en) | 2011-04-08 | 2014-08-19 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US9396589B2 (en) | 2011-04-08 | 2016-07-19 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US11107289B2 (en) | 2011-04-08 | 2021-08-31 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US9824501B2 (en) | 2011-04-08 | 2017-11-21 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US11967034B2 (en) | 2011-04-08 | 2024-04-23 | Nant Holdings Ip, Llc | Augmented reality object management system |
US11869160B2 (en) | 2011-04-08 | 2024-01-09 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US11854153B2 (en) | 2011-04-08 | 2023-12-26 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US10726632B2 (en) | 2011-04-08 | 2020-07-28 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US11514652B2 (en) | 2011-04-08 | 2022-11-29 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US10127733B2 (en) | 2011-04-08 | 2018-11-13 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US10909763B2 (en) | 2013-03-01 | 2021-02-02 | Apple Inc. | Registration between actual mobile device position and environmental model |
US9679414B2 (en) | 2013-03-01 | 2017-06-13 | Apple Inc. | Federated mobile device positioning |
US9928652B2 (en) | 2013-03-01 | 2018-03-27 | Apple Inc. | Registration between actual mobile device position and environmental model |
US11532136B2 (en) | 2013-03-01 | 2022-12-20 | Apple Inc. | Registration between actual mobile device position and environmental model |
US10217290B2 (en) | 2013-03-01 | 2019-02-26 | Apple Inc. | Registration between actual mobile device position and environmental model |
US10140317B2 (en) | 2013-10-17 | 2018-11-27 | Nant Holdings Ip, Llc | Wide area augmented reality location-based services |
US10664518B2 (en) | 2013-10-17 | 2020-05-26 | Nant Holdings Ip, Llc | Wide area augmented reality location-based services |
US11392636B2 (en) | 2013-10-17 | 2022-07-19 | Nant Holdings Ip, Llc | Augmented reality position-based service, methods, and systems |
Also Published As
Publication number | Publication date |
---|---|
WO1999042947A3 (en) | 1999-12-02 |
AU2971499A (en) | 1999-09-06 |
CA2321448A1 (en) | 1999-08-26 |
WO1999042946A3 (en) | 1999-10-28 |
WO1999042947A2 (en) | 1999-08-26 |
AU2871099A (en) | 1999-09-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6173239B1 (en) | Apparatus and methods for presentation of information relating to objects being addressed | |
US6522292B1 (en) | Information systems having position measuring capacity | |
WO1999042946A2 (en) | Apparatus and methods for presentation of information relating to objects being addressed | |
US6396475B1 (en) | Apparatus and methods of the remote address of objects | |
US9913098B2 (en) | Mobile device and geographic information system background and summary of the related art | |
US20030184594A1 (en) | Apparatus and methods for interfacing with remote addressing systems | |
US20090319177A1 (en) | Predictive services for devices supporting dynamic direction information | |
EP2498236B1 (en) | System, server, terminal apparatus, program and method for information providing. | |
EP2202616B1 (en) | A method and apparatus to browse and access downloaded contextual information | |
JP2011090482A (en) | Device, method and control program for displaying electronic signboard | |
US6671738B1 (en) | System and method of associating an object with a world wide web (WWW) site | |
Carswell et al. | Mobile visibility querying for LBS | |
CA2946686A1 (en) | Location error radius determination | |
WO2009111578A2 (en) | Mobile device and geographic information system background and summary of the related art | |
KR20220066987A (en) | Systems and methods for selecting a poi to associate with a navigation maneuver | |
KR20050087844A (en) | Providing a user with location-based information | |
WO2001071282A1 (en) | Information systems having directional interference facility | |
KR101568741B1 (en) | Information System based on mobile augmented reality | |
Razak et al. | Interactive android-based indoor parking lot vehicle locator using QR-code | |
US20140344681A1 (en) | Tour Guidance Systems | |
US20150358782A1 (en) | Catch the screen | |
US20180115619A1 (en) | System and method for attaching digital documents to physical objects | |
Gardiner et al. | EgoViz–a mobile based spatial interaction system | |
Rakhmania et al. | Implementation of Location Base Service Method Using WI-FI Network For Object Recognition at Museum | |
Ghiani et al. | Supporting Mobile Users in Selecting Target Devices. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AU CA CH CN JP KR NZ |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
AK | Designated states |
Kind code of ref document: A3 Designated state(s): AU CA CH CN JP KR NZ |
|
AL | Designated countries for regional patents |
Kind code of ref document: A3 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
ENP | Entry into the national phase in: |
Ref country code: CA Ref document number: 2321448 Kind code of ref document: A Format of ref document f/p: F Ref document number: 2321448 Country of ref document: CA |
|
ENP | Entry into the national phase in: |
Ref country code: JP Ref document number: 2000 532809 Kind code of ref document: A Format of ref document f/p: F |
|
NENP | Non-entry into the national phase in: |
Ref country code: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 29714/99 Country of ref document: AU |
|
122 | Ep: pct application non-entry in european phase | ||
NENP | Non-entry into the national phase in: |
Ref country code: JP |