US20110115606A1 - Touch sensitive panel in vehicle for user identification - Google Patents

Touch sensitive panel in vehicle for user identification Download PDF

Info

Publication number
US20110115606A1
US20110115606A1 US12/912,637 US91263710A US2011115606A1 US 20110115606 A1 US20110115606 A1 US 20110115606A1 US 91263710 A US91263710 A US 91263710A US 2011115606 A1 US2011115606 A1 US 2011115606A1
Authority
US
United States
Prior art keywords
touch pad
user
pad input
identifying
finger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/912,637
Inventor
Qiang Fu
Jeyhan Karaoguz
Tom W. Kwan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avago Technologies International Sales Pte Ltd
Original Assignee
Broadcom Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Broadcom Corp filed Critical Broadcom Corp
Priority to US12/912,637 priority Critical patent/US20110115606A1/en
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KARAOGUZ, JEYHAN, KWAN, TOM W., FU, QIANG
Publication of US20110115606A1 publication Critical patent/US20110115606A1/en
Assigned to BANK OF AMERICA, N.A., AS COLLATERAL AGENT reassignment BANK OF AMERICA, N.A., AS COLLATERAL AGENT PATENT SECURITY AGREEMENT Assignors: BROADCOM CORPORATION
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BROADCOM CORPORATION
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS Assignors: BANK OF AMERICA, N.A., AS COLLATERAL AGENT
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints

Definitions

  • the present invention relates generally to vehicle operations; and more particularly to electronics for controlling vehicle operations.
  • Vehicles include numerous electrical systems, including vehicle control systems, entertainment systems, navigation systems, and seating systems, among others.
  • the vehicle control systems control include vehicle engines/motor systems, drive train control systems, suspension control systems, braking systems, lighting systems, cruise control systems, fuel systems, etc.
  • Some of these systems are customizable for particular users. For example, seat positions can be memorized for particular users and enacted upon identification of the user, typically by the user pressing a corresponding numbered button.
  • Other user identification techniques include assigning a key fob to a particular user, identifying when the key fob is in use with the vehicle, and enacting settings particular to the key fob. Button depression requires user interaction. Key fobs may be exchanged between users. Both of these user identification techniques therefore have shortcomings.
  • FIG. 1A is a diagram illustrating a steering wheel having touch pads installed thereon and operative therewith according to one or more embodiments of the present invention
  • FIG. 1B is a diagram illustrating a gear shift selector having a touch pad installed thereon and operative therewith according to one or more embodiments of the present invention
  • FIG. 2 is a block diagram illustrating a vehicle control system processing module and a plurality of touch pad modules constructed and operating according to one or more embodiments of the present invention
  • FIG. 3 is a block diagram illustrating a vehicle control system processing module, a plurality of touch pad modules, and a plurality of other vehicle systems constructed and operating according to one or more embodiments of the present invention
  • FIG. 4 is a block diagram illustrating a touch pad and touch pad circuitry constructed according to one or more embodiments of the present invention
  • FIG. 5A is a diagram illustrating how a user's hand may overlay a touch pad according to one or more embodiments of the present invention
  • FIG. 5B is a diagram illustrating the manner in which a user's hand upon the touch pad produces a particular pattern of capacitance upon the touch sensitive elements of the touch pad;
  • FIG. 6 is a flowchart illustrating operations of a vehicle control system to identify a user using touch pad input and to alter vehicle settings according to one or more embodiments of the present invention
  • FIG. 7 is a flowchart illustrating alternative operations of a vehicle control system to identify a user using touch pad input and to alter vehicle settings according to one or more embodiments of the present invention
  • FIG. 8 is a flowchart illustrating processing touch pad input to determine user finger/hand characteristics according to one or more embodiments of the present invention.
  • FIG. 9A is a flowchart illustrating processing touch pad input to determine heat transfer characteristics of a user's fingers and using the heat transfer characteristics to identify a user according to one or more embodiments of the present invention
  • FIG. 9B is a flowchart illustrating processing touch pad input to determine pulse rate characteristics of a user's fingers and using the pulse rate characteristics to identify a user according to one or more embodiments of the present invention
  • FIG. 10A is a flowchart illustrating the use of vehicle location data to assist in identifying a user according to one or more embodiments of the present invention
  • FIG. 10B is a flowchart illustrating the use of voice data to assist in identifying a user according to one or more embodiments of the present invention
  • FIG. 11 is a flowchart illustrating multiple modes of user identification operations of a vehicle control system according to one or more embodiments of the present invention.
  • FIG. 12 is a flowchart illustrating the operation of a vehicle control system in deleting non-matched users after expiration of a user identification period according to one or more embodiments of the present invention.
  • FIG. 13 is a flowchart illustrating the use of user preference data to assist in identifying a user by a vehicle control system according to one or more embodiments of the present invention.
  • FIG. 1A is a diagram illustrating a steering wheel having touch pads installed thereon and operative therewith according to one or more embodiments of the present invention.
  • FIG. 1B is a diagram illustrating a gear shift selector (gear shifter) having a touch pad installed thereon and operative therewith according to one or more embodiments of the present invention.
  • touch pads may be used in an automobile, a boat, an aircraft, or any other vehicle in order to identify a user of the vehicle. These touch pads may be used for user identification and/or to receive user input. While only a steering wheel and gear shift selector are shown in FIGS. 1A and 1B , touch pads could be located on other parts of a vehicle such as a door handle, arm rest, throttle control, dash panel, or other suitable location that a user could touch for the purposes and operations described herein.
  • the steering wheel includes a steering wheel hub 104 and a steering wheel ring 102 of a conventional design.
  • the steering wheel ring 102 includes four touch pads 106 , 108 , 110 , and 112 .
  • Each of these touch pads 108 , 106 , 110 , and 112 includes a plurality of touch sensitive elements that are able to detect the presence of a user's hands/fingers. For example, in the ten o'clock two o'clock driving position, a user's hands would be detected by touch pads 106 and 108 . Further, at four o'clock and eight o'clock driving position, a user's hands would be detected at touch pads 110 and 112 .
  • a user's hand may be detected at any of the touch pads 106 , 108 , 110 , or 112 .
  • operation as according to the present invention may detect the position of the user's hands, the distance between his fingers, the position of the user's knuckles, the pressure that is asserted and applied to the steering wheel, relative heat transfer characteristics of the user's hands, and other unique identifiers of the user that could be employed to identify the user from a number of users.
  • Detection and identification of a user via his or her touch of the steering wheel may be employed to establish particular vehicle settings based upon the user identification. These settings may include entertainment system settings, environmental system settings such as temperature, suspension system settings, seat settings, minor settings, and other settings that may be pre-programmed for a user and that are initiated based upon an identification user.
  • each of the touch pads 108 , 106 , 110 , and 112 may also be used as input to control operations of the vehicle.
  • each of these touch pads could be used to adjust the audio control of an audio device of the vehicle.
  • the touch pad input received via touch pad 106 , 108 , 110 , and/or 112 may be employed to adjust the volume, tuning, track selection, or other settings of an audio system.
  • these touch pads may be used to adjust the environmental system settings within the vehicle.
  • these touch pads may serve as input devices to establish telephone calls via a coupled cellular telephone within the vehicle or via an Internet connection supported by the vehicle.
  • these touch pads 106 , 108 , 110 , and 112 of the steering wheel may be employed for any other types of input as well that may be used for an automobile or other vehicle.
  • a gear shift controller 152 includes one or more touch pads 154 .
  • the touch pad 154 may be used to identify a user of a vehicle in which the gear selector 152 is installed.
  • the gear shift 152 may be installed within an automobile, a boat, an aircraft, or another vehicle.
  • the touch pad 154 may further be enabled to initiate input to the vehicle in which the gear shift 152 resides.
  • the touch pad 154 may be used to adjust not only the particular gear that the automobile is in but the suspension mode that the vehicle is in as well.
  • the input provided via touch pads 106 , 108 , 110 , 112 , and/or 154 of the devices of FIGS. 1A and 1B may be used also to adjust the suspension characteristics of the vehicle, the cruise control characteristics of the vehicle, minor settings of the vehicle, lighting settings of the vehicle, and/or other vehicle settings.
  • a sequence of touches provided via the various touch pads 106 , 108 , 110 , 112 , and/or 154 may be employed to initiate various operations of a computer of a particular vehicle.
  • the touch input may be used to alter certain settings of a vehicle computer by tapping particular touch pads once or twice and then tapping other pads once or twice in order to initiate a control sequence.
  • Touch pad input may be capacitance, inductance, RF propagation characteristics, or a combination of these as measured at the touch sensitive elements of the touch pads.
  • the touch pads may be employed to identify users based upon relative characteristics of the users' fingers or hands as they grasp or rest upon portions of a vehicle.
  • the touch pads 106 , 108 , 110 , 112 , and 154 may capture finger print patterns. The information regarding the user that is received via touch pads 106 , 108 , 110 , 112 , and 154 may be relayed to a vehicle control system processing module, as will be described further herein.
  • FIG. 2 is a block diagram illustrating a vehicle control system constructed and operating according to one or more embodiments of the present invention.
  • the vehicle control system of FIG. 2 includes a vehicle control system processing module 202 that couples to a plurality of touch pad modules 214 A, 214 B, 214 C, and 214 D via one or more communication links 220 .
  • the vehicle control system processing module 202 includes a wireless interface 204 , processing circuitry 206 , one or more wired interfaces 210 , and memory 208 .
  • the vehicle control system processing module 202 typically also includes a user interface 212 and may include other components that are not shown such as at least one video interface, at least one audio interface, and m a video camera/video camera interface.
  • the wireless interfaces 204 support wireless communications with various intra-vehicle components and various extra-vehicle components.
  • the wireless interfaces 204 may support communications via cellular networks, Wireless Wide Area Network (WWAN) networks, Wireless Local Area Networks (WLANs), Wireless Personal Area Networks (WPANs), satellite communication networks, millimeter wave networks, etc. and may support proprietary communication formats.
  • WWAN Wireless Wide Area Network
  • WLANs Wireless Local Area Networks
  • WPANs Wireless Personal Area Networks
  • satellite communication networks millimeter wave networks, etc. and may support proprietary communication formats.
  • the processing circuitry 206 may include one or more of a system processor, a digital signal processor, a processing module, dedicated hardware, application specific integrated circuit, or other circuitry that is capable of executing software instructions and for processing data.
  • the memory 208 may be RAM, ROM, FLASH RAM, FLASH ROM, an optical memory, magnetic memory, or other types of memory that is capable of storing data and/or instructions in allowing processing circuitry to access same.
  • the wired interfaces 210 may include a USB interface, a fire wire interface, a serial interface, a parallel interface, an optical interface, or another type of interface supported by a media that is copper, metal, or optical.
  • the user interface 212 may include keypad, video display, cursor control, touch pad, or other type of interface that allows a user to interface with the vehicle control system processing module 202 .
  • Each of the touch pad modules 214 A, 214 B, 214 C, and 214 D includes a respective touch pad interface (touch pad communication interface) 216 A, 216 B, 216 C, and 216 D, respectively.
  • These touch pad interfaces 216 A, 216 B, 216 C, and 216 D support communications with the vehicle control system processing module 202 via the communication links 220 .
  • the communication links 220 may be wired, wireless, or a combination of wired and wireless links.
  • Each touch pad module 214 A, 214 B, 214 C, and 214 D further includes respective touch pad circuitry 218 A, 218 B, 218 C, and 218 D, which is processing circuitry that interfaces with respective touch pads 220 A, 220 B, 220 C, and 220 D of the touch pad modules.
  • the touch pad circuitry 218 A, 218 B, 218 C, and 218 D is capable of processing touch pad input received from the touch pads 220 A, 220 B, 220 C, and 220 D, as will be further described with reference to FIG. 4 .
  • the touch pad circuitry 218 A, 218 B, 218 C, and 218 D is processing circuitry capable of executing software instructions to perform desired functions.
  • FIG. 3 is a block diagram illustrating a vehicle control system and a plurality of other vehicle systems constructed and operating according to one or more embodiments of the present invention.
  • the system of FIG. 3 includes a plurality of touch pad modules 214 A, 214 B, 214 C, and 214 D.
  • the vehicle systems include an entertainment system 302 , a navigation system 304 , a suspension system 306 , a seating system 308 , a minor control system 310 , a steering wheel system 312 , a climate control system 314 , a suspension system 316 , an engine control system 318 , a lighting system 320 , and a communication system 322 . All of these components are communicatively coupled via one or more communication links 220 , that are wired and/or wireless.
  • the communication system 322 supports extra-vehicular communications for the vehicle, which may be cellular communications, satellite communications, etc.
  • a user of the vehicle is identified based upon touch pad input received at one or more touch pads of one or more of the touch pad modules 214 A, 214 B, 214 C, and 214 D.
  • the touch pad input is processed by processing circuitry of the touch pad modules 214 A, 214 B, 214 C, and/or 214 D and/or the vehicle control system processing module 202 to identify a user of the vehicle.
  • the operation of one or more of the other systems 302 - 320 is modified. For example, identification of the user may be accomplished via touch pads of a steering wheel and/or in combination with touch pads of a gear shifter.
  • suspension settings, seat settings, mirror settings, climate control settings, steering wheel settings, and entertainment settings are altered to correspond to those of the identified user.
  • the reader will appreciate that any settings of any of the systems 302 - 320 of FIG. 3 may be modified based upon those corresponding to a particular user identity.
  • FIG. 4 is a block diagram illustrating a touch pad and touch pad circuitry constructed according to one or more embodiments of the present invention.
  • a touch pad 402 includes a plurality of touch sensitive elements 404 each of which corresponds to a particular location of the touch pad 402 .
  • the touch pad 402 includes an array of touch sensitive elements 404 , each of which may be a particular capacitively coupled location, inductively coupled location, or a radio frequency (RF) touch sensitive element.
  • Touch pad circuitry 406 couples via a grid structure to the plurality of touch sensitive elements 404 to sense the particular capacitance, inductive, or RF characteristics at each of the touch sensitive elements.
  • Touch pad circuitry 406 scans the plurality of touch sensitive elements 404 via access of particular row-column combinations at particular times.
  • the frequency or voltage at which the touch pad circuitry 406 scans the plurality of touch sensitive elements 404 may be altered over time. Choosing the scanning frequency or scanning voltage may be based upon a particular operational use of the touch pad. For example, the manner in which the touch pad is scanned will change based upon a particular operation of the touch pad, e.g., a first scanning frequency/scanning voltage may be employed for user identification while a second scanning frequency/scanning voltage may be employed for receiving user input.
  • the scanning done by the touch pad circuitry 406 of the plurality of touch sensitive elements may be made using a spread spectrum scanned frequency technique. Such technique may be employed to more efficiently capture information from the touch pad 402 at the various touch sensitive elements 404 or to determine which particular scanning frequencies are more successful than others in capturing input information.
  • each row and column corresponding to a particular touch sensitive element 404 may be altered based upon a detected capacitance (inductance/RF propagation) at the location.
  • a detected capacitance inductance/RF propagation
  • one particular touch sensitive element 404 may have a fixed capacitance that does not vary over time. Such fixed capacitance may indicate that the particular touch sensitive element 404 is inoperable or that it receives no discernable input. In such case, by not scanning the particular touch sensitive element, other touch sensitive elements may be more frequently scanned or energy may be saved by not scanning all touch sensitive elements.
  • some portions of the touch pad may be disabled while others are enabled at differing points in time. Enablement of some touch sensitive elements and not others may be based upon a custom configuration of the touch pad for a particular input function provided.
  • the touch pad 402 may also be calibrated by the touch pad circuitry 406 based upon the environmental factors such as temperature, humidity, and surrounding noise as detected by measured capacitance, inductance, or RF propagation characteristics. Calibration of the touch pad 402 allows the touch pad 402 to be more efficient and more effectively receive touch pad input for user identification and/or for other input purposes.
  • the calibration of the touch pad 402 by the touch pad circuitry 406 may be initiated at particular points in time.
  • the touch pad circuitry 406 may simply initiate calibration of the touch pad 402 upon the expiration of a timer such that the touch pad is calibrated at a particular regular time interval.
  • the touch pad 402 may be calibrated after a period of inactivity, i.e., the touch pad circuitry 406 performs calibration when it determines that no input is present on the touch pad 402 .
  • the touch pad 402 may be calibrated by the touch pad circuitry 406 using other input criteria as well.
  • FIG. 5A is a diagram illustrating how a user's hand may overlay a touch pad according to one or more embodiments of the present invention.
  • the touch pad 402 has a plurality of touch sensitive elements 404 and is mounted upon a portion of a vehicle so that it is adjacent a user's hand when the user holds the portion of the vehicle.
  • the outline 502 of users hand is shown as overlaying the touch pad 402 and the plurality of touch sensitive elements 404 . While the touch pad 402 of FIG. 5A is generally illustrated as planar, the touch pad 402 may wrap around a steering wheel, gear shifter, door handle, or another vehicle component.
  • FIG. 5B is a diagram illustrating the manner in which a user's hand upon the touch pad produces a particular pattern of capacitance (inductance/RF propagation) upon the touch pad.
  • a relative capacitance, inductance, or RF propagation pattern of the user's hand 502 is shown on touch pad 402 .
  • the depiction in FIG. 5B is illustrated in general only of relative capacitance at each of the user's finger location positions upon the touch pad 402 . For example, where the user's fingers touch physically the touch pad 402 , stronger capacitance lines 552 and 554 are shown. Where the user's fingers overlay the touch pad 402 , lesser capacitance, inductance, or RF propagation characteristic lines 554 are shown. While other capacitance lines on the touch pad 402 are not shown in FIG. 5B are numbered, the various capacitance lines would be present for the other fingers as well.
  • the capacitance pattern of the user's hand 502 upon the touch pad 402 is a signature of a particular user.
  • differing users can be identified.
  • the touch pad 402 may serve as an input device, the capacitance of the touch sensitive elements 404 of the touch pad of 402 over time as it varies may be used to indicate touch pad input.
  • the characteristics measured at each touch sensitive element 404 over time will enable the device to identify a user or to try particular input via the touch pad 402 .
  • FIG. 6 is a flowchart illustrating operations of a vehicle control system processing module and a touch pad module to identify a user using touch pad input and to alter vehicle settings according to one or more embodiments of the present invention.
  • Operations 600 begin when touch pad input is received from at least one touch sensitive element of a touch pad, step 602 .
  • the touch pad input has components from a plurality of touch sensitive elements of the touch pad.
  • the touch pad input is processed by touch pad circuitry to determine user finger characteristics, step 604 .
  • the user finger characteristics are then transmitted to the vehicle control system processing module via a communications interface, step 606 .
  • the vehicle control system processing module then processes the user finger characteristics to identify a user via pattern matching operations, step 608 .
  • the vehicle control system processing module may then alter vehicle settings based upon user identity, step 610 . Alternation of vehicle settings at step 610 may include the vehicle control system processing module sending direction(s) to the various vehicle systems described with reference to FIG. 3 .
  • the pattern recognition used at step 608 may be based upon user finger characteristics, hand characteristics, or a combination of these. These characteristics and processing employed to determine these characteristics are described further herein.
  • heat transfer characteristics of a user's fingers are also determined based upon touch pad input and the heat transfer characteristics can be used to assist in identifying a user.
  • Pulse rate characteristics of a user's fingers can be determined based upon the touch pad input and can be used to assist in identifying a user.
  • Location data can be received from a navigation system and can be used to assist in identifying a user.
  • Voice data can be received from a microphone and can be used to assist in identifying a user.
  • FIG. 7 is a flowchart illustrating alternative operations of a vehicle control system to identify a user using touch pad input and to alter vehicle settings according to one or more embodiments of the present invention.
  • Operations 700 begin when touch pad input is received from at least one touch sensitive element of a touch pad, step 702 .
  • Processing circuitry processes the touch pad input to determine user finger characteristics, step 704 .
  • the processing circuitry then processes the user finger characteristics (and other information) to identify a user via pattern matching operations, step 706 .
  • the processing circuitry then alters vehicle settings/setting/choices based upon the user identity, step 708 , and the process ends.
  • FIG. 7 is a flowchart illustrating alternative operations of a vehicle control system to identify a user using touch pad input and to alter vehicle settings according to one or more embodiments of the present invention.
  • Operations 700 begin when touch pad input is received from at least one touch sensitive element of a touch pad, step 702 .
  • Processing circuitry processes the touch pad input to determine user finger characteristics, step
  • all operations are performed by a single element of the vehicle, e.g., the vehicle control system processing module 202 or touch pad module 214 A- 214 D of FIG. 2 , for example, with the device sending directions to vehicle system to alter vehicle system settings.
  • FIG. 8 is a flowchart illustrating processing touch pad input to determine user finger/hand characteristics according to one or more embodiments of the present invention.
  • Processing the touch pad input by processing circuitry to determine user finger/hand characteristics can be performed by one or more of the following: identifying at least one finger orientation based upon the touch pad input, step 802 ; identifying at least one finger spacing based upon the touch pad input, step 804 ; identifying at least one finger width based upon the touch pad input, step 806 ; identifying a plurality of finger knuckle/joint locations based upon the touch pad input, step 808 ; identifying a plurality of finger lengths based upon the touch pad input, step 810 .
  • User finger characteristics e.g., at least one finger orientation, at least one finger spacing, at least one finger width, a plurality of finger knuckle/joint locations, and a plurality of finger lengths, may be determined by either or both of the vehicle control system processing module and the touch pad circuitry.
  • the touch pad input can be processed by either/both the vehicle control system processing module and the touch pad circuitry to determine these characteristics. Once, determined, these characteristics are compared to stored data of the same type for stored users for identification. Upon initial setup, these characteristics are stored for a particular user.
  • FIG. 9A is a flowchart illustrating processing touch pad input to determine heat transfer characteristics of a user's fingers and using the heat transfer characteristics to identify a user according to one or more embodiments of the present invention.
  • the touch pad input is processed by processing circuitry of the touch pad module and/or the vehicle control system processing module.
  • Heat transfer characteristics of a user's fingers are determined based upon the touch pad input, step 902 .
  • the heat transfer characteristics are used to assist in identifying the user, step 904 . These heat transfer characteristics can be used in conjunction with user finger characteristics to identify the user.
  • FIG. 9B is a flowchart illustrating processing touch pad input to determine pulse rate characteristics of a user's fingers and using the pulse rate characteristics to identify a user according to one or more embodiments of the present invention.
  • the touch pad input is processed by touch pad processing circuitry and/or the vehicle control system processing module.
  • Pulse rate characteristics of a user's fingers are determined based upon the touch pad input, step 952 .
  • the pulse rate characteristics are used to assist in identifying the user, step 954 . These pulse rate characteristics can be used in conjunction with user finger characteristics to identify the user.
  • FIG. 10A is a flowchart illustrating the use of location data to assist in identifying a user according to one or more embodiments of the present invention.
  • Location data is received from a navigation system of the vehicle, for example, step 1002 .
  • the location data may be GPS data, for example.
  • the location data is transmitted to the vehicle control system processing module via the communications interface to assist in identifying the user, step 1004 .
  • the location data can be used in conjunction with user finger characteristics to identify the user. For example, one user of the vehicle may drive the vehicle to work while other users of the vehicle may only occasionally visit such location. In such case, using the location data makes identifying the user much easier.
  • FIG. 10B is a flowchart illustrating the use of voice data to assist in identifying a user according to one or more embodiments of the present invention.
  • Voice data is received from a microphone of the vehicle control system processing module or another vehicle component, step 1052 .
  • the voice data is transmitted to the vehicle control system processing module for processing to assist in identifying the user, step 1054 .
  • the voice data can be used in conjunction with user finger characteristics to identify the user.
  • the voice data may be processed prior to transmission to the vehicle control system processing module. Alternately, the voice data may be captured by the vehicle control system processing module and used by the vehicle control system processing module to identify a user to augment other data used to identify the user.
  • FIG. 11 is a flowchart illustrating multiple modes of user identification operations of a vehicle control system according to one or more embodiments of the present invention.
  • Operations 1100 begin when a user identification operations mode is selected, step 1102 .
  • a menu is provided to a user, step 1110 .
  • the menu allows the user to select a name and, optionally, other user profile data, such as entertainment system settings, suspension system settings, engine control system settings, etc.
  • Touch pad input is then captured and processed to determine finger/hand characteristics, step 1112 .
  • User identity and user preference profile/user preference data is established after fully interacting with the user, step 1114 .
  • the user profile is stored, step 1116 , and the process returns to the step of user identification operations mode selection, step 1102 .
  • the user profile includes a user ID, user system preferences, user touch pad characteristics, e.g., finger characteristics, hand characteristics, heat transfer characteristics, pulse characteristics, vehicle location characteristics, etc.
  • step 1106 touch pad input is captured, step 1118 .
  • the system partially interacts with the user to correlate processed touch pad input to user profiles, step 1120 .
  • a user is selected based upon touch pad input and user interaction, step 1122 .
  • Such partial interaction may query the user to indicate that a correct user ID was selected based upon finger/hand characteristics, for example. However, the extent of user interaction is much less than that of the initial user identification mode 1104 .
  • step 1108 touch pad input is captured, step 1124 .
  • the system correlates the processed touch pad input to user profiles without user interaction, step 1126 .
  • User is selected based upon only the touch pad input and user profiles, without additional user interaction, step 1128 . Thus, with the operations beginning at step 1108 no user interaction is required.
  • FIG. 12 is a flowchart illustrating the operation of a vehicle control system in deleting non-matched users after expiration of a user identification period according to one or more embodiments of the present invention.
  • Operations 1200 begin when a user profile is retrieved, step 1202 .
  • a determination is made regarding whether the user profile has been accessed prior to expiration of a deletion period, step 1204 . If No is determined at step 1204 , the user profile is deleted for the particular user, step 1206 . If Yes at step 1204 , the user profile has been accessed prior to expiration of deletion period and the user profile is not deleted. From both a Yes determination at step 1204 and after step 1206 , a determination is made regarding whether the process is complete, step 1208 . If a Yes determination is made at step 1208 , the process ends. If No, the next user profile is selected, step 1210 , and the process repeats to the determination step 1204 .
  • FIG. 13 is a flowchart illustrating the use of user preference data to assist in identifying a user by a vehicle control system according to one or more embodiments of the present invention.
  • User preference data is identified at step 1302 .
  • the user preference information includes vehicle system selection preferences, seat position settings, etc.
  • the user preference data is used to assist in identifying the user by comparing current vehicle settings and/or other pertinent information to the user preference data, step 1304 .
  • step 1304 only those two users may be prime candidates for pattern matching of finger/hand characteristics.
  • some users may be common active during particular hours of the day and these users are favored for pattern matching during those hours of the day.
  • circuit and “circuitry” as used herein may refer to an independent circuit or to a portion of a multifunctional circuit that performs multiple underlying functions.
  • processing circuitry may be implemented as a single chip processor or as a plurality of processing chips.
  • a first circuit and a second circuit may be combined in one embodiment into a single circuit or, in another embodiment, operate independently perhaps in separate chips.
  • chip refers to an integrated circuit. Circuits and circuitry may comprise general or specific purpose hardware, or may comprise such hardware and associated software such as firmware or object code.
  • the terms “substantially” and “approximately” provides an industry-accepted tolerance for its corresponding term and/or relativity between items. Such an industry-accepted tolerance ranges from less than one percent to fifty percent and corresponds to, but is not limited to, component values, integrated circuit process variations, temperature variations, rise and fall times, and/or thermal noise. Such relativity between items ranges from a difference of a few percent to magnitude differences.
  • the term(s) “coupled to” and/or “coupling” and/or includes direct coupling between items and/or indirect coupling between items via an intervening item (e.g., an item includes, but is not limited to, a component, an element, a circuit, and/or a module) where, for indirect coupling, the intervening item does not modify the information of a signal but may adjust its current level, voltage level, and/or power level.
  • an intervening item e.g., an item includes, but is not limited to, a component, an element, a circuit, and/or a module
  • inferred coupling includes direct and indirect coupling between two items in the same manner as “coupled to.”
  • operble to indicates that an item includes one or more of power connections, input(s), output(s), etc., to perform one or more its corresponding functions and may further include inferred coupling to one or more other items.
  • associated with includes direct and/or indirect coupling of separate items and/or one item being embedded within another item.
  • compares favorably indicates that a comparison between two or more items, signals, etc., provides a desired relationship. For example, when the desired relationship is that signal 1 has a greater magnitude than signal 2, a favorable comparison may be achieved when the magnitude of signal 1 is greater than that of signal 2 or when the magnitude of signal 2 is less than that of signal 1.

Abstract

A vehicle control system includes at least one touch pad having a plurality of touch sensitive elements and processing circuitry communicatively coupled to the at least one touch pad. The processing circuitry is operable to receive touch pad input from the at least one touch pad, the touch pad input corresponding to a user's touch of at least some of the plurality of touch sensitive elements. The processing circuitry further processes the touch pad input to determine user finger characteristics. The processing circuitry further processes the user finger characteristics to identify the user via pattern recognition and alters at least one vehicle setting based upon the identified user. Vehicle settings may be one or more of entertainment system settings, navigation system settings, suspension system settings, seat settings, mirror settings, steering wheel settings, climate control system settings, suspension system settings, engine control system settings, lighting system settings, and communication system settings.

Description

    CROSS-REFERENCE TO PRIORITY APPLICATION
  • The present U.S. Utility Patent Application claims priority pursuant to 35 U.S.C. §119(e) to U.S. Provisional Application Ser. No. 61/261,702, entitled “TOUCH PAD USER IDENTIFICATION, GAMING INPUT, AND PREFERENCE INPUT,” (Attorney Docket No. BP20924), filed Nov. 16, 2009, pending, which is hereby incorporated herein by reference in its entirety and made part of the present U.S. Utility Patent Application for all purposes.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates generally to vehicle operations; and more particularly to electronics for controlling vehicle operations.
  • 2. Description of the Related Art
  • Vehicles include numerous electrical systems, including vehicle control systems, entertainment systems, navigation systems, and seating systems, among others. The vehicle control systems control include vehicle engines/motor systems, drive train control systems, suspension control systems, braking systems, lighting systems, cruise control systems, fuel systems, etc. Some of these systems are customizable for particular users. For example, seat positions can be memorized for particular users and enacted upon identification of the user, typically by the user pressing a corresponding numbered button. Other user identification techniques include assigning a key fob to a particular user, identifying when the key fob is in use with the vehicle, and enacting settings particular to the key fob. Button depression requires user interaction. Key fobs may be exchanged between users. Both of these user identification techniques therefore have shortcomings.
  • BRIEF SUMMARY OF THE INVENTION
  • The present invention is directed to apparatus and methods of operation that are further described in the following Brief Description of the Drawings, the Detailed Description, and the claims. Other features and advantages of the present invention will become apparent from the following detailed description of the invention made with reference to the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1A is a diagram illustrating a steering wheel having touch pads installed thereon and operative therewith according to one or more embodiments of the present invention;
  • FIG. 1B is a diagram illustrating a gear shift selector having a touch pad installed thereon and operative therewith according to one or more embodiments of the present invention;
  • FIG. 2 is a block diagram illustrating a vehicle control system processing module and a plurality of touch pad modules constructed and operating according to one or more embodiments of the present invention;
  • FIG. 3 is a block diagram illustrating a vehicle control system processing module, a plurality of touch pad modules, and a plurality of other vehicle systems constructed and operating according to one or more embodiments of the present invention;
  • FIG. 4 is a block diagram illustrating a touch pad and touch pad circuitry constructed according to one or more embodiments of the present invention;
  • FIG. 5A is a diagram illustrating how a user's hand may overlay a touch pad according to one or more embodiments of the present invention;
  • FIG. 5B is a diagram illustrating the manner in which a user's hand upon the touch pad produces a particular pattern of capacitance upon the touch sensitive elements of the touch pad;
  • FIG. 6 is a flowchart illustrating operations of a vehicle control system to identify a user using touch pad input and to alter vehicle settings according to one or more embodiments of the present invention;
  • FIG. 7 is a flowchart illustrating alternative operations of a vehicle control system to identify a user using touch pad input and to alter vehicle settings according to one or more embodiments of the present invention;
  • FIG. 8 is a flowchart illustrating processing touch pad input to determine user finger/hand characteristics according to one or more embodiments of the present invention;
  • FIG. 9A is a flowchart illustrating processing touch pad input to determine heat transfer characteristics of a user's fingers and using the heat transfer characteristics to identify a user according to one or more embodiments of the present invention;
  • FIG. 9B is a flowchart illustrating processing touch pad input to determine pulse rate characteristics of a user's fingers and using the pulse rate characteristics to identify a user according to one or more embodiments of the present invention;
  • FIG. 10A is a flowchart illustrating the use of vehicle location data to assist in identifying a user according to one or more embodiments of the present invention;
  • FIG. 10B is a flowchart illustrating the use of voice data to assist in identifying a user according to one or more embodiments of the present invention;
  • FIG. 11 is a flowchart illustrating multiple modes of user identification operations of a vehicle control system according to one or more embodiments of the present invention;
  • FIG. 12 is a flowchart illustrating the operation of a vehicle control system in deleting non-matched users after expiration of a user identification period according to one or more embodiments of the present invention; and
  • FIG. 13 is a flowchart illustrating the use of user preference data to assist in identifying a user by a vehicle control system according to one or more embodiments of the present invention.
  • DETAILED DESCRIPTION
  • FIG. 1A is a diagram illustrating a steering wheel having touch pads installed thereon and operative therewith according to one or more embodiments of the present invention. FIG. 1B is a diagram illustrating a gear shift selector (gear shifter) having a touch pad installed thereon and operative therewith according to one or more embodiments of the present invention. Referring to both FIGS. 1A and 1B touch pads may be used in an automobile, a boat, an aircraft, or any other vehicle in order to identify a user of the vehicle. These touch pads may be used for user identification and/or to receive user input. While only a steering wheel and gear shift selector are shown in FIGS. 1A and 1B, touch pads could be located on other parts of a vehicle such as a door handle, arm rest, throttle control, dash panel, or other suitable location that a user could touch for the purposes and operations described herein.
  • Referring particularly to FIG. 1A, the steering wheel includes a steering wheel hub 104 and a steering wheel ring 102 of a conventional design. However, according to present invention, the steering wheel ring 102 includes four touch pads 106, 108, 110, and 112. Each of these touch pads 108, 106, 110, and 112 includes a plurality of touch sensitive elements that are able to detect the presence of a user's hands/fingers. For example, in the ten o'clock two o'clock driving position, a user's hands would be detected by touch pads 106 and 108. Further, at four o'clock and eight o'clock driving position, a user's hands would be detected at touch pads 110 and 112. Further, during single handed operation, a user's hand may be detected at any of the touch pads 106, 108, 110, or 112. In such case, upon detection of a user's hands/fingers, operation as according to the present invention may detect the position of the user's hands, the distance between his fingers, the position of the user's knuckles, the pressure that is asserted and applied to the steering wheel, relative heat transfer characteristics of the user's hands, and other unique identifiers of the user that could be employed to identify the user from a number of users.
  • Detection and identification of a user via his or her touch of the steering wheel may be employed to establish particular vehicle settings based upon the user identification. These settings may include entertainment system settings, environmental system settings such as temperature, suspension system settings, seat settings, minor settings, and other settings that may be pre-programmed for a user and that are initiated based upon an identification user.
  • Further, input received via each of the touch pads 108, 106, 110, and 112 may also be used as input to control operations of the vehicle. For example, each of these touch pads could be used to adjust the audio control of an audio device of the vehicle. The touch pad input received via touch pad 106, 108, 110, and/or 112 may be employed to adjust the volume, tuning, track selection, or other settings of an audio system. In another configuration, these touch pads may be used to adjust the environmental system settings within the vehicle. Moreover, these touch pads may serve as input devices to establish telephone calls via a coupled cellular telephone within the vehicle or via an Internet connection supported by the vehicle. As the user will appreciate, these touch pads 106, 108, 110, and 112 of the steering wheel may be employed for any other types of input as well that may be used for an automobile or other vehicle.
  • Referring to FIG. 1B, a gear shift controller 152 includes one or more touch pads 154. The touch pad 154 may be used to identify a user of a vehicle in which the gear selector 152 is installed. The gear shift 152 may be installed within an automobile, a boat, an aircraft, or another vehicle. The touch pad 154 may further be enabled to initiate input to the vehicle in which the gear shift 152 resides. For example, the touch pad 154 may be used to adjust not only the particular gear that the automobile is in but the suspension mode that the vehicle is in as well.
  • The input provided via touch pads 106, 108, 110, 112, and/or 154 of the devices of FIGS. 1A and 1B may be used also to adjust the suspension characteristics of the vehicle, the cruise control characteristics of the vehicle, minor settings of the vehicle, lighting settings of the vehicle, and/or other vehicle settings. A sequence of touches provided via the various touch pads 106, 108, 110, 112, and/or 154 may be employed to initiate various operations of a computer of a particular vehicle. For example, the touch input may be used to alter certain settings of a vehicle computer by tapping particular touch pads once or twice and then tapping other pads once or twice in order to initiate a control sequence.
  • Touch pad input may be capacitance, inductance, RF propagation characteristics, or a combination of these as measured at the touch sensitive elements of the touch pads. As will be further described herein, the touch pads may be employed to identify users based upon relative characteristics of the users' fingers or hands as they grasp or rest upon portions of a vehicle. Alternatively, the touch pads 106, 108, 110, 112, and 154 may capture finger print patterns. The information regarding the user that is received via touch pads 106, 108, 110, 112, and 154 may be relayed to a vehicle control system processing module, as will be described further herein.
  • FIG. 2 is a block diagram illustrating a vehicle control system constructed and operating according to one or more embodiments of the present invention. The vehicle control system of FIG. 2 includes a vehicle control system processing module 202 that couples to a plurality of touch pad modules 214A, 214B, 214C, and 214D via one or more communication links 220. The vehicle control system processing module 202 includes a wireless interface 204, processing circuitry 206, one or more wired interfaces 210, and memory 208. The vehicle control system processing module 202 typically also includes a user interface 212 and may include other components that are not shown such as at least one video interface, at least one audio interface, and m a video camera/video camera interface. The wireless interfaces 204 support wireless communications with various intra-vehicle components and various extra-vehicle components. The wireless interfaces 204 may support communications via cellular networks, Wireless Wide Area Network (WWAN) networks, Wireless Local Area Networks (WLANs), Wireless Personal Area Networks (WPANs), satellite communication networks, millimeter wave networks, etc. and may support proprietary communication formats.
  • The processing circuitry 206 may include one or more of a system processor, a digital signal processor, a processing module, dedicated hardware, application specific integrated circuit, or other circuitry that is capable of executing software instructions and for processing data. The memory 208 may be RAM, ROM, FLASH RAM, FLASH ROM, an optical memory, magnetic memory, or other types of memory that is capable of storing data and/or instructions in allowing processing circuitry to access same. The wired interfaces 210 may include a USB interface, a fire wire interface, a serial interface, a parallel interface, an optical interface, or another type of interface supported by a media that is copper, metal, or optical. The user interface 212 may include keypad, video display, cursor control, touch pad, or other type of interface that allows a user to interface with the vehicle control system processing module 202.
  • Each of the touch pad modules 214A, 214B, 214C, and 214D includes a respective touch pad interface (touch pad communication interface) 216A, 216B, 216C, and 216D, respectively. These touch pad interfaces 216A, 216B, 216C, and 216D support communications with the vehicle control system processing module 202 via the communication links 220. The communication links 220 may be wired, wireless, or a combination of wired and wireless links. Each touch pad module 214A, 214B, 214C, and 214D further includes respective touch pad circuitry 218A, 218B, 218C, and 218D, which is processing circuitry that interfaces with respective touch pads 220A, 220B, 220C, and 220D of the touch pad modules. The touch pad circuitry 218A, 218B, 218C, and 218D is capable of processing touch pad input received from the touch pads 220A, 220B, 220C, and 220D, as will be further described with reference to FIG. 4. The touch pad circuitry 218A, 218B, 218C, and 218D is processing circuitry capable of executing software instructions to perform desired functions.
  • FIG. 3 is a block diagram illustrating a vehicle control system and a plurality of other vehicle systems constructed and operating according to one or more embodiments of the present invention. The system of FIG. 3 includes a plurality of touch pad modules 214A, 214B, 214C, and 214D. The vehicle systems include an entertainment system 302, a navigation system 304, a suspension system 306, a seating system 308, a minor control system 310, a steering wheel system 312, a climate control system 314, a suspension system 316, an engine control system 318, a lighting system 320, and a communication system 322. All of these components are communicatively coupled via one or more communication links 220, that are wired and/or wireless. The communication system 322 supports extra-vehicular communications for the vehicle, which may be cellular communications, satellite communications, etc.
  • As will be further described herein, embodiments of the present invention a user of the vehicle is identified based upon touch pad input received at one or more touch pads of one or more of the touch pad modules 214A, 214B, 214C, and 214D. The touch pad input is processed by processing circuitry of the touch pad modules 214A, 214B, 214C, and/or 214D and/or the vehicle control system processing module 202 to identify a user of the vehicle. Based upon the user identification, the operation of one or more of the other systems 302-320 is modified. For example, identification of the user may be accomplished via touch pads of a steering wheel and/or in combination with touch pads of a gear shifter. Once the user is identified, suspension settings, seat settings, mirror settings, climate control settings, steering wheel settings, and entertainment settings are altered to correspond to those of the identified user. The reader will appreciate that any settings of any of the systems 302-320 of FIG. 3 may be modified based upon those corresponding to a particular user identity.
  • FIG. 4 is a block diagram illustrating a touch pad and touch pad circuitry constructed according to one or more embodiments of the present invention. A touch pad 402 includes a plurality of touch sensitive elements 404 each of which corresponds to a particular location of the touch pad 402. With the embodiment of FIG. 4, the touch pad 402 includes an array of touch sensitive elements 404, each of which may be a particular capacitively coupled location, inductively coupled location, or a radio frequency (RF) touch sensitive element. Touch pad circuitry 406 couples via a grid structure to the plurality of touch sensitive elements 404 to sense the particular capacitance, inductive, or RF characteristics at each of the touch sensitive elements.
  • Touch pad circuitry 406 scans the plurality of touch sensitive elements 404 via access of particular row-column combinations at particular times. The frequency or voltage at which the touch pad circuitry 406 scans the plurality of touch sensitive elements 404 may be altered over time. Choosing the scanning frequency or scanning voltage may be based upon a particular operational use of the touch pad. For example, the manner in which the touch pad is scanned will change based upon a particular operation of the touch pad, e.g., a first scanning frequency/scanning voltage may be employed for user identification while a second scanning frequency/scanning voltage may be employed for receiving user input.
  • The scanning done by the touch pad circuitry 406 of the plurality of touch sensitive elements may be made using a spread spectrum scanned frequency technique. Such technique may be employed to more efficiently capture information from the touch pad 402 at the various touch sensitive elements 404 or to determine which particular scanning frequencies are more successful than others in capturing input information.
  • Further, the scanning of each row and column corresponding to a particular touch sensitive element 404 may be altered based upon a detected capacitance (inductance/RF propagation) at the location. For example, one particular touch sensitive element 404 may have a fixed capacitance that does not vary over time. Such fixed capacitance may indicate that the particular touch sensitive element 404 is inoperable or that it receives no discernable input. In such case, by not scanning the particular touch sensitive element, other touch sensitive elements may be more frequently scanned or energy may be saved by not scanning all touch sensitive elements.
  • According to another aspect of the present invention, some portions of the touch pad may be disabled while others are enabled at differing points in time. Enablement of some touch sensitive elements and not others may be based upon a custom configuration of the touch pad for a particular input function provided.
  • The touch pad 402 may also be calibrated by the touch pad circuitry 406 based upon the environmental factors such as temperature, humidity, and surrounding noise as detected by measured capacitance, inductance, or RF propagation characteristics. Calibration of the touch pad 402 allows the touch pad 402 to be more efficient and more effectively receive touch pad input for user identification and/or for other input purposes. The calibration of the touch pad 402 by the touch pad circuitry 406 may be initiated at particular points in time. The touch pad circuitry 406 may simply initiate calibration of the touch pad 402 upon the expiration of a timer such that the touch pad is calibrated at a particular regular time interval. Alternatively, the touch pad 402 may be calibrated after a period of inactivity, i.e., the touch pad circuitry 406 performs calibration when it determines that no input is present on the touch pad 402. With other operations or embodiments, the touch pad 402 may be calibrated by the touch pad circuitry 406 using other input criteria as well.
  • FIG. 5A is a diagram illustrating how a user's hand may overlay a touch pad according to one or more embodiments of the present invention. The touch pad 402 has a plurality of touch sensitive elements 404 and is mounted upon a portion of a vehicle so that it is adjacent a user's hand when the user holds the portion of the vehicle. The outline 502 of users hand is shown as overlaying the touch pad 402 and the plurality of touch sensitive elements 404. While the touch pad 402 of FIG. 5A is generally illustrated as planar, the touch pad 402 may wrap around a steering wheel, gear shifter, door handle, or another vehicle component.
  • FIG. 5B is a diagram illustrating the manner in which a user's hand upon the touch pad produces a particular pattern of capacitance (inductance/RF propagation) upon the touch pad. A relative capacitance, inductance, or RF propagation pattern of the user's hand 502 is shown on touch pad 402. The depiction in FIG. 5B is illustrated in general only of relative capacitance at each of the user's finger location positions upon the touch pad 402. For example, where the user's fingers touch physically the touch pad 402, stronger capacitance lines 552 and 554 are shown. Where the user's fingers overlay the touch pad 402, lesser capacitance, inductance, or RF propagation characteristic lines 554 are shown. While other capacitance lines on the touch pad 402 are not shown in FIG. 5B are numbered, the various capacitance lines would be present for the other fingers as well.
  • The capacitance pattern of the user's hand 502 upon the touch pad 402 is a signature of a particular user. The size of user's hands, the positions of their knuckles, the relative angle at which they grip the location in the vehicle. Thus, based upon this variation of the capacitive pattern upon the touch screen 402, differing users can be identified. Further, considering that the touch pad 402 may serve as an input device, the capacitance of the touch sensitive elements 404 of the touch pad of 402 over time as it varies may be used to indicate touch pad input. Based upon the scanning frequency, the scanning voltage, and other scanning factors of the touch pad 402 at the various touch sensitive elements 404, the characteristics measured at each touch sensitive element 404 over time will enable the device to identify a user or to try particular input via the touch pad 402.
  • FIG. 6 is a flowchart illustrating operations of a vehicle control system processing module and a touch pad module to identify a user using touch pad input and to alter vehicle settings according to one or more embodiments of the present invention. Operations 600 begin when touch pad input is received from at least one touch sensitive element of a touch pad, step 602. The touch pad input has components from a plurality of touch sensitive elements of the touch pad. The touch pad input is processed by touch pad circuitry to determine user finger characteristics, step 604. The user finger characteristics are then transmitted to the vehicle control system processing module via a communications interface, step 606. The vehicle control system processing module then processes the user finger characteristics to identify a user via pattern matching operations, step 608. The vehicle control system processing module may then alter vehicle settings based upon user identity, step 610. Alternation of vehicle settings at step 610 may include the vehicle control system processing module sending direction(s) to the various vehicle systems described with reference to FIG. 3.
  • The pattern recognition used at step 608 may be based upon user finger characteristics, hand characteristics, or a combination of these. These characteristics and processing employed to determine these characteristics are described further herein. In another embodiment, heat transfer characteristics of a user's fingers are also determined based upon touch pad input and the heat transfer characteristics can be used to assist in identifying a user. Pulse rate characteristics of a user's fingers can be determined based upon the touch pad input and can be used to assist in identifying a user. Location data can be received from a navigation system and can be used to assist in identifying a user. Voice data can be received from a microphone and can be used to assist in identifying a user.
  • FIG. 7 is a flowchart illustrating alternative operations of a vehicle control system to identify a user using touch pad input and to alter vehicle settings according to one or more embodiments of the present invention. Operations 700 begin when touch pad input is received from at least one touch sensitive element of a touch pad, step 702. Processing circuitry processes the touch pad input to determine user finger characteristics, step 704. The processing circuitry then processes the user finger characteristics (and other information) to identify a user via pattern matching operations, step 706. The processing circuitry then alters vehicle settings/setting/choices based upon the user identity, step 708, and the process ends. In FIG. 7, all operations are performed by a single element of the vehicle, e.g., the vehicle control system processing module 202 or touch pad module 214A-214D of FIG. 2, for example, with the device sending directions to vehicle system to alter vehicle system settings.
  • FIG. 8 is a flowchart illustrating processing touch pad input to determine user finger/hand characteristics according to one or more embodiments of the present invention. Processing the touch pad input by processing circuitry to determine user finger/hand characteristics can be performed by one or more of the following: identifying at least one finger orientation based upon the touch pad input, step 802; identifying at least one finger spacing based upon the touch pad input, step 804; identifying at least one finger width based upon the touch pad input, step 806; identifying a plurality of finger knuckle/joint locations based upon the touch pad input, step 808; identifying a plurality of finger lengths based upon the touch pad input, step 810.
  • User finger characteristics, e.g., at least one finger orientation, at least one finger spacing, at least one finger width, a plurality of finger knuckle/joint locations, and a plurality of finger lengths, may be determined by either or both of the vehicle control system processing module and the touch pad circuitry. The touch pad input can be processed by either/both the vehicle control system processing module and the touch pad circuitry to determine these characteristics. Once, determined, these characteristics are compared to stored data of the same type for stored users for identification. Upon initial setup, these characteristics are stored for a particular user.
  • FIG. 9A is a flowchart illustrating processing touch pad input to determine heat transfer characteristics of a user's fingers and using the heat transfer characteristics to identify a user according to one or more embodiments of the present invention. The touch pad input is processed by processing circuitry of the touch pad module and/or the vehicle control system processing module. Heat transfer characteristics of a user's fingers are determined based upon the touch pad input, step 902. The heat transfer characteristics are used to assist in identifying the user, step 904. These heat transfer characteristics can be used in conjunction with user finger characteristics to identify the user.
  • FIG. 9B is a flowchart illustrating processing touch pad input to determine pulse rate characteristics of a user's fingers and using the pulse rate characteristics to identify a user according to one or more embodiments of the present invention. The touch pad input is processed by touch pad processing circuitry and/or the vehicle control system processing module. Pulse rate characteristics of a user's fingers are determined based upon the touch pad input, step 952. The pulse rate characteristics are used to assist in identifying the user, step 954. These pulse rate characteristics can be used in conjunction with user finger characteristics to identify the user.
  • FIG. 10A is a flowchart illustrating the use of location data to assist in identifying a user according to one or more embodiments of the present invention. Location data is received from a navigation system of the vehicle, for example, step 1002. The location data may be GPS data, for example. The location data is transmitted to the vehicle control system processing module via the communications interface to assist in identifying the user, step 1004. The location data can be used in conjunction with user finger characteristics to identify the user. For example, one user of the vehicle may drive the vehicle to work while other users of the vehicle may only occasionally visit such location. In such case, using the location data makes identifying the user much easier.
  • FIG. 10B is a flowchart illustrating the use of voice data to assist in identifying a user according to one or more embodiments of the present invention. Voice data is received from a microphone of the vehicle control system processing module or another vehicle component, step 1052. The voice data is transmitted to the vehicle control system processing module for processing to assist in identifying the user, step 1054. The voice data can be used in conjunction with user finger characteristics to identify the user. The voice data may be processed prior to transmission to the vehicle control system processing module. Alternately, the voice data may be captured by the vehicle control system processing module and used by the vehicle control system processing module to identify a user to augment other data used to identify the user.
  • FIG. 11 is a flowchart illustrating multiple modes of user identification operations of a vehicle control system according to one or more embodiments of the present invention. Operations 1100 begin when a user identification operations mode is selected, step 1102. When selecting initial user identification mode, step 1104, a menu is provided to a user, step 1110. The menu allows the user to select a name and, optionally, other user profile data, such as entertainment system settings, suspension system settings, engine control system settings, etc. Touch pad input is then captured and processed to determine finger/hand characteristics, step 1112. User identity and user preference profile/user preference data is established after fully interacting with the user, step 1114. The user profile is stored, step 1116, and the process returns to the step of user identification operations mode selection, step 1102. The user profile includes a user ID, user system preferences, user touch pad characteristics, e.g., finger characteristics, hand characteristics, heat transfer characteristics, pulse characteristics, vehicle location characteristics, etc.
  • When intermediate user identification mode is selected, step 1106, touch pad input is captured, step 1118. The system partially interacts with the user to correlate processed touch pad input to user profiles, step 1120. A user is selected based upon touch pad input and user interaction, step 1122. Such partial interaction may query the user to indicate that a correct user ID was selected based upon finger/hand characteristics, for example. However, the extent of user interaction is much less than that of the initial user identification mode 1104.
  • When automatic user identification mode is selected, step 1108, touch pad input is captured, step 1124. The system correlates the processed touch pad input to user profiles without user interaction, step 1126. User is selected based upon only the touch pad input and user profiles, without additional user interaction, step 1128. Thus, with the operations beginning at step 1108 no user interaction is required.
  • FIG. 12 is a flowchart illustrating the operation of a vehicle control system in deleting non-matched users after expiration of a user identification period according to one or more embodiments of the present invention. Operations 1200 begin when a user profile is retrieved, step 1202. A determination is made regarding whether the user profile has been accessed prior to expiration of a deletion period, step 1204. If No is determined at step 1204, the user profile is deleted for the particular user, step 1206. If Yes at step 1204, the user profile has been accessed prior to expiration of deletion period and the user profile is not deleted. From both a Yes determination at step 1204 and after step 1206, a determination is made regarding whether the process is complete, step 1208. If a Yes determination is made at step 1208, the process ends. If No, the next user profile is selected, step 1210, and the process repeats to the determination step 1204.
  • FIG. 13 is a flowchart illustrating the use of user preference data to assist in identifying a user by a vehicle control system according to one or more embodiments of the present invention. User preference data is identified at step 1302. The user preference information includes vehicle system selection preferences, seat position settings, etc. The user preference data is used to assist in identifying the user by comparing current vehicle settings and/or other pertinent information to the user preference data, step 1304. Thus, at step 1304, only those two users may be prime candidates for pattern matching of finger/hand characteristics. As another example, some users may be common active during particular hours of the day and these users are favored for pattern matching during those hours of the day.
  • The terms “circuit” and “circuitry” as used herein may refer to an independent circuit or to a portion of a multifunctional circuit that performs multiple underlying functions. For example, depending on the embodiment, processing circuitry may be implemented as a single chip processor or as a plurality of processing chips. Likewise, a first circuit and a second circuit may be combined in one embodiment into a single circuit or, in another embodiment, operate independently perhaps in separate chips. The term “chip,” as used herein, refers to an integrated circuit. Circuits and circuitry may comprise general or specific purpose hardware, or may comprise such hardware and associated software such as firmware or object code.
  • The present invention has also been described above with the aid of method steps illustrating the performance of specified functions and relationships thereof. The boundaries and sequence of these functional building blocks and method steps have been arbitrarily defined herein for convenience of description. Alternate boundaries and sequences can be defined so long as the specified functions and relationships are appropriately performed. Any such alternate boundaries or sequences are thus within the scope and spirit of the claimed invention.
  • The present invention has been described above with the aid of functional building blocks illustrating the performance of certain significant functions. The boundaries of these functional building blocks have been arbitrarily defined for convenience of description. Alternate boundaries could be defined as long as the certain significant functions are appropriately performed. Similarly, flow diagram blocks may also have been arbitrarily defined herein to illustrate certain significant functionality. To the extent used, the flow diagram block boundaries and sequence could have been defined otherwise and still perform the certain significant functionality. Such alternate definitions of both functional building blocks and flow diagram blocks and sequences are thus within the scope and spirit of the claimed invention. One of average skill in the art will also recognize that the functional building blocks, and other illustrative blocks, modules and components herein, can be implemented as illustrated or by discrete components, application specific integrated circuits, processors executing appropriate software and the like or any combination thereof.
  • As may be used herein, the terms “substantially” and “approximately” provides an industry-accepted tolerance for its corresponding term and/or relativity between items. Such an industry-accepted tolerance ranges from less than one percent to fifty percent and corresponds to, but is not limited to, component values, integrated circuit process variations, temperature variations, rise and fall times, and/or thermal noise. Such relativity between items ranges from a difference of a few percent to magnitude differences. As may also be used herein, the term(s) “coupled to” and/or “coupling” and/or includes direct coupling between items and/or indirect coupling between items via an intervening item (e.g., an item includes, but is not limited to, a component, an element, a circuit, and/or a module) where, for indirect coupling, the intervening item does not modify the information of a signal but may adjust its current level, voltage level, and/or power level. As may further be used herein, inferred coupling (i.e., where one element is coupled to another element by inference) includes direct and indirect coupling between two items in the same manner as “coupled to.” As may even further be used herein, the term “operable to” indicates that an item includes one or more of power connections, input(s), output(s), etc., to perform one or more its corresponding functions and may further include inferred coupling to one or more other items. As may still further be used herein, the term “associated with,” includes direct and/or indirect coupling of separate items and/or one item being embedded within another item. As may be used herein, the term “compares favorably,” indicates that a comparison between two or more items, signals, etc., provides a desired relationship. For example, when the desired relationship is that signal 1 has a greater magnitude than signal 2, a favorable comparison may be achieved when the magnitude of signal 1 is greater than that of signal 2 or when the magnitude of signal 2 is less than that of signal 1.
  • The present invention has also been described above with the aid of method steps illustrating the performance of specified functions and relationships thereof. The boundaries and sequence of these functional building blocks and method steps have been arbitrarily defined herein for convenience of description. Alternate boundaries and sequences can be defined so long as the specified functions and relationships are appropriately performed. Any such alternate boundaries or sequences are thus within the scope and spirit of the claimed invention.
  • Moreover, although described in detail for purposes of clarity and understanding by way of the aforementioned embodiments, the present invention is not limited to such embodiments. It will be obvious to one of average skill in the art that various changes and modifications may be practiced within the spirit and scope of the invention, as limited only by the scope of the appended claims.

Claims (20)

1. A vehicle control system comprising:
at least one touch pad having a plurality of touch sensitive elements; and
processing circuitry communicatively coupled to the at least one touch pad, the processing circuitry operable to:
receive touch pad input from the at least one touch pad, the touch pad input corresponding to a user's touch of at least some of the plurality of touch sensitive elements;
process the touch pad input to determine user finger characteristics;
process the user finger characteristics to identify the user via pattern recognition; and
alter at least one vehicle setting based upon the identified user.
2. The vehicle control system of claim 1, wherein the at least one touch pad is located on one or more of:
a steering wheel;
a gear shifter;
a throttle control;
a door handle; and
an arm rest.
3. The vehicle control system of claim 1, wherein the at least one vehicle setting is selected from the group consisting of:
entertainment system settings;
navigation system settings;
suspension system settings;
seat settings;
mirror settings;
steering wheel settings;
climate control system settings;
suspension system settings;
engine control system settings;
lighting system settings; and
communication system settings.
4. The vehicle control system of claim 1, wherein:
the processing circuitry comprises touch pad circuitry and system processing circuitry;
the touch pad circuitry is operable to:
process the touch pad input to determine the user finger characteristics; and
initiate transmission of the determined user finger characteristics to the system processing circuitry; and
the system processing circuitry is operable to process the determined user finger characteristics to identify the user via pattern recognition.
5. The vehicle control system of claim 4, wherein in processing the touch pad input to determine the user finger characteristics, the touch pad circuitry is operable to perform operations selected from the group consisting of:
identifying at least one finger orientation based upon the touch pad input;
identifying at least one finger spacing based upon the touch pad input;
identifying at least one finger width based upon the touch pad input;
identifying a plurality of finger knuckle/joint locations based upon the touch pad input; and
identifying a plurality of finger lengths based upon the touch pad input.
6. The vehicle control system of claim 1, wherein in processing the touch pad input to determine the user finger characteristics, the processing circuitry is operable to perform operations selected from the group consisting of:
identifying at least one finger orientation based upon the touch pad input;
identifying at least one finger spacing based upon the touch pad input;
identifying at least one finger width based upon the touch pad input;
identifying a plurality of finger knuckle/joint locations based upon the touch pad input; and
identifying a plurality of finger lengths based upon the touch pad input.
7. The vehicle control system of claim 1:
wherein in processing the touch pad input, the processing circuitry is further operable to determine heat transfer characteristics of a user's fingers; and
the processing circuitry is further operable to use the heat transfer characteristics to assist in identifying the user.
8. The vehicle control system of claim 1:
wherein in processing the touch pad input, the processing circuitry is further operable to determine pulse rate characteristics of a user's fingers; and
the processing circuitry is further operable to use the pulse rate characteristics to assist in identifying the user.
9. The vehicle control system of claim 1, wherein the processing circuitry is further operable to:
receive voice data from a microphone of the vehicle control system; and
use the voice data to assist in identifying the user.
10. The vehicle control system of claim 1, wherein the processing circuitry is further operable to:
receive vehicle location data from a navigation system; and
use the vehicle location data to assist in identifying the user.
11. A method for operating a vehicle control system comprising:
receiving touch pad input from the at least one touch pad of a vehicle, the touch pad input corresponding to a user's touch of at least some of the plurality of touch sensitive elements;
processing the touch pad input to determine user finger characteristics;
processing the user finger characteristics to identify the user via pattern recognition; and
altering at least one vehicle setting based upon the identified user.
12. The method of claim 11, wherein receiving touch pad input comprises receiving the touch pad input from at least one touch pad located on one or more of:
a steering wheel;
a gear shifter;
a throttle control;
a door handle; and
an arm rest.
13. The method of claim 11, wherein the at least one vehicle setting is selected from the group consisting of:
entertainment system settings;
navigation system settings;
suspension system settings;
seat settings;
mirror settings;
steering wheel settings;
climate control system settings;
suspension system settings;
engine control system settings;
lighting system settings; and
communication system settings.
14. The method of claim 11:
wherein:
processing the touch pad input to determine user finger characteristics is performed by touch pad circuitry; and
processing the user finger characteristics to identify the user via pattern recognition is performed by system processing circuitry; and
further comprising transmitting the user finger characteristics from the touch pad circuitry to the system processing circuitry.
15. The method of claim 14, wherein processing the touch pad input to determine user finger characteristics is selected from the group consisting of:
identifying at least one finger orientation based upon the touch pad input;
identifying at least one finger spacing based upon the touch pad input;
identifying at least one finger width based upon the touch pad input;
identifying a plurality of finger knuckle/joint locations based upon the touch pad input; and
identifying a plurality of finger lengths based upon the touch pad input.
16. The method of claim 11, wherein processing the touch pad input to determine user finger characteristics is selected from the group consisting of:
identifying at least one finger orientation based upon the touch pad input;
identifying at least one finger spacing based upon the touch pad input;
identifying at least one finger width based upon the touch pad input;
identifying a plurality of finger knuckle/joint locations based upon the touch pad input; and
identifying a plurality of finger lengths based upon the touch pad input.
17. The method of claim 11:
wherein processing the touch pad input includes determining heat transfer characteristics of a user's fingers based upon the touch pad input; and
further comprising using the heat transfer rate characteristics to assist in identifying the user.
18. The method of claim 11:
wherein processing the touch pad input includes determining pulse rate characteristics of a user's fingers based upon the touch pad input; and
further comprising using the pulse rate characteristics to assist in identifying the user.
19. The method of claim 11, further comprising:
receiving voice data from a microphone of the vehicle control system; and
using the voice data to assist in identifying the user.
20. The method of claim 11, further comprising:
receiving vehicle location data from a navigation system; and
using the vehicle location data to assist in identifying the user.
US12/912,637 2009-11-16 2010-10-26 Touch sensitive panel in vehicle for user identification Abandoned US20110115606A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/912,637 US20110115606A1 (en) 2009-11-16 2010-10-26 Touch sensitive panel in vehicle for user identification

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US26170209P 2009-11-16 2009-11-16
US12/912,637 US20110115606A1 (en) 2009-11-16 2010-10-26 Touch sensitive panel in vehicle for user identification

Publications (1)

Publication Number Publication Date
US20110115606A1 true US20110115606A1 (en) 2011-05-19

Family

ID=44010905

Family Applications (12)

Application Number Title Priority Date Filing Date
US12/894,011 Active 2031-03-08 US8535133B2 (en) 2009-11-16 2010-09-29 Video game with controller sensing player inappropriate activity
US12/912,651 Abandoned US20110118029A1 (en) 2009-11-16 2010-10-26 Hand-held gaming device with touch sensitive panel(s) for gaming input
US12/912,342 Abandoned US20110118024A1 (en) 2009-11-16 2010-10-26 Adjusting operation of touch sensitive panel of game controller
US12/912,472 Abandoned US20110115741A1 (en) 2009-11-16 2010-10-26 Touch sensitive panel supporting stylus input
US12/912,645 Active 2031-03-13 US8449393B2 (en) 2009-11-16 2010-10-26 Hand-held gaming device with configurable touch sensitive panel(s)
US12/912,405 Active 2032-05-21 US8614621B2 (en) 2009-11-16 2010-10-26 Remote control for multimedia system having touch sensitive panel for user ID
US12/912,422 Abandoned US20110118025A1 (en) 2009-11-16 2010-10-26 Game controller with touch pad user interface
US12/912,637 Abandoned US20110115606A1 (en) 2009-11-16 2010-10-26 Touch sensitive panel in vehicle for user identification
US12/912,595 Abandoned US20110118027A1 (en) 2009-11-16 2010-10-26 Altering video game operations based upon user id and-or grip position
US12/943,768 Active 2033-02-04 US8838060B2 (en) 2009-11-16 2010-11-10 Device communications via intra-body communication path
US12/945,556 Active 2031-09-08 US9007331B2 (en) 2009-11-16 2010-11-12 Touch sensitive panel detecting hovering finger
US13/867,316 Active US8845424B2 (en) 2009-11-16 2013-04-22 Hand-held gaming device with configurable touch sensitive panel(s)

Family Applications Before (7)

Application Number Title Priority Date Filing Date
US12/894,011 Active 2031-03-08 US8535133B2 (en) 2009-11-16 2010-09-29 Video game with controller sensing player inappropriate activity
US12/912,651 Abandoned US20110118029A1 (en) 2009-11-16 2010-10-26 Hand-held gaming device with touch sensitive panel(s) for gaming input
US12/912,342 Abandoned US20110118024A1 (en) 2009-11-16 2010-10-26 Adjusting operation of touch sensitive panel of game controller
US12/912,472 Abandoned US20110115741A1 (en) 2009-11-16 2010-10-26 Touch sensitive panel supporting stylus input
US12/912,645 Active 2031-03-13 US8449393B2 (en) 2009-11-16 2010-10-26 Hand-held gaming device with configurable touch sensitive panel(s)
US12/912,405 Active 2032-05-21 US8614621B2 (en) 2009-11-16 2010-10-26 Remote control for multimedia system having touch sensitive panel for user ID
US12/912,422 Abandoned US20110118025A1 (en) 2009-11-16 2010-10-26 Game controller with touch pad user interface

Family Applications After (4)

Application Number Title Priority Date Filing Date
US12/912,595 Abandoned US20110118027A1 (en) 2009-11-16 2010-10-26 Altering video game operations based upon user id and-or grip position
US12/943,768 Active 2033-02-04 US8838060B2 (en) 2009-11-16 2010-11-10 Device communications via intra-body communication path
US12/945,556 Active 2031-09-08 US9007331B2 (en) 2009-11-16 2010-11-12 Touch sensitive panel detecting hovering finger
US13/867,316 Active US8845424B2 (en) 2009-11-16 2013-04-22 Hand-held gaming device with configurable touch sensitive panel(s)

Country Status (1)

Country Link
US (12) US8535133B2 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120068952A1 (en) * 2010-05-25 2012-03-22 Motorola Mobility, Inc. User computer device with temperature sensing capabilities and method of operating same
US20140088793A1 (en) * 2012-09-27 2014-03-27 Dennis M. Morgan Device, method, and system for portable configuration of vehicle controls
WO2015069311A1 (en) * 2013-11-08 2015-05-14 Seyamak Vaziri Capacitive track pad transmission shift knob
US20150199941A1 (en) * 2014-01-15 2015-07-16 Nokia Corporation 3d touch sensor reader
US9103732B2 (en) 2010-05-25 2015-08-11 Google Technology Holdings LLC User computer device with temperature sensing capabilities and method of operating same
US9159221B1 (en) * 2012-05-25 2015-10-13 George Stantchev Steering wheel with remote control capabilities
GB2528086A (en) * 2014-07-09 2016-01-13 Jaguar Land Rover Ltd Identification method and apparatus
US9602624B2 (en) 2013-09-30 2017-03-21 AT&T Intellectual Property I, L.L.P. Facilitating content management based on profiles of members in an environment
EP3168778A1 (en) * 2015-11-13 2017-05-17 Thunder Power New Energy Vehicle Development Company Limited Vehicle fingerprint bookmark
US9696839B1 (en) * 2013-03-15 2017-07-04 Adac Plastics, Inc. Vehicle door control
US9753562B2 (en) 2014-01-15 2017-09-05 Nokia Technologies Oy Dynamic threshold for local connectivity setup
US9937797B2 (en) 2015-11-13 2018-04-10 Thunder Power New Energy Vehicle Development Company Limited Vehicle fingerprint bookmark
US10194019B1 (en) * 2017-12-01 2019-01-29 Qualcomm Incorporated Methods and systems for initiating a phone call from a wireless communication device
WO2020111308A1 (en) * 2018-11-28 2020-06-04 전자부품연구원 Intuitive interaction method and system for augmented reality display for vehicle

Families Citing this family (207)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8556628B1 (en) 2006-08-15 2013-10-15 Malcom E. Baxter Shooting training device
US8777620B1 (en) 2006-08-15 2014-07-15 Triggermaster, Inc. Firearm trigger pull training system and methods
US9151564B1 (en) 2006-08-15 2015-10-06 Triggermaster, Inc. Firearm trigger pull training system and methods
US8463182B2 (en) * 2009-12-24 2013-06-11 Sony Computer Entertainment Inc. Wireless device pairing and grouping methods
US20100060592A1 (en) * 2008-09-10 2010-03-11 Jeffrey Traer Bernstein Data Transmission and Reception Using Optical In-LCD Sensing
TWI483145B (en) * 2009-02-26 2015-05-01 Htc Corp Portable electronic device and method for avoiding erroneously touching touch panel thereof
US8668145B2 (en) * 2009-04-21 2014-03-11 Technology Innovators Inc. Automatic touch identification system and method thereof
JP5195637B2 (en) * 2009-05-21 2013-05-08 富士通株式会社 BAN sensor wireless communication apparatus and method
KR20100126958A (en) * 2009-05-25 2010-12-03 삼성전자주식회사 Apparatus and method for contrlling multi-device
US9323398B2 (en) 2009-07-10 2016-04-26 Apple Inc. Touch and hover sensing
KR20110080894A (en) * 2010-01-07 2011-07-13 삼성전자주식회사 Method and apparatus for processing multi-touch input
JP5427070B2 (en) * 2010-03-05 2014-02-26 株式会社ワコム Position detection device
JP5508122B2 (en) * 2010-04-30 2014-05-28 株式会社ソニー・コンピュータエンタテインメント Program, information input device, and control method thereof
GB2481596B (en) * 2010-06-29 2014-04-16 Nds Ltd System and method for identifying a user through an object held in a hand
US9357024B2 (en) 2010-08-05 2016-05-31 Qualcomm Incorporated Communication management utilizing destination device user presence probability
JP5934214B2 (en) * 2010-08-27 2016-06-15 ユイコ インコーポレイテッドUico,Inc. Capacitive touch screen with dynamic capacitance control and improved touch sensing
US8982060B2 (en) * 2010-08-27 2015-03-17 Apple Inc. Touch and hover sensor compensation
US9569003B2 (en) * 2010-09-30 2017-02-14 Broadcom Corporation Portable computing device including a three-dimensional touch screen
US10055017B2 (en) 2010-10-22 2018-08-21 Joshua Michael Young Methods devices and systems for creating control signals
US10069837B2 (en) 2015-07-09 2018-09-04 Biocatch Ltd. Detection of proxy server
US11223619B2 (en) 2010-11-29 2022-01-11 Biocatch Ltd. Device, system, and method of user authentication based on user-specific characteristics of task performance
US11269977B2 (en) 2010-11-29 2022-03-08 Biocatch Ltd. System, apparatus, and method of collecting and processing data in electronic devices
US10621585B2 (en) 2010-11-29 2020-04-14 Biocatch Ltd. Contextual mapping of web-pages, and generation of fraud-relatedness score-values
US10586036B2 (en) 2010-11-29 2020-03-10 Biocatch Ltd. System, device, and method of recovery and resetting of user authentication factor
US10917431B2 (en) * 2010-11-29 2021-02-09 Biocatch Ltd. System, method, and device of authenticating a user based on selfie image or selfie video
US10970394B2 (en) 2017-11-21 2021-04-06 Biocatch Ltd. System, device, and method of detecting vishing attacks
US11210674B2 (en) 2010-11-29 2021-12-28 Biocatch Ltd. Method, device, and system of detecting mule accounts and accounts used for money laundering
US10728761B2 (en) 2010-11-29 2020-07-28 Biocatch Ltd. Method, device, and system of detecting a lie of a user who inputs data
US10298614B2 (en) * 2010-11-29 2019-05-21 Biocatch Ltd. System, device, and method of generating and managing behavioral biometric cookies
US10474815B2 (en) 2010-11-29 2019-11-12 Biocatch Ltd. System, device, and method of detecting malicious automatic script and code injection
US10897482B2 (en) 2010-11-29 2021-01-19 Biocatch Ltd. Method, device, and system of back-coloring, forward-coloring, and fraud detection
US10949757B2 (en) 2010-11-29 2021-03-16 Biocatch Ltd. System, device, and method of detecting user identity based on motor-control loop model
US10747305B2 (en) 2010-11-29 2020-08-18 Biocatch Ltd. Method, system, and device of authenticating identity of a user of an electronic device
US10685355B2 (en) 2016-12-04 2020-06-16 Biocatch Ltd. Method, device, and system of detecting mule accounts and accounts used for money laundering
US10262324B2 (en) 2010-11-29 2019-04-16 Biocatch Ltd. System, device, and method of differentiating among users based on user-specific page navigation sequence
US20190158535A1 (en) * 2017-11-21 2019-05-23 Biocatch Ltd. Device, System, and Method of Detecting Vishing Attacks
US10776476B2 (en) 2010-11-29 2020-09-15 Biocatch Ltd. System, device, and method of visual login
US10404729B2 (en) 2010-11-29 2019-09-03 Biocatch Ltd. Device, method, and system of generating fraud-alerts for cyber-attacks
US10834590B2 (en) 2010-11-29 2020-11-10 Biocatch Ltd. Method, device, and system of differentiating between a cyber-attacker and a legitimate user
US10949514B2 (en) 2010-11-29 2021-03-16 Biocatch Ltd. Device, system, and method of differentiating among users based on detection of hardware components
US9310923B2 (en) 2010-12-03 2016-04-12 Apple Inc. Input device for touch sensitive devices
US9851849B2 (en) * 2010-12-03 2017-12-26 Apple Inc. Touch device communication
US8982045B2 (en) 2010-12-17 2015-03-17 Microsoft Corporation Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device
US8988398B2 (en) 2011-02-11 2015-03-24 Microsoft Corporation Multi-touch input device with orientation sensing
US8994646B2 (en) 2010-12-17 2015-03-31 Microsoft Corporation Detecting gestures involving intentional movement of a computing device
US9244545B2 (en) 2010-12-17 2016-01-26 Microsoft Technology Licensing, Llc Touch and stylus discrimination and rejection for contact sensitive computing devices
US8660978B2 (en) 2010-12-17 2014-02-25 Microsoft Corporation Detecting and responding to unintentional contact with a computing device
WO2012094740A1 (en) * 2011-01-12 2012-07-19 Smart Technologies Ulc Method for supporting multiple menus and interactive input system employing same
WO2012105273A1 (en) * 2011-02-04 2012-08-09 パナソニック株式会社 Electronic equipment
US9201520B2 (en) 2011-02-11 2015-12-01 Microsoft Technology Licensing, Llc Motion and context sharing for pen-based computing inputs
AU2012201543B2 (en) * 2011-03-15 2015-04-09 Aristocrat Technologies Australia Pty Limited An environmental controller, an environment control system and an environment control method
US20120287065A1 (en) * 2011-05-10 2012-11-15 Kyocera Corporation Electronic device
JP2012247911A (en) * 2011-05-26 2012-12-13 Sony Corp Information processing apparatus, information processing method, and program
US8928336B2 (en) 2011-06-09 2015-01-06 Ford Global Technologies, Llc Proximity switch having sensitivity control and method therefor
US8975903B2 (en) 2011-06-09 2015-03-10 Ford Global Technologies, Llc Proximity switch having learned sensitivity and method therefor
US9417754B2 (en) 2011-08-05 2016-08-16 P4tents1, LLC User interface system, method, and computer program product
US10004286B2 (en) 2011-08-08 2018-06-26 Ford Global Technologies, Llc Glove having conductive ink and method of interacting with proximity sensor
US9143126B2 (en) 2011-09-22 2015-09-22 Ford Global Technologies, Llc Proximity switch having lockout control for controlling movable panel
US9195351B1 (en) * 2011-09-28 2015-11-24 Amazon Technologies, Inc. Capacitive stylus
US20130104039A1 (en) * 2011-10-21 2013-04-25 Sony Ericsson Mobile Communications Ab System and Method for Operating a User Interface on an Electronic Device
EP2587347A3 (en) * 2011-10-25 2016-01-20 Broadcom Corporation Portable computing device including a three-dimensional touch screen
US8750852B2 (en) * 2011-10-27 2014-06-10 Qualcomm Incorporated Controlling access to a mobile device
US10112556B2 (en) 2011-11-03 2018-10-30 Ford Global Technologies, Llc Proximity switch having wrong touch adaptive learning and method
US8994228B2 (en) 2011-11-03 2015-03-31 Ford Global Technologies, Llc Proximity switch having wrong touch feedback
US8878438B2 (en) 2011-11-04 2014-11-04 Ford Global Technologies, Llc Lamp and proximity switch assembly and method
US9331743B2 (en) * 2011-12-08 2016-05-03 Microsoft Technology Licensing, Llc Biological entity communication channel
US20130147602A1 (en) * 2011-12-12 2013-06-13 Cisco Technology, Inc. Determination of user based on electrical measurement
US20130154958A1 (en) * 2011-12-20 2013-06-20 Microsoft Corporation Content system with secondary touch controller
US20130176270A1 (en) * 2012-01-09 2013-07-11 Broadcom Corporation Object classification for touch panels
US8902181B2 (en) 2012-02-07 2014-12-02 Microsoft Corporation Multi-touch-movement gestures for tablet computing devices
US9354748B2 (en) 2012-02-13 2016-05-31 Microsoft Technology Licensing, Llc Optical stylus interaction
US9013425B2 (en) * 2012-02-23 2015-04-21 Cypress Semiconductor Corporation Method and apparatus for data transmission via capacitance sensing device
US9075566B2 (en) 2012-03-02 2015-07-07 Microsoft Technoogy Licensing, LLC Flexible hinge spine
US9298236B2 (en) 2012-03-02 2016-03-29 Microsoft Technology Licensing, Llc Multi-stage power adapter configured to provide a first power level upon initial connection of the power adapter to the host device and a second power level thereafter upon notification from the host device to the power adapter
US9870066B2 (en) 2012-03-02 2018-01-16 Microsoft Technology Licensing, Llc Method of manufacturing an input device
US9158383B2 (en) 2012-03-02 2015-10-13 Microsoft Technology Licensing, Llc Force concentrator
US8873227B2 (en) 2012-03-02 2014-10-28 Microsoft Corporation Flexible hinge support layer
US9660644B2 (en) 2012-04-11 2017-05-23 Ford Global Technologies, Llc Proximity switch assembly and activation method
US9831870B2 (en) 2012-04-11 2017-11-28 Ford Global Technologies, Llc Proximity switch assembly and method of tuning same
US9944237B2 (en) 2012-04-11 2018-04-17 Ford Global Technologies, Llc Proximity switch assembly with signal drift rejection and method
US9568527B2 (en) 2012-04-11 2017-02-14 Ford Global Technologies, Llc Proximity switch assembly and activation method having virtual button mode
US9219472B2 (en) 2012-04-11 2015-12-22 Ford Global Technologies, Llc Proximity switch assembly and activation method using rate monitoring
US9197206B2 (en) 2012-04-11 2015-11-24 Ford Global Technologies, Llc Proximity switch having differential contact surface
US9531379B2 (en) 2012-04-11 2016-12-27 Ford Global Technologies, Llc Proximity switch assembly having groove between adjacent proximity sensors
US9184745B2 (en) 2012-04-11 2015-11-10 Ford Global Technologies, Llc Proximity switch assembly and method of sensing user input based on signal rate of change
US8933708B2 (en) 2012-04-11 2015-01-13 Ford Global Technologies, Llc Proximity switch assembly and activation method with exploration mode
US9065447B2 (en) 2012-04-11 2015-06-23 Ford Global Technologies, Llc Proximity switch assembly and method having adaptive time delay
US9520875B2 (en) 2012-04-11 2016-12-13 Ford Global Technologies, Llc Pliable proximity switch assembly and activation method
US9287864B2 (en) 2012-04-11 2016-03-15 Ford Global Technologies, Llc Proximity switch assembly and calibration method therefor
US9559688B2 (en) 2012-04-11 2017-01-31 Ford Global Technologies, Llc Proximity switch assembly having pliable surface and depression
WO2013158118A1 (en) 2012-04-20 2013-10-24 Empire Technology Development Llc Online game experience using multiple devices
US9201547B2 (en) 2012-04-30 2015-12-01 Apple Inc. Wide dynamic range capacitive sensing
US20130300590A1 (en) 2012-05-14 2013-11-14 Paul Henry Dietz Audio Feedback
US9136840B2 (en) 2012-05-17 2015-09-15 Ford Global Technologies, Llc Proximity switch assembly having dynamic tuned threshold
US8981602B2 (en) 2012-05-29 2015-03-17 Ford Global Technologies, Llc Proximity switch assembly having non-switch contact and method
US9337832B2 (en) 2012-06-06 2016-05-10 Ford Global Technologies, Llc Proximity switch and method of adjusting sensitivity therefor
US9684382B2 (en) 2012-06-13 2017-06-20 Microsoft Technology Licensing, Llc Input device configuration having capacitive and pressure sensors
US9459160B2 (en) 2012-06-13 2016-10-04 Microsoft Technology Licensing, Llc Input device sensor configuration
JP5923394B2 (en) * 2012-06-20 2016-05-24 株式会社Nttドコモ Recognition device, recognition method, and recognition system
US9641172B2 (en) 2012-06-27 2017-05-02 Ford Global Technologies, Llc Proximity switch assembly having varying size electrode fingers
US9652090B2 (en) 2012-07-27 2017-05-16 Apple Inc. Device for digital communication through capacitive coupling
US9557845B2 (en) 2012-07-27 2017-01-31 Apple Inc. Input device for and method of communication with capacitive devices through frequency variation
CN104756170A (en) * 2012-08-07 2015-07-01 韦伯图纳公司 Multi-media ad targeting and content recommendation with viewer identity detection system
US8964379B2 (en) 2012-08-20 2015-02-24 Microsoft Corporation Switchable magnetic lock
US8922340B2 (en) 2012-09-11 2014-12-30 Ford Global Technologies, Llc Proximity switch based door latch release
US8796575B2 (en) 2012-10-31 2014-08-05 Ford Global Technologies, Llc Proximity switch assembly having ground layer
US10817096B2 (en) * 2014-02-06 2020-10-27 Apple Inc. Force sensor incorporated into display
US20140168140A1 (en) * 2012-12-18 2014-06-19 Logitech Europe S.A. Method and system for discriminating stylus and touch interactions
US9244576B1 (en) 2012-12-21 2016-01-26 Cypress Semiconductor Corporation User interface with child-lock feature
US20140201205A1 (en) * 2013-01-14 2014-07-17 Disney Enterprises, Inc. Customized Content from User Data
US9176538B2 (en) 2013-02-05 2015-11-03 Microsoft Technology Licensing, Llc Input device configurations
JP2016507119A (en) 2013-02-08 2016-03-07 アップル インコーポレイテッド Force judgment based on capacitive sensing
US10578499B2 (en) 2013-02-17 2020-03-03 Microsoft Technology Licensing, Llc Piezo-actuated virtual buttons for touch surfaces
US9632594B2 (en) * 2013-03-11 2017-04-25 Barnes & Noble College Booksellers, Llc Stylus sensitive device with stylus idle functionality
US9785259B2 (en) * 2013-03-11 2017-10-10 Barnes & Noble College Booksellers, Llc Stylus-based slider functionality for UI control of computing device
US9766723B2 (en) 2013-03-11 2017-09-19 Barnes & Noble College Booksellers, Llc Stylus sensitive device with hover over stylus control functionality
US9946365B2 (en) 2013-03-11 2018-04-17 Barnes & Noble College Booksellers, Llc Stylus-based pressure-sensitive area for UI control of computing device
US9311204B2 (en) 2013-03-13 2016-04-12 Ford Global Technologies, Llc Proximity interface development system having replicator and method
US9143715B2 (en) 2013-03-14 2015-09-22 Intel Corporation Remote control with capacitive touchpad
SG10201606730SA (en) * 2013-03-15 2016-10-28 Tactual Labs Co Fast multi-touch noise reduction
JP5697113B2 (en) * 2013-04-26 2015-04-08 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Electronics
US9440143B2 (en) 2013-07-02 2016-09-13 Kabam, Inc. System and method for determining in-game capabilities based on device information
FR3008510B1 (en) 2013-07-12 2017-06-23 Blinksight DEVICE AND METHOD FOR CONTROLLING ACCESS TO AT LEAST ONE MACHINE
US9671889B1 (en) 2013-07-25 2017-06-06 Apple Inc. Input member with capacitive sensor
US10845901B2 (en) 2013-07-31 2020-11-24 Apple Inc. Touch controller architecture
US9415306B1 (en) * 2013-08-12 2016-08-16 Kabam, Inc. Clients communicate input technique to server
KR20150020865A (en) * 2013-08-19 2015-02-27 삼성전자주식회사 Method and apparatus for processing a input of electronic device
US9117100B2 (en) 2013-09-11 2015-08-25 Qualcomm Incorporated Dynamic learning for object tracking
US10025489B2 (en) 2013-09-16 2018-07-17 Microsoft Technology Licensing, Llc Detecting primary hover point for multi-hover point device
US9686581B2 (en) 2013-11-07 2017-06-20 Cisco Technology, Inc. Second-screen TV bridge
US9623322B1 (en) 2013-11-19 2017-04-18 Kabam, Inc. System and method of displaying device information for party formation
US9933879B2 (en) 2013-11-25 2018-04-03 Apple Inc. Reconfigurable circuit topology for both self-capacitance and mutual capacitance sensing
US10139959B2 (en) * 2013-11-26 2018-11-27 Apple Inc. Self-calibration of force sensors and inertial compensation
US9295916B1 (en) 2013-12-16 2016-03-29 Kabam, Inc. System and method for providing recommendations for in-game events
US20150177945A1 (en) * 2013-12-23 2015-06-25 Uttam K. Sengupta Adapting interface based on usage context
US9227141B2 (en) 2013-12-31 2016-01-05 Microsoft Technology Licensing, Llc Touch screen game controller
US9448631B2 (en) 2013-12-31 2016-09-20 Microsoft Technology Licensing, Llc Input device haptics and pressure sensing
JP6349838B2 (en) * 2014-01-21 2018-07-04 セイコーエプソン株式会社 POSITION DETECTION DEVICE, POSITION DETECTION SYSTEM, AND POSITION DETECTION DEVICE CONTROL METHOD
JP6276867B2 (en) 2014-02-12 2018-02-07 アップル インコーポレイテッド Force determination using sheet sensor and capacitive array
US20150242024A1 (en) * 2014-02-21 2015-08-27 Polar Electro Oy Radio Frequency Sensor
US10120420B2 (en) 2014-03-21 2018-11-06 Microsoft Technology Licensing, Llc Lockable display and techniques enabling use of lockable displays
US10222935B2 (en) 2014-04-23 2019-03-05 Cisco Technology Inc. Treemap-type user interface
US9870083B2 (en) 2014-06-12 2018-01-16 Microsoft Technology Licensing, Llc Multi-device multi-user sensor correlation for pen and computing device interaction
US9727161B2 (en) 2014-06-12 2017-08-08 Microsoft Technology Licensing, Llc Sensor correlation for pen and touch-sensitive computing device interaction
US9400880B2 (en) 2014-06-17 2016-07-26 Qualcomm Incorporated Method and apparatus for biometric-based security using capacitive profiles
US10712116B1 (en) 2014-07-14 2020-07-14 Triggermaster, Llc Firearm body motion detection training system
US10324733B2 (en) 2014-07-30 2019-06-18 Microsoft Technology Licensing, Llc Shutdown notifications
US20160034051A1 (en) * 2014-07-31 2016-02-04 Cisco Technology, Inc. Audio-visual content navigation with movement of computing device
US20160034171A1 (en) * 2014-08-04 2016-02-04 Flextronics Ap, Llc Multi-touch gesture recognition using multiple single-touch touch pads
US9424048B2 (en) 2014-09-15 2016-08-23 Microsoft Technology Licensing, Llc Inductive peripheral retention device
US9946371B2 (en) * 2014-10-16 2018-04-17 Qualcomm Incorporated System and method for using touch orientation to distinguish between users of a touch panel
US10038443B2 (en) 2014-10-20 2018-07-31 Ford Global Technologies, Llc Directional proximity switch assembly
KR102380228B1 (en) 2014-11-14 2022-03-30 삼성전자주식회사 Method for controlling device and the device
US10065111B1 (en) * 2014-12-16 2018-09-04 Oculus Vr, Llc Mapping user interactions with a controller to a hand position
US9763088B2 (en) * 2014-12-31 2017-09-12 Ruckus Wireless, Inc. Mesh network with personal pre-shared keys
US9654103B2 (en) 2015-03-18 2017-05-16 Ford Global Technologies, Llc Proximity switch assembly having haptic feedback and method
US9548733B2 (en) 2015-05-20 2017-01-17 Ford Global Technologies, Llc Proximity sensor assembly having interleaved electrode configuration
WO2016191392A1 (en) * 2015-05-22 2016-12-01 Tactual Labs Co. Transmitting and receiving system and method for bidirectional orthogonal signaling sensors
US9898091B2 (en) 2015-06-03 2018-02-20 Oculus Vr, Llc Virtual reality system with head-mounted display, camera and hand-held controllers
US10416799B2 (en) 2015-06-03 2019-09-17 Microsoft Technology Licensing, Llc Force sensing and inadvertent input control of an input device
US10222889B2 (en) 2015-06-03 2019-03-05 Microsoft Technology Licensing, Llc Force inputs and cursor control
US9870052B2 (en) * 2015-06-11 2018-01-16 Oculus Vr, Llc Hand-held controller with pressure-sensing switch for virtual-reality systems
US9999833B2 (en) * 2015-06-11 2018-06-19 Oculus Vr, Llc Hand-held controllers with capacitive touch sensors for virtual-reality systems
GB2539705B (en) 2015-06-25 2017-10-25 Aimbrain Solutions Ltd Conditional behavioural biometrics
US10007421B2 (en) * 2015-08-03 2018-06-26 Lenovo (Singapore) Pte. Ltd. Natural handwriting detection on a touch surface
US9660968B2 (en) 2015-09-25 2017-05-23 Intel Corporation Methods and apparatus for conveying a nonce via a human body communication conduit
US10325134B2 (en) * 2015-11-13 2019-06-18 Fingerprint Cards Ab Method and system for calibration of an optical fingerprint sensing device
US20170140233A1 (en) * 2015-11-13 2017-05-18 Fingerprint Cards Ab Method and system for calibration of a fingerprint sensing device
US9891773B2 (en) 2015-12-17 2018-02-13 Synaptics Incorporated Detecting hover distance with a capacitive sensor
US20170185980A1 (en) * 2015-12-24 2017-06-29 Capital One Services, Llc Personalized automatic teller machine
US10061385B2 (en) 2016-01-22 2018-08-28 Microsoft Technology Licensing, Llc Haptic feedback for a touch input device
US10025492B2 (en) 2016-02-08 2018-07-17 Microsoft Technology Licensing, Llc Pointing detection
AU2017224830A1 (en) * 2016-02-25 2018-10-11 Box Dark Industries Pty. Ltd. Articulated gaming controller
KR102559030B1 (en) 2016-03-18 2023-07-25 삼성전자주식회사 Electronic device including a touch panel and method for controlling thereof
US10007343B2 (en) 2016-03-31 2018-06-26 Apple Inc. Force sensor in an input device
GB2552032B (en) 2016-07-08 2019-05-22 Aimbrain Solutions Ltd Step-up authentication
US10086267B2 (en) 2016-08-12 2018-10-02 Microsoft Technology Licensing, Llc Physical gesture input configuration for interactive software and video games
KR102425576B1 (en) 2016-09-13 2022-07-26 삼성전자주식회사 Wearable device and the operation method thereof
US10579784B2 (en) 2016-11-02 2020-03-03 Biocatch Ltd. System, device, and method of secure utilization of fingerprints for user authentication
US10379806B2 (en) 2016-11-04 2019-08-13 International Business Machines Corporation Dynamic selection for touch sensor
US10444927B2 (en) 2016-11-04 2019-10-15 Microsoft Technology Licensing, Llc Stylus hover and position communication protocol
US10372520B2 (en) 2016-11-22 2019-08-06 Cisco Technology, Inc. Graphical user interface for visualizing a plurality of issues with an infrastructure
US10739943B2 (en) 2016-12-13 2020-08-11 Cisco Technology, Inc. Ordered list user interface
US10397262B2 (en) 2017-07-20 2019-08-27 Biocatch Ltd. Device, system, and method of detecting overlay malware
CN107485854B (en) * 2017-08-03 2022-03-01 惠州Tcl移动通信有限公司 Game paddle control method, storage medium and game paddle
JP6719433B2 (en) * 2017-09-22 2020-07-08 株式会社日立製作所 Moving body control system and moving body control method
US10437365B2 (en) 2017-10-11 2019-10-08 Pixart Imaging Inc. Driver integrated circuit of touch panel and associated driving method
US10773153B2 (en) * 2017-11-02 2020-09-15 Michael Callahan Method and system for a personal interaction game platform
US10599259B2 (en) * 2017-11-20 2020-03-24 Google Llc Virtual reality / augmented reality handheld controller sensing
US10862867B2 (en) 2018-04-01 2020-12-08 Cisco Technology, Inc. Intelligent graphical user interface
CN108845613B (en) * 2018-04-09 2021-07-09 广州视源电子科技股份有限公司 Interactive intelligent tablet computer and data processing method and device thereof
CN108777854A (en) * 2018-05-25 2018-11-09 恒玄科技(上海)有限公司 A kind of wireless headset realization stereosonic system and method for high-quality transmission
US11056923B2 (en) * 2018-06-05 2021-07-06 Avago Technologies International Sales Pte. Limited Wireless charging relay and method
US10866683B2 (en) 2018-08-27 2020-12-15 Apple Inc. Force or touch sensing on a mobile device using capacitive or pressure sensing
US10814222B2 (en) 2018-09-21 2020-10-27 Logitech Europe S.A. Gaming controller with adaptable input configurations
CN109350962A (en) * 2018-10-08 2019-02-19 业成科技(成都)有限公司 Touch device
US11490491B2 (en) * 2018-12-07 2022-11-01 Sony Interactive Entertainment Inc. Entertainment apparatus, light emission controlling apparatus, operation device, light emission controlling method and program
US10635202B1 (en) * 2018-12-18 2020-04-28 Valve Corporation Dynamic sensor assignment
US10905946B2 (en) 2019-02-28 2021-02-02 Valve Corporation Continuous controller calibration
US20200285291A1 (en) * 2019-03-06 2020-09-10 Sony Interactive Entertainment Inc. Low battery switchover
US11281373B2 (en) * 2019-05-07 2022-03-22 Yifang Liu Multi-perspective input for computing devices
US20220238566A1 (en) * 2019-06-05 2022-07-28 Touch Biometrix Limited Apparatus and method
US11216065B2 (en) * 2019-09-26 2022-01-04 Lenovo (Singapore) Pte. Ltd. Input control display based on eye gaze
US11504610B2 (en) * 2020-02-14 2022-11-22 Valve Corporation Dynamically enabling or disabling controls of a controller
CN111462557B (en) * 2020-04-09 2022-03-01 中国人民解放军陆军军医大学第二附属医院 Cardiovascular disease clinical case breakthrough game type teaching application system
US11606353B2 (en) 2021-07-22 2023-03-14 Biocatch Ltd. System, device, and method of generating and utilizing one-time passwords
US20230084581A1 (en) * 2021-09-16 2023-03-16 Voyetra Turtle Beach Inc. Video game controller with a graphical user interface

Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4525859A (en) * 1982-09-03 1985-06-25 Bowles Romald E Pattern recognition system
US5812252A (en) * 1995-01-31 1998-09-22 Arete Associates Fingerprint--Acquisition apparatus for access control; personal weapon and other systems controlled thereby
US5812067A (en) * 1994-05-10 1998-09-22 Volkswagen Ag System for recognizing authorization to use a vehicle
US5982913A (en) * 1997-03-25 1999-11-09 The United States Of America As Represented By The National Security Agency Method of verification using a subset of claimant's fingerprint
US6100811A (en) * 1997-12-22 2000-08-08 Trw Inc. Fingerprint actuation of customized vehicle features
US6225890B1 (en) * 1998-03-20 2001-05-01 Trimble Navigation Limited Vehicle use control
US6351695B1 (en) * 1999-04-23 2002-02-26 Ronald Weiss Verified common carrier truck operation log
US6603462B2 (en) * 2001-03-21 2003-08-05 Multidigit, Inc. System and method for selecting functions based on a finger feature such as a fingerprint
US6628810B1 (en) * 1997-03-13 2003-09-30 Koninklijke Philips Electronics N.V. Hand biometrics sensing device
US20040017934A1 (en) * 2002-07-29 2004-01-29 Kocher Robert William Method and apparatus for contactless hand recognition
US6940391B1 (en) * 2000-03-21 2005-09-06 Mitsubishi Denki Kabushiki Kaisha Vehicle key system
US20050271279A1 (en) * 2004-05-14 2005-12-08 Honda Motor Co., Ltd. Sign based human-machine interaction
US6990219B2 (en) * 2000-12-15 2006-01-24 Nippon Telegraph And Telephone Corporation Image capturing method and apparatus and fingerprint collation method and apparatus
US20070055888A1 (en) * 2005-03-31 2007-03-08 Miller Brian S Biometric control of equipment
US7268665B2 (en) * 2003-11-18 2007-09-11 Kabushiki Kaisha Tokai Rika Denki Seisakusho Vehicle anti-theft apparatus
US7280678B2 (en) * 2003-02-28 2007-10-09 Avago Technologies General Ip Pte Ltd Apparatus and method for detecting pupils
US20070299577A1 (en) * 2006-05-24 2007-12-27 Denso Corporation User assistance system for vehicle
US20080004113A1 (en) * 2006-06-30 2008-01-03 Jason Avery Enhanced controller with modifiable functionality
US20080069412A1 (en) * 2006-09-15 2008-03-20 Champagne Katrina S Contoured biometric sensor
US20080252412A1 (en) * 2005-07-11 2008-10-16 Volvo Technology Corporation Method for Performing Driver Identity Verification
US20090010502A1 (en) * 2005-09-30 2009-01-08 Daimler Ag Vehicle Occupant Protection System
US20090073112A1 (en) * 2007-09-14 2009-03-19 International Business Machines Corporation Method and system for dynamically configurable tactile feedback for navigational support
US7602947B1 (en) * 1996-05-15 2009-10-13 Lemelson Jerome H Facial-recognition vehicle security system
US20090289780A1 (en) * 2008-05-21 2009-11-26 Danette Sue Tenorio-Fox SenCora print system
US7660442B2 (en) * 2006-09-01 2010-02-09 Handshot, Llc Method and system for capturing fingerprints, palm prints and hand geometry
US20100039224A1 (en) * 2008-05-26 2010-02-18 Okude Kazuhiro Biometrics information matching apparatus, biometrics information matching system, biometrics information matching method, person authentication apparatus, and person authentication method
US20100234074A1 (en) * 2006-10-02 2010-09-16 Nokia Corporation Keypad emulation
US20100248822A1 (en) * 2009-03-27 2010-09-30 Microsoft Corporation Personalization using a hand-pressure signature
US7956890B2 (en) * 2004-09-17 2011-06-07 Proximex Corporation Adaptive multi-modal integrated biometric identification detection and surveillance systems
US8131026B2 (en) * 2004-04-16 2012-03-06 Validity Sensors, Inc. Method and apparatus for fingerprint image reconstruction
US8175346B2 (en) * 2006-07-19 2012-05-08 Lumidigm, Inc. Whole-hand multispectral biometric imaging
US8285009B2 (en) * 2004-02-12 2012-10-09 Nec Infrontia Corporation Fingerprint input apparatus

Family Cites Families (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4857916A (en) * 1987-02-26 1989-08-15 Bellin Robert W System and method for identifying an individual utilizing grasping pressures
US6343991B1 (en) * 1997-10-01 2002-02-05 Brad A. Armstrong Game control with analog pressure sensor
JP2845175B2 (en) * 1995-08-25 1999-01-13 株式会社オプテック Game console controller
US5896125A (en) * 1995-11-06 1999-04-20 Niedzwiecki; Richard H. Configurable keyboard to personal computer video game controller adapter
US6408087B1 (en) * 1998-01-13 2002-06-18 Stmicroelectronics, Inc. Capacitive semiconductor user input device
US7663607B2 (en) * 2004-05-06 2010-02-16 Apple Inc. Multipoint touchscreen
EP2256605B1 (en) * 1998-01-26 2017-12-06 Apple Inc. Method and apparatus for integrating manual input
JP3171575B2 (en) * 1998-07-31 2001-05-28 株式会社ソニー・コンピュータエンタテインメント Entertainment system and program supply medium
US6028950A (en) * 1999-02-10 2000-02-22 The National Registry, Inc. Fingerprint controlled set-top box
US6369706B1 (en) * 1999-05-10 2002-04-09 Gateway, Inc. System and method for protecting a digital information appliance from environmental influences
US7047419B2 (en) * 1999-09-17 2006-05-16 Pen-One Inc. Data security system
IL134527A (en) * 2000-02-14 2011-08-31 Bioguard Components And Technology Ltd Biometrics interface
US6565441B1 (en) * 2000-04-07 2003-05-20 Arista Enterprises Inc. Dedicated wireless digital video disc (DVD) controller for video game consoles
US20060250213A1 (en) * 2000-07-28 2006-11-09 Cain George R Jr Biometric data controlled configuration
US6819219B1 (en) * 2000-10-13 2004-11-16 International Business Machines Corporation Method for biometric-based authentication in wireless communication for access control
TW507158B (en) * 2001-01-05 2002-10-21 Darfon Electronics Corp Detecting device and method of mouse touch pad
US8939831B2 (en) * 2001-03-08 2015-01-27 Brian M. Dugan Systems and methods for improving fitness equipment and exercise
US6563940B2 (en) * 2001-05-16 2003-05-13 New Jersey Institute Of Technology Unauthorized user prevention device and method
US6902481B2 (en) * 2001-09-28 2005-06-07 Igt Decoupling of the graphical presentation of a game from the presentation logic
JP2003140823A (en) * 2001-11-08 2003-05-16 Sony Computer Entertainment Inc Information input device and information processing program
US7352356B2 (en) * 2001-12-13 2008-04-01 United States Of America Refreshable scanning tactile graphic display for localized sensory stimulation
US20050084138A1 (en) * 2002-02-13 2005-04-21 Inkster D R. System and method for identifying a person
US20030220142A1 (en) * 2002-05-21 2003-11-27 Mark Siegel Video Game controller with display screen
US20040021681A1 (en) * 2002-07-30 2004-02-05 Liao Chin-Hua Arthur Dual-touch-screen mobile computer
US7180508B2 (en) * 2002-09-17 2007-02-20 Tyco Electronics Corporation Dynamic corrections for a non-linear touchscreen
US7050798B2 (en) * 2002-12-16 2006-05-23 Microsoft Corporation Input device with user-balanced performance and power consumption
US8170945B2 (en) * 2004-01-15 2012-05-01 Bgc Partners, Inc. System and method for providing security to a game controller device for electronic trading
US7180401B2 (en) * 2004-12-03 2007-02-20 Kulite Semiconductor Products, Ic. Personal identification apparatus using measured tactile pressure
US20060244733A1 (en) * 2005-04-28 2006-11-02 Geaghan Bernard O Touch sensitive device and method using pre-touch information
KR20060131542A (en) * 2005-06-16 2006-12-20 엘지전자 주식회사 Apparatus and method for power saving of a touch screen
US20060284853A1 (en) * 2005-06-16 2006-12-21 Xm Satellite Radio, Inc. Context sensitive data input using finger or fingerprint recognition
KR100668341B1 (en) * 2005-06-29 2007-01-12 삼성전자주식회사 Method and apparatus for function selection by user's hand grip shape
US7942745B2 (en) * 2005-08-22 2011-05-17 Nintendo Co., Ltd. Game operating device
US7649522B2 (en) * 2005-10-11 2010-01-19 Fish & Richardson P.C. Human interface input acceleration system
US7868874B2 (en) * 2005-11-15 2011-01-11 Synaptics Incorporated Methods and systems for detecting a position-based attribute of an object using digital codes
US20070111796A1 (en) * 2005-11-16 2007-05-17 Microsoft Corporation Association of peripherals communicatively attached to a console device
US10048860B2 (en) * 2006-04-06 2018-08-14 Google Technology Holdings LLC Method and apparatus for user interface adaptation
KR100827234B1 (en) * 2006-05-30 2008-05-07 삼성전자주식회사 Fault-tolerant method and apparatus for touch sensor
US20070299670A1 (en) * 2006-06-27 2007-12-27 Sbc Knowledge Ventures, Lp Biometric and speech recognition system and method
US9069417B2 (en) * 2006-07-12 2015-06-30 N-Trig Ltd. Hover and touch detection for digitizer
JP5294442B2 (en) * 2006-09-13 2013-09-18 任天堂株式会社 GAME DEVICE AND GAME PROGRAM
US8232970B2 (en) * 2007-01-03 2012-07-31 Apple Inc. Scan sequence generator
US7848825B2 (en) * 2007-01-03 2010-12-07 Apple Inc. Master/slave mode for sensor processing devices
US8094128B2 (en) * 2007-01-03 2012-01-10 Apple Inc. Channel scan logic
US20090017910A1 (en) * 2007-06-22 2009-01-15 Broadcom Corporation Position and motion tracking of an object
US20080231604A1 (en) * 2007-03-22 2008-09-25 Cypress Semiconductor Corp. Method for extending the life of touch screens
JP5285234B2 (en) * 2007-04-24 2013-09-11 任天堂株式会社 Game system, information processing system
US8027518B2 (en) * 2007-06-25 2011-09-27 Microsoft Corporation Automatic configuration of devices based on biometric data
WO2009006557A1 (en) * 2007-07-03 2009-01-08 Cypress Semiconductor Corporation Method for improving scan time and sensitivity in touch sensitive user interface device
US8031175B2 (en) * 2008-04-21 2011-10-04 Panasonic Corporation Touch sensitive remote control system that detects hand size characteristics of user and adapts mapping to screen display
US8299889B2 (en) * 2007-12-07 2012-10-30 Cisco Technology, Inc. Home entertainment system providing presence and mobility via remote control authentication
TW200930015A (en) * 2007-12-26 2009-07-01 Htc Corp A user interface of portable device and operating method thereof
US20090176565A1 (en) * 2008-01-07 2009-07-09 Bally Gaming, Inc. Gaming devices for biometrically identifying a player
WO2009089050A1 (en) * 2008-01-08 2009-07-16 Cirque Corporation Game controller touchpad providing touch stick functionality and relative and absolute position input
US8195220B2 (en) * 2008-02-01 2012-06-05 Lg Electronics Inc. User interface for mobile devices
EP2113828B1 (en) * 2008-04-30 2017-10-11 InnoLux Corporation Display device with touch screen
US20090284532A1 (en) * 2008-05-16 2009-11-19 Apple Inc. Cursor motion blurring
US8355003B2 (en) * 2008-06-13 2013-01-15 Microsoft Corporation Controller lighting activation by proximity and motion
KR20100006219A (en) * 2008-07-09 2010-01-19 삼성전자주식회사 Method and apparatus for user interface
US20100062833A1 (en) * 2008-09-10 2010-03-11 Igt Portable Gaming Machine Emergency Shut Down Circuitry
JP2010067117A (en) * 2008-09-12 2010-03-25 Mitsubishi Electric Corp Touch panel device
US8116453B2 (en) * 2008-12-29 2012-02-14 Bank Of America Corporation Gaming console-specific user authentication
US8217913B2 (en) * 2009-02-02 2012-07-10 Apple Inc. Integrated touch screen
US8264455B2 (en) * 2009-02-03 2012-09-11 Microsoft Corporation Mapping of physical controls for surface computing
US8154529B2 (en) * 2009-05-14 2012-04-10 Atmel Corporation Two-dimensional touch sensors
US20110009195A1 (en) * 2009-07-08 2011-01-13 Gunjan Porwal Configurable representation of a virtual button on a game controller touch screen
US20110028194A1 (en) * 2009-07-31 2011-02-03 Razer (Asia-Pacific) Pte Ltd System and method for unified-context mapping of physical input device controls to application program actions
US8334849B2 (en) * 2009-08-25 2012-12-18 Pixart Imaging Inc. Firmware methods and devices for a mutual capacitance touch sensing device
US8264471B2 (en) * 2009-09-22 2012-09-11 Sony Mobile Communications Ab Miniature character input mechanism
US8773366B2 (en) * 2009-11-16 2014-07-08 3M Innovative Properties Company Touch sensitive device using threshold voltage signal
US20120052929A1 (en) * 2010-08-31 2012-03-01 Khamvong Thammasouk Interactive phone case

Patent Citations (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4525859A (en) * 1982-09-03 1985-06-25 Bowles Romald E Pattern recognition system
US5812067A (en) * 1994-05-10 1998-09-22 Volkswagen Ag System for recognizing authorization to use a vehicle
US5812252A (en) * 1995-01-31 1998-09-22 Arete Associates Fingerprint--Acquisition apparatus for access control; personal weapon and other systems controlled thereby
US7602947B1 (en) * 1996-05-15 2009-10-13 Lemelson Jerome H Facial-recognition vehicle security system
US6628810B1 (en) * 1997-03-13 2003-09-30 Koninklijke Philips Electronics N.V. Hand biometrics sensing device
US5982913A (en) * 1997-03-25 1999-11-09 The United States Of America As Represented By The National Security Agency Method of verification using a subset of claimant's fingerprint
US6100811A (en) * 1997-12-22 2000-08-08 Trw Inc. Fingerprint actuation of customized vehicle features
US6225890B1 (en) * 1998-03-20 2001-05-01 Trimble Navigation Limited Vehicle use control
US6351695B1 (en) * 1999-04-23 2002-02-26 Ronald Weiss Verified common carrier truck operation log
US6940391B1 (en) * 2000-03-21 2005-09-06 Mitsubishi Denki Kabushiki Kaisha Vehicle key system
US6990219B2 (en) * 2000-12-15 2006-01-24 Nippon Telegraph And Telephone Corporation Image capturing method and apparatus and fingerprint collation method and apparatus
US6603462B2 (en) * 2001-03-21 2003-08-05 Multidigit, Inc. System and method for selecting functions based on a finger feature such as a fingerprint
US20040017934A1 (en) * 2002-07-29 2004-01-29 Kocher Robert William Method and apparatus for contactless hand recognition
US7280678B2 (en) * 2003-02-28 2007-10-09 Avago Technologies General Ip Pte Ltd Apparatus and method for detecting pupils
US7268665B2 (en) * 2003-11-18 2007-09-11 Kabushiki Kaisha Tokai Rika Denki Seisakusho Vehicle anti-theft apparatus
US8285009B2 (en) * 2004-02-12 2012-10-09 Nec Infrontia Corporation Fingerprint input apparatus
US8131026B2 (en) * 2004-04-16 2012-03-06 Validity Sensors, Inc. Method and apparatus for fingerprint image reconstruction
US20050271279A1 (en) * 2004-05-14 2005-12-08 Honda Motor Co., Ltd. Sign based human-machine interaction
US7956890B2 (en) * 2004-09-17 2011-06-07 Proximex Corporation Adaptive multi-modal integrated biometric identification detection and surveillance systems
US7809954B2 (en) * 2005-03-31 2010-10-05 Brian Scott Miller Biometric control of equipment
US20070055888A1 (en) * 2005-03-31 2007-03-08 Miller Brian S Biometric control of equipment
US8344849B2 (en) * 2005-07-11 2013-01-01 Volvo Technology Corporation Method for performing driver identity verification
US20080252412A1 (en) * 2005-07-11 2008-10-16 Volvo Technology Corporation Method for Performing Driver Identity Verification
US20090010502A1 (en) * 2005-09-30 2009-01-08 Daimler Ag Vehicle Occupant Protection System
US20070299577A1 (en) * 2006-05-24 2007-12-27 Denso Corporation User assistance system for vehicle
US20080004113A1 (en) * 2006-06-30 2008-01-03 Jason Avery Enhanced controller with modifiable functionality
US8175346B2 (en) * 2006-07-19 2012-05-08 Lumidigm, Inc. Whole-hand multispectral biometric imaging
US7660442B2 (en) * 2006-09-01 2010-02-09 Handshot, Llc Method and system for capturing fingerprints, palm prints and hand geometry
US20080069412A1 (en) * 2006-09-15 2008-03-20 Champagne Katrina S Contoured biometric sensor
US20100234074A1 (en) * 2006-10-02 2010-09-16 Nokia Corporation Keypad emulation
US20090073112A1 (en) * 2007-09-14 2009-03-19 International Business Machines Corporation Method and system for dynamically configurable tactile feedback for navigational support
US20090289780A1 (en) * 2008-05-21 2009-11-26 Danette Sue Tenorio-Fox SenCora print system
US20100039224A1 (en) * 2008-05-26 2010-02-18 Okude Kazuhiro Biometrics information matching apparatus, biometrics information matching system, biometrics information matching method, person authentication apparatus, and person authentication method
US20100248822A1 (en) * 2009-03-27 2010-09-30 Microsoft Corporation Personalization using a hand-pressure signature

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120068952A1 (en) * 2010-05-25 2012-03-22 Motorola Mobility, Inc. User computer device with temperature sensing capabilities and method of operating same
US9103732B2 (en) 2010-05-25 2015-08-11 Google Technology Holdings LLC User computer device with temperature sensing capabilities and method of operating same
US9159221B1 (en) * 2012-05-25 2015-10-13 George Stantchev Steering wheel with remote control capabilities
US9426274B2 (en) * 2012-09-27 2016-08-23 Intel Corporation Device, method, and system for portable configuration of vehicle controls
US20140088793A1 (en) * 2012-09-27 2014-03-27 Dennis M. Morgan Device, method, and system for portable configuration of vehicle controls
US9696839B1 (en) * 2013-03-15 2017-07-04 Adac Plastics, Inc. Vehicle door control
US9602624B2 (en) 2013-09-30 2017-03-21 AT&T Intellectual Property I, L.L.P. Facilitating content management based on profiles of members in an environment
US9819764B2 (en) 2013-09-30 2017-11-14 At&T Intellectual Property I, L.P. Facilitating content management based on profiles of members in an environment
WO2015069311A1 (en) * 2013-11-08 2015-05-14 Seyamak Vaziri Capacitive track pad transmission shift knob
US20150199941A1 (en) * 2014-01-15 2015-07-16 Nokia Corporation 3d touch sensor reader
US9753562B2 (en) 2014-01-15 2017-09-05 Nokia Technologies Oy Dynamic threshold for local connectivity setup
GB2528086A (en) * 2014-07-09 2016-01-13 Jaguar Land Rover Ltd Identification method and apparatus
EP3168778A1 (en) * 2015-11-13 2017-05-17 Thunder Power New Energy Vehicle Development Company Limited Vehicle fingerprint bookmark
US9937797B2 (en) 2015-11-13 2018-04-10 Thunder Power New Energy Vehicle Development Company Limited Vehicle fingerprint bookmark
US9944241B2 (en) 2015-11-13 2018-04-17 Thunder Power New Energy Vehicle Development Company Limited Vehicle fingerprint bookmark
US10061960B2 (en) 2015-11-13 2018-08-28 Thunder Power New Energy Vehicle Development Company Limited Vehicle fingerprint bookmark
US10194019B1 (en) * 2017-12-01 2019-01-29 Qualcomm Incorporated Methods and systems for initiating a phone call from a wireless communication device
WO2020111308A1 (en) * 2018-11-28 2020-06-04 전자부품연구원 Intuitive interaction method and system for augmented reality display for vehicle

Also Published As

Publication number Publication date
US8838060B2 (en) 2014-09-16
US20110118023A1 (en) 2011-05-19
US20110118028A1 (en) 2011-05-19
US20110118030A1 (en) 2011-05-19
US20110118027A1 (en) 2011-05-19
US8449393B2 (en) 2013-05-28
US20110115741A1 (en) 2011-05-19
US20110115604A1 (en) 2011-05-19
US20130237322A1 (en) 2013-09-12
US20110118029A1 (en) 2011-05-19
US20110118025A1 (en) 2011-05-19
US8614621B2 (en) 2013-12-24
US20110118024A1 (en) 2011-05-19
US9007331B2 (en) 2015-04-14
US8535133B2 (en) 2013-09-17
US8845424B2 (en) 2014-09-30
US20110115742A1 (en) 2011-05-19

Similar Documents

Publication Publication Date Title
US20110115606A1 (en) Touch sensitive panel in vehicle for user identification
EP3246801B1 (en) Combined fingerprint recognition touch screen device, method of driving the touch screen device, and electronic device including the touch screen device
EP3340879B1 (en) Vehicle security accessory and methods of identity authentication
US9285900B2 (en) Touch pen using delay device and touch input method thereof and touch input system and method thereof
US8195106B2 (en) Vehicle control and communication via device in proximity
US5054112A (en) Electronic data collection system
US8451810B2 (en) Wireless LAN system, a terminal and a recording medium readable by a computer
US20070224939A1 (en) Vehicle control and communication via device in proximity
US20070281735A1 (en) Remote communication device for wireless system
US20190050124A1 (en) Vehicle based trainable transceiver and authentication of user
CN1503956A (en) Mobile communication terminal
CN100373846C (en) User identifying method for remote controller and a remote controller
WO2010045554A1 (en) Vehicle biometric systems and methods
CN105264538A (en) Authentication for recognition systems
EP2917901B1 (en) Frequency shifting method for universal transmitters
US20130166106A1 (en) Portable information processing apparatus, host apparatus, and vehicle control method
JP2007162246A (en) Keyless door lock device
JP5495582B2 (en) Wireless communication apparatus, wireless communication system, control method, and program
US10629010B2 (en) Identification system and method for remote transmitter operation
KR102592678B1 (en) Method for controlling antenna characteristics and an electronic device thereof
JP2020530600A (en) Methods and terminals for controlling shared devices
KR101160452B1 (en) A system and method for wireless controlling a system using multiple finger scan
JP2006160189A (en) Tire information management system
US11908225B2 (en) Ultrasonic sensing
US20230136324A1 (en) Electronic device and method of presenting destination for item delivery

Legal Events

Date Code Title Description
AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FU, QIANG;KARAOGUZ, JEYHAN;KWAN, TOM W.;SIGNING DATES FROM 20100927 TO 20101022;REEL/FRAME:025206/0810

AS Assignment

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001

Effective date: 20170119