US20150256669A1 - System and method for distraction mitigation - Google Patents

System and method for distraction mitigation Download PDF

Info

Publication number
US20150256669A1
US20150256669A1 US14/200,902 US201414200902A US2015256669A1 US 20150256669 A1 US20150256669 A1 US 20150256669A1 US 201414200902 A US201414200902 A US 201414200902A US 2015256669 A1 US2015256669 A1 US 2015256669A1
Authority
US
United States
Prior art keywords
mobile device
zone
vehicle
distraction
mitigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/200,902
Other versions
US9674337B2 (en
Inventor
Alexandre James
Mark John Rigley
Robert Vincent
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BlackBerry Ltd
Original Assignee
2236008 Ontario Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 2236008 Ontario Inc filed Critical 2236008 Ontario Inc
Priority to US14/200,902 priority Critical patent/US9674337B2/en
Assigned to QNX SOFTWARE SYSTEMS LIMITED reassignment QNX SOFTWARE SYSTEMS LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JAMES, ALEXANDRE, Rigley, Mark John, VINCENT, ROBERT
Priority to EP15157968.7A priority patent/EP2953385B1/en
Assigned to 2236008 ONTARIO LIMITED reassignment 2236008 ONTARIO LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: QNX SOFTWARE SYSTEMS LIMITED
Assigned to 2236008 ONTARIO INC. reassignment 2236008 ONTARIO INC. CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE CORPORATE IDENTIFIER INADVERTENTLY LISTED ON THE ASSIGNMENT AND COVERSHEET AS "LIMITED" PREVIOUSLY RECORDED ON REEL 035700 FRAME 0845. ASSIGNOR(S) HEREBY CONFIRMS THE IDENTIFIER SHOULD HAVE STATED "INC.". Assignors: QNX SOFTWARE SYSTEMS LIMITED
Publication of US20150256669A1 publication Critical patent/US20150256669A1/en
Application granted granted Critical
Publication of US9674337B2 publication Critical patent/US9674337B2/en
Assigned to BLACKBERRY LIMITED reassignment BLACKBERRY LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: 2236008 ONTARIO INC.
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • H04M1/72577
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72463User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device
    • H04M1/72569
    • H04W4/046
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/42Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for mass transport vehicles, e.g. buses, trains or aircraft
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W48/00Access restriction; Network selection; Access point selection
    • H04W48/02Access restriction performed under specific conditions
    • H04W48/04Access restriction performed under specific conditions based on user or terminal location or mobility data, e.g. moving direction, speed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/60Substation equipment, e.g. for use by subscribers including speech amplifiers
    • H04M1/6033Substation equipment, e.g. for use by subscribers including speech amplifiers for providing handsfree use or a loudspeaker mode in telephone sets
    • H04M1/6041Portable telephones adapted for handsfree use
    • H04M1/6075Portable telephones adapted for handsfree use adapted for handsfree use in a vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72463User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device
    • H04M1/724631User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device by limiting the access to the user interface, e.g. locking a touch-screen or a keypad

Definitions

  • the present disclosure relates to the field of vehicle operator distraction mitigation.
  • a location of a mobile device within one of one or more zones may be determined and the behavior of a mobile device and/or in-vehicle systems with which the mobile device interacts may be modified based on the determined location.
  • FIG. 1 is a schematic representation of an overhead view of an automobile cabin in which a system and method for distraction mitigation may be used.
  • FIG. 2 is a representation of a method for distraction mitigation.
  • FIG. 3 is a further representation of a method for distraction mitigation.
  • FIG. 4 is a schematic representation of a system for distraction mitigation.
  • FIG. 5 is a further schematic representation of a system for distraction mitigation.
  • FIG. 6 is another schematic representation of a system for distraction mitigation.
  • a system and method for distraction mitigation wherein a location of a mobile device within one of one or more zones in a vehicle may be determined and the behavior of the mobile device and/or in-vehicle systems, with which the mobile device interacts, may be modified based on the determined location.
  • the location of a smartphone may be determined to be within the area (a.k.a. zone) of an automobile cabin associated with a driver of the automobile and the smartphone user interface may be locked to prevent direct operation of the smartphone by the driver while still allowing indirect operation via, for example, a Bluetooth connection to an automobile infotainment system.
  • FIG. 1 is a schematic representation of an overhead view of an automobile cabin in which a system and method for distraction mitigation may be used.
  • the example automobile cabin 100 may include an infotainment system 102 (a.k.a. an in-vehicle system) and multiple audio transducers 104 A, 104 B, 104 C and 104 D (collectively or generically audio transducers 104 ).
  • the infotainment system 102 may include support for one or more functions such as radio tuner, media player, hands-free telephony, navigation system and other similar infotainment functions.
  • infotainment system 102 may be provided through interaction with one or more mobile devices such as driver's mobile device 106 A and passengers' mobile devices 106 B and 106 C (collectively or generically mobile devices 106 ).
  • mobile devices such as driver's mobile device 106 A and passengers' mobile devices 106 B and 106 C (collectively or generically mobile devices 106 ).
  • One or more of the audio transducers 104 may emit encoded signals 110 A, 110 B, 110 C and 110 D (collectively or generically encoded signals 110 ) that may be received by each of the mobile devices 106 .
  • Each mobile device 106 may decode the received signals to extract location information and zone information.
  • the location information may include any information that may be used by a mobile device 106 to determine its location relative to the locations of audio transducers 104 .
  • the zone information may include descriptive information that defines one or more zones in the vehicle cabin relative to the locations of audio transducers 104 .
  • the zones may include a driver's zone 108 A, a front passenger zone 108 B and a rear seat passengers' zone 108 C (collectively or generically the zones 108 ).
  • the encoded signals 110 emitted by the audio transducers 104 may be received by each mobile device 106 via an audio receiver and one or more microphones.
  • the encoded signals 110 may be emitted by radio frequency (RF) transmitters, ultrasonic transmitters, or other forms of emitters suitable to emitting encoded signals 110 and the mobile devices 106 may receive the encoded signals 110 using corresponding signal receivers suitable to receive the audio, RF, ultrasonic or other form of encoded signals 110 .
  • RF radio frequency
  • the location information contained in each encoded signal 110 may include identification of the audio transducer 104 (or alternatively other form of emitter) and the audio transducer's 104 location.
  • the audio transducer's location may, for example, be represented in X and Y coordinates in a Cartesian plane.
  • the X and Y coordinates may be expressed in distance units (e.g. centimeters) or in grid units (e.g. 2 gridlines over by 3 gridlines down) relative to a point of reference (e.g. 0,0).
  • the zone information contained in each encoded signal 110 may include one or more of: a zone identifier, a zone boundary definition, a zone type, and zone restrictions.
  • the zone identifier may be in the form of an enumerated identifier (e.g. 1, 2, 3, . . . or A, B, C, . . . ), in the form of a named identifier (e.g. driver, front passenger, rear passengers) or in other similar forms that allow the zones to be distinguished from each other.
  • the zone boundary definition may be in the form of the dimensions and relative position of a geometric figure that delineates the boundary of the zone (e.g. a 60 ⁇ 90 cm rectangular area having a front left vertex at offset 0.25 cm from the point of reference).
  • the zone type may be in the form of types, or classes, each having pre-defined privileges or restrictions (e.g. fully restricted operation, partially restricted operation or unrestricted operation) or in the form of roles (e.g. driver, passenger) having associated privileges or restrictions.
  • the zone restrictions may be in the form of restrictions to specific operations such as, for example, no direct manual operation allowed, no voice control allowed, no texting or messaging allowed, no email allowed, no media source change allowed, no volume setting change allowed and other similar restrictions.
  • the location information and zone information (e.g. encoded signal 110 ) contained in the one or more emitted audio signals may be psychoacoustically hidden, or masked, such that a human may not perceive the added location information and zone information.
  • Time and frequency based psychoacoustic masking models may be used to determine where additional audio signal content (e.g. encoded signal 110 ) may be added to the emitted audio signals in order for the additional audio signal to be imperceptible.
  • the additional audio signal content (e.g. encoded signal 110 ) may include the location information and/or zone information.
  • a frequency based psychoacoustic masking model may calculate a masked signal-to-noise ratio (SNR) across frequency bands based on one or more currently emitted audio signals.
  • SNR signal-to-noise ratio
  • two or more emitted audio signals with different signal content may be included in the masked SNR calculation.
  • the human listener may not perceive additional audio content added at a signal level below the masked SNR.
  • the physchoacoustic masking model may include two or more emitted audio signals and an environmental noise signal in the calculated masked SNR.
  • a microphone may be used to capture the environmental noise that may be included in the masked SNR calculation.
  • a vehicle may have an environmental noise level allowing location information and zone information to be reproduced and psychoacoustically masked by the environmental noise even in absence of an emitted audio signal. In this case the environmental noise signal is used to determine a output signal level for the additional audio signal (e.g. encoded signal 110 ).
  • the psychoacoustically masked additional audio content may be added to the emitted audio signals as structured noise and/or tones. Modulating the phase and/or amplitude of a tone may be used to transmit a digital encoded signal.
  • the structured noise may be a maximum length sequence, or MLS, that may be modified, or filtered, in the frequency domain to be below the masked SNR.
  • An MLS sequence may be used to determine the latency between a transmitter and a receiver. The latency may be used to calculate the distance between an audio transducer 104 in the vehicle and the mobile device 106 .
  • Each of the two or more emitted signals may include a different MLS in order to distinguish each emitted signal from the others.
  • the location of the mobile device 106 may be determined by analyzing the additional audio content (e.g. encoded signal 110 ) contained in each of the two or more emitted audio signals and/or the emitted signal level.
  • the mobile device 106 may analyze the emitted signals to calculate the location.
  • the mobile device 106 may transmit the captured signals to an in-vehicle system so that the in-vehicle system may analyze and calculate the mobile device location.
  • Each mobile device 106 may use the encoded signals 110 received from the audio transmitters 104 to determine its location relative to the locations of audio transducers 104 and/or a point of reference. Each mobile device 106 may use any know method of localization including, for example, triangulation, trilateration, ranging, multilateration to determines it's location. The determined location may be used by the mobile device 106 to further determine within which zone's boundaries the mobile device 106 is located.
  • any of the mobile devices 106 may use corresponding zone type and/or zone restrictions to modify the behavior of the mobile device 106 including, for example, disabling one or more applications, disabling one or more features within any of one or more applications, blocking one or more functions of the mobile device, locking the mobile device, disabling one or more modes of the mobile device user interface, and displaying a warning message.
  • the mobile device 106 may share its location and/or occupied zone with one or more in-vehicle systems such as, for example, the infotainment system.
  • the mobile device 106 may connect to and share information with the one or more in-vehicle systems using a data communications mechanism such as, for example, Bluetooth® (Bluetooth SIG, Inc., Kirkland, Wash.), Wi-FiTM (Wi-Fi Alliance, Austin, Tex.), Universal Serial Bus (USB), near field communications (NFC), proprietary protocols and other similar data communications mechanisms.
  • the in-vehicle systems may use the location and/or occupied zone of each mobile device 106 to allow or restrict operations and/or interactions with the mobile device 106 .
  • the in-vehicle systems may alternatively, or in addition, use the location and/or occupied zone of each mobile device 106 to display the position of the mobile device 106 in a user interface to facilitate selection of the mobile device 106 by a user and/or to determine pairing order (e.g for a Bluetooth connection) or prioritization of multiple mobile devices 106 with the in-vehicle system.
  • pairing order e.g for a Bluetooth connection
  • the encoded signals 110 may be emitted using any one or more triggers such as, for example, vehicle door opening or closing, vehicle start-up, drivetrain engagement, vehicle motion start/stop, periodically or based on a request from any of the mobile devices 106 .
  • Each mobile device 106 may request an emission of the encoded signals 110 based on one or more triggers such as power-on of the mobile device 106 , sensed motion of the mobile device 106 (e.g. using accelerometer) or other similar triggers.
  • the mobile device 106 may request an emission of the encoded signals 110 when it detects motion in order to determine which zone it is in or whether the occupied zone has changed.
  • Each mobile device 106 may disable or restrict certain operations or functions based on the zone it is located in (a.k.a. the occupied zone).
  • Providing the zone information from the vehicle to the mobile device 106 mitigates the need for the mobile device 106 to be pre-configured with, or to have access to, information identifying the vehicle make, model or configuration.
  • the mobile device does not need to know if the vehicle is configured for left-hand or right-hand side drive, self-driven or Sirir-driven (e.g. limousine, taxi, or similar), the number of rows of seats (e.g. optional third-row configuration).
  • having the vehicle provide the zone information facilitates correct operation of the mobile device and in-vehicle system in use cases such as a driver switching between vehicles and driving a rental vehicle.
  • FIG. 2 is a representation of a method for distraction mitigation.
  • the method 200 may be, for example, implemented using the systems 400 and 600 described herein with reference to FIGS. 4 and 6 .
  • the method 200 may include the following acts that are further described above.
  • the zone information may define one or more zones in the vehicle.
  • the occupied zone is a zone of the one or more zones defined by the zone information that the mobile device is determined to be in.
  • FIG. 3 is a further representation of a method for distraction mitigation.
  • the method 300 may be, for example, implemented using the systems 500 and 600 described herein with reference to FIGS. 5 and 6 .
  • the method 300 may include the following acts that are further described above.
  • Emitting ( 302 ) one or more signals containing location information.
  • Emitting ( 304 ) a signal containing zone information.
  • the zone information may define one or more zones in a vehicle.
  • the emitted signals may be received by a mobile device 106 .
  • the occupied zone is a zone of the one or more zones defined by the zone information that the mobile device 106 is determined to be in.
  • Modifying ( 308 ) the behavior of one or more in-vehicle systems responsive to the occupied zone may include the behavior with regard to interactions with the mobile device 106 .
  • modifying the behavior of the one or more in-vehicle systems responsive to the occupied zone may include configuring the in-vehicle systems (e.g. infotainment system) in accordance with user preferences associated with the mobile device 106 in the occupied zone.
  • modifying the behavior of the one or more in-vehicle systems may include any of disabling one or more applications, disabling one or more features within any of one or more applications, blocking one or more functions of the in-vehicle system, locking the in-vehicle system, disabling one or more modes of the in-vehicle system user interface, and displaying a warning message
  • FIG. 4 is a schematic representation of a system for distraction mitigation.
  • the example system 400 includes a first audio receiver 402 , a second audio receiver 404 , a location unit 406 , a zone detection unit 408 and a distraction controller 410 .
  • the first audio receiver 402 may receive two or more audio signals containing location information via, for example, a microphone 412 .
  • the second audio receiver 404 may receive an audio signal containing zone information via, for example, a microphone 414 .
  • either or both of the first and second audio receivers 402 and 404 may be receivers for RF or ultrasonic signals.
  • the first and second audio receivers 402 and 404 may be a single common receiver and may receive signals via a single common microphone or multiple shared microphones.
  • the location unit 406 may use the location information in the received signals to determine a location of a mobile device 106 in a vehicle.
  • the system 400 may be incorporated into the mobile device 106 .
  • the zone determining unit 408 may use the zone information in the received signal together with the determined location of the mobile device 106 to determine which, if any, zone the mobile device 106 is located in.
  • the zone information may define one or more zones in the vehicle.
  • a zone, of the one or more zones, in which the mobile device 106 is located, may be referred to as the occupied zone.
  • the distraction controller 410 may used the occupied zone to modify the behavior of mobile device 106 .
  • the system 400 may include a transmitter 416 that may transmit either or both the location of the mobile device 106 and an identifier of the occupied zone to an in-vehicle system 510 .
  • FIG. 5 is a further schematic representation of a system for distraction mitigation.
  • the example system 500 includes a location information source 502 , a zone information source 504 , multiple audio transducers 104 , a receiver 506 , a distraction controller 508 and one or more in-vehicle systems 510 that are incorporated into a vehicle.
  • the audio transducers 104 may each emit audio signals containing either or both location information from the location information source 502 and zone information from the zone information source 504 .
  • the location information and zone information may optionally be combined in a common signal using, for example, a multiplexer/combiner 512 .
  • the location and/or zone information may be mixed into an audio signal from an audio signal source 514 , such as infotainment system 102 , using, for example, a mixer/modulator 516 .
  • the audio signal may be a multi-channel signal providing different content to each of the audio transducers 104 .
  • the signals emitted by the audio transducers 106 may be received by a mobile device 106 and used to determine a location and occupied zone in which the mobile device 106 is located.
  • the receiver 506 may receive either or both a location and an occupied zone transmitted by the mobile device 106 .
  • the distraction controller 508 may use the received location and/or occupied zone to modify the behavior of the one or more in-vehicle systems 510 .
  • the behavior of the one or more in-vehicle systems 510 may be modified with regard to interactions with the mobile device 106 in order to, for example, mitigate driver distraction.
  • FIG. 6 is a schematic representation of a system for distraction mitigation.
  • the system 600 comprises a processor 602 , memory 604 (the contents of which are accessible by the processor 602 ) and an I/O interface 606 .
  • the processor 602 may comprise a single processor or multiple processors that may be disposed on a single chip, on multiple devices or distributed over more that one system.
  • the processor 602 may be hardware that executes computer executable instructions or computer code embodied in the memory 604 or in other memory to perform one or more features of the system.
  • the processor 602 may include a general purpose processor, a central processing unit (CPU), a graphics processing unit (GPU), an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), a digital circuit, an analog circuit, a microcontroller, any other type of processor, or any combination thereof.
  • CPU central processing unit
  • GPU graphics processing unit
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • digital circuit an analog circuit
  • microcontroller any other type of processor, or any combination thereof.
  • the memory 604 may comprise a device for storing and retrieving data, processor executable instructions, or any combination thereof.
  • the memory 604 may include non-volatile and/or volatile memory, such as a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), or a flash memory.
  • RAM random access memory
  • ROM read-only memory
  • EPROM erasable programmable read-only memory
  • the memory 604 may comprise a single device or multiple devices that may be disposed on one or more dedicated memory devices or on a processor or other similar device.
  • the memory T04 may include an optical, magnetic (hard-drive) or any other form of data storage device.
  • the memory 604 may store computer code, such as instructions which when executed using the process 602 may cause the system 600 to render the functionality associated with either of methods 200 and 300 as described herein.
  • the computer code may include instructions executable with the processor 602 .
  • the computer code may be written in any computer language, such as C, C++, assembly language, channel program code, and/or any combination of computer languages.
  • the memory 604 may store information in data structures including, for example, location information, zone information and signal content.
  • the I/O interface 606 may be used to connect devices such as, for example, receivers 402 and 404 , transmitter 416 , emitters (e.g. audio transducers 104 ), receiver 506 , in-vehicle systems 510 and to other components of the system 600 .
  • devices such as, for example, receivers 402 and 404 , transmitter 416 , emitters (e.g. audio transducers 104 ), receiver 506 , in-vehicle systems 510 and to other components of the system 600 .
  • the systems 400 , 500 and 600 may include more, fewer, or different components than illustrated in FIGS. 4 , 5 and 6 . Furthermore, each one of the components of systems 400 , 500 and 600 may include more, fewer, or different elements than is illustrated in FIGS. 4 , 5 and 6 .
  • Flags, data, databases, tables, entities, and other data structures may be separately stored and managed, may be incorporated into a single memory or database, may be distributed, or may be logically and physically organized in many different ways.
  • the components may operate independently or be part of a same program or hardware.
  • the components may be resident on separate hardware, such as separate removable circuit boards, or share common hardware, such as a same memory and processor for implementing instructions from the memory. Programs may be parts of a single program, separate programs, or distributed across several memories and processors.
  • the functions, acts or tasks illustrated in the figures or described may be executed in response to one or more sets of logic or instructions stored in or on computer readable media.
  • the functions, acts or tasks are independent of the particular type of instructions set, storage media, processor or processing strategy and may be performed by software, hardware, integrated circuits, firmware, micro code and the like, operating alone or in combination.
  • processing strategies may include multiprocessing, multitasking, parallel processing, distributed processing, and/or any other type of processing.
  • the instructions are stored on a removable media device for reading by local or remote systems.
  • the logic or instructions are stored in a remote location for transfer through a computer network or over telephone lines.
  • the logic or instructions may be stored within a given computer such as, for example, a CPU.

Abstract

A system and method for distraction mitigation wherein a location of a mobile device within one of one or more zones in a vehicle may be determined and the behavior of the mobile device and/or in-vehicle systems, with which the mobile device interacts, may be modified based on the determined location. In one example, the location of a smartphone may be determined to be within the area (a.k.a. zone) of an automobile cabin associated with a driver of the automobile and the smartphone user interface may be locked to prevent direct operation of the smartphone by the driver while still allowing indirect operation via, for example, a Bluetooth connection to an automobile infotainment system.

Description

    BACKGROUND
  • 1. Technical Field
  • The present disclosure relates to the field of vehicle operator distraction mitigation. In particular, to a system and method for distraction mitigation wherein a location of a mobile device within one of one or more zones may be determined and the behavior of a mobile device and/or in-vehicle systems with which the mobile device interacts may be modified based on the determined location.
  • 2. Related Art
  • Recent studies and accident statistics suggest that the use of mobile devices such as, for example, smartphones by automobile drivers while driving may be a significant cause of driver distraction and may contribute to an elevated risk of accidents. In some jurisdictions regulations exist, or are being considered, that ban or restrict the use of mobile devices by drivers while operating a vehicle. The effectiveness of these regulations is dependent on voluntary compliance by drivers and detection of non-compliance by law enforcement agencies.
  • It may be desirable to provide a mechanism that restricts or disables operation of one or more functions of a mobile device and/or in-vehicle systems by a driver while operating a vehicle.
  • BRIEF DESCRIPTION OF DRAWINGS
  • The system and method may be better understood with reference to the following drawings and description. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the disclosure. Moreover, in the figures, like referenced numerals designate corresponding parts throughout the different views.
  • Other systems, methods, features and advantages will be, or will become, apparent to one with skill in the art upon examination of the following figures and detailed description. It is intended that all such additional systems, methods, features and advantages be included with this description and be protected by the following claims.
  • FIG. 1 is a schematic representation of an overhead view of an automobile cabin in which a system and method for distraction mitigation may be used.
  • FIG. 2 is a representation of a method for distraction mitigation.
  • FIG. 3 is a further representation of a method for distraction mitigation.
  • FIG. 4 is a schematic representation of a system for distraction mitigation.
  • FIG. 5 is a further schematic representation of a system for distraction mitigation.
  • FIG. 6 is another schematic representation of a system for distraction mitigation.
  • DETAILED DESCRIPTION
  • A system and method for distraction mitigation wherein a location of a mobile device within one of one or more zones in a vehicle may be determined and the behavior of the mobile device and/or in-vehicle systems, with which the mobile device interacts, may be modified based on the determined location. In one example, the location of a smartphone may be determined to be within the area (a.k.a. zone) of an automobile cabin associated with a driver of the automobile and the smartphone user interface may be locked to prevent direct operation of the smartphone by the driver while still allowing indirect operation via, for example, a Bluetooth connection to an automobile infotainment system.
  • FIG. 1 is a schematic representation of an overhead view of an automobile cabin in which a system and method for distraction mitigation may be used. The example automobile cabin 100 may include an infotainment system 102 (a.k.a. an in-vehicle system) and multiple audio transducers 104A, 104B, 104C and 104D (collectively or generically audio transducers 104). The infotainment system 102 may include support for one or more functions such as radio tuner, media player, hands-free telephony, navigation system and other similar infotainment functions. Some of the functions of the infotainment system 102, for example media player or hands-free telephony, may be provided through interaction with one or more mobile devices such as driver's mobile device 106A and passengers' mobile devices 106B and 106C (collectively or generically mobile devices 106).
  • One or more of the audio transducers 104 may emit encoded signals 110A, 110B, 110C and 110D (collectively or generically encoded signals 110) that may be received by each of the mobile devices 106. Each mobile device 106 may decode the received signals to extract location information and zone information. The location information may include any information that may be used by a mobile device 106 to determine its location relative to the locations of audio transducers 104. The zone information may include descriptive information that defines one or more zones in the vehicle cabin relative to the locations of audio transducers 104. For example, the zones may include a driver's zone 108A, a front passenger zone 108B and a rear seat passengers' zone 108C (collectively or generically the zones 108).
  • The encoded signals 110 emitted by the audio transducers 104 may be received by each mobile device 106 via an audio receiver and one or more microphones. Alternatively, or in addition, the encoded signals 110 may be emitted by radio frequency (RF) transmitters, ultrasonic transmitters, or other forms of emitters suitable to emitting encoded signals 110 and the mobile devices 106 may receive the encoded signals 110 using corresponding signal receivers suitable to receive the audio, RF, ultrasonic or other form of encoded signals 110.
  • The location information contained in each encoded signal 110 may include identification of the audio transducer 104 (or alternatively other form of emitter) and the audio transducer's 104 location. The audio transducer's location may, for example, be represented in X and Y coordinates in a Cartesian plane. The X and Y coordinates may be expressed in distance units (e.g. centimeters) or in grid units (e.g. 2 gridlines over by 3 gridlines down) relative to a point of reference (e.g. 0,0).
  • The zone information contained in each encoded signal 110 may include one or more of: a zone identifier, a zone boundary definition, a zone type, and zone restrictions. The zone identifier may be in the form of an enumerated identifier (e.g. 1, 2, 3, . . . or A, B, C, . . . ), in the form of a named identifier (e.g. driver, front passenger, rear passengers) or in other similar forms that allow the zones to be distinguished from each other. The zone boundary definition may be in the form of the dimensions and relative position of a geometric figure that delineates the boundary of the zone (e.g. a 60×90 cm rectangular area having a front left vertex at offset 0.25 cm from the point of reference). The zone type may be in the form of types, or classes, each having pre-defined privileges or restrictions (e.g. fully restricted operation, partially restricted operation or unrestricted operation) or in the form of roles (e.g. driver, passenger) having associated privileges or restrictions. The zone restrictions may be in the form of restrictions to specific operations such as, for example, no direct manual operation allowed, no voice control allowed, no texting or messaging allowed, no email allowed, no media source change allowed, no volume setting change allowed and other similar restrictions.
  • The location information and zone information (e.g. encoded signal 110) contained in the one or more emitted audio signals may be psychoacoustically hidden, or masked, such that a human may not perceive the added location information and zone information. Time and frequency based psychoacoustic masking models may be used to determine where additional audio signal content (e.g. encoded signal 110) may be added to the emitted audio signals in order for the additional audio signal to be imperceptible. The additional audio signal content (e.g. encoded signal 110) may include the location information and/or zone information. A frequency based psychoacoustic masking model may calculate a masked signal-to-noise ratio (SNR) across frequency bands based on one or more currently emitted audio signals. For example, two or more emitted audio signals with different signal content (e.g. stereo or other multi-channel audio) may be included in the masked SNR calculation. The human listener may not perceive additional audio content added at a signal level below the masked SNR. Alternatively, or in addition, the physchoacoustic masking model may include two or more emitted audio signals and an environmental noise signal in the calculated masked SNR. A microphone may be used to capture the environmental noise that may be included in the masked SNR calculation. In a further alternative, a vehicle may have an environmental noise level allowing location information and zone information to be reproduced and psychoacoustically masked by the environmental noise even in absence of an emitted audio signal. In this case the environmental noise signal is used to determine a output signal level for the additional audio signal (e.g. encoded signal 110).
  • The psychoacoustically masked additional audio content (e.g. encoded signal 110) may be added to the emitted audio signals as structured noise and/or tones. Modulating the phase and/or amplitude of a tone may be used to transmit a digital encoded signal. The structured noise may be a maximum length sequence, or MLS, that may be modified, or filtered, in the frequency domain to be below the masked SNR. An MLS sequence may be used to determine the latency between a transmitter and a receiver. The latency may be used to calculate the distance between an audio transducer 104 in the vehicle and the mobile device 106. Each of the two or more emitted signals may include a different MLS in order to distinguish each emitted signal from the others. The location of the mobile device 106 may be determined by analyzing the additional audio content (e.g. encoded signal 110) contained in each of the two or more emitted audio signals and/or the emitted signal level. The mobile device 106 may analyze the emitted signals to calculate the location. In one alternative, the mobile device 106 may transmit the captured signals to an in-vehicle system so that the in-vehicle system may analyze and calculate the mobile device location.
  • Each mobile device 106 may use the encoded signals 110 received from the audio transmitters 104 to determine its location relative to the locations of audio transducers 104 and/or a point of reference. Each mobile device 106 may use any know method of localization including, for example, triangulation, trilateration, ranging, multilateration to determines it's location. The determined location may be used by the mobile device 106 to further determine within which zone's boundaries the mobile device 106 is located.
  • When any of the mobile devices 106 has determined its location and/or which zone it is in (a.k.a. the occupied zone), it may use corresponding zone type and/or zone restrictions to modify the behavior of the mobile device 106 including, for example, disabling one or more applications, disabling one or more features within any of one or more applications, blocking one or more functions of the mobile device, locking the mobile device, disabling one or more modes of the mobile device user interface, and displaying a warning message.
  • The mobile device 106 may share its location and/or occupied zone with one or more in-vehicle systems such as, for example, the infotainment system. The mobile device 106 may connect to and share information with the one or more in-vehicle systems using a data communications mechanism such as, for example, Bluetooth® (Bluetooth SIG, Inc., Kirkland, Wash.), Wi-Fi™ (Wi-Fi Alliance, Austin, Tex.), Universal Serial Bus (USB), near field communications (NFC), proprietary protocols and other similar data communications mechanisms. The in-vehicle systems may use the location and/or occupied zone of each mobile device 106 to allow or restrict operations and/or interactions with the mobile device 106. The in-vehicle systems may alternatively, or in addition, use the location and/or occupied zone of each mobile device 106 to display the position of the mobile device 106 in a user interface to facilitate selection of the mobile device 106 by a user and/or to determine pairing order (e.g for a Bluetooth connection) or prioritization of multiple mobile devices 106 with the in-vehicle system.
  • The encoded signals 110 may be emitted using any one or more triggers such as, for example, vehicle door opening or closing, vehicle start-up, drivetrain engagement, vehicle motion start/stop, periodically or based on a request from any of the mobile devices 106. Each mobile device 106 may request an emission of the encoded signals 110 based on one or more triggers such as power-on of the mobile device 106, sensed motion of the mobile device 106 (e.g. using accelerometer) or other similar triggers. The mobile device 106 may request an emission of the encoded signals 110 when it detects motion in order to determine which zone it is in or whether the occupied zone has changed. Each mobile device 106 may disable or restrict certain operations or functions based on the zone it is located in (a.k.a. the occupied zone).
  • Providing the zone information from the vehicle to the mobile device 106 mitigates the need for the mobile device 106 to be pre-configured with, or to have access to, information identifying the vehicle make, model or configuration. For example, the mobile device does not need to know if the vehicle is configured for left-hand or right-hand side drive, self-driven or chauffer-driven (e.g. limousine, taxi, or similar), the number of rows of seats (e.g. optional third-row configuration). In addition, having the vehicle provide the zone information facilitates correct operation of the mobile device and in-vehicle system in use cases such as a driver switching between vehicles and driving a rental vehicle.
  • FIG. 2 is a representation of a method for distraction mitigation. The method 200 may be, for example, implemented using the systems 400 and 600 described herein with reference to FIGS. 4 and 6. The method 200 may include the following acts that are further described above. Receiving (202) one or more signals containing location information. Determining (204) a location of a mobile device within a vehicle. Receiving (206) a signal containing zone information. The zone information may define one or more zones in the vehicle. Determining (208) if the mobile device is located within any of the one or more zones. Modifying (210) the behavior of the mobile device responsive to an occupied zone. The occupied zone is a zone of the one or more zones defined by the zone information that the mobile device is determined to be in.
  • FIG. 3 is a further representation of a method for distraction mitigation. The method 300 may be, for example, implemented using the systems 500 and 600 described herein with reference to FIGS. 5 and 6. The method 300 may include the following acts that are further described above. Emitting (302) one or more signals containing location information. Emitting (304) a signal containing zone information. The zone information may define one or more zones in a vehicle. The emitted signals may be received by a mobile device 106. Receiving (306) from the mobile device an identifier of an occupied zone. The occupied zone is a zone of the one or more zones defined by the zone information that the mobile device 106 is determined to be in. Modifying (308) the behavior of one or more in-vehicle systems responsive to the occupied zone. The behavior of the one or more in-vehicle systems may include the behavior with regard to interactions with the mobile device 106. In another example, modifying the behavior of the one or more in-vehicle systems responsive to the occupied zone may include configuring the in-vehicle systems (e.g. infotainment system) in accordance with user preferences associated with the mobile device 106 in the occupied zone. In a further example, modifying the behavior of the one or more in-vehicle systems may include any of disabling one or more applications, disabling one or more features within any of one or more applications, blocking one or more functions of the in-vehicle system, locking the in-vehicle system, disabling one or more modes of the in-vehicle system user interface, and displaying a warning message
  • FIG. 4 is a schematic representation of a system for distraction mitigation. The example system 400 includes a first audio receiver 402, a second audio receiver 404, a location unit 406, a zone detection unit 408 and a distraction controller 410. The first audio receiver 402 may receive two or more audio signals containing location information via, for example, a microphone 412. The second audio receiver 404 may receive an audio signal containing zone information via, for example, a microphone 414. Alternatively, either or both of the first and second audio receivers 402 and 404 may be receivers for RF or ultrasonic signals. In a further alternative, the first and second audio receivers 402 and 404 may be a single common receiver and may receive signals via a single common microphone or multiple shared microphones. The location unit 406 may use the location information in the received signals to determine a location of a mobile device 106 in a vehicle. The system 400 may be incorporated into the mobile device 106. The zone determining unit 408 may use the zone information in the received signal together with the determined location of the mobile device 106 to determine which, if any, zone the mobile device 106 is located in. The zone information may define one or more zones in the vehicle. A zone, of the one or more zones, in which the mobile device 106 is located, may be referred to as the occupied zone. The distraction controller 410 may used the occupied zone to modify the behavior of mobile device 106. In addition, the system 400 may include a transmitter 416 that may transmit either or both the location of the mobile device 106 and an identifier of the occupied zone to an in-vehicle system 510.
  • FIG. 5 is a further schematic representation of a system for distraction mitigation. The example system 500 includes a location information source 502, a zone information source 504, multiple audio transducers 104, a receiver 506, a distraction controller 508 and one or more in-vehicle systems 510 that are incorporated into a vehicle. The audio transducers 104 may each emit audio signals containing either or both location information from the location information source 502 and zone information from the zone information source 504. The location information and zone information may optionally be combined in a common signal using, for example, a multiplexer/combiner 512. Alternatively, or in addition, the location and/or zone information may be mixed into an audio signal from an audio signal source 514, such as infotainment system 102, using, for example, a mixer/modulator 516. The audio signal may be a multi-channel signal providing different content to each of the audio transducers 104. The signals emitted by the audio transducers 106 may be received by a mobile device 106 and used to determine a location and occupied zone in which the mobile device 106 is located. The receiver 506 may receive either or both a location and an occupied zone transmitted by the mobile device 106. The distraction controller 508 may use the received location and/or occupied zone to modify the behavior of the one or more in-vehicle systems 510. The behavior of the one or more in-vehicle systems 510 may be modified with regard to interactions with the mobile device 106 in order to, for example, mitigate driver distraction.
  • FIG. 6 is a schematic representation of a system for distraction mitigation. The system 600 comprises a processor 602, memory 604 (the contents of which are accessible by the processor 602) and an I/O interface 606. The processor 602 may comprise a single processor or multiple processors that may be disposed on a single chip, on multiple devices or distributed over more that one system. The processor 602 may be hardware that executes computer executable instructions or computer code embodied in the memory 604 or in other memory to perform one or more features of the system. The processor 602 may include a general purpose processor, a central processing unit (CPU), a graphics processing unit (GPU), an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), a digital circuit, an analog circuit, a microcontroller, any other type of processor, or any combination thereof.
  • The memory 604 may comprise a device for storing and retrieving data, processor executable instructions, or any combination thereof. The memory 604 may include non-volatile and/or volatile memory, such as a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), or a flash memory. The memory 604 may comprise a single device or multiple devices that may be disposed on one or more dedicated memory devices or on a processor or other similar device. Alternatively or in addition, the memory T04 may include an optical, magnetic (hard-drive) or any other form of data storage device.
  • The memory 604 may store computer code, such as instructions which when executed using the process 602 may cause the system 600 to render the functionality associated with either of methods 200 and 300 as described herein. The computer code may include instructions executable with the processor 602. The computer code may be written in any computer language, such as C, C++, assembly language, channel program code, and/or any combination of computer languages. The memory 604 may store information in data structures including, for example, location information, zone information and signal content.
  • The I/O interface 606 may be used to connect devices such as, for example, receivers 402 and 404, transmitter 416, emitters (e.g. audio transducers 104), receiver 506, in-vehicle systems 510 and to other components of the system 600.
  • All of the disclosure, regardless of the particular implementation described, is exemplary in nature, rather than limiting. The systems 400, 500 and 600 may include more, fewer, or different components than illustrated in FIGS. 4, 5 and 6. Furthermore, each one of the components of systems 400, 500 and 600 may include more, fewer, or different elements than is illustrated in FIGS. 4, 5 and 6. Flags, data, databases, tables, entities, and other data structures may be separately stored and managed, may be incorporated into a single memory or database, may be distributed, or may be logically and physically organized in many different ways. The components may operate independently or be part of a same program or hardware. The components may be resident on separate hardware, such as separate removable circuit boards, or share common hardware, such as a same memory and processor for implementing instructions from the memory. Programs may be parts of a single program, separate programs, or distributed across several memories and processors.
  • The functions, acts or tasks illustrated in the figures or described may be executed in response to one or more sets of logic or instructions stored in or on computer readable media. The functions, acts or tasks are independent of the particular type of instructions set, storage media, processor or processing strategy and may be performed by software, hardware, integrated circuits, firmware, micro code and the like, operating alone or in combination. Likewise, processing strategies may include multiprocessing, multitasking, parallel processing, distributed processing, and/or any other type of processing. In one embodiment, the instructions are stored on a removable media device for reading by local or remote systems. In other embodiments, the logic or instructions are stored in a remote location for transfer through a computer network or over telephone lines. In yet other embodiments, the logic or instructions may be stored within a given computer such as, for example, a CPU.
  • While various embodiments of the system and method for distraction mitigation have been described, it will be apparent to those of ordinary skill in the art that many more embodiments and implementations are possible within the scope of the present invention. Accordingly, the invention is not to be restricted except in light of the attached claims and their equivalents.

Claims (24)

1. A method for distraction mitigation comprising:
receiving one or more signals containing location information;
determining a location of a mobile device with in a vehicle;
receiving a signal containing zone information defining one or more zones in the vehicle;
determining if the mobile device is located within any of the one or more zones; and
modifying the behavior of the mobile device responsive to an occupied zone, of the one or more zones, that the mobile device is determined to be in.
2. The method for distraction mitigation of claim 1, where the location information and the zone information may be contained in a common signal.
3. The method for distraction mitigation of claim 1, further comprising transmitting an identifier of the occupied zone and mobile device identification information to an in-vehicle system.
4. The method for distraction mitigation of claim 1, where the behavior of the mobile device is modified to mitigate distraction of an operator of the vehicle.
5. The method for distraction mitigation of claim 1, where modifying the behavior of the mobile device includes any of: disabling one or more applications, disabling one or more features within any of one or more applications, blocking one or more functions of the mobile device, locking the mobile device, disabling one or more modes of the mobile device user interface, and displaying a warning message.
6. The method for distraction mitigation of claim 1, where the zone information includes one or more of: a zone identifier, a zone boundary definition, a zone type, and zone restrictions
7. The method for distraction mitigation of claim 6, where the zone type includes any of: a class having pre-defined privileges or restrictions, and a role having associated privileges or restrictions.
8. A system for distraction mitigation comprising:
a first receiver to receive one or more signals containing location information;
a second receiver to receive a signal containing zone information defining one or more zones in a vehicle;
a location unit to determine a location of the mobile device with in the vehicle responsive to the received location information;
a zone determining unit to determine if the mobile device is located within any of the one or more zones responsive to the received zone information and the determined location of the mobile device; and
a distraction controller to modify the behavior of the mobile device.
9. The system for distraction mitigation of claim 8, further comprising a transmitter to transmit an identifier of an occupied zone, that the mobile device is determined to be located in, to an in-vehicle system.
10. The system for distraction mitigation of claim 8, where the first and second receivers are adapted to receive any of: an audio signal, a radio frequency signal and an ultrasonic signal.
11. The system for distraction mitigation of claim 8, where the location unit determines the location of the mobile device using any of: triangulation, trilateration, ranging and multilateration.
12. The system for distraction mitigation of claim 8, where the behavior of the mobile device is modified to mitigate distraction of an operator of the vehicle.
13. The system for distraction mitigation of claim 8, where modifying the behavior of the mobile device includes any of: disabling one or more applications, disabling one or more features within any of one or more applications, blocking one or more functions of the mobile device, locking the mobile device, disabling one or more modes of the mobile device user interface, and displaying a warning message.
14. A method for distraction mitigation comprising:
emitting one or more signals containing location information receivable by a mobile device;
emitting a signal containing zone information defining one or more zones in a vehicle receivable by the mobile device;
receiving from the mobile device an identifier of an occupied zone, of the one or more zones, that the mobile device is located within; and
modifying the behavior of one or more in-vehicle systems responsive to the occupied zone.
15. The method for distraction mitigation of claim 14, where the location information and the zone information may be contained in a common signal.
16. The method for distraction mitigation of claim 14, where the identifier of the occupied zone is a location of the mobile device from which the occupied zone can be determined.
17. The method for distraction mitigation of claim 14, where the behavior of the in-vehicle system is modified to mitigate distraction of an operator of the vehicle.
18. The method for distraction mitigation of claim 14, where modifying the behavior of the in-vehicle system includes any of: disabling one or more applications, disabling one or more features within any of one or more applications, blocking one or more functions of the in-vehicle system, locking the in-vehicle system, disabling one or more modes of the in-vehicle system user interface, displaying a warning message, modifying interactions with the mobile device and configuring the in-vehicle system in accordance with user preferences associated with the mobile device.
19. The method for distraction mitigation of claim 14, where the zone information includes one or more of: a zone identifier, a zone boundary definition, a zone type, and zone restrictions
20. The method for distraction mitigation of claim 19, where the zone type includes any of: a class having pre-defined privileges or restrictions, and a role having associated privileges or restrictions.
21. A system for distraction mitigation comprising:
a location information source to provide location information from which a location in a vehicle can be determined;
a zone information source to provide zone information defining one or more zones in the vehicle;
a plurality of first emitters to emit one or more signals containing the location information;
a second emitter to emit a signal containing the zone information;
a receiver to receive an identifier of an occupied zone, of the one or more zones, for a mobile device within the vehicle determined from the emitted location information and emitted zone information; and
a distraction controller to modify the behavior of one or more in-vehicle systems responsive to the occupied zone.
22. The system for distraction mitigation of claim 21, where the second emitter comprises one or more of the plurality of first emitters.
23. The system for distraction mitigation of claim 21, where the behavior of the one or more in-vehicle systems is modified to mitigate distraction of an operator of the vehicle.
24. The system for distraction mitigation of claim 21, where modifying the behavior of the one or more in-vehicle systems includes any of: disabling one or more applications, disabling one or more features within any of one or more applications, blocking one or more functions of the in-vehicle system, locking the in-vehicle system, disabling one or more modes of the in-vehicle system user interface, displaying a warning message, modifying interactions with the mobile device and configuring the in-vehicle system in accordance with user preferences associated with the mobile device.
US14/200,902 2014-03-07 2014-03-07 System and method for distraction mitigation Active 2034-09-16 US9674337B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/200,902 US9674337B2 (en) 2014-03-07 2014-03-07 System and method for distraction mitigation
EP15157968.7A EP2953385B1 (en) 2014-03-07 2015-03-06 System and method for distraction mitigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/200,902 US9674337B2 (en) 2014-03-07 2014-03-07 System and method for distraction mitigation

Publications (2)

Publication Number Publication Date
US20150256669A1 true US20150256669A1 (en) 2015-09-10
US9674337B2 US9674337B2 (en) 2017-06-06

Family

ID=52823441

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/200,902 Active 2034-09-16 US9674337B2 (en) 2014-03-07 2014-03-07 System and method for distraction mitigation

Country Status (2)

Country Link
US (1) US9674337B2 (en)
EP (1) EP2953385B1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150045987A1 (en) * 2011-12-16 2015-02-12 Continental Automotive Systems, Inc. Configurable traffic zone control system
US20160125867A1 (en) * 2013-05-31 2016-05-05 Nokia Technologies Oy An Audio Scene Apparatus
DE102016204996B3 (en) * 2016-03-24 2017-05-11 Volkswagen Aktiengesellschaft Device, method and computer program for grouping devices by locating
CN107027171A (en) * 2016-01-20 2017-08-08 麦恩电子有限公司 The feature configured for vehicle region describes data
EP3255868A1 (en) * 2016-06-06 2017-12-13 Toyota Motor Engineering & Manufacturing North America, Inc. Methods, systems, and vehicles for disabling a function of one or more mobile devices within a passenger cabin of a vehicle
GB2553325A (en) * 2016-09-01 2018-03-07 Jaguar Land Rover Ltd Apparatus and method for interfacing with a mobile device
WO2019018823A1 (en) * 2017-07-20 2019-01-24 Driving Management Systems, Inc. Detection and location of a mobile device using sound
US10212274B2 (en) * 2017-06-08 2019-02-19 Khaled A. ALGHONIEM Systems and methodologies for controlling an electronic device within a vehicle
CN110366156A (en) * 2019-08-26 2019-10-22 科大讯飞(苏州)科技有限公司 Vehicle bluetooth communication processing method, onboard audio management system and relevant device
DE102018219672A1 (en) * 2018-11-16 2020-05-20 Zf Friedrichshafen Ag Method and system for detecting a driver's distraction in a vehicle
CN111717083A (en) * 2020-06-17 2020-09-29 广州小鹏车联网科技有限公司 Vehicle interaction method and vehicle
US10988114B2 (en) * 2017-11-30 2021-04-27 Continental Automotive France Method for activating at least one function of a piece of equipment of a vehicle
NO20191341A1 (en) * 2019-11-13 2021-05-14 Kolseth Jon A Method and app for reducing distractions while driving
WO2023056134A1 (en) * 2021-09-30 2023-04-06 Qualcomm Incorporated Method and apparatus for relative positioning of a user equipment in a vehicle

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017011672A1 (en) 2015-07-14 2017-01-19 Driving Management Systems, Inc. Detecting the location of a phone using rf wireless and ultrasonic signals
US20180252796A1 (en) * 2017-03-03 2018-09-06 Driving Management Systems, Inc. Sonar data communication and localization
CN108650660A (en) * 2018-03-30 2018-10-12 斑马网络技术有限公司 Network configuration system and method in vehicle
US10880686B1 (en) * 2020-01-07 2020-12-29 BlueOwl, LLC Systems and methods for determining a vehicle driver using at least peer-to-peer network signals
US11700506B2 (en) * 2020-01-07 2023-07-11 BlueOwl, LLC Systems and methods for determining a vehicle driver using at least peer-to-peer network signals

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070200672A1 (en) * 2006-02-24 2007-08-30 Denso International America, Inc. Apparatus for automatically initiating sequence of vehicle functions
US20080214211A1 (en) * 1999-08-27 2008-09-04 Lipovski Gerald John Jack System for enabling or restricting certain cellular telephone device capabilities in certain zones
US20110105097A1 (en) * 2009-10-31 2011-05-05 Saied Tadayon Controlling Mobile Device Functions
US20120006610A1 (en) * 2010-07-09 2012-01-12 Erik Wallace Telematics enhanced mobile device safety interlock
US20120071151A1 (en) * 2010-09-21 2012-03-22 Cellepathy Ltd. System and method for selectively restricting in-vehicle mobile device usage
US20120214515A1 (en) * 2011-02-23 2012-08-23 Davis Bruce L Mobile Device Indoor Navigation
US20120253552A1 (en) * 2010-07-01 2012-10-04 Clay Skelton Systems, devices and methods for vehicles
US20130165178A1 (en) * 2011-12-22 2013-06-27 Samsung Electronics Co., Ltd Apparatus and method for adjusting volume in a portable terminal
US20140113619A1 (en) * 2009-07-21 2014-04-24 Katasi Llc Method and system for controlling and modifying driving behaviors
US20140314250A1 (en) * 2013-04-22 2014-10-23 Electronics And Telecommunications Research Institute Position estimation system using an audio-embedded time-synchronization signal and position estimation method using the system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2847044B1 (en) 2012-04-30 2018-08-22 Ford Global Technologies, LLC Apparatus and method for detecting a personal communication device in a vehicle
CA2874651A1 (en) 2012-05-30 2013-12-05 Flextronics Ap, Llc Control of device features based on vehicle state
US20130336094A1 (en) * 2012-06-08 2013-12-19 Rutgers, The State University Of New Jersey Systems and methods for detecting driver phone use leveraging car speakers
US20150204965A1 (en) * 2012-07-06 2015-07-23 Toyota Jidosha Kabushiki Kaisha Position specification system and method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080214211A1 (en) * 1999-08-27 2008-09-04 Lipovski Gerald John Jack System for enabling or restricting certain cellular telephone device capabilities in certain zones
US20070200672A1 (en) * 2006-02-24 2007-08-30 Denso International America, Inc. Apparatus for automatically initiating sequence of vehicle functions
US20140113619A1 (en) * 2009-07-21 2014-04-24 Katasi Llc Method and system for controlling and modifying driving behaviors
US20110105097A1 (en) * 2009-10-31 2011-05-05 Saied Tadayon Controlling Mobile Device Functions
US20120253552A1 (en) * 2010-07-01 2012-10-04 Clay Skelton Systems, devices and methods for vehicles
US20120006610A1 (en) * 2010-07-09 2012-01-12 Erik Wallace Telematics enhanced mobile device safety interlock
US20120071151A1 (en) * 2010-09-21 2012-03-22 Cellepathy Ltd. System and method for selectively restricting in-vehicle mobile device usage
US20120214515A1 (en) * 2011-02-23 2012-08-23 Davis Bruce L Mobile Device Indoor Navigation
US20130165178A1 (en) * 2011-12-22 2013-06-27 Samsung Electronics Co., Ltd Apparatus and method for adjusting volume in a portable terminal
US20140314250A1 (en) * 2013-04-22 2014-10-23 Electronics And Telecommunications Research Institute Position estimation system using an audio-embedded time-synchronization signal and position estimation method using the system

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9402164B2 (en) * 2011-12-16 2016-07-26 Continental Automotive Systems, Inc. Configurable traffic zone control system
US20150045987A1 (en) * 2011-12-16 2015-02-12 Continental Automotive Systems, Inc. Configurable traffic zone control system
US10685638B2 (en) 2013-05-31 2020-06-16 Nokia Technologies Oy Audio scene apparatus
US10204614B2 (en) * 2013-05-31 2019-02-12 Nokia Technologies Oy Audio scene apparatus
US20160125867A1 (en) * 2013-05-31 2016-05-05 Nokia Technologies Oy An Audio Scene Apparatus
US10052935B2 (en) * 2016-01-20 2018-08-21 Livio, Inc. Feature description data for vehicle zone configuration
CN107027171A (en) * 2016-01-20 2017-08-08 麦恩电子有限公司 The feature configured for vehicle region describes data
DE102016204996B3 (en) * 2016-03-24 2017-05-11 Volkswagen Aktiengesellschaft Device, method and computer program for grouping devices by locating
US10080101B2 (en) 2016-03-24 2018-09-18 Volkswagen Ag Device, method, and computer program for grouping devices by location
EP3255868A1 (en) * 2016-06-06 2017-12-13 Toyota Motor Engineering & Manufacturing North America, Inc. Methods, systems, and vehicles for disabling a function of one or more mobile devices within a passenger cabin of a vehicle
US11277725B2 (en) 2016-09-01 2022-03-15 Jaguar Land Rover Limited Apparatus and method for interfacing with a mobile device
GB2553325B (en) * 2016-09-01 2020-03-04 Jaguar Land Rover Ltd Apparatus and method for interfacing with a mobile device
GB2553325A (en) * 2016-09-01 2018-03-07 Jaguar Land Rover Ltd Apparatus and method for interfacing with a mobile device
US10212274B2 (en) * 2017-06-08 2019-02-19 Khaled A. ALGHONIEM Systems and methodologies for controlling an electronic device within a vehicle
WO2019018823A1 (en) * 2017-07-20 2019-01-24 Driving Management Systems, Inc. Detection and location of a mobile device using sound
US10988114B2 (en) * 2017-11-30 2021-04-27 Continental Automotive France Method for activating at least one function of a piece of equipment of a vehicle
DE102018219672A1 (en) * 2018-11-16 2020-05-20 Zf Friedrichshafen Ag Method and system for detecting a driver's distraction in a vehicle
CN110366156A (en) * 2019-08-26 2019-10-22 科大讯飞(苏州)科技有限公司 Vehicle bluetooth communication processing method, onboard audio management system and relevant device
NO20191341A1 (en) * 2019-11-13 2021-05-14 Kolseth Jon A Method and app for reducing distractions while driving
NO345767B1 (en) * 2019-11-13 2021-07-19 Kolseth Jon A Method and app for reducing distractions while driving
CN111717083A (en) * 2020-06-17 2020-09-29 广州小鹏车联网科技有限公司 Vehicle interaction method and vehicle
WO2023056134A1 (en) * 2021-09-30 2023-04-06 Qualcomm Incorporated Method and apparatus for relative positioning of a user equipment in a vehicle

Also Published As

Publication number Publication date
US9674337B2 (en) 2017-06-06
EP2953385B1 (en) 2019-12-04
EP2953385A1 (en) 2015-12-09

Similar Documents

Publication Publication Date Title
US9674337B2 (en) System and method for distraction mitigation
US11303747B2 (en) System and method for limiting usage of a wireless communication device
US10096249B2 (en) Method, apparatus and storage medium for providing collision alert
US20150365743A1 (en) Method and apparatus for including sound from an external environment into a vehicle audio system
US10490072B2 (en) Extended range vehicle horn
US20200296205A1 (en) Anti-distracted driving systems and methods
US20160257198A1 (en) In-vehicle component user interface
US20140365073A1 (en) System and method of communicating with vehicle passengers
US10447846B2 (en) Anti-distracted driving systems and methods
CN103889767A (en) Apparatus and method for control of presentation of media to users of a vehicle
US9561778B2 (en) Method of selecting and stopping a vehicle using vehicle-to-vehicle communication
US20140155052A1 (en) Mobile device services control system and method
JP2015517948A (en) On-vehicle vehicle control system and method
JP2018055446A (en) Vehicle operation management system
US10710456B2 (en) Mobile device monitoring during vehicle operation
US9914418B2 (en) In-vehicle control location
US20210092522A1 (en) System, method, and computer readable storage medium for controlling an in car communication system
KR20150108618A (en) Method for configuring dynamic user interface of head unit in vehicle by using mobile terminal, and head unit and computer-readable recoding media using the same
US9966951B2 (en) System for allowing a user to wirelessly manage software applications of a computing device and plurality of vehicles sensors
US10326878B2 (en) Anti-distracted driving systems and methods
JP2015134556A (en) On-vehicle apparatus and method of suppressing operation in the same
CN112061024A (en) Vehicle external speaker system
US10175809B2 (en) Vehicle electronic mobile device systems
JP2015097012A (en) On-vehicle unit
CN107229445B (en) Play source volume control method and device

Legal Events

Date Code Title Description
AS Assignment

Owner name: QNX SOFTWARE SYSTEMS LIMITED, CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JAMES, ALEXANDRE;RIGLEY, MARK JOHN;VINCENT, ROBERT;SIGNING DATES FROM 20140227 TO 20140307;REEL/FRAME:032551/0941

AS Assignment

Owner name: 2236008 ONTARIO LIMITED, ONTARIO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:QNX SOFTWARE SYSTEMS LIMITED;REEL/FRAME:035700/0845

Effective date: 20150520

AS Assignment

Owner name: 2236008 ONTARIO INC., ONTARIO

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE CORPORATE IDENTIFIER INADVERTENTLY LISTED ON THE ASSIGNMENT AND COVERSHEET AS "LIMITED" PREVIOUSLY RECORDED ON REEL 035700 FRAME 0845. ASSIGNOR(S) HEREBY CONFIRMS THE IDENTIFIER SHOULD HAVE STATED "INC.";ASSIGNOR:QNX SOFTWARE SYSTEMS LIMITED;REEL/FRAME:035785/0156

Effective date: 20150527

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: BLACKBERRY LIMITED, ONTARIO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:2236008 ONTARIO INC.;REEL/FRAME:053313/0315

Effective date: 20200221

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4