US9475390B2 - Method and device for providing a user interface in a vehicle - Google Patents

Method and device for providing a user interface in a vehicle Download PDF

Info

Publication number
US9475390B2
US9475390B2 US13/383,185 US201013383185A US9475390B2 US 9475390 B2 US9475390 B2 US 9475390B2 US 201013383185 A US201013383185 A US 201013383185A US 9475390 B2 US9475390 B2 US 9475390B2
Authority
US
United States
Prior art keywords
gesture
view
display
driver
passenger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/383,185
Other versions
US20120274549A1 (en
Inventor
Ulrike Wehling
Thomas Fabian
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volkswagen AG
Original Assignee
Volkswagen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volkswagen AG filed Critical Volkswagen AG
Assigned to VOLKSWAGEN AG reassignment VOLKSWAGEN AG ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FABIAN, THOMAS, WEHLING, ULRIKE
Publication of US20120274549A1 publication Critical patent/US20120274549A1/en
Application granted granted Critical
Publication of US9475390B2 publication Critical patent/US9475390B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • B60K35/10
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K37/00Dashboards
    • B60K37/04Arrangement of fittings on dashboard
    • B60K37/06Arrangement of fittings on dashboard of controls, e.g. controls knobs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/22
    • B60K35/60
    • B60K35/654
    • B60K35/656
    • B60K35/80
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K15/00Arrangement in connection with fuel supply of combustion engines or other fuel consuming energy converters, e.g. fuel cells; Mounting or construction of fuel tanks
    • B60K15/03Fuel tanks
    • B60K2015/03032Manufacturing of fuel tanks
    • B60K2350/1004
    • B60K2350/1012
    • B60K2350/1024
    • B60K2350/2013
    • B60K2350/203
    • B60K2350/2047
    • B60K2350/206
    • B60K2350/35
    • B60K2350/355
    • B60K2350/901
    • B60K2350/903
    • B60K2360/11
    • B60K2360/141
    • B60K2360/143
    • B60K2360/1526
    • B60K2360/20
    • B60K2360/21
    • B60K2360/33
    • B60K2360/333
    • B60K2360/55
    • B60K2360/785

Definitions

  • the present invention relates to a method and a device for providing a user interface in a vehicle.
  • the device comprises a display surface, which is arranged in the vehicle. Operating objects and/or display objects may be displayed by means of the display surface.
  • a plurality of electronic devices In a vehicle, in particular in a motor vehicle, a plurality of electronic devices is provided whose operation has to be made possible for the driver or another passenger.
  • these devices are a navigation system, a plurality of driver assistance systems as well as communications and multi-media applications, which, for example, comprise a telephone system and devices for playback of music and speech, as for example a radio or a CD player.
  • multi-functional operating systems are applied in vehicles that comprise one or more multi-functional displays and operating elements by which the manifold devices integrated in the vehicle may be operated.
  • the operation is based on and guided by information displayed on the multi-functional display(s).
  • the operating system it may be selected, which information shall be displayed on the multi-functional display.
  • EP O 366 132 B1 is a multi-functional operating device where the selection of function groups and the selection of individual functions is carried out by means of a turn-push-switch and wherein the switch may be actuated in the direction of the axis of rotation.
  • Known from DE 199 44 324 is a multi-functional operating device, which comprises a rotary control switch for selecting functions that are represented within a view box of a display. Arranged surrounding the rotary control switch are push-buttons, to which also view boxes of the display are assigned.
  • DE 103 24 579 A1 is an operating device for controlling vehicle equipment comprising a touch-sensitive control panel.
  • the device comprises an apparatus for translating the detected position of the hand into an instruction to a processing unit, by means of which a vehicle function may be controlled.
  • a method and a device can be provided, by means of which a user interface in a vehicle may be easily and intuitively controlled by a user.
  • a gesture is detected in a detection space, which is arranged in front of the display surface,—the detected gesture is assigned to an operating object and/or a display object and a control command, and—the control command is executed.
  • the gesture may comprise a movement carried out by the hand of a user.
  • the operating object may comprise a motion element controllable by means of the gesture and the movement carried out during the gesture moves the motion element on the display surface, whereby the control command is generated.
  • a gesture may comprising a horizontal movement may be assigned to a control command, which causes showing an animated display object by means of which vehicle functions are explainable in an optical and/or acoustic manner.
  • it may be detected, which passenger carries out the gesture and with a waving closer gesture the animated display object is blended in and with a waving away gesture the animated display object is removed.
  • different display contents can be denotable simultaneously for the driver's side and the co-driver's side and a gesture comprising a horizontal movement in the direction of the co-driver is assigned to a control command that causes the display content for the driver's side also to be displayed for the co-driver's side.
  • a part of a listing can be displayed in which a scrolling is producible in a scrolling direction and a scrolling is created when a gesture has been detected which comprises a movement in the scrolling direction.
  • an operating object for adjusting the loudness of an acoustic output can be displayed and a control command that causes the loudness to be increased is assigned to a gesture comprising a movement in a defined direction and a control command that causes the loudness to be decreased is assigned to a gesture comprising a movement in the opposite direction.
  • an operating object for muting of an acoustic output can be displayed and a control command that causes a muting is assigned to a gesture comprising a movement in a defined direction and a control command that neutralizes the muting is assigned to a gesture comprising a movement in the opposite direction.
  • an operating object for signaling an incoming telephone call can be displayed and a control command that causes the telephone call to be accepted is assigned to a gesture in which a movement is carried out that corresponds to taking off a handset.
  • a device for providing a user interface in a vehicle may comprise—a display surface for displaying operating objects and/or display objects, wherein the display surface is arranged in the vehicle,—a gesture detection device for detecting gestures in a detection space that is arranged in front of the display surface,—an analysis apparatus by means of which the detected gestures may be assigned to operating objects and/or display objects and a control command, and—a control device by means of which the control command is executable.
  • the display surface can be configured such that different display contents simultaneously are denotable for different viewing angles.
  • FIG. 1 schematically shows an exemplary embodiment of the device
  • FIG. 2 shows the display device of a further exemplary embodiment of the device
  • FIG. 3 schematically shows a part of the device of the further exemplary embodiment.
  • FIG. 1 a first exemplary embodiment of the device and the method is described:
  • operating objects and/or display objects are denotable on a display surface arranged in the vehicle. Furthermore, a gesture is detected in a detection space, which is arranged in front of the display surface.
  • the detected gesture is assigned to an operating object and/or a display object and a control command. Subsequently, the control command is executed.
  • Understood as a gesture in the sense of the invention is a certain posture of the hand of a user or a certain movement, which is carried out by the hand of the user.
  • the gestures are carried out within the detection space in front of the display surface such that no contact with the display surface is required.
  • the gesture comprises a movement carried out by the hand of a user.
  • the direction of movement of the gesture in particular is connected with a direction of movement or a function, which is assigned to the gesture.
  • the direction of movement may be directly coupled with the operating object.
  • the operating object may comprise a motion element controllable by means of the gesture.
  • the movement carried out during the gesture moves the motion element on the display surface, whereby the control command is created.
  • the operating object may be a rotary control switch or a sliding switch represented on the display surface.
  • Such an operating element may be actuated by a movement of a gesture that corresponds to sliding or rotating the operating object, wherein the respective movement is carried out contactless by the user in the detection space.
  • Such an actuation of an operating object features the advantage that the user does not have to strike a certain surface area as for example with a touch-sensitive surface. Rather, it is sufficient that he/she moves his/her hand into the detection space and carries out the gesture assigned to the operating object there. In doing so, the gesture in particular corresponds to a movement, which, for example, is carried out with mechanical operating elements, so that the user easily may memorize these gestures.
  • a control command is assigned that causes the presentation of an animated display object by means of which vehicle functions may be explained optically and/or acoustically.
  • the display object is a virtual assistant or a so called avatar, respectively, which supports the passengers.
  • a waving closer gesture the animated display object is shown
  • a waving away gesture the animated display object is removed.
  • the gesture is a waving closer gesture or a waving away gesture, it is detected in addition, which passenger has carried out the gesture.
  • the display surface is a so-called dual-view-display.
  • assigned to a gesture comprising a horizontal movement in the direction of the co-driver may be a control command, which causes that the display content for the driver's side also is displayed for the co-driver's side.
  • assigned to a gesture comprising a horizontal movement in the direction of the driver may be a control command, which causes that the display content for the co-driver's side also is displayed for the driver's side.
  • a part of a listing is displayed on the display surface.
  • a scrolling in a scrolling direction is producible.
  • the scrolling may be created in that a gesture is detected, which comprises a movement in the scrolling direction.
  • a gesture comprising a vertical movement downwards or upwards, respectively, is carried out within the detection space.
  • an operating object for adjusting the loudness of an acoustic output is displayed.
  • a gesture comprising a movement in an output direction is a control command, which causes the loudness to be increased
  • a control command which causes the loudness to be decreased.
  • an operating object for muting an acoustic output may be displayed.
  • a control command which causes a muting
  • a gesture comprising a movement in the opposite direction is a control command, which neutralizes the muting.
  • the direction for increasing the loudness and for neutralizing the muting is a vertical direction upwards and the opposite direction for decreasing the loudness and for muting the acoustic output is a vertical direction downwards.
  • an operating object for signaling an incoming telephone call is displayed.
  • a control command which causes that the telephone call is accepted.
  • assigned to a gesture during which a movement is carried out that corresponds to hanging up a handset may be a control command, which causes that an ongoing telephone conversation is terminated or an incoming call is rejected.
  • the device is configured such that it may carry out the above method in part or as a whole. It comprises a display surface for displaying operating objects and/or display objects. The display surface is arranged in the vehicle. The device further comprises a gesture detection device for detecting gestures in a detection space that is arranged in front of the display surface. By means of an analysis apparatus the detected gestures may be assigned to operating objects and/or display objects as well as in addition to a control command. By means of a control device the control command may be executed.
  • the display surface is configured such that different display contents simultaneously are denotable for different viewing angles.
  • the viewing angles on the one hand correspond to the angle from which the driver is viewing the display surface and on the other hand to the angle from which the co-driver is viewing the display surface.
  • the device serves to provide a user interface in a vehicle.
  • the display device 10 comprises a display device 10 with a display surface 1 .
  • the display surface 1 may be provided by a display of any kind of construction.
  • the display device 10 is connected to a control device 8 .
  • the control device 8 creates graphics data, which are represented in a visible manner to the passengers in the passenger compartment of the vehicle by means of the display surface 1 .
  • operating objects and display objects for the user interface may be displayed. These operating and display objects support the user in controlling devices of the vehicle. Further, the display objects serve to communicate information.
  • the control device 8 further is connected to a vehicle bus 9 .
  • the control device 8 may exchange data with other devices of the vehicle in a bidirectional manner via this connection.
  • data may be transmitted to the control device 8 , which are processed by the control device 8 such that a display content is created on the display surface 1 , which supports the user in controlling the vehicle equipment.
  • a gesture detection device 11 is provided, with the help of which gestures of a user may be detected in a detection space 12 .
  • the detection space 12 is arranged in front of the display surface 1 .
  • the gesture detection device 11 is part of an input device for the user interface. The user may carry out gestures in the detection space 12 to control the representation on the display surface 1 as well as to control further devices of the vehicle as described later.
  • the gesture detection device 11 may comprise infrared light sources and detectors for infrared light.
  • the gesture detection device 11 may comprise an optical system, which comprises a camera that records the gesture carried out in the detection space 12 .
  • the optical system may comprise a light emitting diode that, for example, emits rectangular amplitude modulated light. This light is reflected at the hand of a user who carries out the gesture in the detection space 12 , and reaches a photo diode of the gesture detection device 11 after the reflection.
  • a further light emitting diode also emits rectangular amplitude modulated light towards the photo diode, which, however, is shifted in its phase by 180°.
  • Both of the light signals are superimposed at the photo diode and cancel each other in case they exactly feature the same amplitude.
  • the light emission of the second diode is adjusted via a control circuit such that the overall received signal again is summed up to be zero.
  • the control signal is a measure for the reflection of the light emitted by the first diode at the hand of the user who carries out the gesture. In this manner, a signal may be derived from the control circuit that is representative for the position of the hand of the user.
  • the gesture detection device 11 is connected to the control device 8 and an analysis apparatus 13 .
  • the analysis apparatus 13 further is connected to the display device 10 .
  • the display device 10 transmits the display content to the analysis apparatus 13 .
  • the analysis apparatus 13 assigns a control command to the detected gesture.
  • the analysis apparatus 13 transmits this control command to the control device 8 , which executes the control command.
  • the control device 8 transmits respective data to the vehicle bus 9 and/or changes the display content displayed by the display surface 1 .
  • Displayed on the display surface 1 may be an animated display object, which optically and/or acoustically describes vehicle functions.
  • the user may render this display object visible by carrying out a waving closer gesture in the detection space 12 . Further, the user may remove this display object by carrying out a waving away gesture in the detection space 12 .
  • the waving closer or waving away gesture is detected by the gesture detection device 11 and, by means of the analysis apparatus 13 , is assigned to the respective animated display object and a control command, which causes displaying or removing the animated display object.
  • the gesture may be carried out by different passengers who in particular are sitting on the driver seat or the co-driver seat of the vehicle, it also may be detected, which passenger has carried out the gesture, in order to be able to detect definitely whether it is a waving closer or waving away gesture. Described below with respect to the second exemplary embodiment is how it is possible to detect, which passenger carries out a gesture.
  • operating objects may be displayed on the display surface, which are actuated by means of gestures.
  • a virtual rotary switch may be displayed on the display surface 1 .
  • the user carries out a clockwise rotary movement in the detection space 12 or a movement in an opposite rotary direction.
  • this gesture is assigned to control command, which leads to the loudness being increased or decreased.
  • an operating object may be displayed for muting an acoustic output, which may be activated and again be neutralized by means of a gesture.
  • the control device 8 may further be coupled to a telecommunications device via the vehicle bus 9 .
  • the telecommunications device transmits a signal for an incoming call to the control device 8 .
  • control device 8 creates a symbol, which is displayed on the display surface 1 and that signals the incoming call.
  • this gesture will be detected by the gesture detection device and, by the analysis apparatus 13 , will be assigned to a control command, which causes the incoming call to be accepted. Accordingly, a gesture corresponding to hanging up a handset leads to an incoming call being rejected or a telecommunications connection being terminated.
  • the detail of a geographic map of a navigation system may be moved, for example.
  • windows opening which contain different information, may be closed by a movement gesture.
  • a gesture comprising a rotary movement of the hand may lead to a geographic map being rotated and aligned, respectively, or to a listing arranged in a perspective manner on a ring being rotated.
  • FIG. 2 and FIG. 3 a second exemplary embodiment of the device and the method is described:
  • the device of the further exemplary embodiment generally corresponds to the device of the first exemplary embodiment described with respect to FIG. 1 .
  • the display device 10 is configured such that different display contents are denotable simultaneously for different viewing angles A and B.
  • a device is described, by means of which it may be detected whether the gesture in the detection space 12 is carried out by the driver 2 or the co-driver 3 .
  • the display surface 1 of the display device 10 by means of an optical barrier different display contents are denotable to each observer to the right and to the left of the centric line of sight.
  • driver 2 sitting on the driver seat 4 views the display surface 1 from the viewing angle A
  • the co-driver 3 sitting on the co-driver seat 5 views the display surface 1 from the viewing angle B.
  • the optical barrier of the display device is arranged such that the driver 2 may see different display content in the angular range A than the co-driver 3 from the angular range B.
  • the display device 10 may be activated by means of the control device such that different information may be displayed for the driver 2 and the co-driver 3 simultaneously.
  • the driver 2 or the co-driver 3 may view the display content of the respective other one by means of a gesture carried out in the detection space 12 .
  • the co-driver 3 for example moves his/her hand into the detection space 12 in front of the display surface 1 . In doing so, his/her finger tip 6 approaches the display surface 1 .
  • the seating position of the co-driver 3 for example, who carries out a gesture, in this case may be determined as follows:
  • an electrode device 7 is located in seat 5 of co-driver 3 .
  • an identification code may be capacitively coupled into the body of the co-driver 3 .
  • the identification code may identify the seating position of the co-driver 3 as well as the co-driver 3 him-/herself.
  • the identification code is transmitted via the body of the co-driver 3 and is capacitively uncoupled at the finger tip 6 so that it may be transmitted to a receiving device accommodated in the display device 10 .
  • the receiving device is connected to a control device 8 , which in turn is coupled with the electrode device 7 in a capacitive manner.
  • a control device 8 which in turn is coupled with the electrode device 7 in a capacitive manner.
  • an electric field comprising a very limited range of for example several centimeters or decimeters is used.
  • the range of this field substantially corresponds to the size of the detection space 12 .
  • Relatively low carrier frequencies of several 100 kHz are used for signal transmission, which result in quasi static fields, i.e. fields for which extensively the physical approach is valid that is valid for static fields.
  • this gesture is assigned to a control command, which causes that the display content currently displayed to the driver's side also is displayed for the co-driver's side, i.e. in viewing angle B.
  • this gesture is assigned to a control command that causes the display content of the co-driver to be displayed to the driver.

Abstract

In a method for providing a user interface in a vehicle, control objects and/or display objects can be displayed on a display surface (1) arranged in the vehicle, a gesture in a detection space (12) arranged in front of the display surface (1) is detected, the detected gesture is associated with a control object and/or display object and a control command, and the control command is carried out.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application is a U.S. National Stage Application of International Application No. PCT/EP2010/059744 filed Jul. 7, 2010, which designates the United States of America, and claims priority to German Application No. 10 2009 032 069.5 filed Jul. 7, 2009, the contents of which are hereby incorporated by reference in their entirety.
TECHNICAL FIELD
The present invention relates to a method and a device for providing a user interface in a vehicle. The device comprises a display surface, which is arranged in the vehicle. Operating objects and/or display objects may be displayed by means of the display surface.
BACKGROUND
In a vehicle, in particular in a motor vehicle, a plurality of electronic devices is provided whose operation has to be made possible for the driver or another passenger. For example, amongst these devices are a navigation system, a plurality of driver assistance systems as well as communications and multi-media applications, which, for example, comprise a telephone system and devices for playback of music and speech, as for example a radio or a CD player.
In order to be able to operate the manifold devices in the vehicle often multi-functional operating systems are applied in vehicles that comprise one or more multi-functional displays and operating elements by which the manifold devices integrated in the vehicle may be operated. In this case, the operation is based on and guided by information displayed on the multi-functional display(s). Furthermore, via the operating system it may be selected, which information shall be displayed on the multi-functional display.
Known from EP O 366 132 B1 is a multi-functional operating device where the selection of function groups and the selection of individual functions is carried out by means of a turn-push-switch and wherein the switch may be actuated in the direction of the axis of rotation. Known from DE 199 44 324 is a multi-functional operating device, which comprises a rotary control switch for selecting functions that are represented within a view box of a display. Arranged surrounding the rotary control switch are push-buttons, to which also view boxes of the display are assigned. Finally, known from DE 103 24 579 A1 is an operating device for controlling vehicle equipment comprising a touch-sensitive control panel.
Besides the above described operating elements that are arranged in an offset manner, it was further suggested to provide the display itself with a touch-sensitive surface and to provide a so-called touchscreen this way. Using such a touchscreen the operation is carried out thus that the user touches the touchscreen with his finger tip, as an example. The position of the contact and where applicable the movement during the contact are detected, evaluated and assigned to an operating step. In order to support the user during the operation virtual switches may be displayed on the display as graphical push buttons. A display device comprising a touch-sensitive surface employed in connection with a navigation system is described in DE 10 2005 020 155 A1, for example.
Very specific requirements arise for operating the manifold devices of a vehicle, since the operation may be carried out by the driver, amongst others. Therefore, it is very important that the operating activity does not lead to a distraction of the driver while driving. Therefore, the operating activity in particular should require as little attention by the driver as possible and in addition should be quickly accomplishable.
To provide an easy to operate user interface in a vehicle a device for detecting the position of a hand is proposed in DE 100 22 321 A1. The device comprises an apparatus for translating the detected position of the hand into an instruction to a processing unit, by means of which a vehicle function may be controlled.
SUMMARY
According to various embodiments, a method and a device can be provided, by means of which a user interface in a vehicle may be easily and intuitively controlled by a user.
According to an embodiment, in a method for providing a user interface in a vehicle,—operating objects and/or display objects are denotable on a display surface arranged in the vehicle,—a gesture is detected in a detection space, which is arranged in front of the display surface,—the detected gesture is assigned to an operating object and/or a display object and a control command, and—the control command is executed.
According to a further embodiment, the gesture may comprise a movement carried out by the hand of a user. According to a further embodiment, the operating object may comprise a motion element controllable by means of the gesture and the movement carried out during the gesture moves the motion element on the display surface, whereby the control command is generated. According to a further embodiment, a gesture may comprising a horizontal movement may be assigned to a control command, which causes showing an animated display object by means of which vehicle functions are explainable in an optical and/or acoustic manner. According to a further embodiment, it may be detected, which passenger carries out the gesture and with a waving closer gesture the animated display object is blended in and with a waving away gesture the animated display object is removed. According to a further embodiment, different display contents can be denotable simultaneously for the driver's side and the co-driver's side and a gesture comprising a horizontal movement in the direction of the co-driver is assigned to a control command that causes the display content for the driver's side also to be displayed for the co-driver's side. According to a further embodiment, on the display surface a part of a listing can be displayed in which a scrolling is producible in a scrolling direction and a scrolling is created when a gesture has been detected which comprises a movement in the scrolling direction. According to a further embodiment, an operating object for adjusting the loudness of an acoustic output can be displayed and a control command that causes the loudness to be increased is assigned to a gesture comprising a movement in a defined direction and a control command that causes the loudness to be decreased is assigned to a gesture comprising a movement in the opposite direction. According to a further embodiment, an operating object for muting of an acoustic output can be displayed and a control command that causes a muting is assigned to a gesture comprising a movement in a defined direction and a control command that neutralizes the muting is assigned to a gesture comprising a movement in the opposite direction. According to a further embodiment, an operating object for signaling an incoming telephone call can be displayed and a control command that causes the telephone call to be accepted is assigned to a gesture in which a movement is carried out that corresponds to taking off a handset.
According to another embodiment, a device for providing a user interface in a vehicle, may comprise—a display surface for displaying operating objects and/or display objects, wherein the display surface is arranged in the vehicle,—a gesture detection device for detecting gestures in a detection space that is arranged in front of the display surface,—an analysis apparatus by means of which the detected gestures may be assigned to operating objects and/or display objects and a control command, and—a control device by means of which the control command is executable.
According to a further embodiment of the device, the display surface can be configured such that different display contents simultaneously are denotable for different viewing angles.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention is now described by way of exemplary embodiments with respect to the drawings.
FIG. 1 schematically shows an exemplary embodiment of the device,
FIG. 2 shows the display device of a further exemplary embodiment of the device, and
FIG. 3 schematically shows a part of the device of the further exemplary embodiment.
With respect to FIG. 1 a first exemplary embodiment of the device and the method is described:
DETAILED DESCRIPTION
In the method according to various embodiments, operating objects and/or display objects are denotable on a display surface arranged in the vehicle. Furthermore, a gesture is detected in a detection space, which is arranged in front of the display surface.
The detected gesture is assigned to an operating object and/or a display object and a control command. Subsequently, the control command is executed.
Understood as a gesture in the sense of the invention is a certain posture of the hand of a user or a certain movement, which is carried out by the hand of the user. The gestures are carried out within the detection space in front of the display surface such that no contact with the display surface is required. By means of controlling the user interface of the vehicle through gestures, the user is provided with a particularly simple and intuitive input alternative for controlling the display content on the display surface or the devices of the vehicle.
According to an embodiment of the method, the gesture comprises a movement carried out by the hand of a user. The direction of movement of the gesture in particular is connected with a direction of movement or a function, which is assigned to the gesture. In particular, the direction of movement may be directly coupled with the operating object. That is to say, the operating object may comprise a motion element controllable by means of the gesture. In this case, the movement carried out during the gesture moves the motion element on the display surface, whereby the control command is created. For example, the operating object may be a rotary control switch or a sliding switch represented on the display surface. Such an operating element may be actuated by a movement of a gesture that corresponds to sliding or rotating the operating object, wherein the respective movement is carried out contactless by the user in the detection space. Such an actuation of an operating object, which is displayed on the display surface, features the advantage that the user does not have to strike a certain surface area as for example with a touch-sensitive surface. Rather, it is sufficient that he/she moves his/her hand into the detection space and carries out the gesture assigned to the operating object there. In doing so, the gesture in particular corresponds to a movement, which, for example, is carried out with mechanical operating elements, so that the user easily may memorize these gestures.
According to a further embodiment of the method, to a gesture comprising a horizontal movement a control command is assigned that causes the presentation of an animated display object by means of which vehicle functions may be explained optically and/or acoustically. In particular, the display object is a virtual assistant or a so called avatar, respectively, which supports the passengers. For example, with a waving closer gesture the animated display object is shown, with a waving away gesture the animated display object is removed. In order to be able to decide whether the gesture is a waving closer gesture or a waving away gesture, it is detected in addition, which passenger has carried out the gesture.
According to a further embodiment of the method different display contents are denotable simultaneously for the driver's side and the co-driver's side. Therefore, the display surface is a so-called dual-view-display. In this case, assigned to a gesture comprising a horizontal movement in the direction of the co-driver may be a control command, which causes that the display content for the driver's side also is displayed for the co-driver's side. Inversely, assigned to a gesture comprising a horizontal movement in the direction of the driver may be a control command, which causes that the display content for the co-driver's side also is displayed for the driver's side.
According to a further embodiment of the method a part of a listing is displayed on the display surface. In order to show another part of the listing a scrolling in a scrolling direction is producible. With the method according to various embodiments the scrolling may be created in that a gesture is detected, which comprises a movement in the scrolling direction. For example, when a listing is shown in which the list entries are arranged vertically one below the other a scrolling may be created in that a gesture comprising a vertical movement downwards or upwards, respectively, is carried out within the detection space.
According to a further embodiment of the method an operating object for adjusting the loudness of an acoustic output is displayed. In this case, assigned to a gesture comprising a movement in an output direction is a control command, which causes the loudness to be increased, and assigned to a gesture comprising a movement in the opposite direction is a control command, which causes the loudness to be decreased.
Furthermore, an operating object for muting an acoustic output may be displayed. In this case, assigned to a gesture comprising a movement in an output direction is a control command, which causes a muting, and assigned to a gesture comprising a movement in the opposite direction is a control command, which neutralizes the muting. In particular, the direction for increasing the loudness and for neutralizing the muting is a vertical direction upwards and the opposite direction for decreasing the loudness and for muting the acoustic output is a vertical direction downwards.
According to a further embodiment of the method an operating object for signaling an incoming telephone call is displayed. In this case, assigned to a gesture during which a movement is carried out that corresponds to taking off a handset is a control command, which causes that the telephone call is accepted. Furthermore, assigned to a gesture during which a movement is carried out that corresponds to hanging up a handset may be a control command, which causes that an ongoing telephone conversation is terminated or an incoming call is rejected.
The device according to various embodiments is configured such that it may carry out the above method in part or as a whole. It comprises a display surface for displaying operating objects and/or display objects. The display surface is arranged in the vehicle. The device further comprises a gesture detection device for detecting gestures in a detection space that is arranged in front of the display surface. By means of an analysis apparatus the detected gestures may be assigned to operating objects and/or display objects as well as in addition to a control command. By means of a control device the control command may be executed.
According to an embodiment of the device the display surface is configured such that different display contents simultaneously are denotable for different viewing angles. In particular, here the viewing angles on the one hand correspond to the angle from which the driver is viewing the display surface and on the other hand to the angle from which the co-driver is viewing the display surface.
The device serves to provide a user interface in a vehicle.
It comprises a display device 10 with a display surface 1. The display surface 1 may be provided by a display of any kind of construction. The display device 10 is connected to a control device 8. The control device 8 creates graphics data, which are represented in a visible manner to the passengers in the passenger compartment of the vehicle by means of the display surface 1. In particular, operating objects and display objects for the user interface may be displayed. These operating and display objects support the user in controlling devices of the vehicle. Further, the display objects serve to communicate information.
The control device 8 further is connected to a vehicle bus 9. The control device 8 may exchange data with other devices of the vehicle in a bidirectional manner via this connection. In particular, data may be transmitted to the control device 8, which are processed by the control device 8 such that a display content is created on the display surface 1, which supports the user in controlling the vehicle equipment.
Furthermore, a gesture detection device 11 is provided, with the help of which gestures of a user may be detected in a detection space 12. The detection space 12 is arranged in front of the display surface 1. The gesture detection device 11 is part of an input device for the user interface. The user may carry out gestures in the detection space 12 to control the representation on the display surface 1 as well as to control further devices of the vehicle as described later.
For example, the gesture detection device 11 may comprise infrared light sources and detectors for infrared light. Alternatively, the gesture detection device 11 may comprise an optical system, which comprises a camera that records the gesture carried out in the detection space 12. Furthermore, the optical system may comprise a light emitting diode that, for example, emits rectangular amplitude modulated light. This light is reflected at the hand of a user who carries out the gesture in the detection space 12, and reaches a photo diode of the gesture detection device 11 after the reflection. A further light emitting diode also emits rectangular amplitude modulated light towards the photo diode, which, however, is shifted in its phase by 180°. Both of the light signals are superimposed at the photo diode and cancel each other in case they exactly feature the same amplitude. In case the signals at the photo diode do not cancel each other, the light emission of the second diode is adjusted via a control circuit such that the overall received signal again is summed up to be zero. When the position of the hand of the user in the detection space 12 is changed, also that part of the light is changed, which reaches the photo diode from the first light emitting diode via the reflection at the hand. This causes an updating of the intensity of the second light emitting diode by means of the control circuit. Therefore, the control signal is a measure for the reflection of the light emitted by the first diode at the hand of the user who carries out the gesture. In this manner, a signal may be derived from the control circuit that is representative for the position of the hand of the user.
The gesture detection device 11 is connected to the control device 8 and an analysis apparatus 13. The analysis apparatus 13 further is connected to the display device 10. The display device 10 transmits the display content to the analysis apparatus 13. Depending on the currently displayed operating object and display object, respectively, the analysis apparatus 13 assigns a control command to the detected gesture. The analysis apparatus 13 transmits this control command to the control device 8, which executes the control command. For this purpose, the control device 8 transmits respective data to the vehicle bus 9 and/or changes the display content displayed by the display surface 1.
Described in the following are various examples of gestures and the control commands assigned to these gestures in an exemplary embodiment of the method:
Displayed on the display surface 1 may be an animated display object, which optically and/or acoustically describes vehicle functions. The user may render this display object visible by carrying out a waving closer gesture in the detection space 12. Further, the user may remove this display object by carrying out a waving away gesture in the detection space 12. The waving closer or waving away gesture is detected by the gesture detection device 11 and, by means of the analysis apparatus 13, is assigned to the respective animated display object and a control command, which causes displaying or removing the animated display object. In case the gesture may be carried out by different passengers who in particular are sitting on the driver seat or the co-driver seat of the vehicle, it also may be detected, which passenger has carried out the gesture, in order to be able to detect definitely whether it is a waving closer or waving away gesture. Described below with respect to the second exemplary embodiment is how it is possible to detect, which passenger carries out a gesture.
With an operating system that is based on a hierarchical menu structure it often is the case that a listing comprising several list entries is displayed. However, since the display surface 1 is limited, in many cases only a part of the complete list is shown. In this case the necessity arises to create a scrolling of the listing to render visible other list entries. For example, when the list entries are displayed one below the other a scrolling downwards or upwards may be created. In order to create a scrolling in a downward direction the user may carry out a gesture in the detection space 12 that comprises a downward movement of the hand. In connection with the display of the listing this gesture is assigned to a control command, which causes the downward scrolling. Accordingly, a gesture comprising an upward movement in connection with the listing is interpreted such that a scrolling in an upward direction is created.
Furthermore, operating objects may be displayed on the display surface, which are actuated by means of gestures. For example, a virtual rotary switch may be displayed on the display surface 1. To actuate the rotary switch, with his hand the user carries out a clockwise rotary movement in the detection space 12 or a movement in an opposite rotary direction. In connection with the representation of the virtual rotary switch for the loudness of an acoustic output this gesture is assigned to control command, which leads to the loudness being increased or decreased. In the same manner, an operating object may be displayed for muting an acoustic output, which may be activated and again be neutralized by means of a gesture.
The control device 8 may further be coupled to a telecommunications device via the vehicle bus 9. The telecommunications device transmits a signal for an incoming call to the control device 8. Thereupon, control device 8 creates a symbol, which is displayed on the display surface 1 and that signals the incoming call. When the user now carries out a gesture in the detection space 12 that corresponds to taking off a handset, this gesture will be detected by the gesture detection device and, by the analysis apparatus 13, will be assigned to a control command, which causes the incoming call to be accepted. Accordingly, a gesture corresponding to hanging up a handset leads to an incoming call being rejected or a telecommunications connection being terminated.
By means of further movement gestures the detail of a geographic map of a navigation system may be moved, for example. Further, windows opening, which contain different information, may be closed by a movement gesture.
Finally, a gesture comprising a rotary movement of the hand may lead to a geographic map being rotated and aligned, respectively, or to a listing arranged in a perspective manner on a ring being rotated.
With respect to FIG. 2 and FIG. 3 a second exemplary embodiment of the device and the method is described:
The device of the further exemplary embodiment generally corresponds to the device of the first exemplary embodiment described with respect to FIG. 1. However, in the device of the further exemplary embodiment the display device 10 is configured such that different display contents are denotable simultaneously for different viewing angles A and B. Furthermore, with respect to the further exemplary embodiment a device is described, by means of which it may be detected whether the gesture in the detection space 12 is carried out by the driver 2 or the co-driver 3.
In the display surface 1 of the display device 10, by means of an optical barrier different display contents are denotable to each observer to the right and to the left of the centric line of sight. As can be seen from FIG. 2, driver 2 sitting on the driver seat 4 views the display surface 1 from the viewing angle A, however, the co-driver 3 sitting on the co-driver seat 5 views the display surface 1 from the viewing angle B. The optical barrier of the display device is arranged such that the driver 2 may see different display content in the angular range A than the co-driver 3 from the angular range B. Thus, the display device 10 may be activated by means of the control device such that different information may be displayed for the driver 2 and the co-driver 3 simultaneously.
The driver 2 or the co-driver 3, respectively, may view the display content of the respective other one by means of a gesture carried out in the detection space 12. For this purpose, the co-driver 3 for example moves his/her hand into the detection space 12 in front of the display surface 1. In doing so, his/her finger tip 6 approaches the display surface 1. The seating position of the co-driver 3, for example, who carries out a gesture, in this case may be determined as follows:
As shown in FIG. 3 an electrode device 7 is located in seat 5 of co-driver 3. By means of this electrode device 7 an identification code may be capacitively coupled into the body of the co-driver 3. In doing so, the identification code may identify the seating position of the co-driver 3 as well as the co-driver 3 him-/herself. The identification code is transmitted via the body of the co-driver 3 and is capacitively uncoupled at the finger tip 6 so that it may be transmitted to a receiving device accommodated in the display device 10.
The receiving device is connected to a control device 8, which in turn is coupled with the electrode device 7 in a capacitive manner. With the capacitive coupling between the electrode device 7 and the co-driver 3 on the one hand and the co-driver 3 and the receiving device in the display device 10 on the other hand an electric field comprising a very limited range of for example several centimeters or decimeters is used. The range of this field substantially corresponds to the size of the detection space 12. Relatively low carrier frequencies of several 100 kHz are used for signal transmission, which result in quasi static fields, i.e. fields for which extensively the physical approach is valid that is valid for static fields. With regard to further details of this signal transmission reference is made to DE 10 2004048 956 A1 and the literature cited therein, which hereby are incorporated in the present application by reference. In particular, the circuit devices used in DE 10 2004 048 956 A1 may be applied. The seating position of driver 2 may be determined in a corresponding manner.
When the co-driver 3 now carries out a gesture comprising a horizontal movement in his/her direction inside the detection space 12, this gesture is assigned to a control command, which causes that the display content currently displayed to the driver's side also is displayed for the co-driver's side, i.e. in viewing angle B. Conversely, when it is detected that the driver carries out a gesture comprising a horizontal movement towards the driver inside the detection space 12, this gesture is assigned to a control command that causes the display content of the co-driver to be displayed to the driver.
REFERENCE NUMERALS
  • 1 display surface
  • 2 driver
  • 3 co-driver
  • 4 driver seat
  • 5 co-driver seat
  • 6 finger tip of a user
  • 7 electrode device
  • 8 control device
  • 9 vehicle bus
  • 10 display device
  • 11 gesture detection device
  • 12 detection space
  • 13 analysis apparatus

Claims (19)

What is claimed is:
1. A method for providing a user interface in a vehicle, the method comprising:
simultaneously displaying a driver side display view and a passenger side display view on a same display surface arranged in the vehicle, wherein the driver side display view is visible to a driver but shielded from a passenger by an optical barrier, and the passenger side display view is visible to the passenger but shielded from the driver by the optical barrier, and wherein the driver side display view and the passenger side display view include different display contents, such that different information is simultaneously displayed to the driver and the passenger, the display contents comprising at least one of operating objects and display objects;
detecting an object-related gesture in a detection space located in front of the display surface without using a contact sensor, the detected object-related gesture being related to at least one of an operating object and a display object, and
identifying and executing a control command corresponding to the detected object-related gesture,
detecting a view-sharing gesture in the detection space without using a contact sensor,
detecting whether the driver or the passenger carries out the view-sharing gesture,
identifying and executing a view-sharing command corresponding to the detected view-sharing gesture, wherein executing the view-sharing command comprises (a) when the view-sharing gesture is carried out by the passenger and comprises a horizontal movement in the direction of the passenger side, causing the current display contents of the driver side display view to also be displayed in the passenger side display view, such that the current display contents in the passenger side display view are replaced by the current display contents in the driver side display view, and (b) when the view-sharing gesture is carried out by the driver and comprises a horizontal movement in the direction of the driver side, causing the current display contents of the passenger side display view to also be displayed in the driver side display view, such that the current display contents in the driver side display view are replaced by the current display contents in the passenger side display view.
2. The method according to claim 1, wherein each of the object-related gesture and the view-sharing gesture comprises a movement carried out by the hand of a user.
3. The method according to claim 2, wherein the operating object comprises a motion element controllable by the object-related gesture and the movement carried out during the object-related gesture moves the motion element on the display surface.
4. The method according to claim 1, wherein an object-related gesture comprising a horizontal movement is assigned to the control command, which causes showing an animated display object for explaining vehicle functions in at least one of an optical and acoustic manner.
5. The method according to claim 4, comprising detecting whether the object-related gesture is performed by a driver or a passenger, and displaying an animated display in response to a waving closer gesture removing the animated display in response to a waving away gesture.
6. The method according to claim 1, wherein a part of a listing is displayed on the display surface and wherein the method comprises detecting a scrolling gesture and scrolling the displayed listing in response to detecting the scrolling gesture.
7. The method according to claim 1, wherein an operating object for adjusting the loudness of an acoustic output is displayed and a control command that causes the loudness to be increased is assigned to a gesture comprising a movement in a defined direction and a control command that causes the loudness to be decreased is assigned to a gesture comprising a movement in the opposite direction.
8. The method according to claim 1, wherein an operating object for muting of an acoustic output is displayed and a control command that causes a muting is assigned to a gesture comprising a movement in a defined direction and a control command that neutralizes the muting is assigned to a gesture comprising a movement in the opposite direction.
9. The method according to claim 1, wherein an operating object for signaling an incoming telephone call is displayed and a control command that causes the telephone call to be accepted is assigned to a gesture in which a movement is carried out that corresponds to taking off a handset.
10. The method according to claim 1, wherein the steps of detecting a view-sharing gesture in the detection space and identifying and executing a view-sharing command comprise:
identifying a driver side view-sharing gesture comprising a horizontal movement in the direction of the passenger and, in response to the detected driver side view-sharing gesture, replacing the current display contents in the passenger side display view with the current display contents in the driver side display view, or
identifying a passenger side view-sharing gesture comprising a horizontal movement in the direction of the driver and, in response to the detected passenger side view-sharing gesture, replacing the current display contents in the driver side display view with the current display contents in the passenger side display view.
11. A device for providing a user interface in a vehicle, comprising:
a display surface configured to simultaneously display a driver side display view and a passenger side display view, wherein the driver side display view is visible to a driver but shielded from a passenger by an optical barrier, and the passenger side display view is visible to the passenger but shielded from the driver by the optical barrier, and wherein the driver side display view and the passenger side display view include different display contents, such that different information is simultaneously displayed to the driver and the passenger, the display contents comprising at least one of operating objects and display objects, wherein the display surface is arranged in the vehicle,
a gesture detection device configured to detect object-related gestures in a detection space that is arranged in front of the display surface without using a contact sensor, each object-related gesture being related to at least one of an operating object and a display object,
an analysis apparatus configured to identify a control command corresponding to each detected object-related gesture, and
a control device configured to execute each control command corresponding to each detected object-related gesture,
the gesture detection device further configured to detect a view-sharing gesture in the detection space and to detect whether the driver or the passenger has carried out the view-sharing gesture,
the analysis apparatus further configured to identify a view-sharing command corresponding to the detected view-sharing gesture, and
the control device further configured to execute the identified view-sharing command, wherein executing the view-sharing command comprises (a) when the view-sharing gesture is carried out by the passenger and comprises a horizontal movement in the direction of the passenger side, causing the current display contents of the driver side display view to also be displayed in the passenger side display view, such that the current display contents in the passenger side display view are replaced by the current display contents in the driver side display view, and (b) when the view-sharing gesture is carried out by the driver and comprises a horizontal movement in the direction of the driver side, causing the current display contents of the passenger side display view to also be displayed in the driver side display view, such that the current display contents in the driver side display view are replaced by the current display contents in the passenger side display view.
12. The device according to claim 11, wherein each of the object-related gesture and the view-sharing gesture comprises a movement carried out by the hand of a user.
13. The device according to claim 11, wherein the operating object comprises a motion element controllable by an object-related gesture and the movement carried out during the object-related gesture moves the motion element on the display surface.
14. The device according to claim 11, wherein an object-related gesture comprising a horizontal movement is assigned to the control command, which causes showing an animated display object for explaining vehicle functions in at least one of an optical and acoustic manner.
15. The device according to claim 14, wherein the device is further configured to detect whether the object-related gesture is performed by a driver or a passenger, and displaying an animated display in response to a waving closer gesture removing the animated display in response to a waving away.
16. The device according to claim 11, wherein the device is further configured to display a part of a listing is displayed on the display surface and wherein the method comprises detecting a scrolling gesture and scrolling the displayed listing in response to detecting the scrolling gesture.
17. The device according to claim 11, wherein the device is further configured to display an operating object for adjusting the loudness of an acoustic output and a control command that causes the loudness to be increased is assigned to a gesture comprising a movement in a defined direction and a control command that causes the loudness to be decreased is assigned to a gesture comprising a movement in the opposite direction.
18. The device according to claim 11, wherein the device is further configured to display an operating object for muting of an acoustic output and a control command that causes a muting is assigned to a gesture comprising a movement in a defined direction and a control command that neutralizes the muting is assigned to a gesture comprising a movement in the opposite direction.
19. The device according to claim 11, wherein the analysis apparatus and control device are configured to identify and execute a view-sharing command corresponding to the detected view-sharing gesture by:
identifying a driver side view-sharing gesture comprising a horizontal movement in the direction of the passenger and, in response to the detected driver side view-sharing gesture, replacing the current display contents in the passenger side display view with the current display contents in the driver side display view, or
identifying a passenger side view-sharing gesture comprising a horizontal movement in the direction of the driver and, in response to the detected passenger side view-sharing gesture, replacing the current display contents in the driver side display view with the current display contents in the passenger side display view.
US13/383,185 2009-07-07 2010-07-07 Method and device for providing a user interface in a vehicle Active 2031-07-22 US9475390B2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
DE102009032069.5 2009-07-07
DE102009032069A DE102009032069A1 (en) 2009-07-07 2009-07-07 Method and device for providing a user interface in a vehicle
DE102009032069 2009-07-07
PCT/EP2010/059744 WO2011003947A1 (en) 2009-07-07 2010-07-07 Method and device for providing a user interface in a vehicle

Publications (2)

Publication Number Publication Date
US20120274549A1 US20120274549A1 (en) 2012-11-01
US9475390B2 true US9475390B2 (en) 2016-10-25

Family

ID=42938504

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/383,185 Active 2031-07-22 US9475390B2 (en) 2009-07-07 2010-07-07 Method and device for providing a user interface in a vehicle

Country Status (6)

Country Link
US (1) US9475390B2 (en)
EP (1) EP2451672B1 (en)
KR (1) KR101460866B1 (en)
CN (1) CN102470757B (en)
DE (1) DE102009032069A1 (en)
WO (1) WO2011003947A1 (en)

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011010594A1 (en) * 2011-02-08 2012-08-09 Daimler Ag Method, apparatus and computer program product for driving a functional unit of a vehicle
DE102011089195A1 (en) * 2011-06-30 2013-01-03 Johnson Controls Gmbh Apparatus and method for the contactless detection of objects and / or persons and of gestures and / or operating processes carried out by them
DE102011112447A1 (en) 2011-09-03 2013-03-07 Volkswagen Aktiengesellschaft Method and arrangement for providing a graphical user interface, in particular in a vehicle
DE102011084345A1 (en) 2011-10-12 2013-04-18 Robert Bosch Gmbh Operating system and method for displaying a control surface
DE102011116122A1 (en) 2011-10-15 2013-04-18 Volkswagen Aktiengesellschaft Method for providing an operating device in a vehicle and operating device
DE102012000201A1 (en) 2012-01-09 2013-07-11 Daimler Ag Method and device for operating functions displayed on a display unit of a vehicle using gestures executed in three-dimensional space as well as related computer program product
EP2802924A1 (en) * 2012-01-09 2014-11-19 Audi AG Method for displacing at least one display content
JP5626259B2 (en) * 2012-05-22 2014-11-19 株式会社デンソー Image display device
DE102012216184A1 (en) 2012-09-12 2014-06-12 Bayerische Motoren Werke Aktiengesellschaft System for controlling air-conditioner and infotainment device of i.e. car by user, has control unit for receiving signal indicating detected gesture from recognition unit and controlling recognized function of air conditioner and device
KR101459445B1 (en) * 2012-12-18 2014-11-07 현대자동차 주식회사 System and method for providing a user interface using wrist angle in a vehicle
KR101916042B1 (en) * 2012-12-18 2018-11-07 현대자동차 주식회사 System and method for controlling electric equipment linked with transmission mode in a vehicle
US10029700B2 (en) * 2012-12-21 2018-07-24 Harman Becker Automotive Systems Gmbh Infotainment system with head-up display for symbol projection
DE102013001771B4 (en) 2013-01-31 2014-09-25 Audi Ag Method for operating a functional unit and functional unit
DE102013001868B4 (en) 2013-02-02 2021-03-25 Audi Ag Method for operating a motor vehicle using gesture control and motor vehicles with a gesture detection device
DE102013003033A1 (en) 2013-02-22 2014-08-28 Audi Ag Method for operating playback unit of device, particularly of motor vehicle, involves moving finger by user splayed out from his hand such that finger partially overlaps ear of user in predetermined gesture
US20140375543A1 (en) * 2013-06-25 2014-12-25 Honda Motor Co., Ltd. Shared cognition
DE102013013326B3 (en) * 2013-08-09 2014-12-24 Audi Ag Movable representation of information in the motor vehicle
CN105683901A (en) * 2013-09-27 2016-06-15 大众汽车有限公司 User interface and method for assisting a user when operating an operating unit
KR101777074B1 (en) * 2013-09-27 2017-09-19 폭스바겐 악티엔 게젤샤프트 User interface and method for assisting a user in the operation of an operator control unit
WO2015043652A1 (en) * 2013-09-27 2015-04-02 Volkswagen Aktiengesellschaft User interface and method for assisting a user with the operation of an operating unit
KR20150087544A (en) * 2014-01-22 2015-07-30 엘지이노텍 주식회사 Gesture device, operating method thereof and vehicle having the same
DE102014204800A1 (en) * 2014-03-14 2015-09-17 Volkswagen Aktiengesellschaft Method and apparatus for providing a graphical user interface in a vehicle
CN104216514A (en) * 2014-07-08 2014-12-17 深圳市华宝电子科技有限公司 Method and device for controlling vehicle-mounted device, and vehicle
US10146317B2 (en) 2014-12-12 2018-12-04 Ford Global Technologies, Llc Vehicle accessory operation based on motion tracking
DE102014225927A1 (en) 2014-12-15 2016-06-30 Volkswagen Aktiengesellschaft System and method for detecting the position of a touch on a surface in the interior of a vehicle
US9547373B2 (en) 2015-03-16 2017-01-17 Thunder Power Hong Kong Ltd. Vehicle operating system using motion capture
US9550406B2 (en) 2015-03-16 2017-01-24 Thunder Power Hong Kong Ltd. Thermal dissipation system of an electric vehicle
US9539988B2 (en) 2015-03-16 2017-01-10 Thunder Power Hong Kong Ltd. Vehicle camera cleaning system
DE102016108885A1 (en) 2016-05-13 2017-11-16 Visteon Global Technologies, Inc. Method for contactless moving of visual information
GB2550845B (en) * 2016-05-23 2020-03-18 Jaguar Land Rover Ltd User input system
DE102016211209A1 (en) 2016-06-23 2017-12-28 Zf Friedrichshafen Ag Control of a motor vehicle
US10920049B2 (en) 2016-09-09 2021-02-16 Leoni Kabel Gmbh Polymer composition with high flexibility and flame retardancy
EP3510096B1 (en) 2016-09-09 2023-11-01 LEONI Kabel GmbH Strand-shaped elements and polymer composition for preparing same
WO2018046098A1 (en) 2016-09-09 2018-03-15 Leoni Kabel Gmbh Elongated article with good flexibility and high flame retardancy
US11248111B2 (en) 2016-09-09 2022-02-15 Leoni Kabel Gmbh Conjunction device such as a cable and polymer composition for preparing same
DE102017203173B4 (en) 2017-02-27 2019-03-07 Audi Ag Motor vehicle with a display device and method for operating a display device of a motor vehicle

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0366132A2 (en) 1988-10-27 1990-05-02 Bayerische Motoren Werke Aktiengesellschaft Multifunction operating apparatus
DE19944324A1 (en) 1999-09-15 2001-03-22 Audi Ag Multi-function control device
DE10022321A1 (en) 2000-05-09 2001-11-15 Bayerische Motoren Werke Ag Apparatus in a vehicle for identifying or recognizing a hand position and translating this to a control command
US20030007227A1 (en) * 2001-07-03 2003-01-09 Takayuki Ogino Display device
DE10147940A1 (en) 2001-09-28 2003-05-22 Siemens Ag Operator panel for controlling motor vehicle systems, such as radio, navigation, etc., comprises a virtual display panel within the field of view of a camera, with detected finger positions used to activate a function
KR20040063156A (en) 2001-11-30 2004-07-12 와코 쥰야꾸 고교 가부시키가이샤 Bisimide compound, acid generator and resist composition each containing the same, and method of forming pattern from the composition
DE10324579A1 (en) 2003-05-30 2004-12-16 Daimlerchrysler Ag operating device
US20040266460A1 (en) * 2003-06-25 2004-12-30 Nokia Corporation System and method for interacting with a shared electronic display
US20060066507A1 (en) * 2004-09-27 2006-03-30 Tetsuya Yanagisawa Display apparatus, and method for controlling the same
DE102005020155A1 (en) 2005-04-29 2006-11-02 Volkswagen Ag Vehicle`s information e.g. time, displaying method, for use with navigation system, involves displaying information and/or data with geo position in section of geographical map, where relationship exists to geo position covered by section
US20060279528A1 (en) * 2003-03-10 2006-12-14 Schobben Daniel W E Multi-view display
US20070013624A1 (en) * 2005-07-13 2007-01-18 Grant Bourhill Display
WO2007107368A1 (en) * 2006-03-22 2007-09-27 Volkswagen Ag Interactive operating device and method for operating the interactive operating device
US20070229472A1 (en) * 2006-03-30 2007-10-04 Bytheway Jared G Circular scrolling touchpad functionality determined by starting position of pointing object on touchpad surface
JP2007283968A (en) 2006-04-19 2007-11-01 Toyota Motor Corp Vehicle control device
US20070262953A1 (en) * 2006-05-15 2007-11-15 Zackschewski Shawn R Multiple-view display system having user manipulation control and method
US20080129684A1 (en) * 2006-11-30 2008-06-05 Adams Jay J Display system having viewer distraction disable and method
EP2018992A1 (en) 2007-07-27 2009-01-28 Continental Automotive GmbH Motor vehicle cockpit
US20090040196A1 (en) * 2003-08-27 2009-02-12 Bernd Duckstein Method of controlling the display of various data in a vehicle and Opto-acoustic data unit
DE102007039442A1 (en) 2007-08-21 2009-02-26 Volkswagen Ag Method for displaying information in a vehicle and display device for a vehicle
US20090080099A1 (en) * 2005-07-25 2009-03-26 Sharp Kabushiki Kaisha Parallax barrier, multiple display device and parallax barrier manufacturing method
DE102007048599A1 (en) 2007-10-10 2009-04-16 Volkswagen Ag Method for operating display device of vehicle, involves displaying different information for different viewing angles by display, where viewing angles are assigned to different users
US20100005427A1 (en) * 2008-07-01 2010-01-07 Rui Zhang Systems and Methods of Touchless Interaction
US20120028701A1 (en) * 2007-02-02 2012-02-02 Gomez Benjamin T Gaming systems having multi-output displays

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030095154A1 (en) * 2001-11-19 2003-05-22 Koninklijke Philips Electronics N.V. Method and apparatus for a gesture-based user interface
DE102004048956A1 (en) 2004-10-07 2006-04-27 Ident Technology Ag Signal transferring method using human body, integrates condenser device that is used as signal interface in oscillating circuit, where circuit is operated as parallel or serial oscillating circuit in receive or transmit mode, respectively
DE102005017313A1 (en) * 2005-04-14 2006-10-19 Volkswagen Ag Method for displaying information in a means of transport and instrument cluster for a motor vehicle

Patent Citations (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5270689A (en) 1988-10-27 1993-12-14 Baverische Motoren Werke Ag Multi-function operating device
EP0366132A2 (en) 1988-10-27 1990-05-02 Bayerische Motoren Werke Aktiengesellschaft Multifunction operating apparatus
DE19944324A1 (en) 1999-09-15 2001-03-22 Audi Ag Multi-function control device
US6769320B1 (en) 1999-09-15 2004-08-03 Audi Ag Multifunctional operating device
DE10022321A1 (en) 2000-05-09 2001-11-15 Bayerische Motoren Werke Ag Apparatus in a vehicle for identifying or recognizing a hand position and translating this to a control command
US20030007227A1 (en) * 2001-07-03 2003-01-09 Takayuki Ogino Display device
DE10147940A1 (en) 2001-09-28 2003-05-22 Siemens Ag Operator panel for controlling motor vehicle systems, such as radio, navigation, etc., comprises a virtual display panel within the field of view of a camera, with detected finger positions used to activate a function
KR20040063156A (en) 2001-11-30 2004-07-12 와코 쥰야꾸 고교 가부시키가이샤 Bisimide compound, acid generator and resist composition each containing the same, and method of forming pattern from the composition
US20060279528A1 (en) * 2003-03-10 2006-12-14 Schobben Daniel W E Multi-view display
US20070182718A1 (en) 2003-05-30 2007-08-09 Hans-Peter Schoener Operator control device
DE10324579A1 (en) 2003-05-30 2004-12-16 Daimlerchrysler Ag operating device
US20040266460A1 (en) * 2003-06-25 2004-12-30 Nokia Corporation System and method for interacting with a shared electronic display
US20090040196A1 (en) * 2003-08-27 2009-02-12 Bernd Duckstein Method of controlling the display of various data in a vehicle and Opto-acoustic data unit
US20060066507A1 (en) * 2004-09-27 2006-03-30 Tetsuya Yanagisawa Display apparatus, and method for controlling the same
DE102005020155A1 (en) 2005-04-29 2006-11-02 Volkswagen Ag Vehicle`s information e.g. time, displaying method, for use with navigation system, involves displaying information and/or data with geo position in section of geographical map, where relationship exists to geo position covered by section
US20070013624A1 (en) * 2005-07-13 2007-01-18 Grant Bourhill Display
US20090080099A1 (en) * 2005-07-25 2009-03-26 Sharp Kabushiki Kaisha Parallax barrier, multiple display device and parallax barrier manufacturing method
WO2007107368A1 (en) * 2006-03-22 2007-09-27 Volkswagen Ag Interactive operating device and method for operating the interactive operating device
DE102006037156A1 (en) 2006-03-22 2007-09-27 Volkswagen Ag Interactive operating device and method for operating the interactive operating device
US20090327977A1 (en) * 2006-03-22 2009-12-31 Bachfischer Katharina Interactive control device and method for operating the interactive control device
US20070229472A1 (en) * 2006-03-30 2007-10-04 Bytheway Jared G Circular scrolling touchpad functionality determined by starting position of pointing object on touchpad surface
JP2007283968A (en) 2006-04-19 2007-11-01 Toyota Motor Corp Vehicle control device
US20070262953A1 (en) * 2006-05-15 2007-11-15 Zackschewski Shawn R Multiple-view display system having user manipulation control and method
US20080129684A1 (en) * 2006-11-30 2008-06-05 Adams Jay J Display system having viewer distraction disable and method
US20120028701A1 (en) * 2007-02-02 2012-02-02 Gomez Benjamin T Gaming systems having multi-output displays
EP2018992A1 (en) 2007-07-27 2009-01-28 Continental Automotive GmbH Motor vehicle cockpit
US20090027332A1 (en) 2007-07-27 2009-01-29 Continental Automotive Gmbh Motor vehicle cockpit
DE102007039442A1 (en) 2007-08-21 2009-02-26 Volkswagen Ag Method for displaying information in a vehicle and display device for a vehicle
US20110205162A1 (en) 2007-08-21 2011-08-25 Waeller Christoph Method for displaying information in a vehicle and display device for a vehicle
DE102007048599A1 (en) 2007-10-10 2009-04-16 Volkswagen Ag Method for operating display device of vehicle, involves displaying different information for different viewing angles by display, where viewing angles are assigned to different users
US20100005427A1 (en) * 2008-07-01 2010-01-07 Rui Zhang Systems and Methods of Touchless Interaction

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
International PCT Search Report and Written Opinion, PCT/EP2010/059744, 12 pages, Nov. 2, 2010.
Zumo 660 series Owners Manual (Published May 2009). *

Also Published As

Publication number Publication date
EP2451672B1 (en) 2013-09-11
EP2451672A1 (en) 2012-05-16
CN102470757B (en) 2015-08-26
US20120274549A1 (en) 2012-11-01
CN102470757A (en) 2012-05-23
KR20120049249A (en) 2012-05-16
KR101460866B1 (en) 2014-11-11
WO2011003947A1 (en) 2011-01-13
DE102009032069A1 (en) 2011-01-13

Similar Documents

Publication Publication Date Title
US9475390B2 (en) Method and device for providing a user interface in a vehicle
KR101367593B1 (en) Interactive operating device and method for operating the interactive operating device
US9649938B2 (en) Method for synchronizing display devices in a motor vehicle
US8907778B2 (en) Multi-function display and operating system and method for controlling such a system having optimized graphical operating display
EP2544078B1 (en) Display device with adaptive capacitive touch panel
JP2018150043A (en) System for information transmission in motor vehicle
US10144285B2 (en) Method for operating vehicle devices and operating device for such devices
US20150367859A1 (en) Input device for a motor vehicle
US20140365928A1 (en) Vehicle's interactive system
US10627913B2 (en) Method for the contactless shifting of visual information
CN109643219B (en) Method for interacting with image content presented on a display device in a vehicle
US20140210795A1 (en) Control Assembly for a Motor Vehicle and Method for Operating the Control Assembly for a Motor Vehicle
MX2011004124A (en) Method and device for displaying information sorted into lists.
US11188211B2 (en) Transportation vehicle with an image capturing unit and an operating system for operating devices of the transportation vehicle and method for operating the operating system
US20220244789A1 (en) Method for operating a mobile terminal using a gesture recognition and control device, gesture recognition and control device, motor vehicle, and an output apparatus that can be worn on the head
US20180157324A1 (en) Method and Device for Interacting with a Graphical User Interface
JP4858206B2 (en) In-vehicle device operation support device and operation support method
KR101946746B1 (en) Positioning of non-vehicle objects in the vehicle
US20130201126A1 (en) Input device
CN103958255A (en) Method for operating a mobile device in a vehicle
US10052955B2 (en) Method for providing an operating device in a vehicle and operating device
KR20170010066A (en) Method and device for providing a user interface in a vehicle
GB2539329A (en) Method for operating a vehicle, in particular a passenger vehicle
JP2017187919A (en) In-vehicle information processing system
JP2017199203A (en) Vehicle-mounted information processing system

Legal Events

Date Code Title Description
AS Assignment

Owner name: VOLKSWAGEN AG, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WEHLING, ULRIKE;FABIAN, THOMAS;REEL/FRAME:028860/0205

Effective date: 20120207

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4