US20160077671A1 - Rapid prototyping and machine vision for reconfigurable interfaces - Google Patents

Rapid prototyping and machine vision for reconfigurable interfaces Download PDF

Info

Publication number
US20160077671A1
US20160077671A1 US14/949,271 US201514949271A US2016077671A1 US 20160077671 A1 US20160077671 A1 US 20160077671A1 US 201514949271 A US201514949271 A US 201514949271A US 2016077671 A1 US2016077671 A1 US 2016077671A1
Authority
US
United States
Prior art keywords
user
reconfigurable
human
input devices
machine interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/949,271
Inventor
Christopher R. Wagner
Amanda Christiana
Douglas Haanpaa
Charles J. Jacobus
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
JOLLY SEVEN SERIES 70 OF ALLIED SECURITY TRUST I
Original Assignee
Cybernet Systems Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cybernet Systems Corp filed Critical Cybernet Systems Corp
Priority to US14/949,271 priority Critical patent/US20160077671A1/en
Publication of US20160077671A1 publication Critical patent/US20160077671A1/en
Assigned to NORTHERN LIGHTS, SERIES 74 OF ALLIED SECURITY TRUST I reassignment NORTHERN LIGHTS, SERIES 74 OF ALLIED SECURITY TRUST I ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CYBERNET SYSTEMS CORPORATION
Assigned to JOLLY SEVEN, SERIES 70 OF ALLIED SECURITY TRUST I reassignment JOLLY SEVEN, SERIES 70 OF ALLIED SECURITY TRUST I ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NORTHERN LIGHTS, SERIES 74 OF ALLIED SECURITY TRUST I
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B6/00Tactile signalling systems, e.g. personal calling systems
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K17/00Electronic switching or gating, i.e. not by contact-making and –breaking
    • H03K17/94Electronic switching or gating, i.e. not by contact-making and –breaking characterised by the way in which the control signals are generated
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y80/00Products made by additive manufacturing

Definitions

  • This invention relates generally to human-machine interfaces and, in particular, to interfaces that use rapid prototyping technology to fabricate interfaces with desired functional and haptic responses.
  • a complex simulation interface is useful for training. For example, pilots would like to train in a flight simulator using mission specific data. These simulators often take up a large amount of space, particularly if one simulator exists for each type of interface (e.g. each aircraft cockpit).
  • This invention relates generally to human-machine interfaces and, in particular, to interfaces that use rapid prototyping technology to fabricate interfaces with desired functional and haptic responses.
  • the embodiments include software to aid in generation of panels and control instruments rapidly construct panels that can support a variety of control interfaces. These panels can replicate existing systems (for simulation, training, gaming, etc.) or create new designs (for human factors testing, as functional product, etc.).
  • the controls may have tactile and/or visual characteristics similar or identical to their functional component counterparts such as buttons, knobs, switches, pedals, joysticks, steering wheels, and touch panels but are modular and use alternative data transfer modes (potentiometers, fiber optics, RFID, machine vision, etc.) to track and analyze the response of the controls. The response is then transmitted to the host programs.
  • a user can design and fabricate a reconfigurable interface to interact with virtual environments for various applications such as simulation, training, virtual instrumentation, gaming, human factors testing, etc.
  • a method of forming a human-machine interface according to the invention includes the step of providing a scaffold enabling different interface components to be positioned at different locations on the scaffold.
  • One or more of the interface components are fabricated with a rapid prototyping technology. Such components feature a physical structure that provides a specified functional response.
  • the components are mounted on the scaffold in accordance with a given application.
  • the functional response may include a haptic response.
  • the interface component may be a pushbutton, in which case the haptic response may include the depth of travel or the stiffness of the pushbutton.
  • the interface component may be a pushbutton with a frame, a button portion, and compliant tabs coupling the button portion to the frame, and the haptic response may include the depth of travel or the stiffness of the pushbutton determined by the size, shape or composition of the compliant tabs.
  • the interface component may include a location to be touched by a user, in which case the specified functional response is determined through machine vision extraction of the touch location.
  • a human-machine interface constructed in accordance with the invention broadly includes a plurality of user input devices, at least certain of which are fabricated with a rapid prototyping technology, wherein at least some of the user input devices are fabricated with a rapid prototyping technology include physical features that are sized, shaped or composed to yield a desired functional response.
  • FIG. 1A shows scaffolding for a multifunction display
  • FIG. 1B shows a rapidly prototyped assembled button mechanism, where compliant tabs provide a specified haptic response (depth of travel, stiffness) as well as an optical change enabling machine vision extraction;
  • FIG. 1C shows the button mechanism of FIG. 1B in an exploded form
  • FIG. 1D shows a rapidly prototyped touch panel
  • FIG. 2 depicts a machine vision interface state extraction embodiment.
  • Rapid prototyping machines are increasing in prevalence due to their increase in resolution, speed, and value. These machines are essentially 3D printers. Given a software description of a solid component, a rapid prototyping machine can build up a physical solid to those specifications in a manner of minutes to hours.
  • Applicable technologies include Selective Laser Sintering (SLS); Fused Deposition Modeling (FDM); Stereolithography (SLA); Photopolymer Laminated Object Manufacturing (LOM); Electron Beam Melting (EBM); and “3D Printing” (3DP). This invention is not limited in this regard however, as it may take advantage of these as well as any yet-to-be developed alternatives.
  • haptic fidelity is a priority for those components that need to be accessed “by touch”
  • this invention uses a combination of technologies enabling an actual button or knob component to give perfect haptic fidelity in conjunction with rapidly prototyped interface components. This allows for tradeoffs between haptic fidelity and storage space—using actual components for those high priority buttons and knobs that need to be found “by touch,” and using rapidly prototyped components for lower priority buttons that do not need to be physically stored until time of use.
  • FIG. 1A shows scaffolding for a multifunction display, where actual components can be positioned precisely.
  • the scaffolding of FIG. 1A includes a frame 100 with various holes 102 and an aperture 104 enabling user controls and a central display to be ‘snapped’ into position. These controls may either be stored physical components or rapidly prototyped elements, depending upon operational criticality or other factors.
  • FIG. 1B shows a rapidly prototyped button mechanism in an assembled form, where compliant tabs 110 , 112 provide a specified haptic response (depth of travel, stiffness) as well as an optical change enabling machine vision extraction.
  • FIG. 1C shows the button mechanism of FIG. 1B in an exploded form.
  • the tabs 110 , 112 interact with a frame 120 .
  • the components may be ‘written’ with rapid prototyping technology to provide the dimensions, thicknesses and other physical parameters to achieve a target ‘look and feel’ and functional response.
  • FIG. 1D shows a rapidly prototyped touch panel, where an interference grating is used to transform a small applied force into a large optical change, again enabling machine vision extraction of touch location.
  • the structure of FIG. 1D uses only rapid prototyped components. This design demonstrates how rapid prototyping can combine structural with functional interface components.
  • FIG. 2 A machine vision interface state extraction embodiment is shown in FIG. 2 .
  • Cameras are used to extract position of actual buttons and switches, maintaining haptic feel, without having to reconnect every button and switch upon reconfiguration.
  • these components can be optimized to undergo a large optical change during a state change (e.g. button press).
  • This invention also incorporates machine vision technologies that enable this optimally “mixed fidelity” approach to haptically simulating cockpit interfaces.
  • the main difficulty with using either real components or rapidly-prototyped components is that electrical wiring each interface component would be too time consuming to be practical. Instead, we have developed machine vision technologies that can extract button positions (or knob orientations, etc.) from a camera view of the interface. This way, no wiring is needed, and different cockpit interface panels can be interchanged quickly with no loss in functionality.
  • a primary advantage of using physical structures for interface components is that the haptic realism is near perfect, and cannot be matched by today's haptic simulation technology. These components can also be reused and quickly reconfigured simply by placing rapidly prototyped panels into specified locations.

Abstract

A system and method including software to aid in generation of panels and control instruments rapidly generates a station that can support a variety of control interfaces. Rapid-Prototyped Panels, or RP-Panels, replicate existing systems (for simulation, training, gaming, etc.) or from new designs (for human factors testing, as functional product, etc.). The controls have tactile and visual characteristics similar or identical to their functional component counterparts such as buttons, knobs, switches, pedals, joysticks, steering wheels, and touch panels but are modular and use alternative data transfer modes (potentiometers, fiber optics, RFID, machine vision, etc.) to track and analyze the response of the controls. The response is then transmitted to the host programs. With this method a user can design and fabricate a reconfigurable interface to interact with virtual environments for various applications such as simulation, training, virtual instrumentation, gaming, human factors testing, etc.

Description

    REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 12/351,332, filed Jan. 9, 2009, which claims priority from U.S. Provisional Patent Application Ser. No. 61/020,013, filed Jan. 9, 2008, the entire content of which is incorporated herein by reference.
  • GOVERNMENT SUPPORT
  • This invention was made with Government support under Contract No. N61339-07-C-0085 awarded by the United States Navy. The Government has certain rights in the invention.
  • FIELD OF THE INVENTION
  • This invention relates generally to human-machine interfaces and, in particular, to interfaces that use rapid prototyping technology to fabricate interfaces with desired functional and haptic responses.
  • BACKGROUND OF THE INVENTION
  • In many situations, a complex simulation interface is useful for training. For example, pilots would like to train in a flight simulator using mission specific data. These simulators often take up a large amount of space, particularly if one simulator exists for each type of interface (e.g. each aircraft cockpit).
  • What is needed is a reconfigurable simulator that can faithfully replicate the look and feel (sense of touch) of a number of different interfaces. Simulating the sense of touch, or haptic, aspect of a range of interfaces is a particularly difficult task due to the sensitivity of the human tactile system. Current haptic interface technologies are unable to faithfully recreate the large range of tactile stimuli encountered when interacting with interface components such as buttons, knobs, etc. common to interfaces.
  • SUMMARY OF THE INVENTION
  • This invention relates generally to human-machine interfaces and, in particular, to interfaces that use rapid prototyping technology to fabricate interfaces with desired functional and haptic responses. The embodiments include software to aid in generation of panels and control instruments rapidly construct panels that can support a variety of control interfaces. These panels can replicate existing systems (for simulation, training, gaming, etc.) or create new designs (for human factors testing, as functional product, etc.).
  • The controls may have tactile and/or visual characteristics similar or identical to their functional component counterparts such as buttons, knobs, switches, pedals, joysticks, steering wheels, and touch panels but are modular and use alternative data transfer modes (potentiometers, fiber optics, RFID, machine vision, etc.) to track and analyze the response of the controls. The response is then transmitted to the host programs. By virtue of the invention, a user can design and fabricate a reconfigurable interface to interact with virtual environments for various applications such as simulation, training, virtual instrumentation, gaming, human factors testing, etc.
  • A method of forming a human-machine interface according to the invention includes the step of providing a scaffold enabling different interface components to be positioned at different locations on the scaffold. One or more of the interface components are fabricated with a rapid prototyping technology. Such components feature a physical structure that provides a specified functional response. The components are mounted on the scaffold in accordance with a given application.
  • The functional response may include a haptic response. For example, the interface component may be a pushbutton, in which case the haptic response may include the depth of travel or the stiffness of the pushbutton. In a specific embodiment, the interface component may be a pushbutton with a frame, a button portion, and compliant tabs coupling the button portion to the frame, and the haptic response may include the depth of travel or the stiffness of the pushbutton determined by the size, shape or composition of the compliant tabs.
  • Alternatively, the interface component may include a location to be touched by a user, in which case the specified functional response is determined through machine vision extraction of the touch location. A human-machine interface constructed in accordance with the invention broadly includes a plurality of user input devices, at least certain of which are fabricated with a rapid prototyping technology, wherein at least some of the user input devices are fabricated with a rapid prototyping technology include physical features that are sized, shaped or composed to yield a desired functional response.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1A shows scaffolding for a multifunction display;
  • FIG. 1B shows a rapidly prototyped assembled button mechanism, where compliant tabs provide a specified haptic response (depth of travel, stiffness) as well as an optical change enabling machine vision extraction;
  • FIG. 1C shows the button mechanism of FIG. 1B in an exploded form;
  • FIG. 1D shows a rapidly prototyped touch panel; and
  • FIG. 2 depicts a machine vision interface state extraction embodiment.
  • DETAILED DESCRIPTION OF THE INVENTION
  • This invention takes advantage of progress made in rapid prototyping technology. Rapid prototyping machines are increasing in prevalence due to their increase in resolution, speed, and value. These machines are essentially 3D printers. Given a software description of a solid component, a rapid prototyping machine can build up a physical solid to those specifications in a manner of minutes to hours. Applicable technologies include Selective Laser Sintering (SLS); Fused Deposition Modeling (FDM); Stereolithography (SLA); Photopolymer Laminated Object Manufacturing (LOM); Electron Beam Melting (EBM); and “3D Printing” (3DP). This invention is not limited in this regard however, as it may take advantage of these as well as any yet-to-be developed alternatives.
  • The inventive approach to reconfigurable displays take advantage of these rapid prototyping machines, where new interface structures and components can be printed then easily configured into a working simulation. Because haptic fidelity is a priority for those components that need to be accessed “by touch,” this invention uses a combination of technologies enabling an actual button or knob component to give perfect haptic fidelity in conjunction with rapidly prototyped interface components. This allows for tradeoffs between haptic fidelity and storage space—using actual components for those high priority buttons and knobs that need to be found “by touch,” and using rapidly prototyped components for lower priority buttons that do not need to be physically stored until time of use.
  • According to the invention, certain human interface components are fabricated with rapid prototyping technology. Figure IA shows scaffolding for a multifunction display, where actual components can be positioned precisely. The scaffolding of FIG. 1A includes a frame 100 with various holes 102 and an aperture 104 enabling user controls and a central display to be ‘snapped’ into position. These controls may either be stored physical components or rapidly prototyped elements, depending upon operational criticality or other factors.
  • FIG. 1B shows a rapidly prototyped button mechanism in an assembled form, where compliant tabs 110, 112 provide a specified haptic response (depth of travel, stiffness) as well as an optical change enabling machine vision extraction. FIG. 1C shows the button mechanism of FIG. 1B in an exploded form. The tabs 110, 112 interact with a frame 120. By providing inputs associated with the desired haptic response, the components may be ‘written’ with rapid prototyping technology to provide the dimensions, thicknesses and other physical parameters to achieve a target ‘look and feel’ and functional response.
  • Another advantage of rapidly prototyping interface components is that monolithic, functional interfaces can be printed by rapid prototyping machines in a single pass, requiring no assembly before use. FIG. 1D shows a rapidly prototyped touch panel, where an interference grating is used to transform a small applied force into a large optical change, again enabling machine vision extraction of touch location. The structure of FIG. 1D uses only rapid prototyped components. This design demonstrates how rapid prototyping can combine structural with functional interface components.
  • A machine vision interface state extraction embodiment is shown in FIG. 2. Cameras are used to extract position of actual buttons and switches, maintaining haptic feel, without having to reconnect every button and switch upon reconfiguration. When used in conjunction with rapidly prototyped interface components, these components can be optimized to undergo a large optical change during a state change (e.g. button press).
  • This invention also incorporates machine vision technologies that enable this optimally “mixed fidelity” approach to haptically simulating cockpit interfaces. The main difficulty with using either real components or rapidly-prototyped components is that electrical wiring each interface component would be too time consuming to be practical. Instead, we have developed machine vision technologies that can extract button positions (or knob orientations, etc.) from a camera view of the interface. This way, no wiring is needed, and different cockpit interface panels can be interchanged quickly with no loss in functionality.
  • A primary advantage of using physical structures for interface components is that the haptic realism is near perfect, and cannot be matched by today's haptic simulation technology. These components can also be reused and quickly reconfigured simply by placing rapidly prototyped panels into specified locations.

Claims (9)

We claim:
1. A reconfigurable human-machine interface, comprising:
a plurality of user-input devices fabricated with a rapid prototyping technology;
a scaffold having a plurality of locations configured to receive the user-input devices;
at least one video camera having a field of view including the scaffold and the user-input devices;
wherein the camera optically determines the interaction between a user and the user-input devices, thereby controlling a machine without the need for wiring between the user-input devices and the machine.
2. The reconfigurable human-machine interface of claim 1, wherein:
the scaffold and user-input devices form a touch panel; and
one or more cameras are used to determine where a user touches the panel.
3. The reconfigurable human-machine interface of claim 1, wherein at least some of the user input devices include physical features that are sized, shaped or composed to simulate the controls of a particular machine.
4. The reconfigurable human-machine interface of claim 3, wherein:
the particular machine is an aircraft; and
the controls simulate a cockpit.
5. The reconfigurable human-machine interface of claim 1, wherein the user-input devices snap into the scaffold at different locations.
6. The reconfigurable human-machine interface of claim 1, wherein the user-input devices are buttons, knobs, switches, pedals, joysticks, or steering wheels.
7. The reconfigurable human-machine interface of claim 1, wherein at least some of the user input devices are fabricated to provide a desired haptic response.
8. The reconfigurable human-machine interface of claim 1, wherein:
the user-input device is a pushbutton; and
the desired haptic response includes the depth of travel or the stiffness of the pushbutton.
9. The reconfigurable human-machine interface of claim 1, wherein the rapid prototyping technology includes Stereolithography (SLA); Photopolymer Laminated Object Manufacturing (LOM), or 3D Printing (3DP).
US14/949,271 2008-01-09 2015-11-23 Rapid prototyping and machine vision for reconfigurable interfaces Abandoned US20160077671A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/949,271 US20160077671A1 (en) 2008-01-09 2015-11-23 Rapid prototyping and machine vision for reconfigurable interfaces

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US2001308P 2008-01-09 2008-01-09
US12/351,332 US9195886B2 (en) 2008-01-09 2009-01-09 Rapid prototyping and machine vision for reconfigurable interfaces
US14/949,271 US20160077671A1 (en) 2008-01-09 2015-11-23 Rapid prototyping and machine vision for reconfigurable interfaces

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/351,332 Continuation US9195886B2 (en) 2008-01-09 2009-01-09 Rapid prototyping and machine vision for reconfigurable interfaces

Publications (1)

Publication Number Publication Date
US20160077671A1 true US20160077671A1 (en) 2016-03-17

Family

ID=40876023

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/351,332 Expired - Fee Related US9195886B2 (en) 2008-01-09 2009-01-09 Rapid prototyping and machine vision for reconfigurable interfaces
US14/949,271 Abandoned US20160077671A1 (en) 2008-01-09 2015-11-23 Rapid prototyping and machine vision for reconfigurable interfaces

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/351,332 Expired - Fee Related US9195886B2 (en) 2008-01-09 2009-01-09 Rapid prototyping and machine vision for reconfigurable interfaces

Country Status (1)

Country Link
US (2) US9195886B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10124252B2 (en) * 2013-03-15 2018-11-13 Immersion Corporation Programmable haptic peripheral
KR20200082831A (en) 2018-12-31 2020-07-08 효성티앤에스 주식회사 Quantification method of damage state of soiled banknotes

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9417754B2 (en) 2011-08-05 2016-08-16 P4tents1, LLC User interface system, method, and computer program product
US10579144B2 (en) * 2017-03-03 2020-03-03 Arizona Board Of Regents On Behalf Of Arizona State University Resonant vibration haptic display
CN110689400B (en) * 2019-08-29 2022-02-25 苏宁云计算有限公司 Man-machine similar track detection method and device based on screen segmentation

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5544842A (en) * 1993-09-21 1996-08-13 Smith; Edward Apparatus and method for the conversion of a three crew member aircraft cockpit to a two crew member aircraft cockpit
US6414672B2 (en) * 1997-07-07 2002-07-02 Sony Corporation Information input apparatus
US20030183497A1 (en) * 2002-03-27 2003-10-02 Johnston Raymond P. Apparatus exhibiting tactile feel
US20060092131A1 (en) * 2004-10-28 2006-05-04 Canon Kabushiki Kaisha Image processing method and apparatus
US20070152974A1 (en) * 2006-01-03 2007-07-05 Samsung Electronics Co., Ltd. Haptic button and haptic device using the same
US20080211779A1 (en) * 1994-08-15 2008-09-04 Pryor Timothy R Control systems employing novel physical controls and touch screens
US20090189749A1 (en) * 2006-11-17 2009-07-30 Salada Mark A Haptic Interface Device and Method for Using Such

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040221258A1 (en) * 2003-05-01 2004-11-04 Lockheed Martin Corporation Method and apparatus for generating custom status display
US9024874B2 (en) * 2007-03-12 2015-05-05 University of Pittsburgh—of the Commonwealth System of Higher Education Fingertip visual haptic sensor controller

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5544842A (en) * 1993-09-21 1996-08-13 Smith; Edward Apparatus and method for the conversion of a three crew member aircraft cockpit to a two crew member aircraft cockpit
US20080211779A1 (en) * 1994-08-15 2008-09-04 Pryor Timothy R Control systems employing novel physical controls and touch screens
US6414672B2 (en) * 1997-07-07 2002-07-02 Sony Corporation Information input apparatus
US20030183497A1 (en) * 2002-03-27 2003-10-02 Johnston Raymond P. Apparatus exhibiting tactile feel
US20060092131A1 (en) * 2004-10-28 2006-05-04 Canon Kabushiki Kaisha Image processing method and apparatus
US20070152974A1 (en) * 2006-01-03 2007-07-05 Samsung Electronics Co., Ltd. Haptic button and haptic device using the same
US20090189749A1 (en) * 2006-11-17 2009-07-30 Salada Mark A Haptic Interface Device and Method for Using Such

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10124252B2 (en) * 2013-03-15 2018-11-13 Immersion Corporation Programmable haptic peripheral
US10279251B2 (en) 2013-03-15 2019-05-07 Immersion Corporation Programmable haptic peripheral
KR20200082831A (en) 2018-12-31 2020-07-08 효성티앤에스 주식회사 Quantification method of damage state of soiled banknotes

Also Published As

Publication number Publication date
US9195886B2 (en) 2015-11-24
US20090184809A1 (en) 2009-07-23

Similar Documents

Publication Publication Date Title
US20160077671A1 (en) Rapid prototyping and machine vision for reconfigurable interfaces
EP2624238A1 (en) Virtual mock up with haptic hand held aid
US20170025031A1 (en) Method and apparatus for testing a device for use in an aircraft
JP4523346B2 (en) Display system
Thomas et al. State-of-the-art and future concepts for interaction in aircraft cockpits
Girdler et al. Mid-Air Haptics in Aviation--creating the sensation of touch where there is nothing but thin air
CA2963255A1 (en) Troubleshooting a model defining a dynamic behavior of a simulated interactive object
Odeh A Web-Based Remote Lab Platform with Reusability for Electronic Experiments in Engineering Education.
US11288340B2 (en) Dynamically updating a model associated to a simulated interactive object
Feng et al. Computer-aided usability evaluation of in-vehicle infotainment systems
Cruz-Neira Making virtual reality useful: A report on immersive applications at Iowa State University
Jáuregui et al. Tacsel: Shape-Changing Tactile Screen applied for Eyes-Free Interaction in Cockpit
KR101392266B1 (en) Flight simulator for kuh and controlling method
Jeong et al. M. Integrator: a maker’s tool for integrating kinetic mechanisms and sensors
KR20200099229A (en) Avionics simulation system and method
EP3670334B1 (en) Hand-operable man-machine interface for aircraft, drone remote control systems, flight simulators, spacecraft and the like
Schaefer et al. Challenges with developing driving simulation systems for robotic vehicles
US20190064920A1 (en) Controlling and configuring unit and method for controlling and configuring a microscope
KR102631398B1 (en) Virtual reality-based flight training system using real control device customized for aircraft type
KR102631397B1 (en) Real and virtual reality based system for flight training of various type of aircraft
Casani et al. Flight system testbed
Joyce Performance of a Novel Virtual Environment for Cockpit Evaluation
Richards et al. PC Rapid Modification Tool for Aircraft Experimentation & Training for the MH-60S/MH-60R Helicopters
Hill Reducing jerk for bimanual control with virtual reality
Binet Versatile Offline Simulation Tool for Systems Design

Legal Events

Date Code Title Description
AS Assignment

Owner name: NORTHERN LIGHTS, SERIES 74 OF ALLIED SECURITY TRUS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CYBERNET SYSTEMS CORPORATION;REEL/FRAME:042369/0414

Effective date: 20170505

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE

AS Assignment

Owner name: JOLLY SEVEN, SERIES 70 OF ALLIED SECURITY TRUST I,

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NORTHERN LIGHTS, SERIES 74 OF ALLIED SECURITY TRUST I;REEL/FRAME:049416/0337

Effective date: 20190606