Recherche Images Maps Play YouTube Actualités Gmail Drive Plus »
Connexion
Les utilisateurs de lecteurs d'écran peuvent cliquer sur ce lien pour activer le mode d'accessibilité. Celui-ci propose les mêmes fonctionnalités principales, mais il est optimisé pour votre lecteur d'écran.

Brevets

  1. Recherche avancée dans les brevets
Numéro de publicationUS20050282141 A1
Type de publicationDemande
Numéro de demandeUS 10/870,879
Date de publication22 déc. 2005
Date de dépôt17 juin 2004
Date de priorité17 juin 2004
Autre référence de publicationUS7991729, US20080026348
Numéro de publication10870879, 870879, US 2005/0282141 A1, US 2005/282141 A1, US 20050282141 A1, US 20050282141A1, US 2005282141 A1, US 2005282141A1, US-A1-20050282141, US-A1-2005282141, US2005/0282141A1, US2005/282141A1, US20050282141 A1, US20050282141A1, US2005282141 A1, US2005282141A1
InventeursMark Falash, Kenneth Donovan, Peter Bilazarian
Cessionnaire d'origineFalash Mark D, Donovan Kenneth B, Peter Bilazarian
Exporter la citationBiBTeX, EndNote, RefMan
Liens externes: USPTO, Cession USPTO, Espacenet
Scenario workflow based assessment system and method
US 20050282141 A1
Résumé
A system and a method for automating performance assessments of an exercise or a training activity provide event assessment information in real-time to one or more evaluators in conjunction with unfolding events. The evaluators can wirelessly communicate assessment information to a database for after action review (AAR).
Images(5)
Previous page
Next page
Revendications(29)
1. A method comprising:
defining a scenario workflow relative to a selected situation;
storing the workflow;
establishing a set of assessment criteria relative to a plurality of workflow related events;
retrieving assessment criteria for at least one active event as the scenario proceeds; and
providing the retrieved assessment criteria to an evaluator.
2. A method as in claim 1 where the assessment criteria are provided to the evaluator at or about the time the active event occurs.
3. A method as in claim 2 where the criteria are communicated, at least in part, wirelessly.
4. A method as in claim 2 which includes storing real-time evaluation information relative to the active event.
5. A method as in claim 4 which includes providing supplemental information relative to evaluating the event, in response to previously stored observations.
6. A method as in claim 5 where the criteria are communicated, at least in part, wirelessly.
7. A method as in claim 5 where the supplemental information is provided in real-time as the event is taking place.
8. A method as in claim 7 which includes storing evaluation information during the event grouped at least by predetermined objectives.
9. A method as in claim 8 which includes replaying stored information relative to the event, in combination with temporally related, pre-stored, evaluation information.
10. A method as in claim 9 which includes preparing reports pertaining to scenario implementation.
11. A method as in claim 7 where the supplemental information is provided, in part, wirelessly to at least one evaluator.
12. A method as in claim 7 which includes carrying out a plurality of events related to the scenario workflow.
13. A method as in claim 12 which includes wirelessly receiving assessment criteria related to a plurality of workflow related events.
14. A method as in claim 13 which includes providing wireless receivers to the evaluators.
15. A system comprising:
a stored workflow event list;
a stored event assessment criteria list;
circuitry to provide information to an evaluator for assessing a plurality of events; and
circuitry for providing supplemental information to the evaluator.
16. A system as in claim 15 which includes circuitry to provide assessment criteria to a plurality of evaluators.
17. A system as in claim 16 which includes storage for a plurality of event assessments.
18. A system as in claim 17 which includes a plurality of evaluator units which provide context-sensitive assessment criteria.
19. A system as in claim 18 where at least some of the units communicate wirelessly.
20. A system as in claim 19 where the units include software enabling an evaluator to enter exercise related observations.
21. A system as in claim 20 where the units include software to receive and present feedback information to the evaluator relative to the exercise.
22. A system for evaluating performance during a scenario comprising:
first software for creating a plurality of scenario workflow events;
second software for creating assessment criteria for each of the workflow events;
software for determining at least one active event of an on-going scenario;
software for retrieving assessment criteria for the active event;
software for receiving event related assessment information from the evaluator; and
software providing for supplemental assessment information to an evaluator.
23. A system as in claim 22 where the first software creates and stores at least one workflow event list.
24. A system as in claim 23 where the second software creates and stores a workflow event assessment criteria list.
25. A system as in claim 22 which includes software for displaying assessment criteria for the active event.
26. A system as in claim 22 which includes software for displaying supplemental information concerning an on-going event to the evaluator.
27. A system as in claim 22 which includes wireless I/O devices for entering assessment information.
28. A system as in claim 19 which includes software for storing assessment information.
29. A system as in claim 28 for retrieving stored assessment information.
Description
    FIELD OF THE INVENTION
  • [0001]
    The invention pertains to systems and methods for assessing performance of participants during training exercises, or, rehearsing of missions. More particularly, the invention pertains to automated systems and methods to facilitate performance evaluation by providing real-time feedback to evaluators as an activity proceeds.
  • BACKGROUND OF THE INVENTION
  • [0002]
    The importance of training personnel to respond to events such as fires, violent domestic events, accidents or natural disasters (earthquakes, tornadoes, floods or the like) is well recognized. Similar comments apply to military training/mission rehearsal.
  • [0003]
    Training/rehearsal activities can last hours or days and can involve a large number of geographically dispersed participants. The value of collecting information as to how the exercise was carried out to facilitate an accurate and meaningful after-action review is also well known. One such system and method are disclosed in U.S. Pat. No. 6,106,297 issued Aug. 22, 2000, assigned to the assignee hereof and entitled “Distributed Interactive Simulation Exercise Manager System and Method”. The '297 patent is hereby incorporated by reference.
  • [0004]
    While the primary value of conducting a performance session, such as a training or exercise session, is an effective and accurate assessment, (the basis of measurable and verifiable feedback to the session audience or participant, the after-action review (AAR)) obtaining such assessments during such sessions can be difficult. A problem in efficiently assessing, or evaluating, performance during complex tasks is defining what is important to be assessed at any given time, and what assessment criteria should be used.
  • [0005]
    It has been known in prior art to define assessment criteria and guidance prior to the assessment session. The assessor is then required to monitor performance activities to determine what type of events are taking place, recall and apply the applicable assessment criteria and assessment guidance, and record the applicable assessment. This approach is labor intensive, particularly for complex tasks involving teams of several individuals, and teams in different locations.
  • [0006]
    There continues to be a need for improved, preferably real-time evaluation systems and methods. Preferably such systems and methods will be flexible and cost effective to implement so as to be usable to provide assessment information for a wide range of civilian and military exercises.
  • SUMMARY OF THE INVENTION
  • [0007]
    A method which embodies the invention includes defining a scenario workflow relative to a selected situation, establishing a set of assessment criteria relative to a plurality of workflow related events, carrying out the scenario; as the scenario proceeds, retrieving assessment criteria for at least one active event; and providing the returned assessment criteria to an evaluator.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0008]
    FIG. 1 is an over-view of a method in accordance with the invention;
  • [0009]
    FIG. 2 illustrates a system in accordance with the invention;
  • [0010]
    FIG. 3 illustrates an exemplary scenario event;
  • [0011]
    FIG. 4 illustrates an exemplary workflow event list for the event of FIG. 3;
  • [0012]
    FIG. 5 illustrates an exemplary event assessment criteria list;
  • [0013]
    FIG. 6 illustrates a displayed event list for activating an event;
  • [0014]
    FIG. 7A illustrates an exemplary assessment entering screen; and
  • [0015]
    FIG. 7B illustrates an exemplary assessment prompting display for an active event.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • [0016]
    While embodiments of this invention can take many different forms, specific embodiments thereof are shown in the drawings and will be described herein in detail with the understanding that the present disclosure is to be considered as an exemplification of the principles of the invention, and as a disclosure of the best mode of practicing the invention. It is not intended to limit the invention to the specific embodiment illustrated.
  • [0017]
    Systems and methods that embody the invention improve the efficiency of assessing performance during complex tasks, such as for distributed teams cooperating to achieve a common goal. Assessment accuracy is increased while reducing the work associated with recording behavior observations, preparing material for briefings and debriefings, and presenting feedback messages or comments during and after the assessment session.
  • [0018]
    Efficiency is increased because 1) the assessor is prompted during the assessment session with the applicable assessment criteria and guidance. An assessor can more quickly determine what assessment is needed; 2) the assessor is prompted with a tailored form for recording the assessment, so the assessor can record the assessment more quickly and accurately; 3) assessment responsibilities can be efficiently distributed and allocated across several assessors during the assessment session, so fewer assessors are needed; 4) expert subjective judgment guidance that is applicable to the assessment event can be captured from experts prior to the assessment session, and communicated to all assessors as required during the assessment session, so the assessors are not required to be experts.
  • [0019]
    Further, a method that embodies the invention automates performance assessment of an activity such as a training or a rehearsal exercise (the assessment session). The automation results in measurable or observable assessments of session events. In one embodiment, the method includes:
  • [0020]
    1) generating a scenario workflow which defines an event list or events that are expected to occur during an assessment session;
  • [0021]
    2) generating a workflow event assessment list which defines assessment criteria or guidance to be used during a respective event;
  • [0022]
    3) defining active events from the event list;
  • [0023]
    4) prompting the assessor regarding the applicable assessment criteria relative to a currently active event; and
  • [0024]
    5) recording the assessors' observations.
  • [0025]
    Alternative embodiments include:
  • [0026]
    a) Predefining the scenario workflow (e.g., a predefined script that always occurs in the same sequence).
  • [0027]
    b) The scenario workflow which defines an event list can be dynamically adjusted in response to the actions taken by the participants during the assessment session (e.g., a portion of the scenario workflow only occurs if the participants behave in a certain way referred to as a triggering behavior).
  • [0028]
    c) The scenario workflow which defines an event list can be dynamically adjusted by the evaluator based on accomplishment of assessment objectives during the assessment session (e.g., the evaluator may add scenario workflow to increase the workload during an assessment session—referred to as an “inject”).
  • [0029]
    d) The workflow event list criteria can be predefined (e.g., the criteria to be used for an event is always the same whenever that event occurs in the scenario workflow).
  • [0030]
    e) The workflow event list assessment criteria can be dynamically adjusted in response to the actions taken by the participants during the assessment session (e.g., if the participants or audience respond to a scenario situation).
  • [0031]
    f) The workflow event list assessment criteria can be dynamically adjusted by the evaluator based on accomplishment of assessment objectives during the assessment session (e.g., if the evaluator may determine that a report preparer has demonstrated mastery of the primary criteria of submitting a properly formatted report in a timely fashion. The assessment criteria for future reporting could be adjusted to record whether the report preparer consults all relevant information sources needed for a quality report).
  • [0032]
    g) Defining what event from the event list are currently applicable can be based on human observation.
  • [0033]
    h) Defining what events from the event list are currently applicable can be based on observation by a computer-based agent.
  • [0034]
    i) Prompting the assessor regarding the applicable assessment criteria can be predefined (e.g., the assessor is always provided the same cue or input screen each time the applicable event occurs).
  • [0035]
    j) Prompting the assessor regarding the applicable assessment criteria can be dynamically adjusted in response to the actions taken by the audience during the assessment session (e.g., if the participants are frequently performing one type of task (Task A) and infrequently performs another type of task (Task B), then the prompting means
  • [0036]
    k) Prompting the assessor regarding the applicable assessment criteria can be dynamically adjusted by the evaluator based on accomplishment of assessment objectives during the assessment session (e.g., if the evaluator determines that a report preparer has demonstrated mastery of the primary criteria of submitting a properly formatted report in a timely fashion. The assessment criteria for future reporting could be adjusted to record whether the report preparer consults all relevant information sources needed for a quality report).
  • [0037]
    In one system, that embodies this invention, the scenario workflow is defined to detail sequences or series of events that would result for a particular situation or course of action. The scenario workflow is then stored in a scenario workflow event database. The scenario workflow is analyzed to define what assessment criteria, if any, are applicable for each of the workflow events. These assessment criteria are added to the scenario workflow event database.
  • [0038]
    During the assessment session, the assessor or evaluator is provided an automated assessment device, or AAD. The AAD facilitates the use of hand-carried Tablet PCs by the evaluation team to coordinate evaluation responsibilities, display the pre-brief material and exercise status at any time during the exercise. As the scenario session proceeds, the scenario is monitored to determine what events are currently active.
  • [0039]
    By accessing the workflow event database, the system retrieves the details for the scenario event, including the assessment criteria list for that event. The evaluator is prompted with context-sensitive assessment criteria as key and critical events occur. As the evaluator records observations against each exercise goal, AAD provides immediate feedback of exercise goals that require additional attention. This significantly reduces the need for the evaluator to closely monitor actual scenario event execution while recording key observations. The AAD provides assistance to the evaluator to prepare the After Action Review (AAR).
  • [0040]
    Following the exercise, the evaluation team can quickly review evaluator comments grouped by exercise objectives and select candidates for discussion. During the review, the evaluator can use the AAD to identify and replay critical periods of audience actions that temporally relate to the evaluator's notes, while referencing the guidance provided during the exercise planning process. This automation and recall capability are not available using current manual data collection and synthesis techniques. The resultant After Action Reports provide invaluable lessons for the exercise participants, exercise coordinators and stakeholder agencies, including best practices that can be shared, thus, enhancing emergency preparedness.
  • [0041]
    FIG. 1 illustrates an overall view of a method 100 in accordance with the invention. In a step 102 a a scenario workflow is generated. In a step 102 b, a workflow event list is produced from the scenario workflow. The event list produced in step 102 b defines those events which are expected to occur during a scenario assessment session.
  • [0042]
    In step 104 a, event assessment criteria are assigned. In step 104 b an event assessment criteria list is generated. Those of skill will understand that the workflow event list step 102 b and event assessment criteria list step 104 b could be stored as known to those of skill in the art for subsequent use.
  • [0043]
    In step 106 a the scenario is initiated and the participants or audience participate in the ongoing scenario. Scenario workflow events unfold, step 106 b and the assessor 110 is available to observe the participants' or audience's performance in response to the events 106 b.
  • [0044]
    As the eventss unfold, step 106 b, active events are recognized, step 108 either automatically or by personnel associated with implementing the session.
  • [0045]
    Active events trigger a retrieval of evaluation or assessment information, step 116. This information is forwarded to the assessor 110, step 118.
  • [0046]
    The assessors' remarks, comments and evaluations are received step 120 and stored for subsequent use, step 122 during after action review. Step 118 can be repeated as appropriate to provide supplemental assessment information to assessor 110 in view of previously recorded assessments step 122.
  • [0047]
    FIG. 2 is a block diagram of a system 10 in accordance with the invention. System 10 represents one embodiment for implementing the methodology 100 of FIG. 1. As illustrated in FIG. 1, the training audience or participants 12 can communicate in a system 10 using, for example, personal computers which communicate via an intranet 12 a and server 12 b with the Internet 14. It will be understood that the participants 12 might participate in the subject scenario as multiple separate teams or as one organization working together to respond to the scenario. The participants or audience 12 can receive information via their personal computers and intranet 12 a, such as by e-mail, telephone or any other form of information providing systems, such as video, crisis information management system (CIMS) or the like.
  • [0048]
    The evaluators 110 can wirelessly communicate with a server associated with web 14 using wireless personal communication devices indicated generally at 20. This wireless communication capability makes it possible for the evaluators to readily move among the audience or participants during the exercise.
  • [0049]
    A plurality of web based servers 24 provides useful information for the exercise. An emergency scene simulator system 26 provides a realistic representation of the subject scenario which requires the attention and action of the training audience or participants 12. The e-mail, phone, crisis information management system 28 provides realistic communication and coordination capabilities for the audience/participants 12. The scenario log/replay system 30 records audience communications, scenario events, and emergency scene related simulation activity. System 32 stores and makes available scenario events, procedures, standards, including the workflow event list, the event assessment criteria lists and related assessment information.
  • [0050]
    Role players 34 interact with the ongoing scenario and participants in the session via one or more web based servers. They can communicate with the audience or participants and/or control the sequence of events, the emergency scene simulator 26 and any other event related information.
  • [0051]
    As noted above, the scenario workflow is defined to detail sequences or series of events that would result for a particular situation or course of action, step 102 a. A sample scenario is illustrated in FIG. 3 (a single event is shown for simplicity). The scenario events can then be stored in a database, such as database 32. It will be understood that the scenario events might be stored at a plurality of locations without departing from the spirit and scope of the invention.
  • [0052]
    For any given scenario event, there is a workflow procedure or event sequence, the workflow event list step 102 b, that the audience is expected to follow in response to the occurrence of the scenario events. The expected workflow event list 102 b for the sample scenario event in FIG. 3 is illustrated in FIG. 4. The scenario workflow event list 102 b can then be stored in a database such as database 32, FIG. 2.
  • [0053]
    The scenario workflow is analyzed to define what assessment criteria, if any, are applicable for each of the workflow events step 104 a. The assessment criteria list, step 104 b, is created and stored in a database such as database 32. These criteria include a description of the objective behavior that is expected of the audience, an event priority used to designate relative priority of events, an Evaluator Observation column to cue the assessor as to what should be observed, and a Performance Outcome note to define the criteria for successful performance by the participants or audience. These criteria which can be added to the event list 102 b from FIG. 4, are illustrated in FIG. 5, as the assessment criteria list step 104 b. The assessment criteria list can be stored in a database such as 32.
  • [0054]
    Event determining software can be used to define what events from the event list are currently active or applicable, step 108. As the scenario session proceeds, the scenario is monitored to determine what events are currently active. This can be accomplished by monitoring the scenario activity and looking for a match of the expected “Trigger” condition for those scenario events that are expected but have not yet occurred (status=“Pending”).
  • [0055]
    In one embodiment, active event identification can be accomplished by a human monitoring the event status see screen of FIG. 6, including all e-mail communication, as well as all voice between the role players and the training audience. When the monitor determines that an event had started, the monitor can designate that even as active (status =“Active”), using a PC-based workstation. This status of the events that are currently applicable would be recorded, for example in database 32, and transmitted over the network to all role players 34 and evaluators 110.
  • [0056]
    By assessing the workflow event database, the details for the active scenario event, including the assessment criteria list step 104 b for that event can be retrieved, for example from database 32. The evaluator 110 is prompted with context-sensitive assessment criteria as key and critical events occur via an automated assessment device, or AAD 20.
  • [0057]
    The AAD 20, for example, hand-carried Tablet-PCs, enables the evaluation team 110 to coordinate evaluation responsibilities, display lists on other exercise material and exercise status at any time during the exercise. FIGS. 7A and 7B are sample displays as might be presented to the evaluation team 110.
  • [0058]
    FIG. 7A illustrates an exemplary normal AAD display where the scenario event monitor is shown in the bottom half of the display. The evaluator records assessments in the top half of the display. As the evaluator records observations against each exercise goal, the AAD provides immediate feedback of exercise goals that require additional attention step 118. This significantly reduces the need for the evaluator to closely monitor actual scenario event execution while recording key observations.
  • [0059]
    FIG. 7B illustrates AAD operation when a critical event has become active. The evaluator is cued to this activity by the “alert” button, lower left comer. Upon selecting the alert button, an Alert Display that is appropriate for assessing the current active event is displayed. This event-appropriate cue enables extremely efficient assessment..
  • [0060]
    It will be understood that none of the details pertaining to communication via intranet 12 a or Internet 14 are limitations of the invention. Similarly, those of skill in the art will recognize that none of the size or location of the participants or audience 12, evaluator or assessment team(s) 110 or role players 34 are limitations of the invention.
  • [0061]
    From the foregoing, it will be observed that numerous variations and modifications may be effected without departing from the spirit and scope of the invention. It is to be understood that no limitation with respect to the specific apparatus illustrated herein is intended or should be inferred. It is, of course, intended to cover by the appended claims all such modifications as fall within the scope of the claims.
Citations de brevets
Brevet cité Date de dépôt Date de publication Déposant Titre
US5597312 *4 mai 199428 janv. 1997U S West Technologies, Inc.Intelligent tutoring method and system
US5815152 *19 juin 199729 sept. 1998Logical Software Solutions CorporationMethod and apparatus for defining and evaluating a graphic rule
US5863208 *2 juil. 199626 janv. 1999Ho; Chi FaiLearning system and method based on review
US6053737 *4 nov. 199725 avr. 2000Northrop Grumman CorporationIntelligent flight tutoring system
US6101101 *28 mai 19988 août 2000Sampo Semiconductor CorporationUniversal leadframe for semiconductor devices
US6106297 *12 nov. 199622 août 2000Lockheed Martin CorporationDistributed interactive simulation exercise manager system and method
US6161101 *7 avr. 199812 déc. 2000Tech-Metrics International, Inc.Computer-aided methods and apparatus for assessing an organization process or system
US6428323 *29 août 20006 août 2002Carla M. PughMedical examination teaching system
US6652283 *30 déc. 199925 nov. 2003Cerego, LlcSystem apparatus and method for maximizing effectiveness and efficiency of learning retaining and retrieving knowledge and skills
US20030091968 *11 nov. 200215 mai 2003Gaumard Scientific, Inc.Interactive education system for teaching patient care
US20040249678 *28 oct. 20039 déc. 2004Henderson E. DevereSystems and methods for qualifying expected risk due to contingent destructive human activities
Référencé par
Brevet citant Date de dépôt Date de publication Déposant Titre
US842896127 août 200723 avr. 2013Emsystem, LlcMethod and system for data aggregation for real-time emergency resource management
US857771828 oct. 20115 nov. 2013Dw Associates, LlcMethods and systems for identifying, quantifying, analyzing, and optimizing the level of engagement of components within a defined ecosystem or context
US895279627 juin 201210 févr. 2015Dw Associates, LlcEnactive perception device
US89963592 mai 201231 mars 2015Dw Associates, LlcTaxonomy and application of language analysis and processing
US902080717 janv. 201328 avr. 2015Dw Associates, LlcFormat for displaying text analytics results
US92693535 déc. 201223 févr. 2016Manu RehaniMethods and systems for measuring semantics in communications
US9418116 *8 sept. 201516 août 2016Ncino, Inc.Capturing evolution of a resource memorandum according to resource requests
US966751324 janv. 201330 mai 2017Dw Associates, LlcReal-time autonomous organization
US9772766 *28 janv. 201426 sept. 2017Wells Fargo India Solutions Private LimitedBanking services experience center
US20070048710 *9 août 20061 mars 2007The University Of North DakotaBioterrorism and disaster response system
US20070174093 *26 mars 200726 juil. 2007Dave ColwellMethod and system for secure and protected electronic patient tracking
US20070260436 *27 avr. 20068 nov. 2007Lockheed Martin Integrated Systems And SolutionsSystem and method for evaluating system architectures
US20070297589 *27 août 200727 déc. 2007Greischar Patrick JMethod and system for data aggregation for real-time emergency resource management
US20080046285 *18 août 200621 févr. 2008Greischar Patrick JMethod and system for real-time emergency resource management
US20090018869 *1 oct. 200815 janv. 2009Patrick J GreischarMethod and system for data aggregation for real-time emergency resource management
US20090099983 *19 mai 200616 avr. 2009Drane Associates, L.P.System and method for authoring and learning
US20140315159 *28 janv. 201423 oct. 2014Wells Fargo Bank, N.A.Banking Services Experience Center
US20160034834 *8 sept. 20154 févr. 2016Ncino, Inc.Capturing evolution of a resource memorandum according to resource requests
Classifications
Classification aux États-Unis434/365, 434/219
Classification internationaleG09B19/00, G09B7/00
Classification coopérativeG09B19/00, G09B7/00
Classification européenneG09B7/00, G09B19/00
Événements juridiques
DateCodeÉvénementDescription
1 oct. 2004ASAssignment
Owner name: LOCKHEED MARTIN CORPORATION, FLORIDA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FALASH, MARK D.;DONOVAN, KENNETH B.;BILAZARIAN, PETER;REEL/FRAME:015853/0575;SIGNING DATES FROM 20040813 TO 20040831