WO2002052528A1 - Method and system for assessment of user performance - Google Patents

Method and system for assessment of user performance Download PDF

Info

Publication number
WO2002052528A1
WO2002052528A1 PCT/US2001/043056 US0143056W WO02052528A1 WO 2002052528 A1 WO2002052528 A1 WO 2002052528A1 US 0143056 W US0143056 W US 0143056W WO 02052528 A1 WO02052528 A1 WO 02052528A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
devices
information
assessing
regarding
Prior art date
Application number
PCT/US2001/043056
Other languages
French (fr)
Inventor
Harry Kennedy Clark
James L. Boney
Kasempath Meesuk
Original Assignee
Element K Online Llc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Element K Online Llc. filed Critical Element K Online Llc.
Priority to EP01988135A priority Critical patent/EP1344202A1/en
Priority to US10/415,465 priority patent/US20040110118A1/en
Publication of WO2002052528A1 publication Critical patent/WO2002052528A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers

Definitions

  • Controlling the device(s) includes their ability to correctly configure, troubleshoot, test, diagnose, initialize, set up, build, arrange, and analyze these devices.
  • the student is assessed based on taking a test where they are asked multiple choice and/or true-false questions regarding the device or control of the device, thus testing their knowledge regarding control of the device.
  • the student is presented with a real world task regarding the control of the device (or set of devices).
  • the student exercises control over the one or more devices to perform the task.
  • the student may exercise control over one or more devices remotely over a network such as the Internet or a LAN.
  • the student may exercise control over the one or more devices using Mentor TechnologiesTM vLabTM system.
  • Methods and systems consistent with the present invention include systems and methods for assessing a user regarding control of one or more devices that include comparing information regarding a configuration of at least one of the devices against at least one evaluation criteria, comparing information regarding state information for the device against at least one evaluation criteria, and assessing the user using the above comparisons
  • Figure 1 illustrates an assessment system in accordance with methods and systems consistent with the invention
  • FIG. 2 illustrates a screen in accordance with methods and systems consistent with the invention
  • FIG. 3 illustrates an example output in accordance with methods and systems consistent with the invention
  • Figure 4 illustrates an illustration of a results for report in accordance with methods and systems consistent with the invention
  • Figure 5 illustrates an example analysis matrix report in accordance with methods and systems consistent with the invention
  • Figure 6 illustrates an example configuration report in accordance with methods and systems consistent with the invention
  • Figure 7 illustrates an example non-configuration report in accordance with methods and systems consistent with the invention
  • FIG. 8 illustrates an example screen in accordance with methods and systems consistent with the invention.
  • Figure 9 illustrates an example user administration screen in accordance with methods and systems consistent with the invention.
  • Figure 1 provides an illustration of an assessment system, in accordance with methods and systems consistent with the invention.
  • the student can request that they be assessed based on their performance of the training exercise.
  • Figure 2 illustrates a screen that may be presented to a user performing an assignment using the vLabTM system.
  • the user may be permitted to select an "Assess Me" button 210 to provide feedback in the middle of an assignment or exercise.
  • assessment may be initiated.
  • assessment may be initiated automatically upon completion of an exercise. This may occur, for example, in a classroom type setting where the assignment is performed as a test of the students' abilities.
  • assessment may be performed in other types of systems where the student exercises actual control over devices on which they are being evaluated.
  • various types of information may be gathered and transferred to a Grader Engine 107.
  • This information may include device configurations 105, state information 106 regarding the device(s), SNMP results 104 from the devices in the pod 101 and other devices connected to the pod's devices, and/or other information.
  • Device Information information is gathered regarding grading and/or Evaluation Criteria 103 ("Evaluation Criteria").
  • the Grader Engine 107 After collecting, analyzing and comparing the Device Information to the Evaluation Criteria 103, the Grader Engine 107 generates an output that may include a variety of reports with information regarding student performance.
  • Figure 3 illustrates an example output 300 that may be presented to a vLabTM system user. As shown, this output may include a Results For report 310, an Analysis Matrix report 320, a Configurations report 330, and a Non- Configuration Information report 340. These various reports will be discussed in more detail below. As will be obvious to one of skill in the art, this is only one example of an output that may be presented to a user, and there are numerous other types of outputs containing various reports that may be presented to the user. This will depend in part on the type of system and types of devices using the present invention, in addition to other criteria.
  • the Grader Engine 107 may execute a series of diagnostic commands that captures the actual state of the network to which the device is connected, thus allowing the Grader Engine 107 to analyze real-time traffic such as ping, traceroute, adjacencies, routing tables, and other diagnostic commands regarding that device and/or network.
  • the results and diagnostics commands can be issued either during the lab for "real time” evaluation, or at the end of the lab and stored in a database for future reporting.
  • the Grader Engine 107 may use pattern matching and parsing technology to evaluate the Device Information (104, 105, and 106) against the Evaluation Criteria 103.
  • the pattern matching and parsing technology may be presented in a hierarchy of "functions" for purposes of authoring the assessment. These provide a range of flexibility and power. For example, there can be “general-purpose” functions where the author of the assessment specifies the raw pattern match or parser, "wrapper” functions that are easier to use but less flexible, and “canned” functions that hide the parsing details but are specific in their use.
  • Wrapper functions take a regular expression and other forms of pattern matching logic supplied by the author and automatically "wrap" it inside of a larger regular expression, pattern matcher, or programming logic. Adding this "wrapper” makes the author's job considerably easier because it saves them from having to write complex expressions that only match in the desired context. For example, writing an expression that only matches an IP address on a given interface can be fairly tricky (it is easy for the IP address to inadvertently match on a different interface earlier or later in the config).
  • the interface( ) wrapper function automatically limits the expression to the specified interface (or list of interfaces), allowing the author to concentrate on the much simpler process of matching on something inside that interface (for example, "ip address 1 ⁇ .1. ⁇ 1. ⁇ .1 255 ⁇ .255 ⁇ .0 ⁇ .0" to ensure that the interface has an address of 1.1.1.1 and a /16 mask).
  • ip address 1 ⁇ .1. ⁇ 1. ⁇ .1 255 ⁇ .255 ⁇ .0 ⁇ .0 to ensure that the interface has an address of 1.1.1.1 and a /16 mask.
  • writing a criterion using a wrapper function is normally extremely simple. However, for more complex requirements, the author can always resort to the full power of regular expressions and other forms of pattern matching and parsing logic.
  • Canned functions are tailor-made to solve specific assessment requirements. Because they totally insulate the author from having to write complex expressions, they are extremely easy to use. However, their use is also considerably more limited than that of the "general-purpose” and “wrapper” functions.
  • the shCdpNeigh( ) function is only designed to process the output of the "show cdp neighbors" command. Although it is flexible enough to automatically determine if the command was issued with the "detail" option and automatically adjust its logic, it will never be useful for looking at other types of router information (for example, the routing tables).
  • shCdpNeigh( ) is very easy to use: simply tell the function which devices you want to process CDP output from and a list consisting of: (i) a neighbor's name, (ii) the local interface used to reach the neighbor, (iii) the neighbor's interface used to reach the local router, and (iv) a list of Layer 3 protocols and addresses. This, and other, functions can allow "wildcards" to be specified by omitting various parameters.
  • the Evaluation Criteria 103 may be based on a set of desired learning objectives which are allocated differing amounts of grading points based on the relative importance of the specific learning objective. By comparing the Device Information (104, 105, and 106) to the Evaluation Criteria 103, the Grader Engine 107 may determine whether the student has met the relevant learning objects, award full or partial credit, deny credit altogether, and then generate an overall score.
  • the Grader Engine 107 may include a "land mine” feature that deducts points from as student's score when student enters certain commands into, or takes certain actions with respect to, the device, e.g., enters commands to try to circumvent the learning exercise. That is, the Grader Engine 107 may include the ability to look for certain types of actions that indicate that a student attempted to "cheat" the exercise.
  • the Grader Engine 107 may include the capability to grant partial credit.
  • the granting of partial credit may be made either based on pre- established criteria or new criteria established by the Grader Engine 107 based on specific Device Information (103, 104, and 105). This may be accomplished by the Grader Engine 107 using the above-described pattern matching and parsing technology, as well as by establishing a logical hierarchy between multiple criteria. This feature allows the Grader Engine 107 to assess a multitude of possible solutions a student may arrive at in trying to perform the designated tasks. Furthermore, use of pattern matching and parsing technology to permit an automated grading approach does not require that the author specifically address every possible solution to the learning exercise.
  • the system may include a Help Engine 108 that permits the student to link to other information related to a specific learning objective. These links may include technical notes, reference materials, and listings of classes or seminars that address that objective, among others.
  • the Help Engine 108 is a software module that is triggered when the user selects a help link or function from one of the various types of feedback reporting produced by the Grader Engine 107 and its associated output modules. In generating the help information, the Help Engine 108 will access information in the Evaluation Criteria 103 and other possible sources such as a meta- database of remedial information and feedback.
  • the results generated by the Grader Engine 107 may be used to feed a variety of other outputs, such as an HTML Rendering Engine 109, XML Engine 111 , or other forms output 110, which in turn can, among other things, generate a variety of reports, including one that lists the learning objectives, number of maximum grading points allocated to each learning objective, and the actual number of points awarded to the student based on his or her performance.
  • the HTML Engine 109 is a software process that generates information to be sent to a web browser via a network such as the Internet.
  • the XML and other output engines are similar software processes, but they can output the results of the assessment information in a wide variety of report and data transfer formats.
  • sections of the report that a user may click on to link to information regarding specific learning objectives, the corresponding configurations, and/or state(s) resulting from the student's performance in the learning exercise. This may be useful in highlighting what the student did correctly or incorrectly.
  • These sections that the user may click on may identified, for example, by shading certain words a particular color, underlining certain words, or by particular icons.
  • the system may also include a variety of security and administrative features 112 as well as other applications 113. Examples of these features include allowing the system administrator to prohibit a student from accessing the help function, viewing details of the lab before a testing situation, taking a test more than once, disabling various "mentoring" features in a testing situation, disabling certain levels of detail in the output report.
  • Figure 4 provides an illustration of the Results For Report 310 that was previously discussed in reference to Figure 3.
  • This report may include overhead type information.
  • this report may include the user's name 410, the title for the assignment 420, the time the assignment was purchased or selected by the user 430, the time it was started by the user 440, the time it was completed 450, the user's IP address 460, a title or identification for the pod used during the assignment 470, and the number of times the user attempted this particular assignment 480.
  • FIG. 5 illustrates an example of the Analysis Matrix 320 Report that was previously discussed in regard to Figure 3.
  • this report lists various learning objectives 510 that the user is assessed on.
  • Each learning objective may include a key 520 that may include the words "Show Me” or a similar icon.
  • the key includes the words "Show Me”
  • the user may click on these words to jump to relevant sections of the configuration code created during the assignment that enable the user to see what they did right and what they did wrong during the assignment.
  • these keys e.g., Show Me
  • a description 530 may be presented for each learning objective.
  • a maximum score field 540 may be listed for each learning objective may be listed. This maximum score field shows the total point that may be awarded for this learning objective if it is completed successfully.
  • a score field 550 may be listed for each learning objective. This score field 550 lists that score that the user was awarded for the learning objective. As shown, partial credit may be awarded to a user who is not completely successful in completing the learning objective.
  • a help link 560 may be presented for each learning objective. A user may click on this help link to view additional information regarding this learning objective, such as information concerning the technical notes, reference materials, classes, other distance learning components, etc.
  • this report may include information regarding the maximum possible raw points 572, the user's raw points 574, the user's raw score 576, any administrative adjustment 578, and the user's final score 580.
  • Figure 6 illustrates an example of the Configuration Report 330 that was previously discussed in reference to Figure 3.
  • a user may click on the text "Show Me” in the Analysis Matrix Report to jump to relevant sections of the configuration code.
  • the user may be presented with a Configuration Report 330 regarding the learning objective, such as illustrated in Figure 6.
  • various information in the Configuration Report may be identified by a color or shading corresponding to the learning objective for which the "Show Me” text was selected.
  • the Configuration Report 330 may include information regarding each of the devices in the pod 110.
  • these devices include a router for Washington, DC 610, a router for Minot 620, and a router for Leesville 630.
  • the Configuration Report may include information regarding the configuration for the device.
  • Figure 7 illustrates an example of the Non-Configuration Report 340 that was previously discussed in reference to Figure 3.
  • the Grader Engine 107 may execute a series of diagnostic commands that capture the actual state of the network. This therefore allows the engine to analyze real time traffic, such as ping, traceroute, adjacencies, routing tables, and other show commands.
  • information in the Non-Configuration Report 340 may be identified by a particular color or shading.
  • This shading or color preferably corresponds to the shading or color of the "Show Me” key for a particular learning objective. This helps a user to quickly identify the information in these reports that corresponds to the particular learning objective.
  • Figure 8 illustrates an example of a screen that may be presented to a user that clicks on one of the help links 560 illustrated in Figure 5.
  • FIG. 9 illustrates an example of a User Administration Screen 900 that may be presented to a user, teacher or system administrator. As shown, this screen may list the various users that performed particular assignments by last name 902, first name 904, login ID 906, and group 908. Further, this screen may list the descriptions 910 for the assignments performed, along with their score 912, and the attempt number 914 for the score. For users with more than one attempt, the score for each attempt may be listed by clicking on the attempt number and then selecting the attempt number for which the user desires to view the score. In addition, buttons may be presented that allow the user to view the report 916 and the user's options 918. Information obtained by selecting to view the user's options may include, for example, setting the administrative group the user belongs to as well as certain administrative flag that control behavior such as multiple attempts at a single exercise and removal of invalid test results.
  • a data export button 920 may be presented to allow the data to be exported to a printer, floppy drive, some other storage device, or in a variety of formats that can be read by other systems, software packages, and databases. For example, this feature can be used to export the data to spreadsheet software. Further, scroll downs or filters, may be provided that allow a user to view the performances by individuals in a particular group 922, by the lab or assignment taken 924, the time or day during which the assignment was performed 1026. Also, a Hidden function 928 is illustrated that if selected hides or removes invalid test results from reports and export screens by default.
  • the above-defined methods may be performed by one or more computers or servers that is/are capable of obtaining the above described information.
  • the above described method may be embodied in software that one or more processors in one or more computers are capable of executing
  • routers may also be used for any other type of devices, such as switches, computers, servers, PLCs, etc. Further, the above-described methods and systems also may be applied to assess a user with regard to software, such as NT, MSWord, UNIX, etc.
  • Appendix A presents various figures concerning an application of the above-described methods and systems as used in vLabTM systems with routers.
  • Appendix B presents text corresponding to these figures.
  • the "Assess Me!” button will only appear if the super-admin has selected the "Allow Assessment During Lab” flag. This feature can be enabled in situations where real-time feedback and guidance is appropriate - it's the ultimate form of "hints and tips” for the user!
  • a process automatically runs that evaluates the user's results and writes a grade to the database and grade log. When the user returns to their locker, they will now be presented with an "Assessment” option for the completed assessment labs.
  • AH interfaces use /16 subnet masks 90 90 ?> - IBRBMrJSl DC can ping EthernetO on Minot and Leesville 20 10 ?>
  • Leesville can ping -EthernetO on DC and Minot 20 20- ?> Minot and Leesvile see DC via CDP 20 20 ?> DC. and Leesville see Minot via CDP 20 20 ?> DC and Minot see Leesville via CDP 20 20 ?>
  • IP addresses have two parts:
  • IP the boundary between the network and host portions of an address is variable.
  • the subnet mask is used to determine where the boundary has been placed!
  • VLAB ASSESSMENT TECHNOLOGY skills-based assessment that gages a CISCO learner ' s true readiness to perform.
  • Each row contains a description of the Burst out on an entire row. learning objective, and the learner's actual performance against that objective.
  • the "Max” field shows the total possible Highlight the "Max” column. points awarded if the objective is met successfully.
  • the "Score” field reflects the learner's Highlight the "Score” Column. score in meeting that objective. Learners Burst out on a red- item where the score is quickly learn to gage their progress, as lower than the Max column. partial credit is reflected in the "Score” field if the objective is not totally met. •

Abstract

A method and system for assessing a user (115) regarding control of one or more devices that include comparing information regarding a configuration (105) of at least one of the devices against at least one evaluation criteria (103), comparing information regarding state information (106) for the device against at least one evaluation criteria, and assessing the user using the above comparisons.

Description

METHOD AND SYSTEM FOR ASSESSMENT OF USER PERFORMANCE
BACKGROUND OF THE INVENTION
The following describes systems and methods for assessing a user's proficiency regarding a device (or set of devices) by evaluating the user/student's control over the device(s). "Controlling the device(s)" includes their ability to correctly configure, troubleshoot, test, diagnose, initialize, set up, build, arrange, and analyze these devices.
Traditionally, students are assessed based on taking a test where they are asked multiple choice and/or true-false questions regarding the device or control of the device, thus testing their knowledge regarding control of the device. In an embodiment of the present invention, rather than simply asking a student questions regarding the device, the student is presented with a real world task regarding the control of the device (or set of devices). The student then exercises control over the one or more devices to perform the task. In an embodiment, the student may exercise control over one or more devices remotely over a network such as the Internet or a LAN. For example, the student may exercise control over the one or more devices using Mentor Technologies™ vLab™ system. For a more detailed description of a system for remote training on devices, see "Methods and Apparatus for Computer Based Training Relating to Devices," of T. C.
Slattery, et al., U.S. Patent Application No. 09/365,243, filed July 30, 1999, which is hereby incorporated by reference. After completing the task, the student is assessed on his/her performance or skills in controlling the device(s).
SUMMARY OF THE INVENTION
Methods and systems consistent with the present invention include systems and methods for assessing a user regarding control of one or more devices that include comparing information regarding a configuration of at least one of the devices against at least one evaluation criteria, comparing information regarding state information for the device against at least one evaluation criteria, and assessing the user using the above comparisons
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 illustrates an assessment system in accordance with methods and systems consistent with the invention;
Figure 2 illustrates a screen in accordance with methods and systems consistent with the invention;
Figure 3 illustrates an example output in accordance with methods and systems consistent with the invention;
Figure 4 illustrates an illustration of a results for report in accordance with methods and systems consistent with the invention;
Figure 5 illustrates an example analysis matrix report in accordance with methods and systems consistent with the invention;
Figure 6 illustrates an example configuration report in accordance with methods and systems consistent with the invention; Figure 7 illustrates an example non-configuration report in accordance with methods and systems consistent with the invention;
Figure 8 illustrates an example screen in accordance with methods and systems consistent with the invention; and
Figure 9 illustrates an example user administration screen in accordance with methods and systems consistent with the invention.
DESCRIPTION OF THE EMBODIMENTS
Figure 1 provides an illustration of an assessment system, in accordance with methods and systems consistent with the invention. In an embodiment, either during an assignment or once an assignment regarding control of a pod including one or more device(s) 101 is completed, the student can request that they be assessed based on their performance of the training exercise. For example, Figure 2 illustrates a screen that may be presented to a user performing an assignment using the vLab™ system. As shown, the user may be permitted to select an "Assess Me" button 210 to provide feedback in the middle of an assignment or exercise. By selecting this button 210, assessment may be initiated. In other embodiments, assessment may be initiated automatically upon completion of an exercise. This may occur, for example, in a classroom type setting where the assignment is performed as a test of the students' abilities. In addition, assessment may be performed in other types of systems where the student exercises actual control over devices on which they are being evaluated. Once assessment is initiated, various types of information may be gathered and transferred to a Grader Engine 107. This information may include device configurations 105, state information 106 regarding the device(s), SNMP results 104 from the devices in the pod 101 and other devices connected to the pod's devices, and/or other information. These various types of data will be collectively referred to as "Device Information". In addition, to device information, information is gathered regarding grading and/or Evaluation Criteria 103 ("Evaluation Criteria").
After collecting, analyzing and comparing the Device Information to the Evaluation Criteria 103, the Grader Engine 107 generates an output that may include a variety of reports with information regarding student performance. Figure 3 illustrates an example output 300 that may be presented to a vLab™ system user. As shown, this output may include a Results For report 310, an Analysis Matrix report 320, a Configurations report 330, and a Non- Configuration Information report 340. These various reports will be discussed in more detail below. As will be obvious to one of skill in the art, this is only one example of an output that may be presented to a user, and there are numerous other types of outputs containing various reports that may be presented to the user. This will depend in part on the type of system and types of devices using the present invention, in addition to other criteria.
In evaluating student performance the Grader Engine 107 may execute a series of diagnostic commands that captures the actual state of the network to which the device is connected, thus allowing the Grader Engine 107 to analyze real-time traffic such as ping, traceroute, adjacencies, routing tables, and other diagnostic commands regarding that device and/or network. The results and diagnostics commands can be issued either during the lab for "real time" evaluation, or at the end of the lab and stored in a database for future reporting.
Further, the Grader Engine 107 may use pattern matching and parsing technology to evaluate the Device Information (104, 105, and 106) against the Evaluation Criteria 103. The pattern matching and parsing technology may be presented in a hierarchy of "functions" for purposes of authoring the assessment. These provide a range of flexibility and power. For example, there can be "general-purpose" functions where the author of the assessment specifies the raw pattern match or parser, "wrapper" functions that are easier to use but less flexible, and "canned" functions that hide the parsing details but are specific in their use.
General-purpose functions involve the use of regular expressions, a pattern-matching language commonly used in UNIX and programming environments. Consequently, these functions are extremely flexible, but more difficult to use because they require the author to understand the regular expression language or other forms of pattern matching logic.
Wrapper functions take a regular expression and other forms of pattern matching logic supplied by the author and automatically "wrap" it inside of a larger regular expression, pattern matcher, or programming logic. Adding this "wrapper" makes the author's job considerably easier because it saves them from having to write complex expressions that only match in the desired context. For example, writing an expression that only matches an IP address on a given interface can be fairly tricky (it is easy for the IP address to inadvertently match on a different interface earlier or later in the config). The interface( ) wrapper function automatically limits the expression to the specified interface (or list of interfaces), allowing the author to concentrate on the much simpler process of matching on something inside that interface (for example, "ip address 1\.1.\1.\.1 255\.255\.0\.0" to ensure that the interface has an address of 1.1.1.1 and a /16 mask). Given that many interface (and related) matching tasks only require very basic (or no) wildcard characters, writing a criterion using a wrapper function is normally extremely simple. However, for more complex requirements, the author can always resort to the full power of regular expressions and other forms of pattern matching and parsing logic.
Canned functions are tailor-made to solve specific assessment requirements. Because they totally insulate the author from having to write complex expressions, they are extremely easy to use. However, their use is also considerably more limited than that of the "general-purpose" and "wrapper" functions. For example, the shCdpNeigh( ) function is only designed to process the output of the "show cdp neighbors" command. Although it is flexible enough to automatically determine if the command was issued with the "detail" option and automatically adjust its logic, it will never be useful for looking at other types of router information (for example, the routing tables). On the other hand, shCdpNeigh( ) is very easy to use: simply tell the function which devices you want to process CDP output from and a list consisting of: (i) a neighbor's name, (ii) the local interface used to reach the neighbor, (iii) the neighbor's interface used to reach the local router, and (iv) a list of Layer 3 protocols and addresses. This, and other, functions can allow "wildcards" to be specified by omitting various parameters.
The Evaluation Criteria 103 may be based on a set of desired learning objectives which are allocated differing amounts of grading points based on the relative importance of the specific learning objective. By comparing the Device Information (104, 105, and 106) to the Evaluation Criteria 103, the Grader Engine 107 may determine whether the student has met the relevant learning objects, award full or partial credit, deny credit altogether, and then generate an overall score.
In addition, the Grader Engine 107 may include a "land mine" feature that deducts points from as student's score when student enters certain commands into, or takes certain actions with respect to, the device, e.g., enters commands to try to circumvent the learning exercise. That is, the Grader Engine 107 may include the ability to look for certain types of actions that indicate that a student attempted to "cheat" the exercise.
Further, the Grader Engine 107 may include the capability to grant partial credit. The granting of partial credit may be made either based on pre- established criteria or new criteria established by the Grader Engine 107 based on specific Device Information (103, 104, and 105). This may be accomplished by the Grader Engine 107 using the above-described pattern matching and parsing technology, as well as by establishing a logical hierarchy between multiple criteria. This feature allows the Grader Engine 107 to assess a multitude of possible solutions a student may arrive at in trying to perform the designated tasks. Furthermore, use of pattern matching and parsing technology to permit an automated grading approach does not require that the author specifically address every possible solution to the learning exercise.
In addition, the system may include a Help Engine 108 that permits the student to link to other information related to a specific learning objective. These links may include technical notes, reference materials, and listings of classes or seminars that address that objective, among others. The Help Engine 108 is a software module that is triggered when the user selects a help link or function from one of the various types of feedback reporting produced by the Grader Engine 107 and its associated output modules. In generating the help information, the Help Engine 108 will access information in the Evaluation Criteria 103 and other possible sources such as a meta- database of remedial information and feedback.
The results generated by the Grader Engine 107 may be used to feed a variety of other outputs, such as an HTML Rendering Engine 109, XML Engine 111 , or other forms output 110, which in turn can, among other things, generate a variety of reports, including one that lists the learning objectives, number of maximum grading points allocated to each learning objective, and the actual number of points awarded to the student based on his or her performance. The HTML Engine 109 is a software process that generates information to be sent to a web browser via a network such as the Internet. The XML and other output engines are similar software processes, but they can output the results of the assessment information in a wide variety of report and data transfer formats.
In addition there may be sections of the report that a user may click on to link to information regarding specific learning objectives, the corresponding configurations, and/or state(s) resulting from the student's performance in the learning exercise. This may be useful in highlighting what the student did correctly or incorrectly. These sections that the user may click on may identified, for example, by shading certain words a particular color, underlining certain words, or by particular icons.
The system may also include a variety of security and administrative features 112 as well as other applications 113. Examples of these features include allowing the system administrator to prohibit a student from accessing the help function, viewing details of the lab before a testing situation, taking a test more than once, disabling various "mentoring" features in a testing situation, disabling certain levels of detail in the output report.
Figure 4 provides an illustration of the Results For Report 310 that was previously discussed in reference to Figure 3. This report may include overhead type information. For example, as illustrated this report may include the user's name 410, the title for the assignment 420, the time the assignment was purchased or selected by the user 430, the time it was started by the user 440, the time it was completed 450, the user's IP address 460, a title or identification for the pod used during the assignment 470, and the number of times the user attempted this particular assignment 480.
Figure 5 illustrates an example of the Analysis Matrix 320 Report that was previously discussed in regard to Figure 3. As illustrated, this report lists various learning objectives 510 that the user is assessed on. Each learning objective may include a key 520 that may include the words "Show Me" or a similar icon. For learning objectives where the key includes the words "Show Me," the user may click on these words to jump to relevant sections of the configuration code created during the assignment that enable the user to see what they did right and what they did wrong during the assignment. Further, these keys (e.g., Show Me) may be color coded or shaded a particular color. This color or shading may then be used as described below in reference to the below described configuration reports and non-configuration reports.
In addition, a description 530 may be presented for each learning objective. Further, a maximum score field 540 may be listed for each learning objective may be listed. This maximum score field shows the total point that may be awarded for this learning objective if it is completed successfully. In addition, a score field 550 may be listed for each learning objective. This score field 550 lists that score that the user was awarded for the learning objective. As shown, partial credit may be awarded to a user who is not completely successful in completing the learning objective. Also, a help link 560 may be presented for each learning objective. A user may click on this help link to view additional information regarding this learning objective, such as information concerning the technical notes, reference materials, classes, other distance learning components, etc. In addition, this report may include information regarding the maximum possible raw points 572, the user's raw points 574, the user's raw score 576, any administrative adjustment 578, and the user's final score 580.
Figure 6 illustrates an example of the Configuration Report 330 that was previously discussed in reference to Figure 3. As discussed above, with reference to Figure 5, a user may click on the text "Show Me" in the Analysis Matrix Report to jump to relevant sections of the configuration code. For example, by clicking on the "Show Me" text for a learning objective, the user may be presented with a Configuration Report 330 regarding the learning objective, such as illustrated in Figure 6. Further, various information in the Configuration Report may be identified by a color or shading corresponding to the learning objective for which the "Show Me" text was selected.
As shown, the Configuration Report 330 may include information regarding each of the devices in the pod 110. In the example illustrated, these devices include a router for Washington, DC 610, a router for Minot 620, and a router for Leesville 630. For each of these devices, the Configuration Report may include information regarding the configuration for the device. Figure 7 illustrates an example of the Non-Configuration Report 340 that was previously discussed in reference to Figure 3. As previously discussed, the Grader Engine 107 may execute a series of diagnostic commands that capture the actual state of the network. This therefore allows the engine to analyze real time traffic, such as ping, traceroute, adjacencies, routing tables, and other show commands. Further, as discussed above with regard to the Configuration Report 330, information in the Non-Configuration Report 340 may be identified by a particular color or shading. This shading or color preferably corresponds to the shading or color of the "Show Me" key for a particular learning objective. This helps a user to quickly identify the information in these reports that corresponds to the particular learning objective.
Figure 8 illustrates an example of a screen that may be presented to a user that clicks on one of the help links 560 illustrated in Figure 5.
In addition to the above, the user may access a User Administration screen. Figure 9 illustrates an example of a User Administration Screen 900 that may be presented to a user, teacher or system administrator. As shown, this screen may list the various users that performed particular assignments by last name 902, first name 904, login ID 906, and group 908. Further, this screen may list the descriptions 910 for the assignments performed, along with their score 912, and the attempt number 914 for the score. For users with more than one attempt, the score for each attempt may be listed by clicking on the attempt number and then selecting the attempt number for which the user desires to view the score. In addition, buttons may be presented that allow the user to view the report 916 and the user's options 918. Information obtained by selecting to view the user's options may include, for example, setting the administrative group the user belongs to as well as certain administrative flag that control behavior such as multiple attempts at a single exercise and removal of invalid test results.
Further, a data export button 920 may be presented to allow the data to be exported to a printer, floppy drive, some other storage device, or in a variety of formats that can be read by other systems, software packages, and databases. For example, this feature can be used to export the data to spreadsheet software. Further, scroll downs or filters, may be provided that allow a user to view the performances by individuals in a particular group 922, by the lab or assignment taken 924, the time or day during which the assignment was performed 1026. Also, a Hidden function 928 is illustrated that if selected hides or removes invalid test results from reports and export screens by default.
The above-defined methods may be performed by one or more computers or servers that is/are capable of obtaining the above described information. In addition, the above described method may be embodied in software that one or more processors in one or more computers are capable of executing
Also, although the above-described methods and systems were discussed with references to routers, they may also be used for any other type of devices, such as switches, computers, servers, PLCs, etc. Further, the above-described methods and systems also may be applied to assess a user with regard to software, such as NT, MSWord, UNIX, etc.
Appendix A presents various figures concerning an application of the above-described methods and systems as used in vLab™ systems with routers. Appendix B presents text corresponding to these figures.
While it has been illustrated and described what is at present considered to be the preferred embodiment and methods of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made, and equivalents may be substituted for elements thereof without departing from the true scope of the invention.
In addition, many modifications may be made to adapt a particular element, technique or, implementation to the teachings of the present invention without departing from the central scope of the invention.
If a user "purchases" and takes one of these assessment labs then they might see the "Assess Me!" button appear while they are in the console.
Figure imgf000017_0001
The "Assess Me!" button will only appear if the super-admin has selected the "Allow Assessment During Lab" flag. This feature can be enabled in situations where real-time feedback and guidance is appropriate - it's the ultimate form of "hints and tips" for the user! Once the user has completed the lab, a process automatically runs that evaluates the user's results and writes a grade to the database and grade log. When the user returns to their locker, they will now be presented with an "Assessment" option for the completed assessment labs.
Figure imgf000017_0002
0303. BSCN Assessment Completed | Assessment ^όj
If they select this and hit the "Gp!"-^utton, the Assessment Analysis Report will open.
Figure imgf000017_0003
Time Added To Locker Oct 1 1 12:49:54 2000.
Time Started: Oct 1 1 12:59:45 2000 A Time Completed: Oci 1 1 13:25:57 2000
User IP Address: 1.1.1.1
Pod Used: 1
Attempt Number: 3rd
K Description M: IX Score Help!
UI WJAM 'router rip' removed 30 10 ? -
Show Vic 'router igrp' present 45 45 ? nder IGRP 30 30 ? ~
-τ> ; 'network 10.0.0.0' present u β ft' Bandwidth parameter present 30 10 ? '-
DC SerialO & Leesville Serial 1 on same subnet 15 15 ?
DC Serial 1 & Minot SerialO on same subnet 15 15 ? >
Figure imgf000018_0001
Minot Serial 1 & Leesville SerialO on same subnet 15 15 7
All 6 links use different subnets 20 20 ? >
All 9 addresses are unique 65 65 ? >
Show Me All 3 use same IGRP AS # 45 30 ? >
All addrs fall btw. 10.28.0.0 & 10.33.0.0 45 45 ? :- onfiαurations:
WashingtonDC Minot Leesville
Figure imgf000018_0002
version 11.3 version 11.3 version 11.3 hostname WashingtonDC hostname Minot hostname Leesville enable password Cisco enable password cisco enable password Cisco
on-Confiαuratϊon Information
WashingtonDC: show cdp neighbors
Capability Codes: R Router, T - Trans Bridge, B - Source Route Bridge
4* S Switch, H - Host, I - IGMP, r - Repeater
- /X du Device ID Local Iπtrfce Holdtrae Capability Platform Port ID PODSWl-4000196E77880Eth 0 148 T S 1900 1 lUJAUm-l Ser 0 165 R 2520 Ser 1
KinoC Ser 1 164 R 2520 Ser 0
WashingtonDC: ping (extended)
Type escape sequence to abort.
Sending 5, 100-byte ICMP Echos to WHcimnw timeout is 2 seconds:
MP I Success rate is ___f_\ percent (5/5) , round-trip rain/avg/max = 32/34/36
WashingtonDC: ping (extended)
Type escape sequence to abort.
Sending 5, 100-byte ICMP Echos to H<H-fWlf< timeout is 2 seconds:
Success rate is 1 percent (0/5i
Figure imgf000018_0003
Figure imgf000019_0001
Key Description lax i Score Help!
EflKffiy 'router rip' removed 30 10 ?> Show Me 'router igrp' present 45 45 ? > l≥ϊji 'J f 'network 10.0.0.0' present under IGRP 30 30 ? > Show Me Bandwidth parameter present 30 10 ?> flHBH DC SerialO & Leesville Serial 1 on same subnet 15 15 ?> Show Me DC Serial 1 & Minot SerialO on same subnet 15 15 ?> HHHi Minot Serial 1 & Leesville SerialO on same subnet 15 15 ?>
All 6 links use different subnets 20 20 ? __>
All 9 addresses are unique 65 65 ? > Shovv Me. All 3 use same IGRP AS # 45 30 ?>
All addrs fall btw. 10.28.0.0 & 10.33.0.0 45 45 ?>
AH interfaces use /16 subnet masks 90 90 ?> - IBRBMrJSl DC can ping EthernetO on Minot and Leesville 20 10 ?>
Minot can ping EthernetO on DC and Leesville 20 20 ?>
Leesville can ping -EthernetO on DC and Minot 20 20- ?> Minot and Leesvile see DC via CDP 20 20 ?> DC. and Leesville see Minot via CDP 20 20 ?>
Figure imgf000019_0002
DC and Minot see Leesville via CDP 20 20 ?>
Raw Points 565 500
Raw Score 88.5
Admin Adjustment 2.6
Final Score 91.1
ScSJBctO ( ) onfϊαuratϊons.
WashingtonDC Minot Leesville
I version 11.3 version 11.3 version 11.3
I ! ! hostname WashingtonDC hostname Minot hostname Leesville ! ! I enable password Cisco enable password Cisco enable password Cisco
nterface EthernetO interface EthernetO p address 10.28.0.1 255.255.0.0 ip address 10.30.0.1 255.255.0.0 I ! interface SerialO nterface EthernetO ip addressJQ.29Λ'.22 ~ 55255 i. p address 10.32.0.1 255.255.0.0 no shutdown I
Figure imgf000020_0001
! interface SerialO ! interface Seriall interface Seriall ip address;iQ.29.0;.r.255.255".'0:q clockrate 56000 no shutdown clockrate 56000 no shutdown ! - interface Seriall clockrate 56000 no shutdown
I network 10.0.0.0 !
Figure imgf000020_0002
router igr 2 banner motd *C --SA-jM-ls ICND Assessment vLab banner motd C ICND Assessment vLab ip classless
WashingtonDC
Minot end ! banner motd C end ICND Assessment vLab
Leesville end
Figure imgf000020_0003
on-Confiq vation Information:
WashingtonDC: show cdp neighbors
Capability Codes: R - Router, T - Trans Bridge, B - Source Route Bridge S - Switch, H - Host, I - IGMP, r - Repeater
Device ID Local Intrfce Holdtme Capabili y Platform Port ID PODSWl-400Q19βE77880Eth 0 148 T S 1900 1
' 1S5 R 2520 Ser 1
MWino-tWW sSeerr 1 164 R 2520 Ser 0
WashingtonDC: ping (extended)
Type escape sequence to abor .
Sending 5, 100-byte ICMP Echos to mmcnmi.il, timeout is 2 seconds:
M i l l
'P Success rate is MPM percent (5/5 ) , round- trip mm/avg/max = 32/34/36 ms
WashingtonDC: ping (extended)
Type escape sequence to abort.
Sending 5, 100-byte ICMP Echos to m«n ιιW timeout is 2 seconds:
Success rate is _] percent (0/5)
WashingtonDC: show ip route summary (bytes)
Figure imgf000021_0002
Figure imgf000021_0001
col
Figure imgf000021_0003
WashingtonDC: trace ip 10.32.0.1
Type escape sequence to abor . Tracing the route to 10.32.0.1
Minot: show cdp neighbors
<Η Capability Codes: R - Router, T - Trans Bridge, B - Source Route Bridge
Switch, H - Host, I - IGMP, r - Repeater
Figure imgf000021_0004
mentor} technologies-
X ' A AςSςSFEςSSMMFE This test verifies that you correctly assigned IP addresses within your network.
IP addresses have two parts:
• A network portion
• A host portion
In most Layer 3 addressing schemes, the boundary between these parts is fixed. (For example, in IPX, the network portion is always 32 bits, while the host portion is always 48 hits.)
In IP, the boundary between the network and host portions of an address is variable. The subnet mask is used to determine where the boundary has been placed!
For details on the command to assign IP addresses, you may wish to refer to the following documentation link:
• IP address ip-address mask
For more information on IP, please refer to the following links:
Mentor Techolρgies Whiteboard Animation- IP Subnet Planning Cisco Technology Brief on IP
Click below for a description of the instructor led training course which covers this command:
ICND - Interconnecting Cisco Network Devicejs
Figure imgf000022_0001
menό^technoJogie-r VLAB ASSESSMENT ADMIN
-)l ASaSESbS iv
Assessment Summary:
Filters
Group:)AI1 ~M Lab: All PeriodrlWeek g Hidden: π H tW
Last Name. FijstJVame Login ID Group Assessment Score Attempt Report User Options demo demo demo Unassigned 0800. A Demo Assessment — Mig |74.6 5th I Mew ocε jcAn tø«Λ ^ Unassigned 0800. A Demo Assessment — Mig 71.4 1st - Vfew- ά <z α
Figure imgf000023_0001
Unassigned 0800. A Demo Assessment — Mig 71.4 1st 'View View λσe- o ^o U Unassigned 0800. A Demo Assessment ~ Mig 84.1 2nd r vifew
SCQBΞΛ. ,;.
} > e. a \χ
Cell 1
INTRODUCTION'
Special Effects Visual:
Mentor Technologies is pleased to
Keep theme of Gears and Knowedge from announce its "vLAB Assessment
Intro. Highlight. Technology - the world's first real-time,
"VLAB ASSESSMENT TECHNOLOGY" skills-based assessment that gages a CISCO learner's true readiness to perform.
Words:
"It's not what you know. It's what you
It's not what you "know." It's what you '"do" 'do" that counts." that counts.
Cell 2
Through our new "vLab Assessment Technology," we bring two ''new" Visual dynamics to Cisco learning - Words: "performance-based testing" and Performance-based testing "mentoring;. " Mentoring
Figure imgf000025_0001
Cell 6
This opens the Assessment Analysis Report Show report with 4 sections all together containing 4 primary sections. (Screen 2 )
The heart of the report is the '"Analysis
Matrix." Blow out of "Analysis Matrix" Portion of the report (2nd area) (Screen 3)
Cell 7
Each row contains a description of the Burst out on an entire row. learning objective, and the learner's actual performance against that objective.
Cell 8
The "Max" field shows the total possible Highlight the "Max" column. points awarded if the objective is met successfully.
Cell 9
And the "Score" field reflects the learner's Highlight the "Score" Column. score in meeting that objective. Learners Burst out on a red- item where the score is quickly learn to gage their progress, as lower than the Max column. partial credit is reflected in the "Score" field if the objective is not totally met. •
Figure imgf000027_0001
Figure imgf000028_0001
Figure imgf000029_0001
Figure imgf000030_0001

Claims

WHAT IS CLAIMED IS:
1. A method for assessing a user regarding control of one or more devices, comprising: comparing information regarding a configuration of at least one of the devices against at least one evaluation configuration criteria; comparing information regarding state information for the device against at least one evaluation state criteria; and assessing the user using the above comparisons.
2. The method of claim 1 , further comprising the step of obtaining information using the Simple Network Management Protocol (SNMP); and comparing the information obtained using the Simple Network Management Protocol (SNMP) against at least one evaluation criteria, wherein the step of assessing the user includes using the comparison using the information obtained using the Simple Network Management Protocol (SNMP).
3. A method for assessing a user regarding control of one or more devices, comprising: comparing information regarding at least one of the devices against at least one evaluation criteria; assigning one or more weights to one or more of the evaluation criteria; generating at least one partial credit value in regard to the comparison; and assessing the user using the above comparisons and the one or more devices.
4. A method for assessing a user regarding control of one or more devices, comprising: comparing information regarding at least one of the devices against at least one evaluation criteria; generating at least one partial credit value based on the comparison; and assessing the user using the above comparisons and the at least one partial credit value.
5. The method of claim 1 , further comprising the step of: providing a report regarding the assessment.
6. The method of claim 5, wherein the report provides one or more of the following capabilities: linking to help information; linking to information regarding the configuration of the one or more devices; and linking to information regarding the states of the one or more devices.
7. The method of claim 1 , further comprising remotely accessing the one or more devices; and
exercising control over the device by the user to perform a training exercise, wherein the user is assessed based on their performance of the training exercise.
PCT/US2001/043056 2000-11-13 2001-11-13 Method and system for assessment of user performance WO2002052528A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP01988135A EP1344202A1 (en) 2000-11-13 2001-11-13 Method and system for assessment of user performance
US10/415,465 US20040110118A1 (en) 2001-11-13 2001-11-13 Method and system for assessment of user performance

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US24703500P 2000-11-13 2000-11-13
US60/247,035 2000-11-13

Publications (1)

Publication Number Publication Date
WO2002052528A1 true WO2002052528A1 (en) 2002-07-04

Family

ID=22933271

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/043056 WO2002052528A1 (en) 2000-11-13 2001-11-13 Method and system for assessment of user performance

Country Status (2)

Country Link
EP (1) EP1344202A1 (en)
WO (1) WO2002052528A1 (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4295831A (en) * 1979-04-16 1981-10-20 Matt Frederick C Computer programming training device
US4652240A (en) * 1984-11-13 1987-03-24 Wackym Phillip A Interactive training system
US5616876A (en) * 1995-04-19 1997-04-01 Microsoft Corporation System and methods for selecting music on the basis of subjective content
US5823781A (en) * 1996-07-29 1998-10-20 Electronic Data Systems Coporation Electronic mentor training system and method
US5991693A (en) * 1996-02-23 1999-11-23 Mindcraft Technologies, Inc. Wireless I/O apparatus and method of computer-assisted instruction
US6033226A (en) * 1997-05-15 2000-03-07 Northrop Grumman Corporation Machining tool operator training system
US6099317A (en) * 1998-10-16 2000-08-08 Mississippi State University Device that interacts with target applications
US6193519B1 (en) * 1996-05-08 2001-02-27 Gaumard Scientific, Inc. Computerized education system for teaching patient care
US6196846B1 (en) * 1998-06-02 2001-03-06 Virtual Village, Inc. System and method for establishing a data session and a voice session for training a user on a computer program
US6308042B1 (en) * 1994-06-07 2001-10-23 Cbt (Technology) Limited Computer based training system
US6341212B1 (en) * 1999-12-17 2002-01-22 Virginia Foundation For Independent Colleges System and method for certifying information technology skill through internet distribution examination
US6371765B1 (en) * 1999-11-09 2002-04-16 Mciworldcom, Inc. Interactive computer-based training system and method

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4295831A (en) * 1979-04-16 1981-10-20 Matt Frederick C Computer programming training device
US4652240A (en) * 1984-11-13 1987-03-24 Wackym Phillip A Interactive training system
US6308042B1 (en) * 1994-06-07 2001-10-23 Cbt (Technology) Limited Computer based training system
US5616876A (en) * 1995-04-19 1997-04-01 Microsoft Corporation System and methods for selecting music on the basis of subjective content
US5991693A (en) * 1996-02-23 1999-11-23 Mindcraft Technologies, Inc. Wireless I/O apparatus and method of computer-assisted instruction
US6193519B1 (en) * 1996-05-08 2001-02-27 Gaumard Scientific, Inc. Computerized education system for teaching patient care
US5823781A (en) * 1996-07-29 1998-10-20 Electronic Data Systems Coporation Electronic mentor training system and method
US6033226A (en) * 1997-05-15 2000-03-07 Northrop Grumman Corporation Machining tool operator training system
US6196846B1 (en) * 1998-06-02 2001-03-06 Virtual Village, Inc. System and method for establishing a data session and a voice session for training a user on a computer program
US6099317A (en) * 1998-10-16 2000-08-08 Mississippi State University Device that interacts with target applications
US6371765B1 (en) * 1999-11-09 2002-04-16 Mciworldcom, Inc. Interactive computer-based training system and method
US6341212B1 (en) * 1999-12-17 2002-01-22 Virginia Foundation For Independent Colleges System and method for certifying information technology skill through internet distribution examination

Also Published As

Publication number Publication date
EP1344202A1 (en) 2003-09-17

Similar Documents

Publication Publication Date Title
US7016949B1 (en) Network training system with a remote, shared classroom laboratory
Lesgold SHERLOCK: A coached practice environment for an electronics troubleshooting job.
JP2021073486A (en) Mission-based, game-implemented cyber training system and method
US8554536B2 (en) Information operations support system, method, and computer program product
US7171155B2 (en) Learning support method and learning support program
US20040110118A1 (en) Method and system for assessment of user performance
Lahoud et al. Information security labs in IDS/IPS for distance education
CN105719208A (en) Examination system of trouble shooting of marine affairs
WO2002052528A1 (en) Method and system for assessment of user performance
DE60009935T2 (en) METHOD AND DEVICE FOR COMPUTER-ASSISTED TRAINING IN THE USE OF APPLIANCES CONTROLLED
KR20010097917A (en) Method of education using internet
US20100151433A1 (en) Test and answer key generation system and method
Atwater et al. Live Lesson: Netsim: Network simulation and hacking for high schoolers
Uramová et al. Best practise for creating Packet Tracer activities for distance learning and assessment of practical skills
US11412016B2 (en) Gamified virtual conference with network security training of network security products
Montero et al. Design and deployment of hands-on network lab experiments for computer science engineers
Lobo et al. Teaching E-learning for Students with Visual Impairments
Bauer et al. Using evidence-centered design to develop advanced simulation-based assessment and training
Csengody et al. Automated Evaluation of a Network Device Configuration
Papadimitriou et al. FSP Creator: A novel web service API creator of fuzzy students progress profile
Press Cisco CCNA Exam# 640-507 Certification Guide
McDonald et al. Practical experiences for undergraduate computer networking students
KR20030000458A (en) Apparatus for and method of network operation practice based on the internet
Jakab et al. Virtual lab in a distributed international environment-svc edinet
KR20020004392A (en) method for repeating practice using the internet

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2001988135

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 10415465

Country of ref document: US

WWP Wipo information: published in national office

Ref document number: 2001988135

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP

WWW Wipo information: withdrawn in national office

Ref document number: 2001988135

Country of ref document: EP