|Numéro de publication||US20050026130 A1|
|Type de publication||Demande|
|Numéro de demande||US 10/870,550|
|Date de publication||3 févr. 2005|
|Date de dépôt||18 oct. 2004|
|Date de priorité||20 juin 2003|
|Autre référence de publication||CN1836268A, CN100585662C, US8798520, US20110287400, US20140377734|
|Numéro de publication||10870550, 870550, US 2005/0026130 A1, US 2005/026130 A1, US 20050026130 A1, US 20050026130A1, US 2005026130 A1, US 2005026130A1, US-A1-20050026130, US-A1-2005026130, US2005/0026130A1, US2005/026130A1, US20050026130 A1, US20050026130A1, US2005026130 A1, US2005026130A1|
|Inventeurs||Christopher Crowhurst, Doug Boone, Roger Kershaw|
|Cessionnaire d'origine||Christopher Crowhurst, Doug Boone, Kershaw Roger C.|
|Exporter la citation||BiBTeX, EndNote, RefMan|
|Citations de brevets (17), Référencé par (12), Classifications (16), Événements juridiques (6)|
|Liens externes: USPTO, Cession USPTO, Espacenet|
This application is related to and claims priority from U.S. Provisional Patent Application No. 60/479,952, filed Jun. 20, 2003, the disclosure of which is incorporated herein by reference in its entirety, and is further related to U.S. Patent Publication No. 20030203342, published on Oct. 30, 2003 and entitled “METHOD AND SYSTEM FOR COMPUTER BASED TESTING USING CUSTOMIZABLE TEMPLATES”, U.S. Patent Publication No. 20030196170, published on Oct. 16, 2003 and entitled “METHOD AND SYSTEM FOR COMPUTER BASED TESTING USING A NON-DETERMINISTIC EXAM EXTENSIBLE LANGUAGE (XXL) PROTOCOL”, U.S. Patent Publication No. 20030182602, published on Sep. 25, 2003 and entitled “METHOD AND SYSTEM FOR COMPUTER BASED TESTING USING PLUGINS TO EXPAND FUNCTIONALITY OF A TEST DRIVER”, and U.S. Patent Publication No. 20030138765, published on Jul. 24, 2003 and entitled “METHOD AND SYSTEM FOR COMPUTER BASED TESTING USING AN AMALGAMATED RESOURCE FILE”, and U.S. Patent Publication No. 20030129573, published on Jul. 10, 2003 and entitled “EXTENSIBLE EXAM LANGUAGE (XXL) PROTOCOL FOR COMPUTER BASED TESTING”, all of which were filed concurrently and all of which are incorporated herein by reference in their entirety.
1. Field of the Invention
The present invention is related to systems and methods used to facilitate computer based testing such as those that utilize network systems. More particularly, the present invention uses cacheable objects to expand functionality of a test driver application. And, even more particularly, such cacheable objects include cacheable data objects and cacheable program and application objects that may be used by a test driver application in facilitating test taking and administration.
2. Description of the Related Art
For many years, standardized testing has been a common method of assessing examinees as regards educational placement, skill evaluation, etc. Due to the prevalence and mass distribution of standardized tests, computer-based testing has emerged as a superior method for providing standardized tests, guaranteeing accurate scoring, and ensuring prompt return of test results to examinees.
Tests are developed based on the requirements and particulars of test developers. Typically, test developers employ psychometricians or statisticians and psychologists to determine the specific requirements specific to human assessment. These experts often have their own, unique ideas regarding how a test should be presented and regarding the necessary contents of that test, including the visual format of the test as well as the data content of the test. Therefore, a particular computer-based test has to be customized to fulfill the client's requirements.
An item presenter is then written to present the new item, for example, to the test driver, step 18. Presenting the new item to the test driver requires a modification of the test driver's executable code. The test driver must be modified so that it is aware of the new item and can communicate with the new item presenter, step 20. The test packager must then also be modified, step 22. The test packager, which may also be a compiler, takes what the test publisher has created and writes the result as new object codes for the new syntax. Subsequently, the scoring engine must also be modified to be able to score the new item type, step 24. Finally, the results processor must be modified to be able to accept the new results from the new item, step 26. This process requires no less than seven software creations or modifications to existing software.
U.S. Pat. No. 5,827,070 (Kershaw et al.) and U.S. Pat. No. 5,565,316 (Kershaw et al.) are incorporated herein by reference. The '070 and '316 patents, which have similar specifications, disclose a computer-based testing system comprising a test development system and a test delivery system. The test development system comprises a test document creation system for specifying the test contents, an item preparation system for computerizing each of the items in the test, a test preparation system for preparing a computerized test, and a test packaging system for combining all of the items and test components into a computerized test package. The computerized test package is then delivered to authorized examinees on a workstation by the test delivery system.
U.S. Pat. No. 5,513,994 (Kershaw et al.), which is incorporated herein by reference, discloses a centralized administrative system and method of administering standardized test to a plurality of examinees. The administrative system is implemented on a central administration workstation and at least one test workstation located in different rooms at a test center. The administrative system software, which provides substantially administrative functions, is executed from the central administration workstation. The administrative system software, which provides function carried out in connection with a test session, is executed from the testing workstations.
However, computer-based testing has expanded from standalone distribution administered at a local test center to wide area network distribution administered via clustered servers at multiple locations. Thus, a distributed computer-based testing system requires scalability to support continuous exam administration and a high volume of concurrent test candidates who may be located at many remote locations.
Additionally, computer-based tests have evolved from mere display of simple text-based content to include streaming of audio and video content. Thus, a distributed computer-based testing system demands sufficient system resources and storage capacity as well as efficient data communication management to serve bandwidth-intensive multimedia content in a consistent manner.
Moreover, computer-based test models have advanced to include adaptive and simulation test models. Thus, a distributed computer-based testing system must support a variety of complex test models.
Further, a distributed computer-based testing system must facilitate a fair testing environment within a dynamic networked environment to test candidates who may have varying workstation capabilities or network connectivity. A number of factors affect the creation and maintenance of a fair testing environment, including bandwidth mismatches and network latency between a test candidate workstation and a test distribution server as well as between a test distribution server and a test source server, the available system resources of the test source server, the test distribution servers and the test candidate workstations, and test component characteristics (e.g., whether the object is text, audio or video). Thus, it is necessary to monitor candidate progress, candidate performance, network bandwidth, network latency, and server response, among other testing environment variables, during computer-based testing and cache test components in response to changes in the testing environment in order to ensure timely and consistent delivery of the computer-based test. In other words, a distributed computer-based testing system must be adjustable to emulate a suitable testing environment on test candidate workstations concurrently executing the same computer-based test.
The present invention discloses a computer-based testing system that controls delivery of a computer-based test to a high volume of concurrent test candidates and that adapts delivery of the computer-based test in response to changes in the testing environment.
It is one feature and advantage of the present invention to deliver computer-based tests to a high volume of concurrent test candidates located at multiple locations.
It is another feature and advantage of the present invention to securely administer computer-based tests among concurrent test candidates located at multiple locations.
It is another feature and advantage of the present invention to monitor candidate progress during computer-based testing for ensuring that test components are timely available for delivery to a test candidate during testing.
It is another feature and advantage of the present invention to monitor candidate performance during computer-based testing for ensuring that suitable test components are available for delivery to a test candidate during testing, for example, to support Computer Adaptive Testing (CAT).
It is another feature and advantage of the present invention to monitor network bandwidth during computer-based testing for adapting delivery of a computer-based test to a test candidate in accordance with the network bandwidth.
It is another feature and advantage of the present invention to monitor network latency during computer-based testing for adapting delivery of a computer-based test to a test candidate in accordance with the network latency.
It is another feature and advantage of the present invention to monitor server response during computer-based testing for adapting delivery of a computer-based test to a test candidate in accordance with the server response.
It is another feature and advantage of the present invention to enable a test candidate to launch a computer-based test on the candidate's workstation prior to all the test components having been delivered to the candidate workstation.
It is another feature and advantage of the present invention to enable a test candidate to continue computer-based testing when network connectivity fails during computer-based testing.
These and other features and advantages of the present invention are achieved in systems and methods for computer-based testing including a test driver application that controls delivery of a computer-based test to a test candidate. Particularly, the system includes cacheable objects, including cacheable data objects (test items) and cacheable program and application objects (plugins), collectively, test components, to expand the functionality of the test driver application, enabling the test driver application to control caching of test components in response to changes in the testing environment during delivery of a computer-based test to a candidate workstation. The systems and methods include monitoring candidate progress, candidate performance, network bandwidth, network latency and server response during delivery of the computer-based test and adjusting either the source of test components or the volume of test components being cached for delivery of the test. Based upon such monitoring, for example, if network communication failure is detected, the test candidate is able to continue computer-based testing while connectivity is being reestablished in the background.
There has thus been outlined, rather broadly, the more important features of the invention and the preferred embodiments in order that the detailed description thereof that follows may be better understood, and in order that the present contribution to the art may be better appreciated. There are, of course, additional features of the invention that will be described hereinafter and which will form the subject matter of the claims appended hereto.
In this respect, before explaining the preferred embodiments of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and to the arrangements of the components set forth in the following description or illustrated in the drawings. The invention is capable of other embodiments and of being practiced and carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting.
As such, those skilled in the art will appreciate that the conception, upon which this disclosure is based, may readily be utilized as a basis for the designing of other structures, methods and systems for carrying out the several purposes of the present invention. It is important, therefore, that the claims be regarded as including such equivalent constructions insofar as they do not depart from the spirit and scope of the present invention.
Further, the purpose of the foregoing abstract is to enable the U.S. Patent and Trademark Office and the public generally, and especially the scientists, engineers and practitioners in the art who are not familiar with patent or legal terms or phraseology, to determine quickly from a cursory inspection the nature and essence of the technical disclosure of the application. The abstract is neither intended to define the invention of the application, which is measured by the claims, nor is it intended to be limiting as to the scope of the invention in any way.
These, together with other objects of the invention, along with the various features of novelty, which characterize the invention, are pointed out with particularity in the claims annexed to and forming a part of this disclosure. For a better understanding of the invention, its operating advantages and the specific objects attained by its uses, reference should be had to the accompanying drawings and descriptive matter in which there is illustrated preferred embodiments of the invention.
Reference now will be made in detail to the presently preferred embodiments of the invention. Such embodiments are provided by way of explanation of the invention, which is not intended to be limited thereto. In fact, those of ordinary skill in the art may appreciate upon reading the present specification and viewing the present drawings that various modifications and variations can be made.
For example, features illustrated or described as part of one embodiment can be used on other embodiments to yield a still further embodiment. Additionally, certain features may be interchanged with similar devices or features not mentioned yet which perform the same or similar functions. It is therefore intended that such modifications and variations are included within the totality of the present invention.
The present invention discloses a system and method of computer-based testing using a test driver that is, for example, object-oriented and is architected to dynamically add functionality through, for example, the use of an expansion module, and preferably through the use of plugins. The test driver preferably references component object model servers using standard interfaces, and uses, for example, class names (that can be an Active Document) defined in a custom test definition language entitled extensible exam Language (“XXL”) based on extensible Markup Language (“XML”) format to interact with existing applications while offering the flexibility of allowing development of new plugins. These new plugins can be customized to a client's needs without changing the core test driver. The specific format and protocol of XXL is also described in U.S. Patent Publication No. 20030129573, published Jul. 10, 2003 and entitled “EXTENSIBLE EXAM LANGUAGE (XXL) PROTOCOL FOR COMPUTER BASED TESTING”, incorporated herein by reference.
The plugins advantageously enable the test driver to support, for example, new item types, navigation algorithms, information displays, scoring algorithms, timing algorithms, test unit selection algorithms, results persistence reporting, printed score reporting, and/or helm types without change to the test driver's executable. Plugins also allow expansion of the test driver's functionality without requiring the test driver to be recompiled or re-linked, and without requiring the test publisher to learn to program. Since plugins are written independently of the test driver, plugins can be written long after the test driver is built. The client and the software developer can design and test the plugins and distribute the plugins to each test site. By using this method, large-scale regression testing of other examinations will not usually be necessary unless changes are made to the plugins that may be used by many examinations. The specific use of plugins is described in U.S. Patent Publication No. 20030182602, published Sep. 25, 2003 and entitled “METHOD AND SYSTEM FOR COMPUTER BASED TESTING USING PLUGINS TO EXPAND FUNCTIONALITY OF A TEST DRIVER”, incorporated herein by reference.
The test driver of the present invention controls delivery of a computer-based test to a test candidate, including controlling caching of test components during delivery of the test. In accordance to monitoring of the testing environment, including monitoring candidate progress, candidate performance, network bandwidth, network latency and server response, during delivery of the test, the test driver adjusts either the source of test components or the volume of test components being cached for delivery of the test. Based on such monitoring, for example, if network communication failure is detected, the test candidate is able to continue computer-based testing while connectivity is being reestablished in the background. By using this system, a uniform testing environment can be established and maintained during computer-based testing in a distributed computer-based testing environment.
I. Overview of Computer-Based Test Delivery System
A test specification is authored by a test publisher according to the specifications of the client and stored in exam source files 130. Exam source files 130 include data files 132, XXL files 134, multimedia files 136, and hypertext markup language (“HTML”) files 138. XXL files 134 include the test specification, which contains the client's requirements for the test, a bank of test items or questions, templates that determine the physical appearance of the test, plugins, and any additional data necessary to implement the test. Additional data is also stored in data files 132. For example an adaptive selection plugin may need a, b & c theta values. These values are stored in a binary file created by a statistical package.
HTML files 130 include, for example, any visual components of the test, such as the appearance of test items or questions, the appearance of presentations on the display device, the appearance of any client specified customizations, and/or the appearance of score reports. HTML files 130 preferably also include script, for example, VBscript and Jscript, or Java script. HTML files 130 are preferably authored using Microsoft's FrontPage 2000. FrontPage 2000 is preferably also used to manage the source files in a hierarchy that is chosen by the test publisher. Multimedia files 136 include, for example, any images (.jpg, .gif, etc.) and/or sound files (.mp3, .wav, .au, etc.) that are used during the test.
XXL compiler 140 retrieves XXL files 134 from exam source files 130 using interface 190 and compiles the XXL test content stored in XXL files 134. XXL compiler 140 stores the compiled test files in exam resource file 120. In another embodiment, exam source files 130 do not contain XXL files 134 and contains, for example, only multi-media files. In this embodiment, XXL compiler 140 is merely a test packager that writes the data directly to exam resource file 120 without modification or validation. The data appears in a stream under the “data” branch of exam resource file 120. The name of the stream is specified by the test author.
In a preferred embodiment, XXL files 134 also include XXL language that defines plugins 150, in which case, plugins 150 assist XXL compiler 140 in compiling XXL files 134. Test driver 110 preferably supports, for example, nine different types of plugins 150, including, for example: display plugin 152; helm plugin 154; item plugin 156; timer plugin 158; selection plugin 160; navigation plugin 162; scoring plugin 164; results plugin 166; and report plugin 168. Plugins 150, which are also included in XXL files 134, are the first XML files compiled into exam resource file 120.
Plugins 150 allow a test designer to customize the behavior of test driver 110 and are divided into two types, for example: visible plugins and invisible plugins, as shown in
An application or component that uses objects provided by another component is called a client. Components are characterized by their location relative to clients. An out-of process component is an .exe file that runs in its own process, with its own thread of execution. Communication between a client and an out-of-process component is therefore called cross-process or out-of-process communication.
An in-process component, such as a .dll or .ocx file, runs in the same process as the client. It provides the fastest way of accessing objects, because property and method calls don't have to be marshaled across process boundaries. However, an in-process component must use the client's thread of execution.
Exam resource file 120 receives the compiled test content from XXL compiler 140 and plugins 150, if applicable, and stores the compiled test content in an object-linking and embedding (“OLE”) structured storage format, called POLESS, which is described in greater detail below. Other storage formats may optionally be used. OLE allows different objects to write information into the same file, for example, embedding an Excel spreadsheet inside a Word document. OLE supports two types of structures, embedding and linking. In OLE embedding, the Word document of the example is a container application and the Excel spreadsheet is an embedded object. The container application contains a copy of the embedded object, and changes made to the embedded object affect only the container application. In OLE linking, the Word document of the example is the container application and the Excel spreadsheet is a linked object. The container application contains a pointer to the linked object and any changes made to the linked object change the original linked object. Any other applications that link to the linked object are also updated. POLESS supports structured storage such that only one change made to an object stored in exam resource file 120 is globally effective. Test driver 110 comprises Active Document container application 112 for the visible plugins, display plugin 152, helm plugin 154, and item plugin 156, which function as embedded objects, preferably COM objects.
Both XXL compiler 140 and plugins 150 are involved in storing the compiled test content into exam resource file 120, if any of plugins 150 are being used. Exam resource file 120 comprises, for example, a hierarchical storage structure, as will be described in further detail below. Other storage structures may optionally be used. XXL compiler 140 determines to which storage location a specific segment of the compiled test content is to be stored. However, if any of plugins 150 are used to validate the portion of any of the data from exam source files 130, then the plugins 150 store the data directly to the exam resource file, based upon directions from XXL compiler 140. XXL compiler uses IPersistResource interface 192, co-located with I-Plugin interface 167 in
Referring again to
There are, for example, ten COM interfaces utilized in computer-based test delivery system 100. IPlugin interface 167, which is also a COM interface, is supported by all of plugins 150. COM interfaces 169, therefore, includes the IPlugin interface. The IPlugin interface contains generic operations such as loading and unloading, required of all plugins 150. In addition to the global IPlugin interface, each plugin 150 also uses, for example, a second, individual COM interface 169 to communicate with test driver 110. Alternative structures of the IPlugin interface may also be used. Table 1 shows the relationship between each plugin 150 and the COM interface 169 used with that particular plugin 150.
TABLE 1 COM INTERFACE FOR PLUGINS. PLUGIN COM INTERFACE DESCRIPTION All Plugins 150 IPlugin Passes data between the test driver and all plugins regarding generic operations, e.g., loading and unloading. Display 152 IDisplay Passes data between the test driver and the visible plugins that handle title bars, displays, non- answered items, and summaries. Helm 154 IHelm Passes data between the test driver and the visible plugins that display navigation controls or reviews. Communicates with a navigation plugin to perform the actual navigation. Also functions as a user interface connection to the test driver. Item 156 IItem Passes data between the test driver and the visible plugins that govern test items or simulations. Timer 158 IUnitTimer Passes data between the test drivers and the invisible plugins used to perform timing across examination sections. Selection 160 ISelection Passes data between the test drivers and the invisible plugins used to select forms, sections, groups, or items for delivery to the candidate. Navigation 160 INavigate Passes data between the test drivers and the invisible plugins used to control section navigation and define rules for traversing through the test. Scoring 164 IScore Passes data between the test drivers and the invisible plugins used to control scoring of delivered testing units. Results 166 IResults Passes data between the test drivers and the invisible plugins that control writing of candidate results, for example, to candidate exam results file 180. Report 168 IReport Passes data between the test drivers and the invisible plugins that control printing of score reports and other material, for example, printed reference material and post exam instructions to printer 182.
Exam instance file 170 is used to restart a test if the test has been interrupted, for example, because of a power failure. During delivery of the test, exam instance file 170 receives examination state information from test driver 110 and plugins 150 regarding the state of all running objects being used to deliver the test. The examination state information includes the presentation that was being delivered on the display device before the interruption, the responses the candidate had entered in that presentation, etc. When the test is restarted, the exam instance file 170 loads the state information back to test driver 110 and plugins 150, allowing the test to return to operation at the point where the test had been interrupted. Preferably, the running state of all objects is saved to exam instance file 170 rather than of only some of the objects. Saving the state of only some of the objects to exam instance file 170 causes the potential problem of only a portion of the test information being restored after a test interruption. Exam instance file 170 may also store additional information relating to the test, including, for example: the timing utilized and time remaining on units of the exam, the current unit of delivery, candidate score, etc. Test driver 110 and plugins 150 communicate with exam instance file 170 using POLESS interfaces 195. Test driver 110 controls communications between test driver 110 and plugins 150 using IPersistInstance interface 196, which is collocated with COM interfaces 169 in
Several administrative environments perform the administrative functions of computer-based test delivery system 100, for example: Test Center Manager (“TCM”) Bridge 172; Educational Testing Service (“ETS”) Bridge 174; and Unified Administration System (“UAS”) 174. Administrative functions include, for example: checking-in an candidate, starting the test, aborting the test, pausing the test, resuming the test, and transmitting results.
There are preferably two ways to run test driver 110. The first is through a series of command line options and the second is using COM interfaces describing appointment information. The command line option exists for backwards compatibility in a standard ETS environment and a TCM environment. Table 2 shows a list of command line options test driver 110 supports. There are, for example, four programs which launch the test through the COM interface, for example: 1) LaunchTest.exe (for test production and client review); 2) UAS; 3) UTD2ETS.dll (an internal compatibility module for use with the ETS administration environment); and 4) UTD2TCM (for the Test Center Manger environment). Other number of environments and/or programs may optionally be used.
TABLE 2 COMMAND LINE OPTIONS SUPPORT BY TEST DRIVER. SWITCH (ES) OPTION(S) PURPOSE /? n/a Displays command line /help switches in dialog box. /UnregServer n/a Unregisters the test driver core COM server. /RegServer n/a Registers the test driver core COM server. /T Form name Name of the form or form group to run in the exam. /F Resource file The exam resource file to use. /S n/a Suppress any printing. /W n/a Run in close-of-day mode. /TI n/a Set tracing level to information. (Very large instance file). /TW n/a Set tracing level to warning. (Large instance file.) /TE n/a Set tracing level to error. (Average sized instance file.) /K Resource dir, Used to point to SKSID, candidate directories. A space director separates each of the three options.
The administration environments use several interfaces to communicate with test driver 110. IAppointment interface 176 is part of UAS 174 and allows access by test driver 110 to candidate information for the candidate taking the test, such as demographics. The candidate information is included in candidate exam results file 180, which is created by the test driver. ILaunch2 interface 177 functions as the primary control interface for UAS 174 and allows UAS 174 to control various components such as test driver 110, screen resolution change, accommodations for disabled candidates, candidate check-in, etc., in a test center, which is the physical location where the candidate is taking the test. ITransfer interface 199 transfers candidate exam results file 180 and other files back to UAS 174. IPrint interface 198 sends information regarding any reports to printer 182.
The test driver is described in greater detail in U.S. Patent Publication No. 20030182602, entitled “METHOD AND SYSTEM FOR COMPUTER BASED TESTING USING PLUGINS TO EXPAND FUNCTIONALITY OF A TEST DRIVER”, incorporated herein by reference.
II. XXL Compiler Interfaces and Classes
The main interface to XXL compiler 140 is ICompile interface 2002. ICompile interface 2002 is implemented by cCompiler class 2000. All control and initiation of compilation of exam source files 130 into exam resource file 120 occurs by way of this single public interface. The core, non-plugin related elements of the XXL test definition language, as stored in XXL files 134, are compiled by classes in XXL compiler 140. For example, cSection class 2018, compiles the section element, and cGroup class 2016 compiles the group element.
ICompile interface 2002 supports the following operations, for example: createResource( ); addSource( ); addData( ); closeResource( ); about( ); linkResource( ); openResource( ) and getCryptoObject( ). CreateResource( ) creates a resource file, for example, an XXL based resource file such as exam resource file 120. AddSource( ) compiles an XXL file into the resource file. AddData( ) adds a file directly to a data branch of the resource file. CloseResource( ) closes the resource file. LinkResource( ) links a resource in the resource file and is performed after all compiling of the source files are completed. GetCryptoObject( ) returns an ICrypto object containing the current encryption setting of POLESS, as described below.
The classes of XXL compiler 1040, e.g., cForm 2020 and cItem 2012, handle individual XXL core language elements. All of these classes compile the specific XXL source element into exam resource file 120. All of these class language elements are also symbols used in later references. Therefore, the classes all derive from cSymbol class 2040. cSymbol class 2040 allows the classes of XXL compiler 140 to reside in a symbol table.
For example, the XXL element plugin 150 appears as follows in XXL files 134:
3 <plugin name=“helmNextPrevious” progid=“UTDP.cNextPrevious” />
This XXL call causes an instance of cPlugin class 2036 to be created, compiles the source, and writes the compiled result to exam resource file 120. The name and ID of Plugin 150 is also added to the symbol table for later reference.
XXL compiler 140 also contains the following token classes, for example: cToken 2042; cTokenCreator NoRef 2044; cTokenCreator 2046; ctokenCreatorRef 2048; cTokenCreatorBase 2050; and cTokenFactory 2054. These token classes are involved in the identification of tokens. Tokens turn into symbols after identification. Symbols are any class derived from cSymbol, e.g., cTemplate, cSection, etc.
XXL compiler 140 also contains the following symbol table classes, for example: cPluginSymbolTable 2058; cTemplateSymbolTable 2060; cSymbolTable 2062; cFFGSymbolTable 2064; cSGPSymbolTable 2066; and cSymbolTableBase 2068. These classes are varieties of symbol tables. There are different symbol tables for different groups of symbols. A group of symbols define a name space for the symbol. Common symbol table functions are located in the base symbol table classes and templates.
All content and specification destined for a plugin 150 appears in the data element in XXL. For example, below is an item definition in XXL:
4 <item name=“wantABreak1” skipAllowed=“false”> <data> <multiChoice correctAnswer=“A” maxResponses=“1” minResponses=“1” autoPrompt=“false” URI=“itembank/info_item.htm#wantABreak”/>- ; </data> </item>
The item element is handled by a cItem class 2012 object. The data element in the XXL definition is handled by a cData class 2004 object. Item plugin 156 Plugin 150 will receive the source to compile from the cData class 2004 object, in this example, a multiChoice element.
III. Test Driver Interfaces and Classes
Test driver 110 defines various interfaces to allow test driver 110 to communicate with different parts of computer-based test delivery system 100. Test driver 110 includes, for example, ten COM interfaces 169 to communicate and transfer data with plugins 150. (See Table 1 above) The COM interfaces 169 are denoted in
Test driver 110 and plugins 150 communicate and transfer data with exam resource file 120 using, for example, three IPersistResource interfaces 192: IPersistResourceStream interface 192 a; IPersistResourceSet interface 192 b; and IPersistResourceStore interface 192. IPersistResource interfaces 192 are used by plugins 150 during compilation of exam source files 130 and are used by both test driver 110 and plugins 150 during delivery of the test. During compilation of exam source files 130, XXL compiler 140 directs plugins 150 in which storage location of exam resource file 120 to store any information that plugins 150 have validated. Plugins 150 can then retrieve the stored information from exam resource file 150 during delivery of the test. Other number of interfaces and different combination of functionality may alternatively be used.
Information is saved from plugins 150, or from XXL compiler 140 in general, to exam resource file 120, for example, as either a stream of data, as a set of data, or as a storage structure, depending on which of the three IPersistResource interfaces 192 is implemented to save the information from plugins 150, to exam resource file 120. IPersistResourceStream interface 192 a saves the information, for example, as a stream of data or other data storage format. A stream of data is simply a stream of bytes stored as a linear sequence. IPersistResourceSet interface 192 b saves the information, for example, as a set of data. A set of data is preferably a name-value property pair. For example, the name of a particular property for an item is distractors and the value is the number of distractors required for that item. IPersistResourceSet interface 192 allows the name-value property pair to be saved together in exam resource file 120. IPersistResourceStore interface 192 c saves the information, for example, in a directory format with storage areas. The directory format allows other streams of data to be saved within the storage area, other property sets to be stored within the storage area, and for sub-storages to be saved under the storage area.
IPersistInstance interface 196, likewise, comprises, for example, three, different interfaces, for example: IPersistInstanceStream interface 196 a; IPersistInstanceSet interface 196 b; and IPersistInstanceStore interface 196 c. Examination state information is saved to exam instance file 170 as, for example, a stream of data, as a set of data, or as a storage element, depending on which of the three IPersistResource interfaces 192 is implemented.
Two of the interfaces, IContainerNotify interface 200 and IContainerNotifyHelm interface 206, function as callback interfaces from plugins 150 to test driver 110. IContainerNotify interface 200 allows a visible plugin to inform test driver 110, for example, that the plugin is displayed and ready for examinee interaction. IContainerNotifyHelm interface 206 allows helm plugin 154 to request navigation from test driver 110 after receiving an input from the examinee to move to another section of the test. IMore interface 202 is used to convey whether the examinee has seen all content in a presentation. For example, a “more” button appears in place of the next button when the content exceeds the window length. When the examinee scrolls to the bottom, the “more” button disappears and is replaced with the “next” button. Collection interface 204 is used by test driver 110 to hold any group entities, for example, categories and sections of the test.
The remaining interfaces are, for example, Microsoft defined Active Document interfaces, used to implement OLE linking functions of test driver 110 and the visible plugins, display plugin 152, helm plugin 154, and item plugin 156. IOleInPlaceFrame interface 210 controls the container's top-level frame window, which involves allowing the container to insert its menu group into the composite menu, install the composite menu into the appropriate window frame, and remove the container's menu elements from the composite menu. IOleInPlaceFrame interface 210 sets and displays status text relevant to the end-place object. IOleInPlaceFrame interface 210 also enables or disables the frames modeless dialogue boxes, and translates accelerator key strokes intended for the container's frame. IOleInPlaceUI window interface 211 is implemented by container applications and used by object applications to negotiate boarder space on the document or frame window. The container provides a RECT structure in which the object can place toolbars and other similar controls, determine if tools can in fact be installed around the objects' window frame, allocates space for the boarder, and establishes a communication channel between the object and each frame and document window. IAdviseSync interface 212 enables containers and other objects to receive notifications of data changes, view changes, and compound-document changes occurring in objects of interest. Container applications, for example, require such notifications to keep cached presentations of their linked and embedded objects up-to-date.
Calls to IAdviseSync interface 212 methods are a synchronous, so the call is sent and then the next instruction is executed without waiting for the calls return. IOleWindow interface 213 provides methods that allow an application to obtain the handle to the various windows that participate in-place activation, and also to enter and exit context-sensitive help mode. IOleInPlaceSite interface 214 manages interaction between the container and the objects in-place client site. The client site is the display site for embedded objects, and provides position and conceptual information about the object. IOleClientSite interface 215 is the primary means by which an embedded object obtains information about the location and extent of its display site, its moniker, its user interface, and other resources provided by its container. Test driver 110 called IOleClientSite interface 215 to request services from the container. A container must provide one instance of IOleClientSite interface 215 for every compound-document it contains. IOleDocumentSite interface 216 enables a document that has been implemented as a document object to bypass the normal activation sequence for in-place-active objects and to directly instruct its client site to activate it as a document object. A client site with this ability is called a “document site”.
B. Core Classes
Inheritance, or generalization, relates to a generalized relationship between classes that shows that the subclass shares the structure or behavior defined in one or more superclasses. A generalized relationship is a solid line with an arrowhead pointing to the superclass. Instantiation, or dependency, represents a relationship between two classes, or between a class and an interface, to show that the client class depends on the supplier class/interface to provide certain services. The arrowhead points to the supplier class/interface. Some services from a supplier class to a client class include, for example: the client class access a value (constant or variable) defined in the supplier class/interface; methods of the line class invoke methods of the supplier class/interface; and methods of the client class have signatures whose return class or arguments are instances of the supplier class/interface. For instantiation, the cardinality of the relationship is illustrated in
Test driver 110 also has several interfaces and implementing classes. Test driver 110 interfaces include, for example: IExam interface 222; IMsgBox interface 224; ICategory interface 232; IForm interface 238; IcResults interface 240; IcReport interface 242; IScript interface 246; ISection interface 250; IPresentation interface 248; and/or IcItem interface 256. The classes that implement the main interfaces include, for example: cScreenMinimum class 226; cFormGroup class 228; cPlugin class 230; cArea class 234; cTemplate class 236; cActivePlugin class 250; and cEvent class 252. The interfaces that are prefaced by “Ic” have names that already exist for plugins 150 to enact, for example, item plugin 156 implements IItem interface 169 c. IcItem interface 256, however, is the interface implemented by test driver 110 class citem (not shown). Of course, any number of interfaces may be used, depending on the necessary functionality.
The core class cExam (not shown) implements ILaunch2 interface 177 so that UAS 174 can control test driver 110. The appointment object, which implements IAppointment interface 176, is the main object UAS 174 supplies to test driver 110. The appointment object is available to plugins 150 by way of IPlugin interface 169 j. Furthermore, all plugins 150 get (IExam) using the IPlugin interface 169, also.
The cExam class selects and delivers the form, using cFormGroup class 228 and IForm interface 238. The form delivers results using IcResults interface 240, reports using IcReport interface 242, and sections contained with in the test using ISection interface 250. Classes that are in the test delivery chain preferably derive from cEvent class 252.
The cResults class (not shown) delivers a results plugin 166 that implements IResult interface 169 i. The cReport class (not shown) delivers a report plugin 168 that implements IReport interface 169 h. The cSection, cGroup, and cForm classes (not shown) use several invisible plugins 150 to control the delivery of the test. These plugins 150 are timer plugins 158, which implement IUnitTimer interface 169 d, selection plugins 160, which implement ISelection interface 169 e, scoring plugins 164, which implement IScore interface 169 g, and navigation plugins 162, which implement INavigate interface 169 f. The cPresentation class (not shown) supplies data to its template for the display of the presentation. The three visible plugins 150 are created and controlled through cTemplate class 236 and child objects cArea class 234. Item plugins 156 have an extension class in the cItem class (not shown) that wraps the item plugin 156 and provides generic extended services that all item plugins 156 implements. The cItem class in test driver 110 is a wrapper class. The cItem class provides two base services, for example: generic item functionality and access to item plugin 156, which is the wrapping function. Item generic functionality includes, for example: having an item name, having an item title, determining if the item is scored or un-scored, determining whether the item has been presented to the examinee, etc. These services are generic to all items and are provided by test driver 110. Item plugins 156 perform the actual scoring of the item, which is unique to each item type. Item plugins 156 present the content of the item and allow the examinee to interact with the item. These services are unique to each item type.
In addition to the interfaces described previously, test driver 110 implements IRegistry interface 220, which allows VB code to access the Windows registry. Test driver 110 also implements ILegacyItem interface 258 and ILegacyScore interface 260, which are defined by test driver 110 and are implements by certain item plugins 156 and scoring plugins 164. ILegacyItem interface 258 and ILegacyScore interface 260 allow old item types that existed in previous test drivers to report results like the previous test drivers. For some tests, test driver 110 must report results for old item types, which had very specific ways of reporting results. ILegacyItem interface 258 and ILegacyScore interface 260 allow the new item plugins 156 that represent old item types to report this legacy format of information to result plugins 166 trying to imitate previous test drivers.
A complete description of test driver 110 classes and interfaces is included in Appendix A.
All persistent storages, exam resource file 120 and exam instance file 170, preferably utilize POLESS. POLESS allows data to be embedded, linked, or references as external files from the persistent storage to test driver 110 and Active Document container application 112 (
POLESS is an extension of OLE structured storage compound document implementation. A compound document is a single document that contains a combination of data structures such as text, graphics, spreadsheets, sound and video clips. The document may embed the additional data types or reference external files by pointers of some kind. There are several benefits to structured storage. Structured storage provides file and data persistence by treating a single file as a structured collection of objects known as storage elements and streams. Another benefit is incremental access. If test driver 110 or plugins 150 need access to an object within a compound file, only that particular object need be loaded and saved, rather than the entire file. Additionally, structure storage supports transaction processing. Test driver 110 or plugins 150 can read or write to compound files in transacted mode, where changes made can subsequently be committed or reverted.
A. POLESS Components
OLE2SS component 310 contains all the interface definition that makeup structure storage. These interfaces can be realized by any structured storage implementation, such as compound document implementation OLE2 320 and POLESS 300. The interfaces include, for example: IStream interface 340; ISequentialStream interface 342; IStorage interface 344; and IRootstorage interface 346. POLESS 300 additionally implements IStreamVB interface 348 and IStorageVB interface 350.
IStreamVB interface 348 supports several functions, for example: ReadVB( ); WriteVB( ); Clear( ); Reset( ); get_sName( ); get_oStream( ); and CopyTo( ). ReadVB( ) reads a specified number of bytes to a data array. WriteVB( ) writes the byte data to the stream. Clear( ) clears the stream of all data. Reset( ) sets position to the beginning of the stream. get_sName( ) is a read-only function that returns the name of the stream. get_oStream( ) is a read-only function that returns the IStream interface 348. CopyTo( ) copies a source stream to a destination stream.
IStorageVB interface 350 supports several functions, for example: Clear( ); CommitVB( ); RevertVB( ); sElementName( ); bStorage( ); oElement( ); CreateStream( ); OpenStream( ); CreateStorage( ); OpenStorage( ); get_sName( ); get_oStorage( ); get_nCount( ); GetCompression( ); GetEncryption( ); GetCRC( ); CreateStreamLinked( ); CreatePropertyStg( ); OpenPropertyStg( ); SetClass( ); RegisterAlias( ); Destroy( ); and get_ElementType( ). Clear( ) clears the storage of all elements. CommitVB( ) causes transacted mode changes to be reflected in the parent. RevertVB( ) discards changes made since the last commit. sElementName( ) returns the name of the element. bStorage( ) returns TRUE if the element is a sub-storage. oElement( ) returns IStreamVB interface 348 or IStorage interface VB 350 for the element. CreateStream( ) creates and opens a stream and returns IStreamVB interface 348.
OpenStream( ) opens a stream and returns IStreamVB interface 348. CreateStorage( ) creates and opens a nested storage and returns IStreamVB interface 348. OpenStorage( ) opens an existing storage and returns IStreamVB interface 348. get_sName( ) is a read-only function that returns the name of the storage. get_oStorage( ) is a read-only function that returns IStorage interface 350. get_nCount( ) is a read-only function that returns a count of the elements. GetCompression( ) returns the status of file compression. GetEncryption( ) returns the status of file encryption. GetCRC( ) returns the status of file CRC checking. CreateStreamLinked( ) creates and opens a linked stream and returns IStreamVB interface 348. CreatePropertyStg( ) creates and opens a property storage and returns IPropertyStorageVB interface 414. OpenPropertyStg( ) opens a property storage and returns IPropertyStorageVB interface 414. SetClass( ) sets the CLSID for the storage. RegisterAlias( ) registers a pluggable protocol. Destroy( ) destroys the specified elements. get_ElementType( ) is a read-only function that returns the type of the element.
B. POLESS Classes
1) cFileRoot Class
StorageFileCreate( ) creates a new storage file, returns the root storage to interface, marks the new structured storage file as a POLESS file by storing the class ID (“CLSID”) of this class in a stream in the root storage. StorageFileOpen( ) opens an existing storage file and returns the root storage interface. CryptoGet( ) gets a default configured crypto class and should be set and used on the open or create of the storage file. bStorageFile( ) returns true if the file provided is an OLLE structured storage file and not a POLESS storage file. StorageAmalgamatedGet( ) gets an empty small cStorageAmalgamated class 404. DeltaFileCreate( ) creates a POLESS difference file by comparing the original POLESS file to the updated POLESS file. DeltaFileApply( ) applies a POLESS delta file and applies the original POLESS file to the delta file to create an updated POLESS file. GetObjectFromPath( ) uses monikers to retrieve the object named by the path and returns a pointer to the object retrieved. CreateStreamFromFile( ) creates a structured storage stream and populates it with the contents of the file. CreateStreamFromBSTR( ) creates a structures storage stream and fills it with the specified string. MemoryStreamFromStream( ) is used to copy a stream to a newly created memory stream object. GetPicture( ) loads a picture from stream object 424. SavePicture( ) saves the picture into the stream 426.
2) cCrypto Class
cCrypto class 402 implements ICrypto interface 401 and they support the following properties and method, for example: ProviderName; Password; FileType; Algorithm; EnumProviders( ); and EnumAlgorithms( ). Get_ProviderName( ) returns the name of the Crypto provider. Put_ProviderName( ) sets the name of the Crypto provider. Get_Password( ) and Put_Password( ) are only used for sponsor resource files. Get_FileType( ) gets the file type and put_FileType( ) sets the file type. Get_Algorithm( ) gets the encryption algorithm and put Algorithm( ) sets the encryption algorithm. EnumProviders( ) returns an enumerator for the list of installed providers. EnumAlgorithms( ) enumerate a list of algorithms for the current provider.
4) cStorageRoot Class
5) cStream Class
cStream class 408 is the POLESS implementation of IStream interface 340. cStream class 408 handles any storage object 426 that is POLESS specific and then delegates work to compound document implementation OLE2 320. The specific work includes compression/decompression and encryption/decryption of stream object 424.
IStream interface 340 supports the following operations, for example: Seek( ); SetSize( ); CopyTo( ); Commit( ); Revert( ); LockRegion( ); UnlockRegion( ); Stat( ); and Clone( ). Seek( ) changes the seek pointer to a new location relative to the beginning of stream object 424, the end of stream object 424, or the current seek pointer. SetSize( ) changes the size of stream object 424. CopyTo( ) Copies a specified number of bytes from the current seek pointer in stream object 424 to the current seek pointer in another stream object 424. Commit( ) ensures that any changes made to a stream object 424 open in transacted mode are reflected in the parent storage object. Revert( ) discards all changes that have been made to a transacted stream since the last call to IStream::Commit. LockRegion( ) restricts access to a specified range of bytes in stream object 424. Supporting this functionality is optional since some file systems do not provide this operation. UnlockRegion( ) removes the access restriction on a range of bytes previously restricted with IStream::LockRegion. Stat( ) retrieves the STATSTG structure for the stream object 424. Clone( ) creates a new stream object that references the same bytes as the original stream but provides a separate seek pointer to those bytes.
IStreamVB interface 348 is an automation friendly version of IStream interface 340. IStreamVB interface 348 supports the following operations, for example: Read( ); Write( ); Clear( ); Reset( ); get_sName( ); get_oStream; and CopyTo( ). Read( ) reads data from stream object 424. Write( ) writes data, including the entire byte array, to stream object 424. Clear( ) clears stream object 424 of all data. Reset( ) resets the position in stream object 424 to the beginning of stream object 424. Get_sName( ) returns the name of the stream. Get_Stream( ) returns the IDispatch interface. CopyTo( ) copies the contents of a source stream to a destination stream.
6) cStorage Class
cStorage class 410 is the POLESS implementation of IStorage interface 344 and IcStorage interface 411. cStorage class 410 handles any storage object 426 that is POLESS specific and then delegates work to compound document implementation OLE2 320.
IStorage interface 344 supports the following operations, for example: CreateStream( ); OpenStream( ); CreateStorage( ); OpenStorage( ); CopyTo( ); MoveElementTo( ); Commit( ); Revert( ); EnumElements( ); DestroyElement( ); RenameElement( ); SetElementTimes( ); SetClass( ); SetStateBits( ); and Stat( ). CreateStream( ) creates and opens a stream object 424 with the specified name contained in a storage object. OpenStream( ) opens an existing stream object 424 within a storage object using specified access permissions. CreateStorage( ) creates and opens a new stream object 424 within a storage object. OpenStorage( ) opens an existing storage object 426 with the specified name according to the specified access mode. CopyTo( ) copies the entire contents of an open storage object 426 into another storage object. The layout of the destination storage object may differ from the layout of the source storage object. MoveElementTo( ) copies or moves a sub-storage or stream object 424 from one storage object 426 to another storage object.
Commit( ) reflects changes for a transacted storage object 426 to the parent level. Revert( ) discards all changes that have been made to the storage object 426 since the last IStorage::Commit operation. EnumElements( ) returns an enumerator object that can be used to enumerate storage objects 426 and stream objects 424 contained within a storage object. DestroyElement( ) removes the specified storage object 426 or stream object 424 from a storage object. RenameElement( ) renames the specified storage object 426 or stream object 424 in a storage object. SetElementTimes( ) sets the modification, access, and creation times of the indicated storage element, if supported by the underlying file system. SetClass( ) assigns the specified CLSID to a storage object. SetStateBits( ) stores state information in a storage object, for example up to 32 bits. Stat( ) returns the STATSTG structure for an open storage object.
IStorageVB interface 350 is an automation friendly version of IStorage interface 344. IStorageVB interface 350 supports the following operations, for example: Clear( ); Commit( ); Revert( ); sElementName( ); bStorage( ); bElement( ); CreateStream( ); OpenStream( ); Createstorage( ); OpenStorage( ); get_sName( ); getoStorage( ); get_nCount( ); GetCompression( ); GetEncryption( ); GetCRC( ); CreateStreamLinked( ); CreatePropertyStg( ); OpenPropertyStg( ); SetClass( ); RegisterAlias( ); Destroy( ); and get_ElementType( ). Clear( ) clears the storage of all elements, e.g. sub-storages and streams. Commit( ) ensures that any changes made to a storage object opened in transacted mode are reflected in the parent storage. For non-root storage objects in direct mode, this method has no effect. For a root storage, it reflects the changes in the actual device, for example, a file on disk. For a root storage object open in direct mode, the commit( ) method is always called prior to releasing the object. Commit( ) flushes all memory buffers to the disk for a root storage in direct mode and will return an error code upon failure. Although releasing the object also flushes memory buffers to disk, it has no capacity to return any error codes upon failure. Therefore, calling releasing without first calling commit( ) causes indeterminate results. Revert( ) discards all changes that have been made to the storage object since the last Commit( ) operation.
Get_nCount( ) returns the count of elements in the storage. GetCompression( ) determines if streams may be compressed in the file and if enabled streams may optionally be compressed when created. GetCRC( ) indicates whether a cyclic-redundancy-check (“CRC”), or a digital signature, check is to be performed on the file. CreateStreamLinked( ) creates a link to a stream in another POLESS file. CreatePropertyStg( ) creates a property storage. OpenPropertyStg( ) opens a property storage. SetClass( ) assigns the specified CLSID to a storage object. RegisterAlias( ) registers an alias to a storage in the POLESS file for access by the pluggable protocol. Destroy( ) destroys the specified element. Get_ElementType( ) is a read-only command that returns the type of the element.
7) cPropertyStorage Class
cPropertyStorage class 412 implements IPropertyStorage interface 413, which supports the following operations, for example: ReadMultiple( ); WriteMultiple( ); DeleteMultiple( ); ReadPropertyNames ( ); WritePropertyNames( ); DeletePropertyNames( ); SetClass( ); Commit( ); Revert( ); Enum( ); Stat( ); and SetTimes( ). ReadMultiple( ) reads property values in a property set. WriteMultiple( ) writes property values in a property set. DeleteMultiple( ) deletes property values in a property set. ReadPropertyNames( ) gets corresponding strung names fro given property identifiers. WritePropertyNames( ) creates or changes string names corresponding to given property identifiers. DeletePropertyNames( ) deletes string names for given property identifiers. SetClass( ) assigns a CLSID to a property set. Commit( ) flushes or commits changes to a property storage object, as is done with the command IStorage::Commit, described previously. Revert( ) discards all changes made since the last commit call when a property storage is opened in transacted mode. Enum( ) creates and gets a pointer to an enumerator for properties within a property set. Stat( ) receives statistics about a property set. SetTimes( ) sets modification, creation, and access times for a property set.
IPropertyStorageVB interface 414 is an automation friendly version of IPropertyStorage interface 413 that manages the persistent properties of a single property set. IPropertyStorageVB interface 414 supports the following operations, for example: ReadVB( ); WriteVB( ); Delete( ); CommitVB( ); RevertVB( ); SetClass( ); get_nCount( ); CopyTo( ); GetName( ); WriteMultiple( ); and ReadMultiple( ). ReadVB( ) reads the value of a specified property from the property set. WriteVB( ) writes a value for a specified property to the property set. If the property does not exist the property/value pair will be created. If the property already exists, the value will be updated if opened in eAccess_Write mode. Delete( ) removes a property from the property set. CommitVB( ) flushes or commits changes to a property storage object, as is done with the command IStorage::Commit, described previously. RevertVB( ) discards all changes made since the last commit call when a property storage is opened in transacted mode. SetClass( ) assigns the specified CLSID to a property storage object. Get_nCount( ) returns the count of properties in the property set. CopyTo( ) copies the contents of the source property set to a destination property set. GetName( ) returns the name of the specified property. WriteMultiple( ) writes property values in a property set. ReadMultiple( ) reads property values in a property set.
8) cPropertyStorageAmalgamated Class
cPropertyStorageAmalgamated class 416 implements IPropertyStorageAmalgamated interface 417, which supports the following operations, for example: PropertyStorageAdd( ) and ClearStorage ( ). PropertyStorageAdd ( ) adds a property set to the collection of property sets. ClearStorage( ) clears the collection of property sets.
C. POLESS Exam Resource File
Exam branch 550, as seen in
Forms branch 600, as seen in
Event information 607 indicates, for example, the order of events of the test for that form. Each event has a name and is prefixed with an event type and a colon. Other formats are optional. The event type includes “section”, “report”, and “results”. Version information 608 and title information 609 indicate the version and title of the form, respectively. Skip allowed information 610 indicates, for example, whether or not by default skipping of sections is allowed. Restartable information 611 indicates, for example, whether the form can be restarted. Any optional, customized information regarding the form is stored in custom storage 616 as a property set or other data storage format. Timer storage 628 stores, for example, information relating to how the form is to be timed as a storage element. Attributes storage 630 stores, for example, the names of Timer Plugin 158 to be used with the form. Plugin data storage 632 and plugin data storage 633 store any data necessary for timer plugin 158 as a storage element and a stream of data, respectively. Plugin data storage 632 and plug in data storage 633 are optional. Scoring storage 634 stores, for example, information relating to the scoring of the form. Attributes storage 636 stores, for example, the name of scoring plugin 164 to be used with the form. Plugin data 638 and plugin data 639 optionally store any data needed for scoring Plugin 164 as a storage element and a stream of data respectively.
Items Branch 650, as seen in
Start information 658 indicates, for example, script execution at the beginning of the item and finish information 659 indicates, for example, script execution at the end of the item. Condition information 660 indicates, for example, whether or not there is a condition on the item being delivered to the examinee. The information stored in attributes storage 654 is stored as a stream of data or other data storage format. Data storage 662 and data stream 664 store any information regarding the properties of the item. For example, data storage 662 or data stream 664 can store the correct answer of a multiple choice item. Data storage 662 and data stream 664 stored the information as a storage element and a stream of data respectively.
Any optional, customized information regarding the item is stored in customs storage 666 as a stream of data or other data storage format. Category storage 668 stores, for example, information relating to each category to which the item belongs. The information stored in category storage 668 preferably and optionally is redundant, as category branch 700 stores, for example, all the items within the specific categories. The reason for the optional redundancy is so that test driver 110 can quickly look up the category of any item.
Category branch 700, as seen in
Description information 708 is used within the category to contain a description of the category's contents. Category storage 710 stores, for example, information relating to any subcategories under the category identified in name attribute storage 702. Items storage 712 indicates, for example, any items that exist within the category. Sections storage 714 contains information indicating what any sections that exist within the category. Scoring storage 716 contains information relating to the scoring of the items within the category. Attributes storage 718 stores, for example, the name of the scoring plugin to be used with the item. Data storage 720 and data stream 722 contain the information needed to initialize scoring plugin 164. Data storage 720 and data stream 722 store the information as a storage element and a stream of data respectively.
Templates branch 750, as seen in
Areas storage 764 indicates, for example, information relating to the areas used within the template denoted by the information in name attributes storage 752. Many areas may exist within a template as denoted by the three vertical ellipses. Each area is identified by the information stored in name attribute storage 766. Attribute storage 768 stores, for example, visible plugin name information 760, size information 770, and allow more information 771. Plugin name information 760 indicates, for example, the name of the visible plugin to be used with the area. Size information 770 indicates, for example, the size of the area, as for example a pixel value, a percentage value, or HTML syntax. Plugin data 772 and plugin data 774 store information relating to the visible plugin to be used in the area. The data stored in either plugin data storage 772 or plugin data stream 774 is executed by the visible plugin when the template is loaded. Plugin data storage 772 and plugin data stream 774 stores, for example, the information as either a storage element or a stream of data, respectively. Other information may optionally be stored.
Section branch 800, as seen in
Timer storage 814 stores information regarding, for example, the timing of the section. Attribute storage 816 stores, for example, information identifying timer plugin 158, which is to be used with a section. Plugin data storage 818 and plugin data storage 820 stores, for example, data needed for timer plugin 158. Plugin data storage 818 and plugin data storage 820 stores, for example, information as a storage element and a string of data, or other acceptable format, respectively. Navigation storage 822 stores, for example, information relating to the delivery of presentations and groups within the section. Attributes storage 824 stores, for example, information indicating which navigation plugin 162 is to be used with this section. Plugin data storage 826 and plugin data stream 828 store information needed for the navigation plugin 162. Plugin data storage 826 and plugin data stream 828 store the information as a storage element and a stream of data respectively. Groups branch 850, as seen in
Event information 856 indicates, for example, the order of events within the test. Review name information 858 indicates, for example, whether or not a presentation within the group is to be used as a review screen. Any optional, customized information regarding the group is stored in custom storage 860 as a stream of data or other data storage format. Events storage 862 stores event information, for example, as is described in further detail in
Start information 886, finish information 887, and condition information 888 indicates, for example, start, finish, and conditional scripts respectively. Any optional, customized information regarding the event is stored in custom storage 889. The “key” for each custom attribute will be a string. Referring again to
Plugins branch 900, as seen in
Data branch 950, as indicated in
FormGroups branch 1000, as seen in
Scripts branch 1100 stores, for example, information relating to scripts used within the test. Attributes storage 1102 stores, for example, type information that specifies which type of language the script is in, for example, VB script of J script. Scripts storage 1104 stores, for example, global scripts used within the test that may be referenced by the test driver. MsgBox branch 1150 stores, for example, information relating to the size and content of any message boxes that may be delivered to the examinee during the test. Message boxes may be triggered by plugins 150 during the exam.
D. POLESS Exam Instance File
Running branch 1202 stores, for example, the state information of all running objects in test driver 110 and plugins 150. Plugins 150 use one of IPersistInstanceStream interface 196 a, IPersistInstanceSet interface 196 b, or IPersistInstanceStore interface 196 c to store information to exam instance file 170 as a stream of data, a set of data, or a store of data, respectively. Any of plugins 150, except display plugin 152, results plugin 166, report plugin 168, and helm plugin 154, which do not contain examination state information, store examination state information to exam instance file 170. Test driver 110 determines the storage location in exam instance file 170 that stores a particular piece of examination state information.
Exam sub-branch 1204 contains examination state information relating to the exam. Contents storage 1206 stores, for example, exam status information 1207 and version information 1208. Exam status information 1207 indicates, for example, the status of the exam, for example, initializing or terminating. Template storage branch 1210 stores, for example, examination state information relating to templates running in the exam. Name attribute storage 1212 stores, for example, count information 1214 and observed ever information 1215. Observed ever information 1215 indicates, for example, whether or not the template's content has ever been fully seen by the examinee.
Form storage branch 1216 contains information relating to the forms used within the exam. Contents storage branch 1218 stores, for example, seconds information 1219, date start information 1220, date finish information 1221, current section information 1222, and version information 1223. Current section information 1222 indicates, for example, the current section being delivered to the examinee in the form. Version information 1223 indicates, for example, the identification of the form.
Sections chosen storage branch 1224, as illustrated in
Items chosen sub-branch storage 1240 stores, for example, information relating to items that have been or will be delivered to the examinee. Contents storage branch 1242 stores, for example, the names and order of all the items that have been or will be delivered to the examinee. Name attributes storage 1244 indicates, for example, the identification of a particular item. Contents storage branch 1246 stores, for example, presented information 1244, complete information 1248, skipped information 1249, seconds information 1250, dehydrated information 1251, and observed ever information 1252. Presented information 1247 indicates, for example, whether the item has ever been delivered to the examinee. Completed information 1248 indicates, for example, whether or not the item has been completed. Skipped information 1249 indicates, for example, whether the item has been skipped. Item plugin storage 1254 and item plugin storage 1255 stores, for example, examination state information from item plugin 156. Item plugin storage 1254 is used if item plugin 156 uses IPersistInterfaceSet 196 b or IPersistInterfaceStore 196 c. Item plugin storage 1255 is used if item plugin 156 uses IPersistInterfaceStream 196 a.
Presentations chosen storage sub-branch 1286 indicates, for example, any presentations that have been or will be delivered to the examinee. Contents storage 1288 stores, for example, the names of the presentations. Names storage sub-branch 1290 stores, for example, the name of the presentation. Names storage 1290 also stores, for example, comment information 1291, marked information 1292, count information 1293, name information 1294, observed ever information 1295, name information 1296, and observed ever information 1297. Name information 1294 and observed information 1295 relate to the name of the first presentation area stored under presentations chosen sub-branch 1286 and whether or not the presentation has ever been observed, and name information 1296 indicates, for example, the last presentation area that was delivered to the examinee and whether or not the presentation was ever observed. Contents storage 1298 stores, for example, information leading to events. Contents storage 1298 stores, for example, ready information 1299 ever checked information 1300, ever started information 1301, and ever finished information 1302. Ready information 1299 indicates, for example, whether the event is ready to be delivered to the examinee. Ever checked information 1300 indicates, for example, whether an event's conditional delivery script ever been checked. Preferably, the conditional delivery script is only checked once. Ever started information 1301 indicates, for example, whether the event was ever started by the examinee. Ever finished information 1302 indicates, for example, whether the event was completed by the examinee.
Referring again to
History branch 1320 is a single stream of chronological text messages that logs the history of the test. These text messages are used by staff at system headquarters to diagnose problems that occurred in the field. Each text message is prefixed with the date, time, and a level of severity, for example: information, warning, or error. Test driver 110 will filter the text messages to a level of diagnostics desired for test driver 110, such as determining errors in test driver 110 or detail history tracking, including general information.
V. Expansion of Test Driver Using Plugins
A detailed description of the XXL schema is given in U.S. Patent Publication No. 20030129573, entitled “EXTENSIBLE EXAM LANGUAGE (XXL) PROTOCOL FOR COMPUTER BASED TESTING,” incorporated herein by reference.
The test developer next writes the appropriate plugin 150, in this example, item plugin 156. The test developer also implements the IPlugin interface 167 and IPlugin interface and IItem interfaces 169. Additionally, the test developer implements IPersistResource interface 192 (
A. Test Production and Test Delivery
The validation of the test specification and content is illustrated in greater detail in
B. Plugin Life Cycle
Exam source files 130, of which data files 132 and XXL files 134 are shown, contain every aspect of the test as written by the test publisher. In step I, XXL compiler 140 reads from XXL files 134 and interprets instructions that call for the use of a plugin 150. Plugin 150 is identified in the XXL test definition language by both a name and a program identification (“prog ID”). When XXL compiler receives the prog ID from XXL files 134, XXL compiler knows that a plugin 150 is required to complete the compilation of exam source files 130.
Not all of the possible types of plugins 150 are required to build any one test. Also, more than one plugin 150 is implemented for a specific type. In the above example, two navigation plugins 162 and two item plugins 156 are defined. XXL compiler 140 reads information from exam source files 130 using IStream interface 340, iNode interface 1424, which is the Microsoft interface used to access a node of an XML document in the document object model (“DOM”), and IStreamVB interface 348. XXL compiler 140 instantiates the requested plugin 150 using, for example, the call CoCreateInstance( ). CoCreateInstance( ) creates a single, uninitialized object of the class associated with a specified CLSID, using a prog ID that has been converted into the CLSID.
If the data referring to plugin 150 has been customized by the test developer, XXL compiler 140 may not recognize the new data. Therefore, XXL compiler 140 passes the data directly to plugin 150 and plugin 150 loads the data into a private memory (not shown). In one embodiment, the private memory is internal to plugin 150, and in another embodiment, the private memory is external to plugin 150. Plugin 150 can then validate the data using the XXL schema. If the data is invalid, plugin 150 reports the error. In an alternative embodiment, plugin 150 can validate the data using an XML document type definition (“DTD”). A DTD is a formal description in XML Declaration Syntax of a particular type of document. Similar to a schema, a DTD sets out what names are to be used to the different types of elements, where they may occur, and how they all fit together. However, the XXL schema is preferred for validation since schemas are easier to read than a DTD and are very flexible.
If plugin 150 declares that the data is valid, XXL compiler 140 prepares a POLESS storage object 300 in exam resource file 120 to which plugin 150 saves the data at a command from XXL compiler 140, in step II. As described previously, XXL compiler 140 determines where the data from plugin 150 is to be saved in exam resource file 120 and creates the appropriate storage location. The name, CLSID, and data associated with plugin 150 is stored in plugins branch 900 in exam resource file 120 (
The compile sequence of a plugin 150, as shown in steps I and II in
Step 1I contains two steps, indicated as step IIa and IIb. In step IIa, XXL compiler 140 creates the appropriate storage element in exam resource file 120 using POLESS object 300. The storage element type is determined based on the type of IPersistResource interface 192 that plugin 150 implements, for example: IPersistResourceStream interface 192 a; IPersistResourceSet interface 192 b; or IPersistResourceStore interface 192 c. XXL compiler 140 then calls IPersistResource*::Save( ) call 1434 for the appropriate IPersistResource interface. Plugin 150 saves the compiled information from exam source files 130 to exam resource file 120 through the POLESS object 300 passed by XXL compiler 140. In step IIb, XXL compiler 140 instructs plugin 150 to unload, or flush, its content using Unload( ) call 1436. As stated previously, steps I, IIa, and IIb are repeated until all of exam source files 130 is compiled.
Step VI, which is shown as steps VIa and VIb, concerns amalgamation of exam resource file 120. Amalgamation enables data for a specific plugin to exist virtually as one storage location even if the data appears at different locations within the storage hierarchy. Amalgamation can be performed on exam resource file 120 if plugin 120 has implemented either IPersistResourceSet interface 192 b or IPersistResourceStore interface 192 c which storing data to exam resource file 120. In step VIa, XXL compiler 140 amalgamates one to three storage elements in exam resource file 120 and passes the amalgamated POLESS object to plugin 150 using IPersistResource*::ValidateResource( ) call 1438. Plugin 150 determines whether or not the amalgamated POLESS object creates a complete and valid set. Plugin 150 throws a structured COM error if the amalgamated POLESS object does not create a complete and valid set. In step VIb, XXL compiler 140 instructs plugin 150 to unload, or flush, its content using Unload( ) call 1440. Steps VIa and VIb are interspersed among steps I, IIa, and IIb cycles and can also occur multiple times during the compilation of exam source files 130. Amalgamation is described in greater detail, in U.S. Patent Publication No. 20030129573, entitled “EXTENSIBLE EXAM LANGUAGE (XXL) PROTOCOL FOR COMPUTER BASED TESTING,” incorporated herein by reference.
Referring again to
Periodically, based on a request either from test driver 110 or from plugin 150, the state of all running objects will save to exam instance file 170, which is a unique file for each examinee, indicating the progress and the status of the test for that examinee. Test driver 110 asks plugin 150 if plugin 150 is “dirty,” meaning that plugin 150 is storing has some updated examination state information. For example, when the examinee selects distractor A on a multi-choice item, item plugin 156, in this case, becomes dirty. If plugin 150 is dirty, test driver 110 provides plugin 150 a POLESS object 300 in exam instance file 170 and plugin saves the examination state information to exam instance file 170 using IPersistInstance interface 196, in step IV. For example, item plugin 156 saves the examinee's answer to item plugin storage 1254 or to item plugin storage 1255 (
Step V occurs if the test is interrupted, for example, because of a power failure, and the test needs to restart. When test driver 110 is required to return to a particular operation state, test driver 110 reads the examination state information from exam instance file 170. Plugin 150 is provided the storage object containing the state of plugin 150 as saved in step IV using IPersistInstance interface 196. Using the previous example, item plugin 156 retrieves its state information from item plugin storage 1254 or for item plugin storage 1255. Plugin 150 is able to become operational from the retrieved state information, enabling a restart of the test from the point at which the test was interrupted.
The delivery sequence of a plugin 150, as shown in steps II, IV, and V in
In step 111 b, cTemplate class 236 in test driver 110 uses IPlugin::Load( ) call 1526 to set the core object references from test driver 110 into the plugin 150 being delivered. The core object references include IContainerNotify interface 200, the cExam class (not shown), and the IAppointment interface 176, which passes information regarding the examinee and appointment to plugin 150.
Step V, which is interspersed with step III, occurs only if the test is interrupted and plugin 150 loses state. cTemplate class 236 in test driver 110 uses IPersistInstance*::Reload( ) call 1528 to call on the reload method of exam instance file 170. Exam instance file 170 reloads plugin 150, through IPersistInstance interface 192, for example, IPersistInstanceSet 192 b, with the state saved to the appropriate storage location in exam resource file 170 (see
Step IIIc is performed for both initial delivery of plugin 150 and during restart of the test, in conjunction with step V. cTemplate class 236 in test driver 110 uses IPersistResource*::Load( ) call 1530 to call on the load method of exam resource file 120. Exam resource file 120 loads plugin 150, through IPersistResource interface 192, for example IPersistResourceSet interface 192 b, with the test specification and content from the appropriate storage location in exam resource file 120. Plugin 150 is loaded with test specification and content from exam resource file 120 when being initially delivered to the examinee. Plugin 150 is also loaded with test specification and content from exam resource file 120 and with examination state information from exam instance file 170, as described above, when the test has been interrupted and plugin 150 must recover state.
After plugin 150 is properly loaded, cTemplate class 236 in test driver 110 uses, I*::PresentationStarting( ) call 1532 (continued in
The deactivation of the presentation begins with a request from the helm for navigation. For example, if the examinee has finished a question and wishes to move on to the next question on the next presentation, the examinee can choose the “NEXT” button on the helm. The navigation request is sent from IHelm interface 169 b, which receives the request from the examinee, to test driver 110 using IContainerNotifyHelm interface 206. As seen in
The active presentation then instructs the template to deactivate using cTemplate::Deactivate( ) call 1546, step IIIk (continued in
Step IV, which contains sub-steps IVa-c, is the process to save plugin state data to exam instance file 170. Test driver 110 requests the “dirty” state of plugin 150 to determine whether plugin 150 is storing any state information that would be necessary if the test were to be interrupted. Test driver 110 uses IPersistInstance*::IsDirty( ) call 1552 to make the request, step IVa. For example, test driver 110 uses IPersistInstanceSet::IsDirty call 1552 if the state data is a property set. If plugin 150 is storing state data that is not already stored in exam instance file 170, IPersistInstance*::IsDirty( ) call 1552 returns true. If plugin 150 is “dirty”, test driver 110 instructs plugin 150 to save the state data to exam instance file 170 in the POLESS object provided (
VI. Network Environment for a Computer-Based Testing System
A description of a network environment for a computer-based testing system according to the present invention including a test driver that controls delivery of a computer-based test over a networked environment by caching test components for delivery to a test candidate in order to facilitate a uniform testing environment for at least one or more concurrent test candidates is provided.
With reference to
The network environment as shown in
From application deployment servers 3040. the test driver application for controlling delivery of the computer-based test may be downloaded for setup on candidate workstation 3000. The test driver application 3070 may be stored on a computer readable medium, such as a hard drive or other magnetic medium, connected to candidate workstation 3030. However, test components of the computer-based test, including the exam resource file, test items and plugins (all of which were previously described), are stored in random access memory (RAM) to prevent unauthorized copying or manipulation of the computer-based test, thereby assuring the integrity of the test.
Test driver application 3070 includes session manager 3080 for managing the computer-based testing session, having authentication layer 3090 for authenticating proctors of the computer-based test, event locator candidate credential interface 3100 for certifying test candidate eligibility to use the computer-based test and Unified Test Driver (UTD) core 3110 for controlling delivery of the computer-based test to candidate workstation 3000. UTD core 3100 controls item cache controller 3120, plugin cache controller 3130 and browser presentation layer 3140. Item cache controller 3120 stores test items retrieved from item cache servers 3040 in item cache 4360 (
VII. Caching Architecture for the Computer-Based Testing System
In computer-based testing, a test may be delivered to a candidate workstation using two delivery modes: a disconnected mode (e.g., non-networked) or a connected mode (e.g., networked). A disconnected mode is a traditional delivery mode in which every element required to administer the computer-based test is stored on the test candidate workstation before the computer-based test is initiated. Thus, there is no need to monitor the testing environment to adjust delivery of the computer-based test over a network environment and no need to implement a caching architecture for a network environment. In contrast, a connected mode is a non-traditional delivery mode in which core elements of the computer-based test are stored on the test candidate workstation and additional test components are retrieved during computer-based testing from distribution servers over a network environment. While using the connected mode provides many advantages, such as, flexibility in administering computer-based tests to test candidates located in multiple locations, the network environment introduces environment variables that are not controllable by a test administrator without great cost. Thus, in order to provide a uniform testing environment to a varied volume of at least one or more concurrent test candidates located in multiple locations, a computer-based system must adjust delivery of a computer-based test in order to compensate for variance in the network environment.
A description of a caching architecture for a computer-based testing system of the present invention including a test driver that controls delivery of a computer-based test over a networked environment by caching test components for delivery to a test candidate in order to facilitate a uniform testing environment for at least one or more concurrent test candidates is provided.
As shown in
Test driver application 3070 further includes item request interface 4340, plugin request interface 4370, request processor 4310, decryption module 4320, decompression module 4330, item request module 4350, plugin request module 4380, cache controller 4400, item cache 4360 and plugin cache 4390. Test driver application 3070 sends requests to retrieve test components via item request interface 4340 and plugin request interface 4370. Request processor 4310 processes requests for test components initiated by test driver application 3070 and delivery of test components retrieved from the distribution servers to test driver application 3070. Item request module 4360 facilitates retrieval of test items from item cache servers 3040. Plugin request module 4380 facilitates retrieval of plugins from the plugin cache servers 3060. Cache controller 4400 manages the storing of test components in item cache 4360 and plugin cache 4390. Item cache 4350 stores test items retrieved from item cache servers 3040. Plugin cache 4390 stores test plugins retrieved from plugin cache servers 3060. Decryption module 4320 decrypts test components that have been encrypted for preserving the integrity of the computer-based test. Decompression module 4330 decompresses test components that have been compressed for transmitting over the network environment.
Descriptions of example caching operations of a computer-based testing system of the present invention including a test driver that controls delivery of a computer-based test over a networked environment having the caching architecture are provided. Because the computer-based test comprises cacheable objects, it is possible to download only selected test components from the distribution servers to the candidate workstation for delivering a current test section in accordance to the test specifications. Thus, it is possible for a test candidate to initiate a computer-based test prior to all test components being downloaded to the candidate workstation.
A similar process occurs when test driver application 3070 requires a plugin.
Caching of a computer-based test for delivery to candidate workstation 3000 is facilitated in accordance to the changing testing environment during computer-based testing reflected by monitoring of candidate progress, candidate performance, network bandwidth, network latency and server response, among other environmental variables, during computer-based testing. With reference to
Descriptions of caching operations of a computer-based testing system according to the present invention is provided. Generally, stimuli processor 4410 periodically initiates an inquiry to candidate progress monitor 4430, candidate performance monitor 4420, network bandwidth monitor 4450, network latency monitor 4440 and server response monitor 4460 during computer-based testing. The results of each monitor are returned to stimuli processor 4410. Based these results, stimuli processor 4410 adjusts either the source of test components or the volume of test components being cached for delivery of the computer-based test and cache controller 4400 accordingly. Examples of the operations of the testing environment monitors candidate progress monitor 4430, candidate performance monitor 4420, network bandwidth monitor 4450, network latency monitor 4440 and server response monitor 4460 are now described.
For example, candidate progress monitor 4430 measures the test candidate's rate of progress in answering test items during computer-based testing for maintaining availability of test items.
In another example, candidate performance monitor 4420 measures the test candidate's competency in answering test items during computer-based testing for maintaining availability of suitable test items.
A further example is network bandwidth monitor 4450 which measures the speed of data transfer between candidate workstation 3000 and a distribution server for maintaining timely availability of test items.
An additional example, network latency monitor 4440 measures the delay in data transmission time between candidate workstation 3000 and the distribution servers caused by the network for re-establishing connectivity during computer-based testing.
In another example, server response monitor 4460 measures the delay in data transmission time between candidate workstation 3000 and the distribution servers caused by the distribution servers for maintaining accessibility to a source for test components during computer-based testing.
The present invention is not limited to the embodiment described herein. For example, the test driver application and cacheable test components may be stored on the same distribution servers for delivery to test candidates. In addition, the distribution servers, in groups or singly, can be located in any number of remote locations. Furthermore, the testing environment monitors can include other monitors not specifically described herein. Thus, cacheable objects are used to expand functionality of a test driver application that controls delivery of a computer-based test to one or more test candidates over a dynamic distributed network environment by adapting delivery of the computer-based test in accordance to monitoring of testing environment variables.
The many features and advantages of the invention are apparent from the detailed specification, and thus, it is intended by the appended claims to cover all such features and advantages of the invention, which fall within the true spirit and scope of the invention. Further, since numerous modifications and variations will readily occur to those skilled in the art, it is not desired to limit the invention to the exact construction illustrated and described, and accordingly, all suitable modifications and equivalence may be resorted to, falling within the scope of the invention.
|Brevet cité||Date de dépôt||Date de publication||Déposant||Titre|
|US5915973 *||11 mars 1997||29 juin 1999||Sylvan Learning Systems, Inc.||System for administration of remotely-proctored, secure examinations and methods therefor|
|US5947747 *||9 mai 1996||7 sept. 1999||Walker Asset Management Limited Partnership||Method and apparatus for computer-based educational testing|
|US6112049 *||21 oct. 1997||29 août 2000||The Riverside Publishing Company||Computer network based testing system|
|US6681098 *||10 janv. 2001||20 janv. 2004||Performance Assessment Network, Inc.||Test administration system using the internet|
|US6684053 *||12 juin 2001||27 janv. 2004||Ecollege.Com||On-line educational system for processing exam questions and answers|
|US6712615 *||22 mai 2001||30 mars 2004||Rolf John Martin||High-precision cognitive performance test battery suitable for internet and non-internet use|
|US6988096 *||18 juil. 2001||17 janv. 2006||Learningsoft Corporation||Adaptive content delivery system and method|
|US7043193 *||15 août 2000||9 mai 2006||Knowlagent, Inc.||Versatile resource computer-based training system|
|US7052277 *||14 déc. 2001||30 mai 2006||Kellman A.C.T. Services, Inc.||System and method for adaptive learning|
|US7080303 *||13 nov. 2002||18 juil. 2006||Prometric, A Division Of Thomson Learning, Inc.||Method and system for computer based testing using plugins to expand functionality of a test driver|
|US20020028430 *||10 juil. 2001||7 mars 2002||Driscoll Gary F.||Systems and methods for computer-based testing using network-based synchronization of information|
|US20020098468 *||23 janv. 2001||25 juil. 2002||Avatar Technology, Inc.||Method for constructing and teaching a curriculum|
|US20030152904 *||29 nov. 2002||14 août 2003||Doty Thomas R.||Network based educational system|
|US20040005536 *||30 janv. 2003||8 janv. 2004||Feng-Qi Lai||Universal electronic placement system and method|
|US20040047354 *||9 juin 2003||11 mars 2004||Slater Alastair Michael||Method of maintaining availability of requested network resources, method of data storage management, method of data storage management in a network, network of resource servers, network, resource management server, content management server, network of video servers, video server, software for controlling the distribution of network resources|
|US20040229199 *||15 avr. 2004||18 nov. 2004||Measured Progress, Inc.||Computer-based standardized test administration, scoring and analysis system|
|US20040259062 *||20 juin 2003||23 déc. 2004||International Business Machines Corporation||Method and apparatus for enhancing the integrity of mental competency tests|
|Brevet citant||Date de dépôt||Date de publication||Déposant||Titre|
|US7376550 *||26 oct. 2005||20 mai 2008||Juniper Networks, Inc.||Simulation of network traffic using non-deterministic user behavior models|
|US7765096||5 mai 2008||27 juil. 2010||Juniper Networks, Inc.||Simulation of network traffic using non-deterministic user behavior models|
|US8170466||26 mai 2006||1 mai 2012||Ctb/Mcgraw-Hill||System and method for automated assessment of constrained constructed responses|
|US8429744 *||15 déc. 2010||23 avr. 2013||Symantec Corporation||Systems and methods for detecting malformed arguments in a function by hooking a generic object|
|US8517742 *||17 mai 2005||27 août 2013||American Express Travel Related Services Company, Inc.||Labor resource testing system and method|
|US8769517 *||15 mars 2002||1 juil. 2014||International Business Machines Corporation||Generating a common symbol table for symbols of independent applications|
|US8798520 *||3 août 2011||5 août 2014||Prometric Inc.||System and method for computer based testing using cache and cacheable objects to expand functionality of a test driver application|
|US20080286743 *||15 mai 2007||20 nov. 2008||Ifsc House||System and method for managing and delivering e-learning to hand held devices|
|US20110275049 *||10 nov. 2011||Larry Eugene Albright||System and method for pre-selection in computer adaptive tests|
|US20110287400 *||24 nov. 2011||Prometric Inc.||System and method for computer based testing using cache and cacheable objects to expand functionality of a test driver application|
|US20120149000 *||15 juil. 2011||14 juin 2012||John Allan Baker||Systems and methods for guided instructional design in electronic learning systems|
|WO2008030993A2 *||6 sept. 2007||13 mars 2008||Agilix Labs Inc||Security and tamper resistance for high stakes online testing|
|Classification aux États-Unis||434/362|
|Classification coopérative||G09B7/02, H04Q3/0062, G06F17/30902, H04N21/2402, G09B5/00, G09B7/00, G09B5/06, G09B5/14, Y10S706/927, H04M7/0084, H04M2215/7414, H04N21/44204, H04M15/8016|
|8 juin 2007||AS||Assignment|
Owner name: PROMETRIC, A DIVISION OF THOMSON LEARNING, INC., M
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CROWHURST, CHRISTOPHER;BOONE, DOUG;KERSHAW, ROGER C.;REEL/FRAME:019431/0938;SIGNING DATES FROM 20060511 TO 20070607
|26 juil. 2007||AS||Assignment|
Owner name: PROMETRIC HOLDINGS LLC, DELAWARE
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:THOMSON LEARNING, INC.;REEL/FRAME:019588/0864
Effective date: 20070702
|27 sept. 2007||AS||Assignment|
Owner name: THOMSON LEARNING, INC., CONNECTICUT
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PROMETRIC, A DIVISION OF THOMSON LEARNING, INC.;REEL/FRAME:019919/0518
Effective date: 20070530
|26 oct. 2007||AS||Assignment|
Owner name: PROMETRIC INC., MARYLAND
Free format text: CERTIFICATE OF CONVERSION;ASSIGNOR:TEST CENTER, LLC;REEL/FRAME:020018/0716
Effective date: 20071016
Owner name: TEST CENTER, LLC, MARYLAND
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PROMETRIC HOLDINGS LLC;REEL/FRAME:020018/0520
Effective date: 20071012
|18 janv. 2008||AS||Assignment|
Owner name: CREDIT SUISSE, CAYMAN ISLANDS BRANCH, NEW YORK
Free format text: SECURITY AGREEMENT;ASSIGNOR:PROMETRIC INC.;REEL/FRAME:020385/0325
Effective date: 20080118
|12 avr. 2012||AS||Assignment|
Owner name: PROMETRIC INC., MARYLAND
Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:028034/0776
Effective date: 20120410