WO2004107087A2 - Generating test cases - Google Patents

Generating test cases Download PDF

Info

Publication number
WO2004107087A2
WO2004107087A2 PCT/IN2004/000145 IN2004000145W WO2004107087A2 WO 2004107087 A2 WO2004107087 A2 WO 2004107087A2 IN 2004000145 W IN2004000145 W IN 2004000145W WO 2004107087 A2 WO2004107087 A2 WO 2004107087A2
Authority
WO
WIPO (PCT)
Prior art keywords
test
input
cases
flow graph
user
Prior art date
Application number
PCT/IN2004/000145
Other languages
French (fr)
Other versions
WO2004107087A3 (en
Inventor
Venkata Regunathan
Original Assignee
Flextronics Design Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Flextronics Design Limited filed Critical Flextronics Design Limited
Publication of WO2004107087A2 publication Critical patent/WO2004107087A2/en
Publication of WO2004107087A3 publication Critical patent/WO2004107087A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Definitions

  • the present invention relates to testing of applications, and more specifically to a method and apparatus for generating test cases used to test an application.
  • Testing generally refers to evaluation of an application (designed to perform a specific task) to determine conformance to requirement, the performance and/or reliability of the application.
  • an application may be implemented in the form of a combination of one or more of hardware, software and firmware.
  • a typical testing transaction entails providing a set of input values to an application and determining whether the application behaves in a desired fashion. Often, in applications such as graphical user interfaces implemented using software, the input values of a set are provided in sequence, and the corresponding output/ behavior is verified against the expected output/ behavior. The set of input/output values for a single transaction is referred to as a test case.
  • test cases generally to test the application exhaustively.
  • a test designer examines the specifications and/or uses the application to manually and intuitively identify the test cases.
  • the identified test cases are then used to test the application (or various versions thereof as the application evolves with fewer problems/bugs and/or enhanced features).
  • test designers may unintentionally miss identifying (or including) test cases, which could identify possible problems/bugs with the application implementation. Often such problems are uncovered later when the application is deployed on a wide-scale, with users using portions of the application not tested exhaustively. At least to avoid such unintentional omission of useful test cases, an improved approach to generate test cases is desirable. Generating exhaustive test cases systematically, to achieve required test coverage remains to be problem faced by test designers.
  • Figure (Fig.)l is a block diagram illustrating an example system in which the present invention can be implemented.
  • Figure 2 is flowchart illustrating an example method using which test cases may be generated according to an aspect of present invention.
  • Figure 3 is a display screen illustrating an example application using use cases for which test cases can be generated according to various aspects of the present invention.
  • Figure 4A illustrates use case diagram of 'Edit Proj ect' use case implemented according to an aspect of present invention.
  • Figure 4B is a display screen illustrating the manner in which 'Edit Project 1 use case is represented using a flow-graph in an embodiment of present invention.
  • Figure 5 is a display screen which, depicts the test paths generated for 'Edit Project' in an embodiment of present invention.
  • Figure 6A is a display screen illustrating the manner in which variables (used later for defining test cases) of all the nodes in 'Edit Proj ect' use case flow-graph may be specified in an embodiment of present invention.
  • Figure 6B is a pop-up screen illustrating the manner in which variables and type may be specified for each node in an embodiment of present invention.
  • Figure 7 A is a display screen illustrating the manner in which input/output (I/O)' variables for each node may be specified in an embodiment of present invention.
  • Figure 7B is a window 780 illustrating the manner in which input/output variables, type and values for each node may be specified in an embodiment of present invention.
  • Figure 8A is a display screen illustrating the manner in which test cases may be generated in an embodiment of the present invention.
  • Figure 8B is a display screen illustrating the manner in which a user may be provided a convenient interface for specifying various input variables corresponding to a test case according to an aspect of present invention.
  • Figure 9 is a network centric architecture illustrating implementation of several aspects of the present invention.
  • An aspect of the present invention examines a flow-graph representing the specification of an application, and generates test cases by traversing the flow graph.
  • the flow-graph permits flexibility such as representation of loops in an application flow/use, application flow/use can be effectively specified by flow-graphs.
  • test paths covering potentially all the paths (and all input combinations) of the applications can be systematically generated.
  • Example System Figure 1 is a block diagram of computer system 100 illustrating an example system in which various aspects of the present invention can be implemented.
  • Computer system 100 may contain one or more processors such as central processing unit (CPU) 110, random access memory (RAM) 120, secondary memory 130, graphics controller 160, display unit 170, network interface 180, and input interface 190. All the components except display unit 170 may communicate with each other over communication path 150, which may contain several buses as is well known in the relevant arts. The components of Figure 1 are described below in further detail.
  • CPU 110 may execute instructions stored in RAM 120 to provide several features of the present invention.
  • CPU 110 may contain multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, CPU 110 may contain only a single processing unit.
  • RAM 120 may receive instructions from secondary memory 130 using communication path 150. Data representing test cases of various paths of an application may be stored and retrieved from secondary memory 130 (and/or RAM 120) during the execution of the instructions.
  • Graphics controller 160 generates display signals (e.g., in RGB format) to display unit 170 based on data/instructions received from CPU 110.
  • Display unit 170 contains a display screen to display the images defined by the display signals.
  • Input interface 190 may correspond to a key-board and/or mouse, and generally enables a user to provide inputs.
  • Network interface 180 enables some of the inputs (and outputs) to be provided on a network.
  • Display unit 170, input interface 190 and network interface 180 may be used to enable a user to generate test cases for applications.
  • Display unit 170, input interface 190 and network interface 180 may be implemented in a known way.
  • Secondary memory 130 may contain hard drive 135, flashmemory 136 andremovable storage drive 137. Secondary memory 130 may store the data (e.g., use-cases, flow-graphs, test cases, described in detailed in sections below) and software instructions (e.g., to generate test cases), which enable computer system 100 to provide several features in accordance with the present invention. Some or all of the data and instructions may be provided on removable storage unit 140, and the data and instructions may be read and provided by removable storage drive 137 to CPU 110. Floppy drive, magnetic tape drive, CD-ROM drive, DVD Drive, Flash memory, removable memory chip (PCMCIA Card, EPROM) are examples of such removable storage drive 137.
  • PCMCIA Card PCMCIA Card, EPROM
  • Removable storage unit 140 may be implemented using medium and storage format compatible with removable storage drive 137 such that removable storage drive 137 can read the data and instructions.
  • removable storage unit 140 includes a computer readable storage medium having stordd therein computer software and/or data.
  • An embodiment of the present invention is implemented using software running (that is, executing) in computer system 100.
  • computer program product is used to generally refer to removable storage unit 140 or hard disk installed in hard drive 135. These computer program products are means for providing software to computer system 100.
  • CPU 110 may retrieve the software instructions, and execute the instructions to provide various features of the present invention as described below in detail.
  • FIG. 2 is flowchart illustrating an example method using which test cases may be generated according to an aspect of present invention. The description is provided with reference to Figure 1 merely for illustration. Various aspects of the present invention can be implemented in other systems as well.
  • the flowchart starts in step 201, in which control immediately passes to step 210.
  • step 210 computer system 100 may enable a user to specify use cases associated with high level functionalities of an application sought to be implemented.
  • computer system enables the user to expand higher level use cases into one or more lower level use cases.
  • Use cases are described in further detail in a book entitled, “Applying Use Cases”, ISBN Number: 0-201-30981-5, by Gei Schneider and Jason P. Winters and "Object -Oriented Software Engineering A Use Case Driven Approach", ISBN Number: 0-201 -54435-0, by Ivar Jacobson et al.
  • Example high level functionalities and lower level use cases are illustrated in further detail with an example in sections below.
  • step 230 computer system 100 may enable the user to generate a flow graph representing each use case.
  • a flow-graph generally contains several nodes connected by edges. A transition from one node to another generally occurs on an event (including providing an appropriate input(s) of a test case, generating a corresponding output(s), etc.).
  • Each node may be viewed as realizing a behavior of functionality as in specification for a corresponding portion of the application,, and then control is transferred, to a next.ngde asspecified in the flow graph.
  • An example approach to represent a flow-graph is described below in further detail.
  • Each node may represent an atomic operation, a use case, a branch or a junction of multiple paths.
  • a use case at a node can in turn be defined in further detail with lower level nodes and edges.
  • the recursive definition of nodes representing use cases can continue until none of the nodes in a flow-graph at that level represents a.use case.
  • loops in a flow graph enable repeated invocation of use cases and atomic operations.
  • loops In conjunction with branches, loops generally provide a mechanism to implement complex functionalities.
  • computer system 100 may identify various paths in the flow graphs by traversing the flow-graphs.
  • the paths can potentially traverse a single use case or multiple use cases, potentially at different levels/granularity of definition.
  • a set of test paths are generated to thoroughly test all paths of a use case, and the tested use case is then used as a single node ('black box') while testing the application at a higher level of granularity.
  • Such an approach enables the overall number of test cases to be minimized while still potentially exhaustively testing all paths of an application in structured and bottom-up manner.
  • step 270 computer system 100 may enable a user to provide inputs and outputs to define the test cases for each path.
  • Various type of inputs and outputs may be provided according to their data types, and various approaches may be employed in generating test cases based on the user inputs.
  • the test cases need to be designed to ensure an application operates as desired for correct input, and yet handles unexpected inputs gracefully.
  • a user specifies the type of input(s)/output(s) (e.g., integer, text, character) and the range of permissible/correct input(s)/output(s), and computer system 100 may automatically generate test cases according to the formal testing techniques (boundary-value analysis, equivalence portioning etc.). Designing such test cases will be apparent to one skilled in the relevant arts by reading the disclosure provided herein.
  • the test cases thus generated may be used to test the application. Such testing may be performed manually or using an appropriate automation technique (such as generating test scripts which automatically feed the correct sequence of inputs to an application and verifying the actual outputs against the expected outputs while testing). Such testing techniques are not described in the present application, as not being necessary for an understanding of the present invention. Control then passes to step 299, in which the method ends.
  • test cases can be generated for various applications. It may be appreciated that the flow-graphs are described as being described associated with use cases merely to integrate the approach into some common deign methodology, however various aspects of the present invention can be implemented without use of such use cases. The manner in which the approach(es) of above can be used is described in further detail with reference to an example application below.
  • Figure 3 is a display screen illustrating an example application using use cases for which test cases can be generated according to various aspects of the present invention.
  • the display screen is shown containing two windows 305 and 315.
  • Window 305 is shown containing a root directory 'C-TestTool' which in turn is shown containing sub classes-'class diagram', 'use case diagram', Tester(user) and use cases which are shown in window 315.
  • Window 315 is shown containing user 310 specifying four use cases -Authenticate user320, Open Proj ect330, Edit Proj ect340, and Close Proj ect350.
  • Each use case is assumed to represent a high level functionality.
  • Each use case may be defined in further detail as described with reference to Edit Project340.
  • Figure 4A depicts the further detailed definition of Edit Prqject340.
  • the user is shown specifying that Edit Project340 uses the use cases - Save410 and Editwindow415.
  • Editwindow415 is in turn shown using the use cases - Project Settings420, Set compile options425 and Environment settings430.
  • Figure 4B is a display screen containing two windows 440 and 445.
  • Window 440 is shown containing a directory C-TestToolD which in turn is shown containing two use case flow graphs uc 1 EdifProj ectFlowGraph (representing Edit Proj ect) and 'uc2OpenProj ectFlowGraph' .
  • User 310 may use ViewD option in the pop-up menu 495 to view the flow-graph in window 445. The user may continue to edit the flow-graph to represent the specification of the application sought to be tested.
  • Window 445 depicts the flow-graph for Edit Project340.
  • the flow graph is shown containing starting node 435, decision boxes 450 and 480, junctions 460 and 470, Edit window415, Project Settings420, Set compile options425, Environment settings430, Save 410, and end junction 490.
  • the nodes are shown connected by various edges. The manner in which the nodes and edges together specify application flow is described below in further detail.
  • Transition to decision box 450 from starting node 435 is shown to occur on event e2.
  • control transfers to Edit window415 on event e3, and to end junction 490 on event e4.
  • control is shown transferring to junction 460 on event e5.
  • control is transferred to Project Settings420, Set compile options425, Environment settings430 respectively upon events e7, e6, and e8.
  • test paths may be generated from flow-graphs. While a simple example is described for illustration, it should be understood that the approaches are applicable to complex applications as well. 5. Test Paths from Flow Graphs
  • FIG. 4B user 310 may use Create PathsD option in popup menu 495 to create test paths from the flow-graph in window 445.
  • Figure 5 depicts the test paths generated for Edit Project340.
  • the nodes in the flow-graph are traversed to determine the various possible paths, which form the test paths.
  • Window 505 is shown containing seven paths (pi - p7), with each path indicating a test path. Some of the paths are described below for illustration. As may be appreciated, each path is conveniently identified by the event symbols associated with transitions. However, each path may be. identified by the corresponding nodes.
  • the path corresponds to a situation in which e2 event causes a transition to decision node 450, and e4 event cause control to be transferred to end node 490.
  • transfer of control from edit project 340 to end node 490 indicates that user 310 opted to quit editing.
  • Test path P3 in path 530 is shown equaling (e2, e3, e5, e6, e9, elO, ell, el2), which corresponds to a situation in which a user returns to editing after saving changes related to the environment.
  • the remaining test paths P4 - P7 are described similarly. The description is continued with reference to generating test cases from the test paths.
  • test cases from Test Paths It may be appreciated that a user may need to provide different combinations of input/output values to test each identified path exhaustively.
  • Various interfaces may be provided to simplify the task of providing such input/output values, as will be apparent to one skilled in the relevant arts such as functional testing techniques (such as boundary value analysis, equivalence class portioning etc.). Such interfaces are intended to be covered by various aspects of the present invention.
  • a convenient interface is provided enabling a user to specify the input/output variables associated with each test path.
  • a number of test case can then be generated by providing a combination of input/output values for the specified input/output variables.
  • Various combinations of input/output values can be provided to test the path exhaustively. The selection of the combination of input/output values can be provided based on the formal functional testing techiniques.
  • an example interface which facilitates the user to specify various input/output variables that may be relevant to testing of each test path.
  • Figure 6 A is a display screen illustrating the manner in which variables of all the nodes in Edit Project use case flow-graph may be specified in an embodiment of present invention.
  • a flow-graph represents the detail of a use case represented at higher levels granularity
  • the information on variables may be received from displays/interfaces corresponding to. such higher levels.
  • window 610 the user is shown highlighting the use case 'EditProj ectFlowGraph', and causing pop-up menu 495 to be activated/generated by appropriate action (e.g., by clicking the appropriate button of a mouse).
  • the user is then shown selecting Define VariablesD option to cause pop-up window 650 of Figure 6B to be displayed.
  • the user is shown specifying the input/output variables (File Name, Path Name, Drive Name, Compiler, Java Version, Count) in column 654 and variable type in column 658. Similar to type, other attributes for variables can be added. These attribute information can be used to ensure that application can handle these variables gracefully.
  • input/output variables File Name, Path Name, Drive Name, Compiler, Java Version, Count
  • the user indicates completion of specifying the input/output variables by clicking on button 651.
  • the input/output data may be caused to be ignored by clicking on button 652.
  • the information thus provided can be used in identifying the input/output variables for each path as described below in further detail with reference to Figure 7A.
  • Figure 7A is a display screen illustrating the manner in which input/output (I/O) variables for each node (in the context of a test path) may be specified in an embodiment of present invention.
  • the input/output variables thus specified may be consolidated to generate the input/output variables associated with each test path.
  • window 760 the flow-graph corresponding to Edit Project340 is shown.
  • the user is shown selecting (by underlining) test path P2 in window 710, and the corresponding path in window 760 is shown high-lighted (bold lines).
  • Pop-up menu 775 is shown when the user clicks on Edit window415.
  • Window 780 may be displayed when the user selects the 'Edit I/O' option.
  • Window 780 of Figure 7B is shown containing tables 785 and 789 respectively for indicating input variables and output variables associated with the selected path for node 415.
  • Table 785 is shown containing three columns 781, 782, and 783 respectively representing Variables, Type, and default value (Vail).
  • the input and output variables in columns 781 and 786 need to be contained in variables column 654 of Figure 6B, and appropriate checks may be performed before accepting the input variables specified in columns 781 and 786.
  • variables specified in columns 654 and 658 may be displayed, and a user may be permitted to dragD each displayed variable as an input variable or output variable in columns 781 and 786.
  • the implementation of several such interfaces will be apparent to one skilled in the relevant arts.
  • I/O variables for a node may be different in each test path.
  • a node represents a use case
  • different paths internal to the use case may be tested in different test paths, and accordingly the I/O variables for a node may be different in different test paths.
  • an interface such as that described in Figures 7A and
  • 7B may be used to define the I/O variables and values for (all the nodes) in each test path.
  • the set of input output variables for a test path may be determined by combining the input/output variables specified for various nodes in the context of the test path.
  • the determined set may be used to provide a convenient user interface to define test cases as described below.
  • test cases may be generated from each test path by using corresponding number of combinations of input/output values for input/output variables identified for each path.
  • test cases can be generated from the test paths identified above, as described below with reference to Figures 8 A and 8B with examples. It should be further understood that not all input/output variables of a test path may need to be provided for a test case.
  • a user (defining test cases) may determine such sub-set based on an understanding of the application-flow. The description is continued with reference to an example user interface enabling a user to define test cases for test paths.
  • Figure 8A is a display screen illustrating the manner in which test cases may be generated in an embodiment of the present invention.
  • Window 810 is shown displaying various test paths, and the user is shown selecting one of the test paths (Underlined).
  • Pop-up menu 830 is shown generated upon an appropriate user action (such as right-clicking on the selected test path). The user may generate test cases by selecting the option entitled/Define TC as described below with reference to Figure 8B.
  • Figure 8B is a display screen (generated in response to selection of 'Define TC option in
  • Figure 8A illustrating the manner in which a user may be provided a convenient interface for specifying various input/output variables corresponding to a test case.
  • columns 861, 862, and 863 respectively note the node name in a test path, the input/output variable name and variable type. Based on the information, the input/output values provided are shown in column 864 and 865.
  • test case may be accepted and saved.
  • the saving of the test case may be reflected by an additional entry below the already present test case 812.
  • a user may edit a test case displayed in area 810 by clicking on the test case.
  • test cases 8A and 8B Using the interface of Figures 8A and 8B, a user may specify as many test cases as desired for each test path.
  • the input variables of a test case may be provided to an application by any convenient user interface (including just manually).
  • the approach(es) and user interfaces described above can be used to generate test cases for an application.
  • test cases can be generated automatically.
  • user may edit or modify the input/output values of variables by selecting the test case in area 810 and with user interaction 830
  • test path generation from flow graph (Section 5) and generation of test cases from Test paths (Section 6)
  • appropriate level of test case optimization and test stop criteria can be defined.
  • the optimization of test cases can be derived based on the hierarchy of the flow graph, The loop counts, traversal of all nodes, traversal of all decision blocks in the flow graph, traversal of unique paths in the flow graphs, usage of formal techniques (equivalence, boundary etc .) .
  • the test coverage can be computed based on the paths actually traversed and test completion criteria can be achieved. The implementation of generation of optimal test cases, computation of such test coverage and test completion criteria will be apparent to one skilled in the relevant arts by reading the disclosure provided herein.
  • software instructions are implemented in computer system 100 to provide an editor (or editors) using which a user generates use cases (e.g., as in Figure 3) and flow-graphs (Figure 4B).
  • the software instructions may keep track of the various use cases specified and provide a suitable interface by which the user can specify flow-graphs. For example, the use cases for which test cases are already generated may be shown in one color
  • the software instructions need to store appropriate data representing the nodes and connections, which together define the topology of the flow-graph for the use case.
  • Data structures such as linked lists and/or arrays (with appropriate pointers) may be used to represent the topology.
  • the data structures may need to further enable storing of various I/O variable names associated with each node for each path.
  • test paths need to be designed to traverse the flow-graph to determine various test paths.
  • the implementation of storing of topology, traversal and determination of test paths will be apparent to one skilled in the relevant arts by reading the disclosure provided herein.
  • the test paths also may be stored in appropriate data structures.
  • the software instructions may further be designed to enable a user to specify various input variables associated with each test path to generate a test case.
  • the test cases thus generated may be used for testing the application. While the above description is provided with reference to computer system 100 operating as a stand-alone system, it should be understood that test generation tools may be implemented using multiple systems. For example, a network- centric architecture depicted in Figure 9 may be used.
  • server system 950 (which may be implemented using a web server andor an application server, widely available in the market place) provides the various screens of Figures 3, 4A, 4B, 5, 6A, 6B, 7A, 7B, 8A and 8B in the form of web pages on network 940 (e.g., implemented using TCP/IP) to user system 910 (or 920), and the user provides the appropriate inputs, as described above, to generate test cases.
  • network 940 e.g., implemented using TCP/IP
  • data is transferred between server system 950 and user system 910 representing the various screens and data exchanged.
  • various features of the present invention can be implemented on standalone systems as well as networked systems.

Abstract

A user may provide the specification of an application in the form of flow-graphs (230) to a computer system. The computer system examines (traverses) (250) the flow graph to generate test paths. The user may then specify different input value combinations associated with each test path to generate corresponding number of test cases (270). By using such an approach the applications may potentially be tested exhaustively.

Description

GENERATING TEST CASES
Background of the Invention Field of the Invention The present invention relates to testing of applications, and more specifically to a method and apparatus for generating test cases used to test an application.
Related Art
Testing generally refers to evaluation of an application (designed to perform a specific task) to determine conformance to requirement, the performance and/or reliability of the application. As used in the present application, an application may be implemented in the form of a combination of one or more of hardware, software and firmware.
A typical testing transaction entails providing a set of input values to an application and determining whether the application behaves in a desired fashion. Often, in applications such as graphical user interfaces implemented using software, the input values of a set are provided in sequence, and the corresponding output/ behavior is verified against the expected output/ behavior. The set of input/output values for a single transaction is referred to as a test case.
One challenge often encountered in application development is generation of test cases, generally to test the application exhaustively. In oiie approach, a test designer examines the specifications and/or uses the application to manually and intuitively identify the test cases. The identified test cases are then used to test the application (or various versions thereof as the application evolves with fewer problems/bugs and/or enhanced features).
One problem with such an approach is that a test designer may unintentionally miss identifying (or including) test cases, which could identify possible problems/bugs with the application implementation. Often such problems are uncovered later when the application is deployed on a wide-scale, with users using portions of the application not tested exhaustively. At least to avoid such unintentional omission of useful test cases, an improved approach to generate test cases is desirable. Generating exhaustive test cases systematically, to achieve required test coverage remains to be problem faced by test designers.
Brief Description of the Drawings
The present invention will be described with reference to the following accompanying drawings:
Figure (Fig.)l is a block diagram illustrating an example system in which the present invention can be implemented.
Figure 2 is flowchart illustrating an example method using which test cases may be generated according to an aspect of present invention. Figure 3 is a display screen illustrating an example application using use cases for which test cases can be generated according to various aspects of the present invention.
Figure 4A illustrates use case diagram of 'Edit Proj ect' use case implemented according to an aspect of present invention.
Figure 4B is a display screen illustrating the manner in which 'Edit Project1 use case is represented using a flow-graph in an embodiment of present invention.
Figure 5 is a display screen which, depicts the test paths generated for 'Edit Project' in an embodiment of present invention.
Figure 6A is a display screen illustrating the manner in which variables (used later for defining test cases) of all the nodes in 'Edit Proj ect' use case flow-graph may be specified in an embodiment of present invention.
Figure 6B is a pop-up screen illustrating the manner in which variables and type may be specified for each node in an embodiment of present invention.
Figure 7 A is a display screen illustrating the manner in which input/output (I/O)' variables for each node may be specified in an embodiment of present invention. Figure 7B is a window 780 illustrating the manner in which input/output variables, type and values for each node may be specified in an embodiment of present invention.
Figure 8A is a display screen illustrating the manner in which test cases may be generated in an embodiment of the present invention.
Figure 8B is a display screen illustrating the manner in which a user may be provided a convenient interface for specifying various input variables corresponding to a test case according to an aspect of present invention. Figure 9 is a network centric architecture illustrating implementation of several aspects of the present invention.
In the drawings, like reference numbers generally indicate identical, functionally similar, and or structurally similar elements. The drawing in which an element first appears is indicated by the leftmost digit(s) in the corresponding reference number.
Detailed Description of the Preferred Embodiments
1. Overview An aspect of the present invention examines a flow-graph representing the specification of an application, and generates test cases by traversing the flow graph. As the flow-graph permits flexibility such as representation of loops in an application flow/use, application flow/use can be effectively specified by flow-graphs. By traversing the flow-graph, test paths covering potentially all the paths (and all input combinations) of the applications can be systematically generated.
Several aspects of the invention are described below with reference to examples for illustration. It should be understood that numerous specific details, relationships, and methods are set forth to provide a full understanding of the invention. One skilled in the relevant art, however, will readily recognize that the invention can be practiced without one or more of the specific details, or with other methods, etc. In other instances, well-known structures or operations are not shown in detail to avoid obscuring the invention.
2. Example System Figure 1 is a block diagram of computer system 100 illustrating an example system in which various aspects of the present invention can be implemented. Computer system 100 may contain one or more processors such as central processing unit (CPU) 110, random access memory (RAM) 120, secondary memory 130, graphics controller 160, display unit 170, network interface 180, and input interface 190. All the components except display unit 170 may communicate with each other over communication path 150, which may contain several buses as is well known in the relevant arts. The components of Figure 1 are described below in further detail.
CPU 110 may execute instructions stored in RAM 120 to provide several features of the present invention. CPU 110 may contain multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, CPU 110 may contain only a single processing unit. RAM 120 may receive instructions from secondary memory 130 using communication path 150. Data representing test cases of various paths of an application may be stored and retrieved from secondary memory 130 (and/or RAM 120) during the execution of the instructions.
Graphics controller 160 generates display signals (e.g., in RGB format) to display unit 170 based on data/instructions received from CPU 110. Display unit 170 contains a display screen to display the images defined by the display signals. Input interface 190 may correspond to a key-board and/or mouse, and generally enables a user to provide inputs. Network interface 180 enables some of the inputs (and outputs) to be provided on a network. Display unit 170, input interface 190 and network interface 180 may be used to enable a user to generate test cases for applications. Display unit 170, input interface 190 and network interface 180 may be implemented in a known way.
Secondary memory 130 may contain hard drive 135, flashmemory 136 andremovable storage drive 137. Secondary memory 130 may store the data (e.g., use-cases, flow-graphs, test cases, described in detailed in sections below) and software instructions (e.g., to generate test cases), which enable computer system 100 to provide several features in accordance with the present invention. Some or all of the data and instructions may be provided on removable storage unit 140, and the data and instructions may be read and provided by removable storage drive 137 to CPU 110. Floppy drive, magnetic tape drive, CD-ROM drive, DVD Drive, Flash memory, removable memory chip (PCMCIA Card, EPROM) are examples of such removable storage drive 137.
Removable storage unit 140 may be implemented using medium and storage format compatible with removable storage drive 137 such that removable storage drive 137 can read the data and instructions. Thus, removable storage unit 140 includes a computer readable storage medium having stordd therein computer software and/or data. An embodiment of the present invention is implemented using software running (that is, executing) in computer system 100.
In this document, the term "computer program product" is used to generally refer to removable storage unit 140 or hard disk installed in hard drive 135. These computer program products are means for providing software to computer system 100. CPU 110 may retrieve the software instructions, and execute the instructions to provide various features of the present invention as described below in detail.
3. Method
Figure 2 is flowchart illustrating an example method using which test cases may be generated according to an aspect of present invention. The description is provided with reference to Figure 1 merely for illustration. Various aspects of the present invention can be implemented in other systems as well. The flowchart starts in step 201, in which control immediately passes to step 210.
In step 210, computer system 100 may enable a user to specify use cases associated with high level functionalities of an application sought to be implemented. In step 220, computer system enables the user to expand higher level use cases into one or more lower level use cases. Use cases are described in further detail in a book entitled, "Applying Use Cases", ISBN Number: 0-201-30981-5, by Gei Schneider and Jason P. Winters and "Object -Oriented Software Engineering A Use Case Driven Approach", ISBN Number: 0-201 -54435-0, by Ivar Jacobson et al. Example high level functionalities and lower level use cases are illustrated in further detail with an example in sections below.
In step 230, computer system 100 may enable the user to generate a flow graph representing each use case. A flow-graph generally contains several nodes connected by edges. A transition from one node to another generally occurs on an event (including providing an appropriate input(s) of a test case, generating a corresponding output(s), etc.). Each node may be viewed as realizing a behavior of functionality as in specification for a corresponding portion of the application,, and then control is transferred, to a next.ngde asspecified in the flow graph. An example approach to represent a flow-graph is described below in further detail.
Each node may represent an atomic operation, a use case, a branch or a junction of multiple paths. A use case at a node can in turn be defined in further detail with lower level nodes and edges. The recursive definition of nodes representing use cases can continue until none of the nodes in a flow-graph at that level represents a.use case. In addition, loops in a flow graph enable repeated invocation of use cases and atomic operations. In conjunction with branches, loops generally provide a mechanism to implement complex functionalities.
In step 250, computer system 100 may identify various paths in the flow graphs by traversing the flow-graphs. The paths can potentially traverse a single use case or multiple use cases, potentially at different levels/granularity of definition. In an embodiment, a set of test paths are generated to thoroughly test all paths of a use case, and the tested use case is then used as a single node ('black box') while testing the application at a higher level of granularity. Such an approach enables the overall number of test cases to be minimized while still potentially exhaustively testing all paths of an application in structured and bottom-up manner.
In step 270, computer system 100 may enable a user to provide inputs and outputs to define the test cases for each path. Various type of inputs and outputs may be provided according to their data types, and various approaches may be employed in generating test cases based on the user inputs. In general, the test cases need to be designed to ensure an application operates as desired for correct input, and yet handles unexpected inputs gracefully.
Thus, in an embodiment, a user specifies the type of input(s)/output(s) (e.g., integer, text, character) and the range of permissible/correct input(s)/output(s), and computer system 100 may automatically generate test cases according to the formal testing techniques (boundary-value analysis, equivalence portioning etc.). Designing such test cases will be apparent to one skilled in the relevant arts by reading the disclosure provided herein. In step 290, the test cases thus generated may be used to test the application. Such testing may be performed manually or using an appropriate automation technique (such as generating test scripts which automatically feed the correct sequence of inputs to an application and verifying the actual outputs against the expected outputs while testing). Such testing techniques are not described in the present application, as not being necessary for an understanding of the present invention. Control then passes to step 299, in which the method ends.
Thus, using the approaches of above, test cases can be generated for various applications. It may be appreciated that the flow-graphs are described as being described associated with use cases merely to integrate the approach into some common deign methodology, however various aspects of the present invention can be implemented without use of such use cases. The manner in which the approach(es) of above can be used is described in further detail with reference to an example application below.
4. Example Application
Figure 3 is a display screen illustrating an example application using use cases for which test cases can be generated according to various aspects of the present invention. The display screen is shown containing two windows 305 and 315. Window 305 is shown containing a root directory 'C-TestTool' which in turn is shown containing sub classes-'class diagram', 'use case diagram', Tester(user) and use cases which are shown in window 315.
Window 315 is shown containing user 310 specifying four use cases -Authenticate user320, Open Proj ect330, Edit Proj ect340, and Close Proj ect350. Each use case is assumed to represent a high level functionality. Each use case may be defined in further detail as described with reference to Edit Project340.
Figure 4A depicts the further detailed definition of Edit Prqject340. As may be appreciated the user is shown specifying that Edit Project340 uses the use cases - Save410 and Editwindow415. Editwindow415 is in turn shown using the use cases - Project Settings420, Set compile options425 and Environment settings430. Figure 4B is a display screen containing two windows 440 and 445. Window 440 is shown containing a directory C-TestToolD which in turn is shown containing two use case flow graphs uc 1 EdifProj ectFlowGraph (representing Edit Proj ect) and 'uc2OpenProj ectFlowGraph' . User 310 may use ViewD option in the pop-up menu 495 to view the flow-graph in window 445. The user may continue to edit the flow-graph to represent the specification of the application sought to be tested.
Window 445 depicts the flow-graph for Edit Project340. The flow graph is shown containing starting node 435, decision boxes 450 and 480, junctions 460 and 470, Edit window415, Project Settings420, Set compile options425, Environment settings430, Save 410, and end junction 490. The nodes are shown connected by various edges. The manner in which the nodes and edges together specify application flow is described below in further detail.
Transition to decision box 450 from starting node 435 is shown to occur on event e2.
From decision box 450, control transfers to Edit window415 on event e3, and to end junction 490 on event e4. From Edit window415, control is shown transferring to junction 460 on event e5. From junction 460, control is transferred to Project Settings420, Set compile options425, Environment settings430 respectively upon events e7, e6, and e8.
Control transfers to junction 470 from Project Settings420, Set compile options425, Environment settings430 upon the occurrence of respective events el 4, e9 and el 5. From junction 470, control transfers to Save410 upon occurrence of event elO. From Save410, control transfers to decision box 480 on event el l. From decision box 480, control transfers to end junction 490 on event el3, and to Edit window415 on event el2. Thus, the flow graph of Figure 4B is shown containing a loop between nodes 415, 460, 470, 410 and 480.
The description is continued with reference to the manner in which test paths may be generated from flow-graphs. While a simple example is described for illustration, it should be understood that the approaches are applicable to complex applications as well. 5. Test Paths from Flow Graphs
Continuing with reference to Figure 4B, user 310 may use Create PathsD option in popup menu 495 to create test paths from the flow-graph in window 445. Figure 5 depicts the test paths generated for Edit Project340. As may be appreciated, the nodes in the flow-graph (of Figure 4B) are traversed to determine the various possible paths, which form the test paths.
Window 505 is shown containing seven paths (pi - p7), with each path indicating a test path. Some of the paths are described below for illustration. As may be appreciated, each path is conveniently identified by the event symbols associated with transitions. However, each path may be. identified by the corresponding nodes.
Path 510 is shown depicting test path P 1 = e2, e4. The path corresponds to a situation in which e2 event causes a transition to decision node 450, and e4 event cause control to be transferred to end node 490. As may be appreciated, transfer of control from edit project 340 to end node 490 indicates that user 310 opted to quit editing.
Path 520 is shown depicting P2 = e2, e3, e5, e6, e9, elO, el 1, el3, corresponding to a situation in which user 310 opts to set compile opt, save changes and quit editing. Test path P3 in path 530 is shown equaling (e2, e3, e5, e6, e9, elO, ell, el2), which corresponds to a situation in which a user returns to editing after saving changes related to the environment. The remaining test paths P4 - P7 are described similarly. The description is continued with reference to generating test cases from the test paths.
6. Generating Test Cases from Test Paths It may be appreciated that a user may need to provide different combinations of input/output values to test each identified path exhaustively. Various interfaces may be provided to simplify the task of providing such input/output values, as will be apparent to one skilled in the relevant arts such as functional testing techniques (such as boundary value analysis, equivalence class portioning etc.). Such interfaces are intended to be covered by various aspects of the present invention. According to one approach, a convenient interface is provided enabling a user to specify the input/output variables associated with each test path. A number of test case can then be generated by providing a combination of input/output values for the specified input/output variables. Various combinations of input/output values can be provided to test the path exhaustively. The selection of the combination of input/output values can be provided based on the formal functional testing techiniques. Thus, the description is continued with reference to an example interface, which facilitates the user to specify various input/output variables that may be relevant to testing of each test path.
7. Identifying Input/Output Variables for Each Test Path
Figure 6 A is a display screen illustrating the manner in which variables of all the nodes in Edit Project use case flow-graph may be specified in an embodiment of present invention.
When a flow-graph represents the detail of a use case represented at higher levels granularity, the information on variables may be received from displays/interfaces corresponding to. such higher levels.
In window 610, the user is shown highlighting the use case 'EditProj ectFlowGraph', and causing pop-up menu 495 to be activated/generated by appropriate action (e.g., by clicking the appropriate button of a mouse). The user is then shown selecting Define VariablesD option to cause pop-up window 650 of Figure 6B to be displayed. The user is shown specifying the input/output variables (File Name, Path Name, Drive Name, Compiler, Java Version, Count) in column 654 and variable type in column 658. Similar to type, other attributes for variables can be added. These attribute information can be used to ensure that application can handle these variables gracefully.
The user indicates completion of specifying the input/output variables by clicking on button 651. The input/output data may be caused to be ignored by clicking on button 652. The information thus provided can be used in identifying the input/output variables for each path as described below in further detail with reference to Figure 7A.
Figure 7A is a display screen illustrating the manner in which input/output (I/O) variables for each node (in the context of a test path) may be specified in an embodiment of present invention. The input/output variables thus specified may be consolidated to generate the input/output variables associated with each test path.
In window 760, the flow-graph corresponding to Edit Project340 is shown. The user is shown selecting (by underlining) test path P2 in window 710, and the corresponding path in window 760 is shown high-lighted (bold lines). Pop-up menu 775 is shown when the user clicks on Edit window415. Window 780 may be displayed when the user selects the 'Edit I/O' option.
Window 780 of Figure 7B is shown containing tables 785 and 789 respectively for indicating input variables and output variables associated with the selected path for node 415. Table 785 is shown containing three columns 781, 782, and 783 respectively representing Variables, Type, and default value (Vail). The input and output variables in columns 781 and 786 need to be contained in variables column 654 of Figure 6B, and appropriate checks may be performed before accepting the input variables specified in columns 781 and 786.
The variables specified in columns 654 and 658 may be displayed, and a user may be permitted to dragD each displayed variable as an input variable or output variable in columns 781 and 786. The implementation of several such interfaces will be apparent to one skilled in the relevant arts.
It should be understood that the input/output (I/O) variables for a node may be different in each test path. For example, when a node represents a use case, different paths internal to the use case may be tested in different test paths, and accordingly the I/O variables for a node may be different in different test paths. Thus, an interface such as that described in Figures 7A and
7B may be used to define the I/O variables and values for (all the nodes) in each test path.
The set of input output variables for a test path may be determined by combining the input/output variables specified for various nodes in the context of the test path. The determined set may be used to provide a convenient user interface to define test cases as described below.
8. Defining Input/Output Values for Input/Output Variables
Broadly, several test cases may be generated from each test path by using corresponding number of combinations of input/output values for input/output variables identified for each path. In general, it is desirable to test a test path using different values based on formal testing techniques, some that are contemplated as expected values and some others which may be due to user errors. It is typically desirable to ensure that the application conforms to a desired behavior in case of expected inputs, in addition to gracefully handling (e.g., ignoring, with an appropriate comment) any unexpected input values (e.g., value of a number as an input when the expectation is a character).
Keeping such considerations in mind, various test cases can be generated from the test paths identified above, as described below with reference to Figures 8 A and 8B with examples. It should be further understood that not all input/output variables of a test path may need to be provided for a test case. A user (defining test cases) may determine such sub-set based on an understanding of the application-flow. The description is continued with reference to an example user interface enabling a user to define test cases for test paths.
Figure 8A is a display screen illustrating the manner in which test cases may be generated in an embodiment of the present invention. Window 810 is shown displaying various test paths, and the user is shown selecting one of the test paths (Underlined). Pop-up menu 830 is shown generated upon an appropriate user action (such as right-clicking on the selected test path). The user may generate test cases by selecting the option entitled/Define TC as described below with reference to Figure 8B.
Figure 8B is a display screen (generated in response to selection of 'Define TC option in
Figure 8A) illustrating the manner in which a user may be provided a convenient interface for specifying various input/output variables corresponding to a test case. As may be readily observed, columns 861, 862, and 863 respectively note the node name in a test path, the input/output variable name and variable type. Based on the information, the input/output values provided are shown in column 864 and 865.
Assuming only three of the nodes in the path need to provided input/output variables and that the three nodes respectively have 3 (File Name, Path Name and Drive Name, as defined above with reference to Figure 6B), 3, and 1 input variables, the corresponding 7 input variables may be provided. Once the user clicks on button 871 , the test case may be accepted and saved. The saving of the test case may be reflected by an additional entry below the already present test case 812. A user may edit a test case displayed in area 810 by clicking on the test case.
Using the interface of Figures 8A and 8B, a user may specify as many test cases as desired for each test path. The input variables of a test case may be provided to an application by any convenient user interface (including just manually). Thus, the approach(es) and user interfaces described above can be used to generate test cases for an application.
For a test path, using formal testing techniques on the nodes, test cases can be generated automatically. In such automatically generated test cases, user may edit or modify the input/output values of variables by selecting the test case in area 810 and with user interaction 830
In general, the approach(es) described above may be implemented in various environments as will be apparent to one skilled in the relevant arts by reading the disclosure provided herein. Some general considerations in an example embodiment are noted below briefly.
8. Test Case Optimization and Test Completion Criteria
Using the test path generation from flow graph (Section 5) and generation of test cases from Test paths (Section 6), appropriate level of test case optimization and test stop criteria can be defined. The optimization of test cases can be derived based on the hierarchy of the flow graph, The loop counts, traversal of all nodes, traversal of all decision blocks in the flow graph, traversal of unique paths in the flow graphs, usage of formal techniques (equivalence, boundary etc .) . Similarly the test coverage can be computed based on the paths actually traversed and test completion criteria can be achieved. The implementation of generation of optimal test cases, computation of such test coverage and test completion criteria will be apparent to one skilled in the relevant arts by reading the disclosure provided herein.
9. Implementation Considerations
In one embodiment, software instructions are implemented in computer system 100 to provide an editor (or editors) using which a user generates use cases (e.g., as in Figure 3) and flow-graphs (Figure 4B). The software instructions may keep track of the various use cases specified and provide a suitable interface by which the user can specify flow-graphs. For example, the use cases for which test cases are already generated may be shown in one color
(e.g., green), and the remaining use cases in another color (e.g., red) in area 440 of Figure 4B.
As the user continues to edit define a flow-graph for a use case, the software instructions need to store appropriate data representing the nodes and connections, which together define the topology of the flow-graph for the use case. Data structures such as linked lists and/or arrays (with appropriate pointers) may be used to represent the topology. The data structures may need to further enable storing of various I/O variable names associated with each node for each path.
The software instructions need to be designed to traverse the flow-graph to determine various test paths. The implementation of storing of topology, traversal and determination of test paths will be apparent to one skilled in the relevant arts by reading the disclosure provided herein. The test paths also may be stored in appropriate data structures.
The software instructions may further be designed to enable a user to specify various input variables associated with each test path to generate a test case. The test cases thus generated may be used for testing the application. While the above description is provided with reference to computer system 100 operating as a stand-alone system, it should be understood that test generation tools may be implemented using multiple systems. For example, a network- centric architecture depicted in Figure 9 may be used. In such an environment, server system 950 (which may be implemented using a web server andor an application server, widely available in the market place) provides the various screens of Figures 3, 4A, 4B, 5, 6A, 6B, 7A, 7B, 8A and 8B in the form of web pages on network 940 (e.g., implemented using TCP/IP) to user system 910 (or 920), and the user provides the appropriate inputs, as described above, to generate test cases. In general, data is transferred between server system 950 and user system 910 representing the various screens and data exchanged. Thus, various features of the present invention can be implemented on standalone systems as well as networked systems.
10. Conclusion
While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation.
Thus, the breadth and scope of the present invention should not be limited by any of the above described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims

What Is Claimed Is: 1. A method of generating a plurality of test cases for an application, wherein said plurality of test cases are designed to test said application, said method comprising: enabling, using a computer system, a user to provide a specification of one of a plurality of portions of said application in the form of a flow graph, wherein said flow graph contains at least one loop; and generating, in said computer system, said plurality of test cases based on said flow graph.
2. The method of claim 1 , further comprising: traversing, in said computer system, said flow graph to generate a plurality of test paths, wherein said plurality of test cases are generated based on said plurality of test paths.
3. The method of claim 2, further comprising: enabling said user to specify a plurality of input/output variables associated with each of said plurality of test paths; and displaying said plurality input/output variables associated with each of said plurality of test paths, wherein each of said plurality of combination of input/output values is provided for a corresponding one of said plurality of input/output variables.
4. The method of claim 3, further comprising: enabling said user to provide a plurality of combinations of input/output values associated with each of said plurality of test paths, wherein said generating generates each of said plurality of combination of input/output values generates a corresponding one of said plurality of test cases.
5. The method of claim 4, wherein said enabling said user to specify comprises enabling said user to input variables associated with each node for each of said plurality of test paths.
6. The method of claim 2, wherein said flow graph comprises a plurality of nodes and a plurality of edges, each pair of said plurality of nodes being connected by a corresponding one of said plurality of edges to define a topology, further comprising storing data representing said topology, wherein said traversing comprises examining said data representing said topology.
7. The method of claim 6, further comprising: enabling, using said computer system, said user to indicate a plurality of use cases, each of said plurality of use cases corresponding to a corresponding one of said portions of said application; and enabling, using said computer system, said user to provide a plurality of flow-graphs, wherein each of said plurality of flow-graphs corresponds to a corresponding one of said plurality of use cases, wherein.a corresponding plurality of test cases are generated for each of said plurality of use cases.
8. The method of claim 7, wherein a first node, a second node, and a third node respectively comprise a use case, a decision box, a junction node, wherein said first node, said second node and said third node are comprised in said plurality of nodes, wherein another flow graph is defined for said use case represented by said first node, wherein said junction node receives multiple edges, and said decision box indicates multiple edges on which control is transferred depending on the occurrence of corresponding events.
9. A method of generating a plurality of test cases for an application, wherein said plurality of test cases are designed to test said application, said method comprising: providing a specification of a portion of said application in the form of a flow graph, wherein said flow graph contains at least one loop; and receiving data representing a plurality of test paths in said flow graph, wherein said plurality of paths are used to generate said plurality of test cases.
10. The method of claim 9, further comprising mdicating a use case and specifying said flow-graph as corresponding to said use case such that said generating of said plurality of test cases is integrated with a design phase of said application.
11. The method of claim 9, further comprising: displaying said plurality of test paths; enabling a user to specify a plurality of input variables associated with each of said plurality of test paths; and sending data representing said plurality of input variables.
12. The method of claim 9, further comprising: receiving data representing a plurality of input variables associated with each of said plurality of test paths; and sending a combination of input values corresponding to said plurality of input variables to generate a test.
13. The method of claim 9, wherein said providing and said receiving are performed from a user system and said data representing said plurality of test paths is received from a server system.
14. A method for generating plurality of test cases for an application/system wherein said plurality of test cases are designed to test said application/system, said method comprising: enabling using a computer system, a user to provide a specification of one of a plurality of portions of said application in the form Use cases; and generating in said computer system, a flow graph from said Use cases.
15. The method of 14, further comprising: extending the hierarchy of said Use cases to said flow graph generated.
16. Themethodof claim 15, wherein the optimization of generation of said plurality of test cases based on said hierarchy of said Use cases/said flow graph.
17. The method of claim 14, generating in said computer system, said plurality of test cases based on said flow graph.
18. The method of claim 17, traversing, in said computer system, said flow graph to generate a plurality of test paths, wherein said plurality of test cases are generated based on said - ιy - plurality of test paths.
19. The method of claim 18, enabling said user to provide a plurality of combinations of input/output values associated with each of said plurality of test paths, wherein said generating generates each of said plurality of combination of input/output values generates a corresponding one of said plurality of test cases.
20. The method of claim 19, further comprising: enabling said user to specify a plurality of input/output variables associated with each of said plurality of test paths; and displaying said plurality input/output variables associated with each of said plurality of test paths, wherein each of said plurality of combination of input output values is provided for a corresponding one of said plurality of input/output variables.
21. The methodof claim 18, wherein said flow graph comprises a plurality of nodes and a plurality of edges, each pair of said plurality of nodes being connected by a corresponding one of said plurality of edges to define a topology, further comprising storing data representing said topology, wherein said traversing comprises examining said data representing said topology.
22. The method of claim 17, further comprising of automatic generation of test scripts and executables for the plurality of said test cases.
23. The method of claim 15, wherein said enabling said user to specify comprises enabling said user to specify said plurality of input/output variables and attributes of variables associated with each node for each of said flow graph.
24. The method of claim 23, wherein computer generation of said plurality of test cases is based on functional testing techniques.
25. The method of claim 24, further comprising optimizing said plurality of test cases by analyzing said plurality of test cases while generating a plurality of new test cases based on said functional testing techniques. - zυ -
26. The method of claim 24, wherein test completion criteria is defined based on said plurality of test cases generated based on said functional testing techniques based on a boundary value analysis.
27. The method of claim 24, wherein said test completion criteria is defined based on said plurality of test cases generated based on said functional testing techniques based on a equivalence parhΕoning.
28. The method of claim 21 , wherein said test completion criteria is defined based on said hierarchy of said flow graph.
29. The method of claim 21 , wherein said test completion criteria is defined based on traversal of Unique paths, branches, nodes, loops, decision boxes and junctions in said flow graph.
30. The method of claim 17, wherein generating a plurality of test cases for said application, wherein said plurality of test cases are designed to test said application, said method comprising: providing a specification of a portion of said application in the form of said flow graph, wherein said flow graph contains at least one loop; and receiving data representing a plurality of test paths in said flow graph, wherein said plurality of test paths are used to generate said plurality of test cases.
31. The method of claim 30, further comprising indicating said use case and specifying said flow-graph as corresponding to said use case such that said generating of said plurality of test cases is integrated with a design phase of said application.
32. The method of claim 30, further comprising: displaying said plurality of test paths; enabling said user to specify said plurality of input/output variables associated with each of said plurality of test paths; and sending data representing said plurality of input/output variables.
33. The method of claim 30, further comprising: receiving data representing said plurality of input/output variables associated with each of said plurality of test paths; and sending a combination of input values corresponding to said plurality of input/output variables to generate a test.
34. The method of claim 30, wherein said providing and said receiving are performed from a user system and said data representing said plurality of test paths is received from a server system.
35. A computer readable medium carrying one or more sequences of instructions for causing computer system to generate a plurality of test cases for an application, wherein said plurality of test cases are designed to test said application, wherein execution of said one or more sequences of instructions by one or more processors contained in said computer system causes said one or more processors to perform the actions of: enabling, using said computer system, a user to provide a specification of one of a plurality of portions of said application in the form of a flow graph, wherein said flow graph contains at least one loop; and generating, in said computer system, said plurality of test cases based on said flow graph.
36. The computer readable medium of claim 35, further comprising: traversing, in said computer system, said flow graph to generate a plurality of test paths, wherein said plurality of test cases are generated based on said plurality of test paths.
37. The computer readable medium of claim 36, further comprising: enabling said user to specify a plurality of input/output variables associated with each of said plurality of test paths; and displaying said plurality input/output variables associated with each of said plurality of test paths, wherein each of said plurality of combination of input/output values is provided for a corresponding one of said plurality of input/output variables.
38. The computer readable medium of claim 37, further comprising: enabling said user to provide a plurality of combinations of input/output values associated with each of said plurality of test paths, wherein said generating generates each of said plurality of combination of input/output values generates a corresponding one of said plurality of test cases.
39. A computer readable medium carrying one or more sequences of instructions for causing a computer system to generate plurality of test cases for an application system, wherein said plurality of test cases are designed to test said application system, wherein execution of said one or more sequences of instructions by one or more processors contained in said computer system causes said one or more processors to perform the actions of: enabling using said computer system, a user to provide a specification of one of a plurality of portions of said application in the form Use cases; and generating in said' computer system, a flow graph from said Use cases.
40. The computer readable medium of claim 39, further comprising: extending the hierarchy of said Use cases to said flow graph generated.
41. The computer readable medium of claim 40, wherein the optimization of generation of said plurality of test cases is based on said hierarchy of said Use cases/said flow graph.
42. The computer readable medium of claim 39, generating in said computer system, said plurality of test cases based on said flow graph.
PCT/IN2004/000145 2003-05-29 2004-05-28 Generating test cases WO2004107087A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US47388303P 2003-05-29 2003-05-29
US60/473,883 2003-05-29

Publications (2)

Publication Number Publication Date
WO2004107087A2 true WO2004107087A2 (en) 2004-12-09
WO2004107087A3 WO2004107087A3 (en) 2005-03-31

Family

ID=33490666

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2004/000145 WO2004107087A2 (en) 2003-05-29 2004-05-28 Generating test cases

Country Status (1)

Country Link
WO (1) WO2004107087A2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7471991B2 (en) 2004-05-06 2008-12-30 Smp Logic Systems Llc Methods, systems, and software program for validation and monitoring of pharmaceutical manufacturing processes
US7673261B2 (en) 2007-02-07 2010-03-02 International Business Machines Corporation Systematic compliance checking of a process
US20110016451A1 (en) * 2009-01-15 2011-01-20 Infosys Technologies Limited Method and system for generating test cases for a software application
US8491839B2 (en) 2004-05-06 2013-07-23 SMP Logic Systems, LLC Manufacturing execution systems (MES)
WO2014088144A1 (en) * 2012-12-05 2014-06-12 경북대학교 산학협력단 Function test device based on unit test case reuse and function test method therefor
CN111124928A (en) * 2019-12-27 2020-05-08 成都康赛信息技术有限公司 Test case design method based on data
CN111158656A (en) * 2019-12-31 2020-05-15 中国银行股份有限公司 Method and device for generating test codes based on fruit tree method
CN114721932A (en) * 2021-01-06 2022-07-08 腾讯科技(深圳)有限公司 Data processing method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5513118A (en) * 1993-08-25 1996-04-30 Nec Usa, Inc. High level synthesis for partial scan testing
US5808920A (en) * 1996-03-19 1998-09-15 Digital Lightwave, Inc. Communications line test apparatus with an improved graphical user interface
US6292915B1 (en) * 1997-01-22 2001-09-18 Matsushita Electric Industrial Co., Ltd. Method of design for testability and method of test sequence generation
US6330692B1 (en) * 1998-02-18 2001-12-11 Fujitsu Limited Method of determining the route to be tested in a load module test
US6401220B1 (en) * 1998-08-21 2002-06-04 National Instruments Corporation Test executive system and method including step types for improved configurability
US6460068B1 (en) * 1998-05-01 2002-10-01 International Business Machines Corporation Fractal process scheduler for testing applications in a distributed processing system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5513118A (en) * 1993-08-25 1996-04-30 Nec Usa, Inc. High level synthesis for partial scan testing
US5808920A (en) * 1996-03-19 1998-09-15 Digital Lightwave, Inc. Communications line test apparatus with an improved graphical user interface
US6292915B1 (en) * 1997-01-22 2001-09-18 Matsushita Electric Industrial Co., Ltd. Method of design for testability and method of test sequence generation
US6330692B1 (en) * 1998-02-18 2001-12-11 Fujitsu Limited Method of determining the route to be tested in a load module test
US6460068B1 (en) * 1998-05-01 2002-10-01 International Business Machines Corporation Fractal process scheduler for testing applications in a distributed processing system
US6401220B1 (en) * 1998-08-21 2002-06-04 National Instruments Corporation Test executive system and method including step types for improved configurability

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9304509B2 (en) 2004-05-06 2016-04-05 Smp Logic Systems Llc Monitoring liquid mixing systems and water based systems in pharmaceutical manufacturing
USRE43527E1 (en) 2004-05-06 2012-07-17 Smp Logic Systems Llc Methods, systems, and software program for validation and monitoring of pharmaceutical manufacturing processes
US9008815B2 (en) 2004-05-06 2015-04-14 Smp Logic Systems Apparatus for monitoring pharmaceutical manufacturing processes
US9092028B2 (en) 2004-05-06 2015-07-28 Smp Logic Systems Llc Monitoring tablet press systems and powder blending systems in pharmaceutical manufacturing
US8491839B2 (en) 2004-05-06 2013-07-23 SMP Logic Systems, LLC Manufacturing execution systems (MES)
US8591811B2 (en) 2004-05-06 2013-11-26 Smp Logic Systems Llc Monitoring acceptance criteria of pharmaceutical manufacturing processes
US8660680B2 (en) 2004-05-06 2014-02-25 SMR Logic Systems LLC Methods of monitoring acceptance criteria of pharmaceutical manufacturing processes
US7471991B2 (en) 2004-05-06 2008-12-30 Smp Logic Systems Llc Methods, systems, and software program for validation and monitoring of pharmaceutical manufacturing processes
US9195228B2 (en) 2004-05-06 2015-11-24 Smp Logic Systems Monitoring pharmaceutical manufacturing processes
US7673261B2 (en) 2007-02-07 2010-03-02 International Business Machines Corporation Systematic compliance checking of a process
US20110016451A1 (en) * 2009-01-15 2011-01-20 Infosys Technologies Limited Method and system for generating test cases for a software application
US8869111B2 (en) * 2009-01-15 2014-10-21 Infosys Limited Method and system for generating test cases for a software application
WO2014088144A1 (en) * 2012-12-05 2014-06-12 경북대학교 산학협력단 Function test device based on unit test case reuse and function test method therefor
CN111124928A (en) * 2019-12-27 2020-05-08 成都康赛信息技术有限公司 Test case design method based on data
CN111158656A (en) * 2019-12-31 2020-05-15 中国银行股份有限公司 Method and device for generating test codes based on fruit tree method
CN111158656B (en) * 2019-12-31 2023-05-02 中国银行股份有限公司 Test code generation method and device based on fruit tree method
CN114721932B (en) * 2021-01-06 2024-04-09 腾讯科技(深圳)有限公司 Data processing method, device, equipment and storage medium
CN114721932A (en) * 2021-01-06 2022-07-08 腾讯科技(深圳)有限公司 Data processing method, device, equipment and storage medium

Also Published As

Publication number Publication date
WO2004107087A3 (en) 2005-03-31

Similar Documents

Publication Publication Date Title
US9983852B2 (en) Graphical specification and constraint language for developing programs for hardware implementation and use
US6321369B1 (en) Interface for compiling project variations in electronic design environments
US7316000B2 (en) Interactive agent for a topological multi-tier business application composer
US9424398B1 (en) Workflows for defining a sequence for an analytical instrument
US8225288B2 (en) Model-based testing using branches, decisions, and options
US7134081B2 (en) Method and apparatus for controlling an instrumentation system
US7516438B1 (en) Methods and apparatus for tracking problems using a problem tracking system
NL2010546C2 (en) Method and apparatus for automatically generating a test script for a graphical user interface.
US8151244B2 (en) Merging graphical programs based on an ancestor graphical program
US7913225B2 (en) Error handling using declarative constraints in a graphical modeling tool
CA2297901C (en) System and method for generating year 2000 test cases
US5907698A (en) Method and apparatus for characterizing static and dynamic operation of an architectural system
US6212666B1 (en) Graphic representation of circuit analysis for circuit design and timing performance evaluation
US20150301806A1 (en) Tentative program code in an editor
US7146572B2 (en) System and method for configuring database result logging for a test executive sequence
US9524366B1 (en) Annotations to identify objects in design generated by high level synthesis (HLS)
US7286953B1 (en) Device testing automation utility and method
WO2004107087A2 (en) Generating test cases
CN116627418A (en) Multi-level form interface visual generation method and device based on recursion algorithm
Kucukcakar et al. Matisse: an architectural design tool for commodity ICs
US20030041311A1 (en) Topological multi-tier business application composer
US10222944B1 (en) Embedding user interface elements in documents containing code
CN109683883A (en) A kind of Flow Chart Design method and device
Kent Test automation: From record/playback to frameworks
Stadie et al. Closing gaps between capture and replay: Model-based gui testing

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
122 Ep: pct application non-entry in european phase