WO2014174362A1 - Feature model based testing - Google Patents

Feature model based testing Download PDF

Info

Publication number
WO2014174362A1
WO2014174362A1 PCT/IB2014/000605 IB2014000605W WO2014174362A1 WO 2014174362 A1 WO2014174362 A1 WO 2014174362A1 IB 2014000605 W IB2014000605 W IB 2014000605W WO 2014174362 A1 WO2014174362 A1 WO 2014174362A1
Authority
WO
WIPO (PCT)
Prior art keywords
features
test cases
mpfm
perspectives
variation
Prior art date
Application number
PCT/IB2014/000605
Other languages
French (fr)
Inventor
Sachin Patel
Priya Gupta
Vipul Arvind SHAM
Sampatkumar N. DIXIT
Original Assignee
Tata Consultancy Services Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tata Consultancy Services Limited filed Critical Tata Consultancy Services Limited
Publication of WO2014174362A1 publication Critical patent/WO2014174362A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Definitions

  • the present subject matter relates, in general, to testing of software product(s) and particularly to testing of software product(s) based on a Feature Model (FM).
  • FM Feature Model
  • SPL Software Product Lines
  • Feature Model which is a representation of the features of the products of the SPL, is generated to determine the valid feature combinations for testing.
  • Fig. la illustrates a feature map of a Multi-Perspective Feature
  • MPFM Model
  • Fig. lb illustrates components of a software testing system, in accordance with an embodiment of the present subject matter.
  • FIG. 2 illustrates a method to perform software testing based on a
  • MPFM Multiple Perspective Feature Model
  • MPFM Multiple Perspective Feature Model
  • Feature Modeling is one popular technique to model variability in the SPL.
  • Feature Modeling is a tool to develop Feature Models (FMs) where the developed FM's represent a product in a SPL by means of features.
  • FMs Feature Models
  • inclusion of feature(s) in the SPL results in interactions among the features, thus varying the behavior of other features.
  • developing a FM for complex systems, such as the entire SPL is a tedious and a cumbersome task. obtain the MPFM.
  • the Feature Model (FM) associated with a SPL is generated, wherein the FM is representative of
  • the generation of a FM may be achieved by means of tools existing in the state of the art. For example, Feature modeler.
  • the generated FM associated with the SPL may have variability information scattered all over the FM. Therefore, Separation of Concerns (SoC) in the FM associated with the SPL is performed, in order to modularize the FM.
  • SoC Separation of Concerns
  • the SoC in the FM is based on identifying a source of variation with a common cause of variation.
  • the features that are sources of variations may have a common cause of variation.
  • a FM for an internationalized website may include Stock Quote and Gold rate chart as features under a perspective domain.
  • the FM may also include Indian Rupee, Euro and Pound as sub-features of the features Stock Quote and Gold rate chart.
  • the determined common cause of variation in the FM may be identified as perspective.
  • the common cause of variation 'currency' may be identified as a new perspective of the FM; Since the SPL may implement a plurality of such perspectives where multiple new perspectives may be identified base on SoC, the generated FM is referred to as a MPFM.
  • each identified perspective is further modeled to one or more feature.
  • test cases may also be identified based on the perspectives and the features of the MPFM.
  • identification [0013] More often than not, SPLs once developed, are tested to validate the correctness and to ensure that a desired output is generated. This implies that when a new feature is included or changed, it has to be ensured that the new feature works in all configurations in conjugation with all the other variations that each configuration might have.
  • inclusion of a feature in like currency in an internationalized website can affect several other features like Stock Quote, Gold Rate Chart, etc., which are affected by the change in currency.
  • the new feature has to be tested in combination with other features in different browsers like Internet Explorer, Google Chrome, Firefox, different Operating Systems (OS), such as Windows, Linux, Android, and different devices, such as Desktop, Tablet and Smartphone.
  • OS Operating Systems
  • the feature interactions in FM are highly convoluted in a real world SPL and can prove to be a daunting task, where the number of variations and the configurations is large.
  • a Multiple Perspective Feature Model relates to modularization of the FM by performing Separation of Concerns (SoC) in the FM to of test cases is based on perspective selection parameters.
  • SoC Separation of Concerns
  • a MPFM includes perspectives, such as domain, internationalization, architecture and operating environment- and features, such as Weather Update, Celsius, Blogs, Articles, 3-column page layout, 2-column page layout, Internet Explorer, Windows, Linux and Firefox.
  • test case for a user planning on testing Weather Update in Celsius, in a 3-column page layout, on an Internet Explorer Browser, on a device running on a Windows Operating System has to be generated, the test case can be identified as [Weather Update, Celsius, 3-column, Internet Explorer, Windows].
  • Such a selection of test cases based on perspectives and features may allow effective testing of the SPL.
  • the implementation of the described system and methods of the present subject matter may provide an alternative way to model variations and measure test coverage.
  • the present subject matter leverages the basic technology of modularization of redundant features by creating a MPFM, thus creating a model that is modular and is easy to maintain. Further, since the existing method relies on the use of MPFM in testing by generating a pair-wise combination, an exhaustive coverage of possible contexts and notion of complete testing done is available. Furthermore, prioritization of the test cases helps the software testers to sort the features based on their importance and helps the software testers to arrive at a minimum set of tests that provide maximum coverage of variability, thereby reducing the effort.
  • Fig. la illustrates a feature map of a Multi-perspective feature Model
  • the MPFM 100 is a representative of a plurality of perspectives 102-1, 102-2, 102-3 and 102-4 and may be collectively referred to as perspectives 102. Each of these perspectives 102 may be further associated with features 104-1, 104-2,...104-12 and may be collectively referred to as features 104.
  • the features 102 may further include sub-features 106-1, 106-2, 106-3, 106-22 and may be collectively referred to as sub- features (106).
  • the generation of a FM may be achieved by means of tools existing in the state of the art. For example, Feature modeler.
  • the feature map herein mentioned shall be used for the purposes of explanation of Fig lb.
  • Fig. lb illustrates components of a software testing system 152, in accordance with an embodiment of the present subject matter.
  • the software testing system 152 includes processor(s) 154.
  • the processor 154 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions.
  • the processor(s) is to fetch and execute computer-readable instructions stored in the memory.
  • the software testing system 152 may also include a memory 158.
  • the memory 158 may be coupled to the processor 154.
  • the memory 158 can include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non- volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
  • the software testing system 152 may include module(s) 160 and data 162. The modules 160 and the data 162 may be coupled to the processors 154.
  • the modules 160 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types.
  • the modules 160 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulate signals based on operational instructions.
  • the module(s) 160 include a modeling module
  • the data 162 includes modular data 176, test data 178, and other data 180.
  • the other data 180 amongst other things, may serve as a repository for storing data that is processed, received, or generated as a result of the execution of one or more modules in the module(s).
  • the data is shown internal to the software testing system 152, it may be understood that the data 162 can reside in an external repository (not shown in the figure), which may be coupled to the software testing system 152.
  • the software testing system 152 may communicate with the external repository through the interface(s) 156 to obtain information from the data 162.
  • the software- testing system l T 52 ⁇ facilitates software application testing based on Multiple Perspective Feature Model (MPFM).
  • the modeling module 164 of the software testing system 152 is to generate a Feature Model (FM) associated with a Software product Line (SPL), where the generated FM is representative of a plurality of perspectives 102, features 104 and sub-features 106.
  • the generation of the FM may be performed by existing tools available in the state of the art, such as a feature modeler.
  • the FM for an internationalized website (Site X) may contain a domain perspective 102-1.
  • the domain 102-1 can further include features like Stock Quote 104-2 and Gold Rate chart 104-4 and each of these features 104 can include further sub-features 106 like Indian Rupee 106 -3, Dollar 106-4 and Pound 106-5.
  • the FM for a website may contain Operating Environment 102-4 as a perspective.
  • the Operating Environment 102-4 may further include features 106 like browser 104-10, Operating System (OS) 104- 1 1 and device 104-12.
  • Each of these features may include further sub- features 106, such as the browser 104-10 may include sub-features 106 like Internet Explorer (IE) 106-14, Opera 106-16, Firefox 106-15 and different configurations of each the listed sub-features like IE 7, IE 8 and so on.
  • the OS may further include sub-features like Windows 106-17, Linux 106- 18 and Android 106-19.
  • the modularizing module 166 of the software testing system 152 may modularize the generated FM by performing Separation of Concerns (SoC) in the FM.
  • SoC Separation of Concerns
  • the SoC may be performed based on identifying a common cause of variation for a source of variation.
  • the source of variation is considered as a relationship between a source feature and a target feature, such that, the source feature is the feature of the perspective that is implemented with many variations and the target feature is variations of the source feature.
  • the features 104 may be referred to as the source features, while the sub-features 106 may be referred to as the target features.
  • the source of variation here is, the relationship between source features, i.e., Stock Quote 104-2 and the target features i.e., the Rupee 106-3, Dollar 106-4 and Pound 106-5 .
  • another source of variation is the relationship between source features i.e., Gold Rate Chart 104-4and the target features i.e.
  • the software testing system 152 may further modularize the generated FM, by generating 'currency 104-6' to be the common cause of variation.
  • the modularizing module 166 of the software testing system 152 may determine the common cause of variation as a perspective to generate a Multiple Perspective Feature Model (MPFM).
  • MPFM Multiple Perspective Feature Model
  • each perspective 102 of the MPFM 100 includes one or more features 104.
  • the MPFM thus generated is stored in the modular data 176.
  • 'currency 104-6' which is identified as the common cause of variation, is determined as a perspective of the FM for site X by the modularizing module 166.
  • the perspective currency 104- 6 may further be linked to the sub-features the Rupee 106-3, Dollar 106-4 and Pound 106-5.
  • test cases are identified based on utilizing perspectives 102 as parameters and one or more features 104 from the perspectives 102 as values.
  • the validating module 168 may identify test cases and validate the identified test cases and store the validated test cases in the test data 178.
  • the identification of the test cases is based on perspective selection parameters.
  • a MPFM of a SPL consists of perspectives like Operating System, Browser, Protocol, Central Processing Unit (CPU) and Data Base Management Systems (DBMS) and each of these perspectives include one or more features as listed below.
  • the validating module 168 may identify test cases based on utilizing perspectives as parameters and features as values.
  • the various perspectives and features under each of the perspectives may include operating system (Windows, iOS, Android), a browser (Internet Explorer, Firefox), protocol stack (IPv4, IPv6), a processor (Intel, AMD), and a database (MySQL, Sybase, Oracle). 10 possible test cases that may cover all combinations of test cases may be identified and the cases are thus listed below in Table 1
  • test cases once identified may be validated by the validating module 168 of the software testing system 152, where every feature of the identified test case may be compared to features of a product configuration to generate constraints such that only valid test cases are identified.
  • An exact matching of the features of the test cases to features of the product configuration is indicative of a valid test case.
  • test case 1 as listed in Table 1.
  • the test case is identified to be [Windows, Internet Explorer, IPv4, Intel, and MySQL]. Now, each feature of the test case namely, Windows, Internet Explorer, IPv4, Intel, MySQL, is mapped to the feature of the product configuration.
  • test case is indicative of a valid test case and further steps can be taken to ensure that all possible pair-wise combinations of features in the validated test case are tested. However, if the features of the identified test case do not exist in the product configuration, the test case is determined to be invalid and further steps to perform testing of such an invalid test case are aborted. For example, in a situation where the Android Operating System and iOS in the product configuration does not support Internet Explorer browser, the test cases 5 and 7 may be discarded as these test cases are deemed to be invalid.
  • a combinatorial test of the validated test cases may be performed, wherein the combinatorial test includes generation of pair- wise combination of features of the validated test cases to ensure that all possible combinations in the validated test case are tested.
  • the testing module 170 of the software testing system 152 may test feature interactions by generating combinations of the features of the MPFM.
  • the generated combination is a pair-wise combination and the generated combinations are tested by passing each perspective of the MPFM as a parameter and each feature of the perspective of the MPFM is passed as a value.
  • the validated test cases may be prioritized based on statistical parameters.
  • the ranking module 172 of the software testing system 152 may prioritize test cases based on statistical parameters.
  • the statistical parameters include either probability of error and probability of usage.
  • a social networking application might have to be tested in different browsers like Internet Explorer, Firefox, Google Chrome, and different configurations of each of the browsers, devices like tablets, desktop, and the like, different screens with different resolutions, different page layouts, different portals, languages, time zones and so on and so forth. It may not be practically possible to perform testing under all conditions. Under such scenarios, it may be helpful for a software tester to prioritize the test cases.
  • testing based on the probability of usage suggests prioritization of testing those features that are widely used by people of a particular geographic location prior to testing of the less used features,. For example, consider a scenario where a particular social networking application has to be tested in different Operating Systems like Windows Operating System, Blackberry Operating System, Android Operating System and iOS. If this particular application is to be launched in a country where the usage of Windows Operating System and Android Operating System is wide, the features of the application are first tested in Windows Operating System and Android Operating System prior to testing the features of the application on the Blackberry Operating System and iOS.
  • prioritizing based on the probability of error suggests prioritization of testing of those features that have failed to generate the desired output prior to testing of those features that have generated an expected output, based on performance history of the features in past test cases. For example, consider a feature like the ability to open multiple tabs in a browser. If this particular feature has to be tested in different browsers like Internet Explorer, Firefox, Chrome, etc, it is first tested on those browsers where the ability to open multiple tabs has failed repeatedly in the past test cases. [0037] Further, the prioritized test cases are tested to determine if the developed SPL generates the desired output. In one implementation, the testing module 170 may test the prioritized test cases based on combinatorial testing method to determine the success of the developed SPL. [0038] Fig. 2 illustrates a method 200 to generate a Multiple Perspective
  • MPFM Feature Model
  • the order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method, or any alternative methods. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware.
  • the method may be described in the general context of computer executable instructions.
  • computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types.
  • the method may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network.
  • computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
  • steps of the method can be performed by programmed computers.
  • program storage devices for example, digital data storage media, which are machine or computer readable and encode machine-executable or computer-executable programs of instructions, where said instructions perform some or all of the steps of the described method.
  • the program storage devices may be, for example, digital memories, magnetic storage media, such as a magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage media.
  • the embodiments are also intended to cover both communication network and communication devices to perform said steps of the described method.
  • a Feature Model (FM) associated with a Software Product Line (SPL) is generated, wherein the FM is a representative of the features of the SPL.
  • the FM is generated by means of tools existing in the state of the art.
  • the Modeling Module 164 is to generate a FM based on the features of a SPL.
  • SoC Separation of Concerns
  • the SoC is performed by identifying at least one source of variation with a common cause of variation.
  • the source of variation is identified as those features that may be implemented with several variations.
  • the sources of variation that have a common cause of variation are separated out as concerns.
  • the modularizing module 166 may perform SoC in the FM to generate a model that is modular and easy to maintain.
  • the common cause of variation that was identified as a concern is determined to be a perspective. Since several such concerns identified in a real world system for a SPL, several perspectives are determined and the model thus generated is a Multiple Perspective Feature Model (MPFM).
  • MPFM Multiple Perspective Feature Model
  • each perspective of the MPFM further includes at least one feature.
  • the modularizing module 166 may generate a MPFM from a FM associated with a SPL.
  • test cases are identified based on the perspectives and one or more features of the MPFM and the identified test cases, where the identificatiOn-of-the-test-cases-is-based-on-utilizing-perspectives-as-parameters-and features from the perspective are utilized as values.
  • the test cases may be identifies based on perspective selection parameters and the identified test cases are further validated based on comparing the features of the identified test cases to features of the product configuration.

Abstract

The present subject matter discloses a method for performing software testing based on Multiple Perspective Feature model (MPFM). The method includes generating a Feature Model (FM) associated with the Software Product Line, where the FM includes a plurality of features. Further, separation of concerns (SoC) is achieved in the FM based on identifying at least one source of variation with a common cause of variation. Furthermore, the common cause of variation was determined as a perspective of the FM to generate a Multiple Perspective Feature Model (MPFM), where each perspective from amongst a plurality of perspectives of the MPFM includes at least one feature from amongst the plurality of features. Further, the method includes identification of test cases based on the plurality of perspectives and the plurality of features of the MPFM, where the plurality of perspectives are parameters and the plurality of features are values.

Description

FEATURE MODEL BASED TESTING
TECHNICAL FIELD
[0001] The present subject matter relates, in general, to testing of software product(s) and particularly to testing of software product(s) based on a Feature Model (FM).
BACKGROUND
[0002] Software systems are continuously modified during software development process for reasons, such as, correction of errors, addition of new features, porting to new environments, and improvement of performance. The development of Software Product Lines (SPL) are emerging as a viable and important development paradigm allowing companies to realize order-of-magnitude improvements in time to market, cost, productivity, quality, reliability and other business drivers. Further, SPL allows software customization based on a user or an organization's preferences.
[0003] Within SPL, features of the SPL play an important role in specifying the fixed and variable parts of the architectures of product families and configurable systems. Typically, a Feature Model (FM), which is a representation of the features of the products of the SPL, is generated to determine the valid feature combinations for testing.
[0004] Changes made to the software systems are tested to ensure that the software systems behave as defined and that the modifications have not had an adverse impact on the performance of the software. Software testing is an important part in the life cycle of any software system. During software testing, the developed software is tested to determine if the developed software meets the technical requirements which define its development and design. BRIEF DESCRIPTION OF THE DRAWINGS
[0QQ5]_ The_ detailed—description— is— described— with— reference— to— the~ accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and components.
[0006] Fig. la illustrates a feature map of a Multi-Perspective Feature
Model (MPFM), in accordance with an embodiment of the present subject matter.
[0007] Fig. lb illustrates components of a software testing system, in accordance with an embodiment of the present subject matter.
[0008] Fig. 2 illustrates a method to perform software testing based on a
Multiple Perspective Feature Model (MPFM), in accordance with an embodiment of the present subject matter.
[0009] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown., .
DETAILED DESCRIPTION .
[0010] Method(s) and system(s) to perform software testing based on a
Multiple Perspective Feature Model (MPFM) are described. Methods can be implemented in systems that include, but are not limited to, computing devices, such as, desktop computers, hand-held devices, laptops or other portable computers, advanced cellular phones, tablets, notebooks, and the like, capable of establishing testing software applications. Although the description herein is with reference to computers, the methods and systems may be implemented in other computing devices and systems as well, albeit with a few variations, as will be understood by a person skilled in the art.
[0011] The increasingly competitive business environment of today has made it imperative to run business in a cost-effective and customer centric manner. Software customization has occurred in order to achieve cost cutting and the target to adapt to customer requirements has led to customization of softwares. The target to reduce costs, resource dependency and increased reliability has led to the development of Software Product Line (SPL), the ultimate goal being, to develop software that is in line with the requisites of a user or an organization. For example, Enterprise Resource Planning (ERP) packages that integrate information across entire organization are customized for every organization. Further, internationalized softwares are customized for every country to suit to the requisites of the organization in respective countries. Furthermore, consumer software products are personalized as per user preferences. Although the basic functionality of the customized packages is similar to that of a core package, the number of variations and dimensions of variation in customized packages tend to be high
[0012] Since the software product is customizable, there is a possibility that multiple variations in working of the SPL due to association among features of the software may occur. Managing variability in the development of the SPL contributes to the key success of the developed software. Feature Modeling is one popular technique to model variability in the SPL. Feature Modeling is a tool to develop Feature Models (FMs) where the developed FM's represent a product in a SPL by means of features. Generally, inclusion of feature(s) in the SPL results in interactions among the features, thus varying the behavior of other features. Also, in a real world system, developing a FM for complex systems, such as the entire SPL, is a tedious and a cumbersome task. obtain the MPFM. In operation, according to an implementation of the present subject matter, the Feature Model (FM) associated with a SPL is generated, wherein the FM is representative of In said implementation, the generation of a FM may be achieved by means of tools existing in the state of the art. For example, Feature modeler.
[0016] The generated FM associated with the SPL may have variability information scattered all over the FM. Therefore, Separation of Concerns (SoC) in the FM associated with the SPL is performed, in order to modularize the FM. In one implementation, the SoC in the FM is based on identifying a source of variation with a common cause of variation. In said implementation, the features that are sources of variations may have a common cause of variation. For example, a FM for an internationalized website may include Stock Quote and Gold rate chart as features under a perspective domain. The FM may also include Indian Rupee, Euro and Pound as sub-features of the features Stock Quote and Gold rate chart. In such a scenario, the features like Stock Quote and Gold rate chart vary due to the sub- features Indian Rupee, Euro and Pound in the website. Therefore, in order to modularize the FM, SoC is performed, where, "currency" may be identified as a common cause of variation for the features Stock Quote and Gold rate chart and; Indian Rupee, Euro and Pound may be identified as sources of variation.
[0017] In one implementation of the present subject matter, the determined common cause of variation in the FM may be identified as perspective. For instance, in the above described example, the common cause of variation 'currency' may be identified as a new perspective of the FM; Since the SPL may implement a plurality of such perspectives where multiple new perspectives may be identified base on SoC, the generated FM is referred to as a MPFM. In one implementation, each identified perspective is further modeled to one or more feature.
[0018] Furthermore, test cases may also be identified based on the perspectives and the features of the MPFM. In one implementation, the identification [0013] More often than not, SPLs once developed, are tested to validate the correctness and to ensure that a desired output is generated. This implies that when a new feature is included or changed, it has to be ensured that the new feature works in all configurations in conjugation with all the other variations that each configuration might have. In an illustrative example, inclusion of a feature in like currency in an internationalized website can affect several other features like Stock Quote, Gold Rate Chart, etc., which are affected by the change in currency. Also, the new feature has to be tested in combination with other features in different browsers like Internet Explorer, Google Chrome, Firefox, different Operating Systems (OS), such as Windows, Linux, Android, and different devices, such as Desktop, Tablet and Smartphone. The feature interactions in FM are highly convoluted in a real world SPL and can prove to be a formidable task, where the number of variations and the configurations is large.
[0014] Also, since the number of feature interactions is wide, it may not be possible to test all the feature, interactions in a limited time period. The manual process of identifying feature interactions and testing the identified feature interactions pose multiple issues, such as, inability to achieve coverage, difficulty in prioritizing, dependency on experienced testers and difficulty in measuring test coverage. Further, a distributed nature of teams in software service industry pose additional challenges, such as lack of access to software artifact, difficulty in analyzing change impact, and insufficient domain understanding. As variability is geared more towards software, and as more features are being included within a single product line, testing of current traditional feature modeling is difficult to implement.
[0015] According to an implementation of the present subject matter, methods and systems to perform software testing based on a Multiple Perspective Feature Model (MPFM) are described. The present subject matter relates to modularization of the FM by performing Separation of Concerns (SoC) in the FM to of test cases is based on perspective selection parameters. For example, let us consider that a MPFM includes perspectives, such as domain, internationalization, architecture and operating environment- and features, such as Weather Update, Celsius, Blogs, Articles, 3-column page layout, 2-column page layout, Internet Explorer, Windows, Linux and Firefox. If test case for a user, planning on testing Weather Update in Celsius, in a 3-column page layout, on an Internet Explorer Browser, on a device running on a Windows Operating System has to be generated, the test case can be identified as [Weather Update, Celsius, 3-column, Internet Explorer, Windows]. Such a selection of test cases based on perspectives and features may allow effective testing of the SPL.
[0019] The implementation of the described system and methods of the present subject matter may provide an alternative way to model variations and measure test coverage. The present subject matter leverages the basic technology of modularization of redundant features by creating a MPFM, thus creating a model that is modular and is easy to maintain. Further, since the existing method relies on the use of MPFM in testing by generating a pair-wise combination, an exhaustive coverage of possible contexts and notion of complete testing done is available. Furthermore, prioritization of the test cases helps the software testers to sort the features based on their importance and helps the software testers to arrive at a minimum set of tests that provide maximum coverage of variability, thereby reducing the effort.
[0020] It should be noted that the description merely illustrates the implementation of principles of the present subject matter. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described herein, embody the principles of the present subject matter and are included within its spirit and scope. Furthermore, all examples recited herein are principally intended expressly to be only for pedagogical purposes to aid the reader in understanding the principles of the present subject matter and the concepts contributed by the inventor(s) to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the present subject matter, as well as specific examples thereof, are intended to encompass equivalents thereof.
[0021] The manner in which the systems and methods shall be implemented has been explained in details with respect to the Fig. l a, lb and 2. While aspects of described systems and methods can be implemented in any number of different computing systems, transmission environments, and/or configurations, the embodiments are described in the context of the following system(s).
[0022] Fig. la illustrates a feature map of a Multi-perspective feature Model
(MPFM) 100, in accordance with an embodiment of the present subject matter. The MPFM 100 is a representative of a plurality of perspectives 102-1, 102-2, 102-3 and 102-4 and may be collectively referred to as perspectives 102. Each of these perspectives 102 may be further associated with features 104-1, 104-2,...104-12 and may be collectively referred to as features 104. The features 102 may further include sub-features 106-1, 106-2, 106-3, 106-22 and may be collectively referred to as sub- features (106). In said implementation, the generation of a FM may be achieved by means of tools existing in the state of the art. For example, Feature modeler. The feature map herein mentioned shall be used for the purposes of explanation of Fig lb.
[0023] Fig. lb illustrates components of a software testing system 152, in accordance with an embodiment of the present subject matter. In one implementation, the software testing system 152 includes processor(s) 154. The processor 154 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) is to fetch and execute computer-readable instructions stored in the memory.
[0024] In another emb^imenrof^Ke present subject matter, the software testing system 152 may also include a memory 158. The memory 158 may be coupled to the processor 154. The memory 158 can include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non- volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. [0025] Further, the software testing system 152 may include module(s) 160 and data 162. The modules 160 and the data 162 may be coupled to the processors 154. The modules 160, amongst other things, include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. The modules 160 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulate signals based on operational instructions.
[0026] In an implementation, the module(s) 160 include a modeling module
164, a modularizing module 166, a validating module 168, a testing module 170, ranking module 172 and other module(s) 174. The other module(s) 174 may include programs or coded instructions that supplement applications or functions performed by the software testing system 152. In said implementation, the data 162 includes modular data 176, test data 178, and other data 180. The other data 180 amongst other things, may serve as a repository for storing data that is processed, received, or generated as a result of the execution of one or more modules in the module(s). Although the data is shown internal to the software testing system 152, it may be understood that the data 162 can reside in an external repository (not shown in the figure), which may be coupled to the software testing system 152. The software testing system 152 may communicate with the external repository through the interface(s) 156 to obtain information from the data 162.
[0027]" The software- testing system lT52~facilitates software application testing based on Multiple Perspective Feature Model (MPFM). In an implementation, the modeling module 164 of the software testing system 152 is to generate a Feature Model (FM) associated with a Software product Line (SPL), where the generated FM is representative of a plurality of perspectives 102, features 104 and sub-features 106. In said implementation, the generation of the FM may be performed by existing tools available in the state of the art, such as a feature modeler. In an illustrative example, the FM for an internationalized website (Site X) may contain a domain perspective 102-1. The domain 102-1 can further include features like Stock Quote 104-2 and Gold Rate chart 104-4 and each of these features 104 can include further sub-features 106 like Indian Rupee 106 -3, Dollar 106-4 and Pound 106-5. In another example, the FM for a website (Site Z) may contain Operating Environment 102-4 as a perspective. The Operating Environment 102-4 may further include features 106 like browser 104-10, Operating System (OS) 104- 1 1 and device 104-12. Each of these features may include further sub- features 106, such as the browser 104-10 may include sub-features 106 like Internet Explorer (IE) 106-14, Opera 106-16, Firefox 106-15 and different configurations of each the listed sub-features like IE 7, IE 8 and so on. The OS may further include sub-features like Windows 106-17, Linux 106- 18 and Android 106-19.
[0028] According to an implementation of the present subject matter, the modularizing module 166 of the software testing system 152 may modularize the generated FM by performing Separation of Concerns (SoC) in the FM. In said implementation, the SoC may be performed based on identifying a common cause of variation for a source of variation. In said implementation, the source of variation is considered as a relationship between a source feature and a target feature, such that, the source feature is the feature of the perspective that is implemented with many variations and the target feature is variations of the source feature.
[0029] In continuation with the previous example on the FM for the internationalized website (Site X), where the Rupee 106-3, Dollar 106-4 and Pound 106-5 are representative sub-features 106 of the features Stock Quote 104-2 and Gold Rate Chart 104-4, the features 104 may be referred to as the source features, while the sub-features 106 may be referred to as the target features. The source of variation here is, the relationship between source features, i.e., Stock Quote 104-2 and the target features i.e., the Rupee 106-3, Dollar 106-4 and Pound 106-5 . Similarly, another source of variation is the relationship between source features i.e., Gold Rate Chart 104-4and the target features i.e. the Rupee 106-3, Dollar 106-4 and Pound 106-5. In the described scenario, the software testing system 152 may further modularize the generated FM, by generating 'currency 104-6' to be the common cause of variation. [0030] In another implementation, the modularizing module 166 of the software testing system 152 may determine the common cause of variation as a perspective to generate a Multiple Perspective Feature Model (MPFM). In said implementation, each perspective 102 of the MPFM 100 includes one or more features 104. In one implementation, the MPFM thus generated is stored in the modular data 176. In continuation with the previous example, 'currency 104-6', which is identified as the common cause of variation, is determined as a perspective of the FM for site X by the modularizing module 166. The perspective currency 104- 6 may further be linked to the sub-features the Rupee 106-3, Dollar 106-4 and Pound 106-5. [0031] Further, test cases are identified based on utilizing perspectives 102 as parameters and one or more features 104 from the perspectives 102 as values. In one implementation the validating module 168 may identify test cases and validate the identified test cases and store the validated test cases in the test data 178. In said implementation, the identification of the test cases is based on perspective selection parameters. Consider an illustrative example, where a MPFM of a SPL consists of perspectives like Operating System, Browser, Protocol, Central Processing Unit (CPU) and Data Base Management Systems (DBMS) and each of these perspectives include one or more features as listed below. The validating module 168 may identify test cases based on utilizing perspectives as parameters and features as values. The various perspectives and features under each of the perspectives may include operating system (Windows, iOS, Android), a browser (Internet Explorer, Firefox), protocol stack (IPv4, IPv6), a processor (Intel, AMD), and a database (MySQL, Sybase, Oracle). 10 possible test cases that may cover all combinations of test cases may be identified and the cases are thus listed below in Table 1
Figure imgf000013_0001
Table 1: Test cases
[0032] Further, the test cases once identified may be validated by the validating module 168 of the software testing system 152, where every feature of the identified test case may be compared to features of a product configuration to generate constraints such that only valid test cases are identified. An exact matching of the features of the test cases to features of the product configuration is indicative of a valid test case. In an example, let us consider test case 1 as listed in Table 1. The test case is identified to be [Windows, Internet Explorer, IPv4, Intel, and MySQL]. Now, each feature of the test case namely, Windows, Internet Explorer, IPv4, Intel, MySQL, is mapped to the feature of the product configuration. The existence of the features in the product configuration is indicative of a valid test case and further steps can be taken to ensure that all possible pair-wise combinations of features in the validated test case are tested. However, if the features of the identified test case do not exist in the product configuration, the test case is determined to be invalid and further steps to perform testing of such an invalid test case are aborted. For example, in a situation where the Android Operating System and iOS in the product configuration does not support Internet Explorer browser, the test cases 5 and 7 may be discarded as these test cases are deemed to be invalid.
[0033] Further, a combinatorial test of the validated test cases may be performed, wherein the combinatorial test includes generation of pair- wise combination of features of the validated test cases to ensure that all possible combinations in the validated test case are tested. In one implementation, the testing module 170 of the software testing system 152 may test feature interactions by generating combinations of the features of the MPFM. In said implementation, the generated combination is a pair-wise combination and the generated combinations are tested by passing each perspective of the MPFM as a parameter and each feature of the perspective of the MPFM is passed as a value. In continuation with the previous example, if all the 10 test cases listed in Table 1 are tested, it can be ensured that a total of 3 x 2x 2 x 2 x 3 = 72 possible pair- wise combinations thus generated are tested.
[0034] Furthermore, the validated test cases may be prioritized based on statistical parameters. In one implementation, the ranking module 172 of the software testing system 152 may prioritize test cases based on statistical parameters. In said implementation, the statistical parameters include either probability of error and probability of usage. In an illustrative example, a social networking application might have to be tested in different browsers like Internet Explorer, Firefox, Google Chrome, and different configurations of each of the browsers, devices like tablets, desktop, and the like, different screens with different resolutions, different page layouts, different portals, languages, time zones and so on and so forth. It may not be practically possible to perform testing under all conditions. Under such scenarios, it may be helpful for a software tester to prioritize the test cases.
[0035] In said implementation, testing based on the probability of usage, suggests prioritization of testing those features that are widely used by people of a particular geographic location prior to testing of the less used features,. For example, consider a scenario where a particular social networking application has to be tested in different Operating Systems like Windows Operating System, Blackberry Operating System, Android Operating System and iOS. If this particular application is to be launched in a country where the usage of Windows Operating System and Android Operating System is wide, the features of the application are first tested in Windows Operating System and Android Operating System prior to testing the features of the application on the Blackberry Operating System and iOS.
[0036] In another implementation, prioritizing based on the probability of error, suggests prioritization of testing of those features that have failed to generate the desired output prior to testing of those features that have generated an expected output, based on performance history of the features in past test cases. For example, consider a feature like the ability to open multiple tabs in a browser. If this particular feature has to be tested in different browsers like Internet Explorer, Firefox, Chrome, etc, it is first tested on those browsers where the ability to open multiple tabs has failed repeatedly in the past test cases. [0037] Further, the prioritized test cases are tested to determine if the developed SPL generates the desired output. In one implementation, the testing module 170 may test the prioritized test cases based on combinatorial testing method to determine the success of the developed SPL. [0038] Fig. 2 illustrates a method 200 to generate a Multiple Perspective
Feature Model (MPFM) and utilize the MPFM in software application testing, according to an embodiment of the present subject matter. The order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method, or any alternative methods. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware.
[0039] The method may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0040] A person skilled in the art will readily recognize that steps of the method can be performed by programmed computers. Herein, some embodiments are also intended to cover program storage devices, for example, digital data storage media, which are machine or computer readable and encode machine-executable or computer-executable programs of instructions, where said instructions perform some or all of the steps of the described method. The program storage devices may be, for example, digital memories, magnetic storage media, such as a magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage media. The embodiments are also intended to cover both communication network and communication devices to perform said steps of the described method.
[0041] Referring to Fig. 2, at block 202, a Feature Model (FM) associated with a Software Product Line (SPL) is generated, wherein the FM is a representative of the features of the SPL. In one implementation of the present subject matter, the FM is generated by means of tools existing in the state of the art. In another implementation, the Modeling Module 164 is to generate a FM based on the features of a SPL.
[0042] At block 204, Separation of Concerns (SoC) is performed in the FM, wherein the SoC is performed by identifying at least one source of variation with a common cause of variation. In one implementation, the source of variation is identified as those features that may be implemented with several variations. In the said implementation, the sources of variation that have a common cause of variation are separated out as concerns. In another implementation, the modularizing module 166 may perform SoC in the FM to generate a model that is modular and easy to maintain. [0043] At block 206, the common cause of variation that was identified as a concern is determined to be a perspective. Since several such concerns identified in a real world system for a SPL, several perspectives are determined and the model thus generated is a Multiple Perspective Feature Model (MPFM). In one implementation, each perspective of the MPFM further includes at least one feature. In said implementation, the modularizing module 166 may generate a MPFM from a FM associated with a SPL. [0044] At block 208, test cases are identified based on the perspectives and one or more features of the MPFM and the identified test cases, where the identificatiOn-of-the-test-cases-is-based-on-utilizing-perspectives-as-parameters-and features from the perspective are utilized as values. In said implementation, the test cases may be identifies based on perspective selection parameters and the identified test cases are further validated based on comparing the features of the identified test cases to features of the product configuration. Furthermore, existence of the features of the identified test cases may be verified based on a comparison to the features of the product configuration to determine the validity of the identified test cases. [0045] Although embodiments for methods and systems to perform software testing based on the MPFM are described, it is to be understood that the present subject matter is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as embodiments to perform software testing based on the MPFM.

Claims

I/We claim:
1. A method for testing a Software Product Line (SPL), the method comprising:
generating a Feature Model (FM) associated with the SPL, wherein the FM includes a plurality of features; achieving a separation of concerns (SoC) in the FM based on identifying at least one source of variation with a common cause of variation, wherein the achieving is performed by a processor (154); determining the common cause of variation as a perspective of the FM to generate a Multiple Perspective Feature Model (MPFM), wherein each perspective from amongst a plurality of perspectives of the MPFM includes at least one feature from amongst the plurality of features, and wherein the MPFM is stored in modular data (176); and identifying test cases based on the plurality of perspectives and the plurality of features of the MPFM, wherein the plurality of perspectives are utilized as parameters and the plurality of features are utilized as values, and wherein the identified test cases are stored in test data (178).
2. The method as claimed in claim 1, wherein the method further comprising validating the identified test cases to generate validated test cases, wherein the validated test cases represent a valid set of features.
3. The method as claimed in claim 2, wherein the validating comprises:
comparing features of the identified test cases to features of a product configuration; and
verifying existence of the features of the identified test cases in the features of the product configuration to validate the identified test cases, wherein an existence of the features is indicative of a valid test case.
4. The method as claimed in claim 2, wherein the method further comprises prioritizing the validated test cases based on statistical parameters.
The method as claimed in claim 4, wherein the statistical parameters include one of probability of error and probability of usage.
The method as claimed in claim 4, further comprising testing the prioritized test cases based on combinatorial testing method.
A Software Testing System (152) for testing a SPL comprising: a processor (154); a modeling module (164) coupled to the processor (154), to generate a FM associated with the SPL, wherein the FM includes a plurality of features;
a modularizing module (166) coupled to the processor (154) to: achieve separation of concerns (SoC) in the FM based on identifying at least one source of variation with a common cause of variation; and determine the common cause of variation as a perspective of the FM to generate a Multiple Perspective Feature Model (MPFM), wherein each perspective from amongst a plurality of perspectives of the MPFM includes at least one feature from amongst the plurality of features; and
a validating module (168) coupled to the processor (154), to identify test cases based on the plurality of perspectives and the plurality of features of the MPFM, wherein the plurality of perspectives are utilized as parameters and the plurality of features are utilized as values.
The Software Testing System (152) as claimed in claim 7, wherein the validating module (168) further validates the identified test cases to generate validated test cases, and wherein the validated test cases represent a valid set of features.
The Software Testing System (152) as claimed in claim 8, wherein the validating module (168) validates the identified test cases based on: comparing features of the identified test cases to features of a product configuration; and
verifying existence of the features of the identified test cases in the features of the product configuration to validate the test cases, wherein an existence of the features is indicative of a valid test case.
10. The Software Testing System (152) as claimed in claim 7, further comprising a ranking module (172) coupled to the processor (154), to prioritize the validated test cases based on statistical parameters.
1 1. The Software Testing System (152) as claimed in claim 10, wherein the statistical parameters include one of probability of error and probability of usage.
12. The Software Testing System (152) as claimed in claim 10, further comprising a testing module (170) coupled to the processor (154) to test the prioritized test cases based on combinatorial testing method.
13. A non-transitory computer readable medium having a set of computer readable instructions that, when executed, cause a computing system to:
generate a Feature Model (FM) associated with the Software Product Line, wherein the FM includes a plurality of features;
achieve separation of concerns (SoC) in the FM based on identifying at least one source of variation with a common cause of variation;
determine the common cause of variation as a perspective of the FM to generate a Multiple Perspective Feature Model (MPFM), wherein each perspective from amongst a plurality of perspectives of the MPFM includes at least one feature from amongst the plurality of features; and
identify test cases based on the plurality of perspectives and the plurality of features of the MPFM, wherein the plurality of perspectives are parameters and the plurality of features are values.
PCT/IB2014/000605 2013-04-25 2014-04-24 Feature model based testing WO2014174362A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN1526/MUM/2013 2013-04-25
IN1526MU2013 IN2013MU01526A (en) 2013-04-25 2014-04-24

Publications (1)

Publication Number Publication Date
WO2014174362A1 true WO2014174362A1 (en) 2014-10-30

Family

ID=51791129

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/000605 WO2014174362A1 (en) 2013-04-25 2014-04-24 Feature model based testing

Country Status (2)

Country Link
IN (1) IN2013MU01526A (en)
WO (1) WO2014174362A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106021107A (en) * 2016-05-19 2016-10-12 浪潮电子信息产业股份有限公司 Modularized test case distribution method
CN107002662A (en) * 2014-11-05 2017-08-01 Avl里斯脱有限公司 Method and apparatus for running pump
EP3299963A1 (en) * 2016-09-26 2018-03-28 Wipro Limited Methods and systems for generating test scenarios to test applications using human senses
WO2019104917A1 (en) * 2017-11-29 2019-06-06 平安科技(深圳)有限公司 Fund system test case testing method, device and equipment, and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5652835A (en) * 1992-12-23 1997-07-29 Object Technology Licensing Corp. Method and apparatus for generating test data for an automated software testing system
US20080313501A1 (en) * 2007-06-14 2008-12-18 National Tsing Hua University Method and system for assessing and analyzing software reliability
US20090018811A1 (en) * 2007-07-09 2009-01-15 International Business Machines Corporation Generation of test cases for functional testing of applications

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5652835A (en) * 1992-12-23 1997-07-29 Object Technology Licensing Corp. Method and apparatus for generating test data for an automated software testing system
US20080313501A1 (en) * 2007-06-14 2008-12-18 National Tsing Hua University Method and system for assessing and analyzing software reliability
US20090018811A1 (en) * 2007-07-09 2009-01-15 International Business Machines Corporation Generation of test cases for functional testing of applications

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107002662A (en) * 2014-11-05 2017-08-01 Avl里斯脱有限公司 Method and apparatus for running pump
CN107002662B (en) * 2014-11-05 2019-02-19 Avl里斯脱有限公司 Method and apparatus for running pump
CN106021107A (en) * 2016-05-19 2016-10-12 浪潮电子信息产业股份有限公司 Modularized test case distribution method
EP3299963A1 (en) * 2016-09-26 2018-03-28 Wipro Limited Methods and systems for generating test scenarios to test applications using human senses
US10255169B2 (en) 2016-09-26 2019-04-09 Wipro Limited Testing applications using application features grouped into categories of human senses
WO2019104917A1 (en) * 2017-11-29 2019-06-06 平安科技(深圳)有限公司 Fund system test case testing method, device and equipment, and storage medium

Also Published As

Publication number Publication date
IN2013MU01526A (en) 2015-04-10

Similar Documents

Publication Publication Date Title
US10565097B2 (en) Orchestrating and providing a regression test
US10353913B2 (en) Automating extract, transform, and load job testing
US10713690B2 (en) Configurable relevance service test platform
JP6518794B2 (en) Plug-in packaging method, apparatus and terminal
US10366112B2 (en) Compiling extract, transform, and load job test data cases
US11816190B2 (en) Systems and methods to analyze open source components in software products
US9612946B2 (en) Using linked data to determine package quality
US20160253172A1 (en) Indicating a trait of a continuous delivery pipeline
CN109656917A (en) Data detection method, device, equipment and the readable storage medium storing program for executing of multi-data source
EP3616066A1 (en) Human-readable, language-independent stack trace summary generation
WO2014174362A1 (en) Feature model based testing
CN106873960A (en) The update method and equipment of a kind of application software
JP7155626B2 (en) Field device commissioning system and field device commissioning method
CN110019660A (en) A kind of Similar Text detection method and device
CN111797312A (en) Model training method and device
JP2017174418A (en) Data structure abstraction for model checking
CN111813739A (en) Data migration method and device, computer equipment and storage medium
US20150287059A1 (en) Forecasting device return rate
CN112541688B (en) Service data verification method and device, electronic equipment and computer storage medium
US20160019564A1 (en) Evaluating device readiness
Markiegi et al. Dynamic test prioritization of product lines: An application on configurable simulation models
US10176276B2 (en) Determining an optimal global quantum for an event-driven simulation
CN109376285A (en) Data sorting verification method, electronic equipment and medium based on json format
CN114238129A (en) Method, device and equipment for generating interface data and storage medium
CN109697141B (en) Method and device for visual testing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14788748

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14788748

Country of ref document: EP

Kind code of ref document: A1