log in  |  register  |  feedback?  |  help  |  web accessibility
Logo
PhD Defense: Model-Based Testing of Off-Nominal Behaviors
Christoph Schulze
Tuesday, November 21, 2017, 11:00 am-12:00 pm Calendar
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)
Abstract
Off-nominal behaviors (ONBs) are unexpected or unintended behaviors that may be exhibited by a system. They can be caused by implementation and documentation errors and are often triggered by unanticipated external stimuli, such as unforeseen sequences of events, out of range data values, or environmental issues. System specifications typically focus on nominal behavior, and do not refer to ONBs or their causes or explain how the system should respond to them. In addition, untested occurrences of ONBs can compromise the safety and reliability of a system. This can be very dangerous in mission- and safety-critical systems, like spacecraft, where software issues can lead to expensive mission failures, injuries, or even loss of life. In order to ensure the safety of the system, potential causes for ONBs need to be identified and their handling in the implementation has to be verified and documented.

This thesis describes the development and evaluation of model-based techniques for the identification and documentation of ONBs. Model-Based Testing (MBT) techniques have been used to provide automated support for thorough evaluation of software behavior. In MBT, models are used to describe the system under test (SUT) and to derive test cases for that SUT. The thesis is divided into two parts. The first part develops and evaluates an approach for the automated generation of MBT models and their associated test infrastructure. The test infrastructure is responsible for executing the generated test cases of the models. The models and the test infrastructure are generated from manual test cases for web-based systems, using a set of heuristic transformation rules and leveraging the structured nature of the SUT. This improvement to the MBT process was motivated by three case studies of MBT that we conducted that evaluate MBT in terms of its effectiveness and efficiency for identifying ONBs. Our experience led us to develop automated approaches to model and test-infrastructure creation, since these were some of the most time-consuming tasks associated with MBT.

The second part of the thesis presents a framework and associated tooling for the extraction and analysis of specifications for identifying and documenting ONBs. The framework infers behavioral specifications in the form of system invariants from automatically generated test data using data-mining techniques (e.g. association-rule mining). The framework follows an iterative test -> infer -> instrument -> retest paradigm, where the initial invariants are refined with additional test data. This work shows how the scalability and accuracy of the resulting invariants can be improved with the help of static data- and control-flow analysis. Other improvements include an algorithm that leverages the iterative process to accurately infer invariants from variables with continuous values. Our evaluations of the framework have shown the utility of such automatically generated invariants as a means for updating and completing system specifications; they also are useful as a means of understanding system behavior including ONBs.
 
Examining Committee: 
 
                          Chair:               Dr. Rance Cleaveland     
                          Dean's rep:      Dr. Shuvra Bhattacharyy  
                          Members:        Dr. Mikael Lindvall 
                                                    Dr. Adam Porter 
                                                    Dr. Don Perlis                           
This talk is organized by Tom Hurst