A paper accepted in ICSE2018

We are glad to announce that our ongoing collaboration with Software Institute Nanjing University has resulted in a paper accepted in a premier conference on software engineering, ICSE 2018.

Title: Synthesizing Qualitative Research in Software Engineering: A Critical Review

Abstract:

Synthesizing data extracted from primary studies is an integral part of the methodologies in support of Evidence-Based Software Engineering (EBSE) such as System Literature Review (SLR). Since a large and increasing number of studies in Software Engineering (SE) incorporate qualitative data, it is important to systematically review and understand different aspects of the Qualitative Research Synthesis (QRS) being used in SE.

Method: We have reviewed the use of QRS methods in 328 SLRs published between 2005 and 2015. We also inquired the authors of 274 SLRs to confirm whether or not any QRS methods were used in their respective reviews. 116 of them provided the responses, which were included in our analysis.

Results: We found eight QRS methods applied in SE research, two of which, narrative synthesis and thematic synthesis, have been predominantly adopted by SE researchers for synthesizing qualitative data.

Conclusion: Our study determines that a significant amount of missing knowledge and incomplete understanding of the defined QRS methods in the community. Our effort also identifies an initial set factors that may influence the selection and use of appropriate QRS methods in SE.

A paper published in Journal of Network and Computer Applications

CREST researchers (in collaboration with Security Lancaster) published a paper in Journal of Network and Computer Applications.

Paper title: Data Exfiltration: A Review of External Attack Vectors and Countermeasures

Authors: Faheem Ullah, Matthew Edwards, Rajiv Ramdhany, Ruzanna Chitchyan, M. Ali Babar, Awais Rashid

Abstract

Context: One of the main targets of cyber-attacks is data exfiltration, which is the leakage of sensitive or private data to an unauthorized entity. Data exfiltration can be perpetrated by an outsider or an insider of an organization. Given the increasing number of data exfiltration incidents, a large number of data exfiltration countermeasures have been developed. These countermeasures aim to detect, prevent, or investigate exfiltration of sensitive or private data. With the growing interest in data exfiltration, it is important to review data exfiltration attack vectors and countermeasures to support future research in this field. Objective: This paper is aimed at identifying and critically analysing data exfiltration attack vectors and countermeasures for reporting the status of the art and determining gaps for future research. Method: We have followed a structured process for selecting 108 papers from seven publication databases. Thematic analysis method has been applied to analyse the extracted data from the reviewed papers. Results: We have developed a classification of (1) data exfiltration attack vectors used by external attackers and (2) the countermeasures in the face of external attacks. We have mapped the countermeasures to attack vectors. Furthermore, we have explored the applicability of various countermeasures for different states of data (i.e., in use, in transit, or at rest). Conclusion: This review has revealed that (a) most of the state of the art is focussed on preventive and detective countermeasures and significant research is required on developing investigative countermeasures that are equally important; (b) Several data exfiltration countermeasures are not able to respond in real-time, which specifies that research efforts need to be invested to enable them to respond in real-time (c) A number of data exfiltration countermeasures do not take privacy and ethical concerns into consideration, which may become an obstacle in their full adoption (d) Existing research is primarily focussed on protecting data in ‘in use’ state, therefore, future research needs to be directed towards securing data in ‘in rest’ and ‘in transit’ states (e) There is no standard or framework for evaluation of data exfiltration countermeasures. We assert the need for developing such an evaluation framework.

A paper published in JSS journal

Paper Title
An empirical investigation of the influence of persona with personality traits on conceptual design [PDF]

Authors

Farshid Anvari, Deborah Richards, Michael Hitchens, Muhammad Ali Babar, Hien Minh Thi Tran, Peter Busch

Journal of Systems and Software Volume 134, December 2017, Pages 324-339
Abstract

Persona, an archetypical user, is increasingly becoming a popular tool for Software Engineers to design and communicate with stakeholders. A persona is a representative of a class of end users of a product or service. However, the majority of personas presented in the literature do not take into consideration that the personality of users affects the way they interact with a product or service. This study empiri- cally explores variations in conceptual design based on the personality of a persona. We carried out two studies in Australia and one study in Denmark. We presented four personas with different personalities to 91 participants who collectively completed 218 design artifacts. The results from the studies indicate that the participants’ views and prioritization of the needs and system requirements were influenced by the personality traits of the provided personas. For an introverted and emotionally unstable personal- ity, inclusion of confidence building and socializer design features had a higher priority compared with the identified requirements for an extravert and emotionally stable personality. The findings support the proposition that personas with personality traits can aid software engineers to produce conceptual de- signs tailored to the needs of specific personalities.

A paper accepted at WISE 2017 Conference

Title: A Kernel-based Approach to Developing Adaptable and Reusable Sensor Retrieval Systems for the Web of Things

Authors: Nguyen Khoi Tran, Quan Z. Sheng, M. Ali Babar, Lina Yao

Abstract: In the era of the Web of Things, a vast number of sensors and data streams are accessible to client applications as Web resources. Web Sensor Retrieval systems (WSR) help client applications to access Web-enabled sensors needed for their operation dynamically in an adhoc manner. Due to the diversity of sensors and query types, a functional WSR instance must be adaptable to different usage and deployment scenarios to ensure its utility. In this paper, we focus on the systematic reuse of components to enable adaptable WSR. In particular, we propose a modular architecture for WSR and develop a kernel to support
the development and composition of WSR modules. We demonstrate our solution with a reference WSR instance deployed on a Raspberry Pi 3. This instance provides five types of queries on eight types of sensors deployed across two sensor platforms. We provide our kernel and reference WSR instance as open-source under MIT license.

Publication Venue:
The 18th International Conference on Web Information System Engineering (WISE) – Moscow, Russia

A paper accepted at ACM Computing Survey

The paper titled “Searching the Web of Things: State of the Art, Challenges, and Solutions” has been accepted in ACM Computing Surveys.

Authors
Nguyen Khoi Tran, Quan Z.Sheng, Muhammad Ali Babar, Lina Yao

Abstract
Technological advances allow more physical objects to connect to the Internet and provide their services
on the Web as resources. Search engines are the key to fully utilize this emerging Web of Things, as they
bridge users and applications with resources needed for their operation. Developing these systems is a
challenging and diverse endeavor due to the diversity of Web of Things resources that they work with. Each
combination of resources in query resolution process requires a different type of search engine with its own
technical challenges and usage scenarios. This diversity complicates both the development of new systems
and assessment of the state of the art. In this article, we present a systematic survey on Web of Things Search
Engines (WoTSE), focusing on the diversity in forms of these systems. We collect and analyze over 200 related
academic works to build a fexible conceptual model for WoTSE. We develop an analytical framework on this
model to review the development of the field and its current status, reflected by 30 representative works in
the area. We conclude our survey with a discussion on open issues to bridge the gap between the existing
progress and an ideal WoTSE.

A paper accepted at ESEM 2017 conference

The paper titled “Beyond Continuous Delivery: An Empirical Investigation of Continuous Deployment Challenges” has been accepted in 11th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM), Toronto, Canada.

Abstract

Context: A growing number of software organizations have been adopting Continuous DElivery (CDE) and Continuous Deployment (CD) practices. Researchers have started investing significant efforts in studying different aspects of CDE and CD. Many studies refer CDE (i.e., where an application is potentially capable of being deployed) and CD (i.e., where an application is automatically deployed to production on every update) as synonyms and do not distinguish them from each other. Despite CDE being successfully adopted by a large number of organizations, it is not empirically known why organizations still are unable or demotivated to have automatic and continuous deployment (i.e., CD practice). Goal: This study aims at empirically investigating and classifying the factors that may impact on adopting and implementing CD practice. Method: We conducted a mixed-method empirical study consisting of interviewing 21 software practitioners, followed by a survey with 98 respondents. Results: Our study reveals 11 confounding factors that limit or demotivate software organizations to push changes automatically and continuously to production. The most important ones are “lack of automated (user) acceptance test”, “manual quality check”, “deployment as business decision”, “insufficient level of automated test coverage”, and “highly bureaucratic deployment process”. Conclusion: Our findings highlight several areas for future research and provide suggestions for practitioners to streamline deployment process.

Matthew Thyer presented LASAGNE framework

On Friday the 26th of May 2017, Matthew Thyer spoke to a group of Software Engineering and Industry students on the structure of Defence Science and Technology Group (DSTg), his work with DSTg and DSTg’s LASAGNE framework. He also showed a presentation on the SERP2016 project that was made using LASAGNE framework.
WeChat Image_20170526150145-1WeChat Image_20170526150233

 

Journal Paper accepted at IEEE Access Journal

CREST researchers published a paper in IEEE Access Journal

Paper Title
Continuous Integration, Delivery and Deployment: A systematic Review on Approaches, Tools, Challenges and Practices

Authors

Mojtaba Shahin, Muhammad Ali Babar, Liming Zhu

Abstract

Context: Continuous practices, i.e., continuous integration, delivery, and deployment, are the software development industry practices that enable organizations to frequently and reliably release new features and products. With the increasing interest in and literature on continuous practices, it is important to systematically review and synthesize the approaches, tools, challenges, and practices reported for adopting and implementing continuous practices.

Objective: This research aimed at systematically reviewing the state of the art of continuous practices to classify approaches and tools, identify challenges and practices in this regard, and identify the gaps for future research.

Method: We used systematic literature review (SLR) method for reviewing the peer-reviewed papers on continuous practices published between 2004 and 1st June 2016. We applied thematic analysis method for analysing the data extracted from reviewing 69 papers selected using predefined criteria.

Results: We have identified thirty approaches and associated tools, which facilitate the implementation of continuous practices in the following ways: (1) “reducing build and test time in continuous integration (CI)”; (2) “increasing visibility and awareness on build and test results in CI”; (3) “supporting (semi-) automated continuous testing”; (4) “detecting violations, flaws and faults in CI”; (5) “addressing security and scalability issues in deployment pipeline”, and (6) “improving dependability and reliability of deployment process”. We have also determined a list of critical factors such as “testing (effort and time)”, “team awareness and transparency”, “good design principles”, “customer”, “highly skilled and motivated team”, “application domain”, and “appropriate infrastructure” that should be carefully considered when introducing continuous practices in a given organization. The majority of the reviewed papers were validation (34.7%) and evaluation (36.2%) research types. This review also reveals that continuous practices have been successfully applied to both greenfield and maintenance projects.

Conclusion: Continuous practices have become an important area of software engineering research and practice. Whilst the reported approaches, tools, and practices are addressing a wide range of challenges, there are several challenges and gaps which require future research work for:improving the capturing and reporting of contextual information in the studies reporting different aspects of continuous practices; gaining a deep understanding of how software-intensive systems should be (re-) architected to support continuous practices; addressing the lack of knowledge and tools for engineering processes of designing and running secure deployment pipelines.

Journal Paper accepted at Future Generation Computer Systems

CREST researchers published a paper in Future Generation Computer Systems (FGCS)

Paper Title
A Reference Architecture for Provisioning of Tools as a Service: Meta-Model, Ontologies and Design Elements

Authors

Muhammad Aufeef Chauhan, Muhammad Ali Babar, Quan Z. Sheng

Abstract

Software Architecture (SA) plays a critical role in designing, developing and evolving cloud-based platforms that can be used to provision different types of services to consumers on demand. In this paper, we present a Reference Architecture (RA) for designing cloud-based Tools as a service SPACE (TSPACE) for provisioning a bundled suite of tools by following the Software as a Service (SaaS) model. The reference architecture has been designed by leveraging information structuring approaches and by using well-known architecture design principles and patterns. The RA has been documented using view-based approach and has been presented in terms of its context, goals, the RA meta-model, information structuring and relationship models using ontologies and components of the RA. We have demonstrated the feasibility and applicability of the RA with the help of a prototype and have used the prototype to provision tools for software architecting. We have also evaluated the RA in terms of effectiveness of the design decisions and the RA’s completeness and feasibility using scenario-based architecture evaluation method. The proposed TSPACE RA can provide valuable insights to information structure approaches and guidelines for designing and implementing TSPACE for various domains.

Journal Paper Published at Information and Software Technology

IST

CREST researchers published a paper in Information and Software Technology, a leading software engineering journal.

Paper Title

Why does site visit matter in global software development: A knowledge-based perspective

Authors

Mansooreh Zahedi and Muhammad Ali Babar

Paper Abstract

Context: Face-to-Face (F2F) interaction is a strong means to foster social relationships and effective knowledge sharing within a team. However, communication in Global Software Development (GSD) teams is usually restricted to computer-mediated conversation that is perceived to be less effective and interpersonal. Temporary collocation of dispersed members of a software development team is a well-known practice in GSD. Despite broad realization of the benefits of visits, there is lack of empirical evidence that explores how temporary F2F interactions are organized in practice and how they can impact knowledge sharing between sites.

Objective: This study aimed at empirically investigating activities that take place during temporary collocation of dispersed members and analyzing the outcomes of the visit for supporting and improving knowledge sharing.

Method: We report a longitudinal case study of a GSD team distributed between Denmark and Pakistan. We have explored a particular visit organized for a group of offshore team members visiting onshore site for two weeks. Our findings are based on a systematic and rigorous analysis of the calendar entries of the visitors during the studied visit, several observations of a selected set of the team members’ activities during the visit and 13 semi-structured interviews.

Results: Looking through the lens of knowledge-based theory of the firm, we have found that social and professional activities organized during the visit, facilitated knowledge sharing between team members from both sites. The findings are expected to contribute to building a common knowledge and understanding about the role and usefulness of the site visits for supporting and improving knowledge sharing in GSD teams by establishing and sustaining social and professional ties.