Juan Pablo Sotomayor Paez
Knight Foundation School of Computing and Information Sciences
Juan P. Sotomayor is currently a Ph.D. candidate in Computer Science at Florida International University’s Knight Foundation School of Computing and Information Sciences with research interests in Software Testing emphasizing in Microservices Testing, He is an active member in MENSA International and the Association for Computing Machinery (ACM).
The popularity of the microservices architecture to create loosely coupled large-scale systems, along with the increase in deployment frequency of individual microservices has led to the need for tools that are capable of testing microservices at runtime in a production environment. As a result of this need, various tools have emerged to test microservice-based applications (MSBAs) using different testing levels in a development environment. Few if any of these tools are currently used to test MSBAs in a production environment.
The proposed research will investigate how to validate microservices at runtime while minimizing the disruption of services of MSBAs. To achieve this goal, we focus on three objectives: 1) evaluating existing testing tools for MSBAs. 2) Design requirements and models to support self-testing of a microservice, and 3) Implement a strategy using self-testing microservices to validate MSBAs at runtime. To evaluate existing testing tools for MSBAs, a semi-systematic literature review was performed that involved identifying the features for twenty-five tools that use various testing strategies (levels) for microservices. A cross-section of these tools was then used to test a MSBA prototype (Rideshare) to gain insight into the infrastructure and overhead needed to test MSBAs. Using these results, we identified potential candidates to support testing of MSBAs at runtime.
A key aspect of testing MSBAs at runtime is the ability of a microservice to test itself. Although there has been work in the area of self-testing components, specifically in autonomic systems, it has been focused on testing monolithic systems which inherently have a different structure and challenges compared to MSBAs. Since microservices are self-contained there is an opportunity to extend previous work on self-testing. Our approach to develop self-testing microservices includes identifying a list of requirements that a design should meet at runtime and the models needed to implement self-testing. These models include: a UML component diagram that includes the traditional service and the self-testing framework, and a UML profile that guides the development of self-testing microservices.
Using the design for self-testing microservices we develop a strategy for the dynamic validation of MSBAs at runtime. This strategy involves the validation of microservices at runtime using two different approaches, (1) replication with validation and (2) safe-adaptation with validation. To evaluate this strategy two MSBAs (Rideshare and Train Ticket) will be re-engineered to implement the self-testing of microservices design. Experiments will be performed to determine the effectiveness of validating MSBAs in a production environment.