21 research outputs found
applying design patterns to remove software performance antipatterns a preliminary approach
Abstract: Patterns and antipatterns represent powerful instruments in the hands of software designers, for improving the quality of software systems. A large variety of design patterns arose since decades, as well as several performance antipatterns have been defined. In this paper we propose a preliminary approach for antipattern-based refactoring of software systems, driven by design patterns application. The approach is focused on refactoring software artifacts (i.e., models, code) by applying design patterns, with the aim of removing possible performance antipatterns occurring on such artifacts. Based on our approach, design patterns are ranked in order to drive the refactoring choice. We also provide an illustrative example as a preliminary validation of our approach, showing how the ranking method works over three design patterns for removing the Empty Semi-Trucks performance antipattern, and we finally identify future research directions of our work
Many-Objective Optimization of Non-Functional Attributes based on Refactoring of Software Models
Software quality estimation is a challenging and time-consuming activity, and
models are crucial to face the complexity of such activity on modern software
applications. In this context, software refactoring is a crucial activity
within development life-cycles where requirements and functionalities rapidly
evolve. One main challenge is that the improvement of distinctive quality
attributes may require contrasting refactoring actions on software, as for
trade-off between performance and reliability (or other non-functional
attributes). In such cases, multi-objective optimization can provide the
designer with a wider view on these trade-offs and, consequently, can lead to
identify suitable refactoring actions that take into account independent or
even competing objectives. In this paper, we present an approach that exploits
NSGA-II as the genetic algorithm to search optimal Pareto frontiers for
software refactoring while considering many objectives. We consider performance
and reliability variations of a model alternative with respect to an initial
model, the amount of performance antipatterns detected on the model
alternative, and the architectural distance, which quantifies the effort to
obtain a model alternative from the initial one. We applied our approach on two
case studies: a Train Ticket Booking Service, and CoCoME. We observed that our
approach is able to improve performance (by up to 42\%) while preserving or
even improving the reliability (by up to 32\%) of generated model alternatives.
We also observed that there exists an order of preference of refactoring
actions among model alternatives. We can state that performance antipatterns
confirmed their ability to improve performance of a subject model in the
context of many-objective optimization. In addition, the metric that we adopted
for the architectural distance seems to be suitable for estimating the
refactoring effort.Comment: Accepted for publication in Information and Software Technologies.
arXiv admin note: substantial text overlap with arXiv:2107.0612
Architectural Support for Software Performance in Continuous Software Engineering: A Systematic Mapping Study
The continuous software engineering paradigm is gaining popularity in modern
development practices, where the interleaving of design and runtime activities
is induced by the continuous evolution of software systems. In this context,
performance assessment is not easy, but recent studies have shown that
architectural models evolving with the software can support this goal. In this
paper, we present a mapping study aimed at classifying existing scientific
contributions that deal with the architectural support for performance-targeted
continuous software engineering. We have applied the systematic mapping
methodology to an initial set of 215 potentially relevant papers and selected
66 primary studies that we have analyzed to characterize and classify the
current state of research. This classification helps to focus on the main
aspects that are being considered in this domain and, mostly, on the emerging
findings and implications for future researc
Introducing Interactions in Multi-Objective Optimization of Software Architectures
Software architecture optimization aims to enhance non-functional attributes
like performance and reliability while meeting functional requirements.
Multi-objective optimization employs metaheuristic search techniques, such as
genetic algorithms, to explore feasible architectural changes and propose
alternatives to designers. However, the resource-intensive process may not
always align with practical constraints. This study investigates the impact of
designer interactions on multi-objective software architecture optimization.
Designers can intervene at intermediate points in the fully automated
optimization process, making choices that guide exploration towards more
desirable solutions. We compare this interactive approach with the fully
automated optimization process, which serves as the baseline. The findings
demonstrate that designer interactions lead to a more focused solution space,
resulting in improved architectural quality. By directing the search towards
regions of interest, the interaction uncovers architectures that remain
unexplored in the fully automated process
IL-10, IL-13, Eotaxin and IL-10/IL-6 ratio distinguish breast implant-associated anaplastic large-cell lymphoma from all types of benign late seromas
Breast implant-associated anaplastic large-cell lymphoma (BI-ALCL) is an uncommon peripheral T cell lymphoma usually presenting as a delayed peri-implant effusion. Chronic inflammation elicited by the implant has been implicated in its pathogenesis. Infection or implant rupture may also be responsible for late seromas. Cytomorphological examination coupled with CD30 immunostaining and eventual T-cell clonality assessment are essential for BI-ALCL diagnosis. However, some benign effusions may also contain an oligo/monoclonal expansion of CD30 + cells that can make the diagnosis challenging. Since cytokines are key mediators of inflammation, we applied a multiplexed immuno-based assay to BI-ALCL seromas and to different types of reactive seromas to look for a potential diagnostic BI-ALCL-associated cytokine profile. We found that BI-ALCL is characterized by a Th2-type cytokine milieu associated with significant high levels of IL-10, IL-13 and Eotaxin which discriminate BI-ALCL from all types of reactive seroma. Moreover, we found a cutoff of IL10/IL-6 ratio of 0.104 is associated with specificity of 100% and sensitivity of 83% in recognizing BI-ALCL effusions. This study identifies promising biomarkers for initial screening of late seromas that can facilitate early diagnosis of BI-ALCL
Analyzing the sensitivity of multi-objective software architecture refactoring to configuration characteristics
Abstract Context: Software architecture refactoring can be induced by multiple reasons, such as satisfying new functional requirements or improving non-functional properties. Multi-objective optimization approaches have been widely used in the last few years to introduce automation in the refactoring process, and they have revealed their potential especially when quantifiable attributes are targeted. However, the effectiveness of such approaches can be heavily affected by configuration characteristics of the optimization algorithm, such as the composition of solutions. Objective: In this paper, we analyze the behavior of E A S I E R , which is an Evolutionary Approach for Software archItecturE Refactoring, while varying its configuration characteristics, with the objective of studying its potential to find near-optimal solutions under different configurations. Method: In particular, we use two different solution space inspection algorithms (i.e., N S G A − I I and S P E A 2 ) while varying the genome length and the solution composition. Results: We have conducted our experiments on a specific case study modeled in AEmilia ADL, on which we have shown the ability of E A S I E R to identify performance-critical elements in the software architecture where refactoring is worth to be applied. Beside this, from the comparison of multi-objective algorithms, N S G A − I I has revealed to outperform S P E A 2 in most of cases, although the latter one is able to induce more diversity in the proposed solutions. Conclusion: Our results show that the E A S I E R thoroughly automated process for software architecture refactoring allows to identify configuration contexts of the evolutionary algorithm in which multi-objective optimization more effectively finds near-optimal Pareto solutions
Architectural support for software performance in continuous software engineering : A systematic mapping study
The continuous software engineering paradigm is gaining popularity in modern development practices, where the interleaving of design and runtime activities is induced by the continuous evolution of software systems. In this context, performance assessment is not easy, but recent studies have shown that architectural models evolving with the software can support this goal. In this paper, we present a mapping study aimed at classifying existing scientific contributions that deal with the architectural support for performance-targeted continuous software engineering. We have applied the systematic mapping methodology to an initial set of 215 potentially relevant papers and selected 66 primary studies that we have analyzed to characterize and classify the current state of research. This classification helps to focus on the main aspects that are being considered in this domain and, mostly, on the emerging findings and implications for future research. Editor's note: Open Science material was validated by the Journal of Systems and Software Open Science Board. (see [https://www.sciencedirect.com/science/article/pii/S0164121221002168] for an example for where to place the statement and how to format it).Peer reviewe
Free flap failure in a patient with a long standing, infected, squamous cell carcinoma.
A 40-year-old patient presented with a long history of a pilonidal sinus, which had been operated on several times during the last 20 years. On clinical examination the patient had a large tumour in the sacral and perineal region, with involvement of the rectal wall. General surgeons first attempted to excise the tumour with wide healthy margins, and close the wound by local flaps. After partial flap necrosis and wound dehiscence, the patient underwent a reconstruction with a free latissimus dorsi myocutaneous flap. During the anastomosis it was noted that the recipient vessel walls were brittle, mainly at the arterial site, so the arterial anastomosis had to be done three times. Despite this the artery thrombosed again 12 hours later. Biopsy specimens were taken from the anastomotic sites and studied under light microscopy. There were signs of acute intramural inflammation, with many polymorphonuclear leukocytes present in microabscesses, and spots of necrosis in the elastic layer at the site of the recipient artery. In conclusion, the long lasting infection was considered to be the main factor that caused the anastomosis to fail, leading to thrombosis, through alteration of the vessel walls. The anomalies in the vessel walls were found at some distance from the clinically diseased area, further than is usually found in acute infection. The use of primary arteriovenous vein graft loops or long vein grafts so that the anastomosis can be made on undamaged vessels, and possibly a less traumatic anastomosis such as the ''sleeve'' type, should be considered for similar cases
An Approach Using Performance Models for Supporting Energy Analysis of Software Systems
Measurement-based experiments are a common solution for assessing the energy consumption of complex software systems. Since energy consumption is a metric that is sensitive to several factors, data collection must be repeated to reduce variability. Moreover, additional rounds of measurements are required to evaluate the energy consumption of the system under different experimental conditions. Hence, accurate measurements are often unaffordable because they are time-consuming. In this study, we propose a model-based approach to simplify the energy profiling process and reduce the time spent performing it. The approach uses Layered Queuing Networks (LQN) to model the scenario under test and examine the system behavior when subject to different workloads. The model produces performance estimates that are used to derive energy consumption values in other scenarios. We have considered two systems while serving workloads of different sizes. We provided 2K, 4K, and 8K images to a Digital Camera system, and we supplied bursts of 75 to 500 customers for a Train Ticket Booking System. We parameterized the LQN with the data obtained from short experiment and estimated the performance and energy in the cases of heavier workloads. Thereafter, we compared the estimates with the measured data. We achieved, in both cases, good accuracy and saved measurement time. In case of the Train Ticket Booking System, we reduced measurement time from 5Â h to 35Â min by exploiting our model, this reflected in a Mean Absolute Percentage Error of 9.24% in the estimates of CPU utilization and 8.72% in energy consumption predictions.</p