6,944 research outputs found
Identifying reusable functions in code using specification driven techniques
The work described in this thesis addresses the field of software reuse. Software reuse is widely considered as a way to increase the productivity and improve the quality and reliability of new software systems. Identifying, extracting and reengineering software. components which implement abstractions within existing systems is a promising cost-effective way to create reusable assets. Such a process is referred to as reuse reengineering. A reference paradigm has been defined within the RE(^2) project which decomposes a reuse reengineering process in five sequential phases. In particular, the first phase of the reference paradigm, called Candidature phase, is concerned with the analysis of source code for the identification of software components implementing abstractions and which are therefore candidate to be reused. Different candidature criteria exist for the identification of reuse-candidate software components. They can be classified in structural methods (based on structural properties of the software) and specification driven methods (that search for software components implementing a given specification).In this thesis a new specification driven candidature criterion for the identification and the extraction of code fragments implementing functional abstractions is presented. The method is driven by a formal specification of the function to be isolated (given in terms of a precondition and a post condition) and is based on the theoretical frameworks of program slicing and symbolic execution. Symbolic execution and theorem proving techniques are used to map the specification of the functional abstractions onto a slicing criterion. Once the slicing criterion has been identified the slice is isolated using algorithms based on dependence graphs. The method has been specialised for programs written in the C language. Both symbolic execution and program slicing are performed by exploiting the Combined C Graph (CCG), a fine-grained dependence based program representation that can be used for several software maintenance tasks
Multiscale Model Approach for Magnetization Dynamics Simulations
Simulations of magnetization dynamics in a multiscale environment enable
rapid evaluation of the Landau-Lifshitz-Gilbert equation in a mesoscopic sample
with nanoscopic accuracy in areas where such accuracy is required. We have
developed a multiscale magnetization dynamics simulation approach that can be
applied to large systems with spin structures that vary locally on small length
scales. To implement this, the conventional micromagnetic simulation framework
has been expanded to include a multiscale solving routine. The software
selectively simulates different regions of a ferromagnetic sample according to
the spin structures located within in order to employ a suitable discretization
and use either a micromagnetic or an atomistic model. To demonstrate the
validity of the multiscale approach, we simulate the spin wave transmission
across the regions simulated with the two different models and different
discretizations. We find that the interface between the regions is fully
transparent for spin waves with frequency lower than a certain threshold set by
the coarse scale micromagnetic model with no noticeable attenuation due to the
interface between the models. As a comparison to exact analytical theory, we
show that in a system with Dzyaloshinskii-Moriya interaction leading to spin
spiral, the simulated multiscale result is in good quantitative agreement with
the analytical calculation
The historical evolution of school integration in Italy: Some witnesses and considerations
AbstractIn Italy, the policy of 'integration' that has been implemented since the 1970s, is based on a welcoming culture in the common school context, and represents a particular phase, both politically and socially, of Italian history. It is based on a system of relations around the person with a disability and on the reciprocal enrichment that allows the other students to understand a different way of learning which is concerned with living together. School integration allows the students to share a new understanding of education which is underpinned by the principle that by living together all students can acquire new ways of learning and new kinds of knowledge. The purpose of this article is not to describe the model and the process of integration in Italy – that can be the subject of further and more specific works – but to focus on the historical evolution and the reference points represented by some authors who discuss the principles, and most meaningful aspects, on which the idea of integrazione scolastica11Italian "integrazione scolastica" is focused on interaction and on reciprocal change, on context organisation and on integrated didactic strategies. is based. The historical analysis is developed through references to some of the key scholars and witnesses who have worked to develop the organizational framework for the development of the inclusive school in Italy
Multiscale simulations of topological transformations in magnetic Skyrmions
Magnetic Skyrmions belong to the most interesting spin structures for the
development of future information technology as they have been predicted to be
topologically protected. To quantify their stability, we use an innovative
multiscale approach to simulating spin dynamics based on the
Landau-Lifshitz-Gilbert equation. The multiscale approach overcomes the
micromagnetic limitations that have hindered realistic studies using
conventional techniques. We first demonstrate how the stability of a Skyrmion
is influenced by the refinement of the computational mesh and reveal that
conventionally employed traditional micromagnetic simulations are inadequate
for this task. Furthermore, we determine the stability quantitatively using our
multiscale approach. As a key operation for devices, the process of
annihilating a Skyrmion by exciting it with a spin polarized current pulse is
analyzed, showing that Skyrmions can be reliably deleted by designing the pulse
shape
Last generation instrument for agriculture multispectral data collection
In recent years, the acquisition and analysis of multispectral data are gaining a growing interest and importance in agriculture. On the other hand, new technologies are opening up for the possibility of developing and implementing sensors with relatively small size and featuring high technical performances. Thanks to low weights and high signal to noise ratios, such sensors can be transported by different type of means (terrestrial as well as aerial vehicles), giving new opportunities for assessment and monitoring of several crops at different growing stages or health conditions. The choice and specialization of individual bands within the electromagnetic spectrum ranging from the ultraviolet to the infrared, plays a fundamental role in the definition of the so-called vegetation indices (eg. NDVI, GNDVI, SAVI, and dozens of others), posing new questions and challenges in their effective implementation. The present paper firstly discusses the needs of low-distance based sensors for indices calculation, then focuses on development of a new multispectral instrument specially developed for agricultural multispectral analysis. Such instrument features high frequency and high resolution imaging through nine different sensors (1 RGB and 8 monochromes with relative band-pass filters, covering the 390 to 950 nm range). The instrument allows synchronized multiband imaging thanks to integrated global shutter technology, with a frame rate up to 5 Hz; exposure time can be as low as 1/5000 s. An applicative case study is eventually reported on an area featuring different materials (organic and non-organic), to show the new instrument potential.
Last generation instrument for agriculture multispectral data collection. Available from: https://www.researchgate.net/publication/317596952_Last_generation_instrument_for_agriculture_multispectral_data_collection [accessed Jul 11, 2017]
On the dependence of galaxy morphologies on galaxy mergers
The distribution of galaxy morphological types is a key test for models of
galaxy formation and evolution, providing strong constraints on the relative
contribution of different physical processes responsible for the growth of the
spheroidal components. In this paper, we make use of a suite of semi-analytic
models to study the efficiency of galaxy mergers in disrupting galaxy discs and
building galaxy bulges. In particular, we compare standard prescriptions
usually adopted in semi-analytic models, with new prescriptions proposed by
Kannan et al., based on results from high-resolution hydrodynamical
simulations, and we show that these new implementations reduce the efficiency
of bulge formation through mergers. In addition, we compare our model results
with a variety of observational measurements of the fraction of
spheroid-dominated galaxies as a function of stellar and halo mass, showing
that the present uncertainties in the data represent an important limitation to
our understanding of spheroid formation. Our results indicate that the main
tension between theoretical models and observations does not stem from the
survival of purely disc structures (i.e. bulgeless galaxies), rather from the
distribution of galaxies of different morphological types, as a function of
their stellar mass.Comment: MNRAS in press, 11 pages, 5 figure
A methodological comparison between energy and environmental performance evaluation
The European Union is working on strategies in order to increase the energy efficiency of buildings. A useful solution is to identify the energy performance of buildings through the Energy Performance Certificate (EPC), as it provides information for the comparison of buildings with different architectural typology, shape, design technology and geographic location. However, this tool does not assess the real energy consumption of the building and does not always take into account its impact on the environment. In this work, two different types of analysis were carried out: one based only on the energy efficiency and the other one based on the environmental impact. Those analyses were applied on a standard building, set in three different Italian locations, with the purpose of obtaining cross-related information. After the evaluation of the results, interventions on some parameters (walls insulation, windows frame, filler gas in the insulated glazing) have been identified in order to improve the energy behavior of the building with an acceptable environmental impact. The aim of this paper is to propose a methodology that integrates the EPC with green building rating systems, leading to a more conscious choice of retrofit interventions as a compromise between energy performances and environmental impact
Mining Version Histories for Detecting Code Smells
Code smells are symptoms of poor design and implementation choices that may hinder code comprehension, and possibly increase change-and fault-proneness. While most of the detection techniques just rely on structural information, many code smells are intrinsically characterized by how code elements change over time. In this paper, we propose Historical Information for Smell deTection (HIST), an approach exploiting change history information to detect instances of five different code smells, namely Divergent Change, Shotgun Surgery, Parallel Inheritance, Blob, and Feature Envy. We evaluate HIST in two empirical studies. The first, conducted on 20 open source projects, aimed at assessing the accuracy of HIST in detecting instances of the code smells mentioned above. The results indicate that the precision of HIST ranges between 72 and 86 percent, and its recall ranges between 58 and 100 percent. Also, results of the first study indicate that HIST is able to identify code smells that cannot be identified by competitive approaches solely based on code analysis of a single system\u27s snapshot. Then, we conducted a second study aimed at investigating to what extent the code smells detected by HIST (and by competitive code analysis techniques) reflect developers\u27 perception of poor design and implementation choices. We involved 12 developers of four open source projects that recognized more than 75 percent of the code smell instances identified by HIST as actual design/implementation problems
Mining Version Histories for Detecting Code Smells
Code smells are symptoms of poor design and implementation choices that may hinder code comprehension, and possibly increase change-and fault-proneness. While most of the detection techniques just rely on structural information, many code smells are intrinsically characterized by how code elements change over time. In this paper, we propose Historical Information for Smell deTection (HIST), an approach exploiting change history information to detect instances of five different code smells, namely Divergent Change, Shotgun Surgery, Parallel Inheritance, Blob, and Feature Envy. We evaluate HIST in two empirical studies. The first, conducted on 20 open source projects, aimed at assessing the accuracy of HIST in detecting instances of the code smells mentioned above. The results indicate that the precision of HIST ranges between 72 and 86 percent, and its recall ranges between 58 and 100 percent. Also, results of the first study indicate that HIST is able to identify code smells that cannot be identified by competitive approaches solely based on code analysis of a single system\u27s snapshot. Then, we conducted a second study aimed at investigating to what extent the code smells detected by HIST (and by competitive code analysis techniques) reflect developers\u27 perception of poor design and implementation choices. We involved 12 developers of four open source projects that recognized more than 75 percent of the code smell instances identified by HIST as actual design/implementation problems
- …