266,765 research outputs found

    Assessing architectural evolution: A case study

    Get PDF
    This is the post-print version of the Article. The official published can be accessed from the link below - Copyright @ 2011 SpringerThis paper proposes to use a historical perspective on generic laws, principles, and guidelines, like Lehman’s software evolution laws and Martin’s design principles, in order to achieve a multi-faceted process and structural assessment of a system’s architectural evolution. We present a simple structural model with associated historical metrics and visualizations that could form part of an architect’s dashboard. We perform such an assessment for the Eclipse SDK, as a case study of a large, complex, and long-lived system for which sustained effective architectural evolution is paramount. The twofold aim of checking generic principles on a well-know system is, on the one hand, to see whether there are certain lessons that could be learned for best practice of architectural evolution, and on the other hand to get more insights about the applicability of such principles. We find that while the Eclipse SDK does follow several of the laws and principles, there are some deviations, and we discuss areas of architectural improvement and limitations of the assessment approach

    The NN2 Flux Difference Method for Constructing Variable Object Light Curves

    Full text link
    We present a new method for optimally extracting point-source time variability information from a series of images. Differential photometry is generally best accomplished by subtracting two images separated in time, since this removes all constant objects in the field. By removing background sources such as the host galaxies of supernovae, such subtractions make possible the measurement of the proper flux of point-source objects superimposed on extended sources. In traditional difference photometry, a single image is designated as the ``template'' image and subtracted from all other observations. This procedure does not take all the available information into account and for sub-optimal template images may produce poor results. Given N total observations of an object, we show how to obtain an estimate of the vector of fluxes from the individual images using the antisymmetric matrix of flux differences formed from the N(N-1)/2 distinct possible subtractions and provide a prescription for estimating the associated uncertainties. We then demonstrate how this method improves results over the standard procedure of designating one image as a ``template'' and differencing against only that image.Comment: Accepted to AJ. To be published in November 2005 issue. 16 page, 2 figures, 2 tables. Source code available at http://www.ctio.noao.edu/essence/nn2

    Software Measurement Activities in Small and Medium Enterprises: an Empirical Assessment

    Get PDF
    An empirical study for evaluating the proper implementation of measurement/metric programs in software companies in one area of Turkey is presented. The research questions are discussed and validated with the help of senior software managers (more than 15 years’ experience) and then used for interviewing a variety of medium and small scale software companies in Ankara. Observations show that there is a common reluctance/lack of interest in utilizing measurements/metrics despite the fact that they are well known in the industry. A side product of this research is that internationally recognized standards such as ISO and CMMI are pursued if they are a part of project/job requirements; without these requirements, introducing those standards to the companies remains as a long-term target to increase quality

    Recommendations for the Determination of Nutrients in Seawater to High Levels of Precision and Inter-Comparability using Continuous Flow Analysers

    Get PDF
    The Global Ocean Ship-based Hydrographic Investigations Program (GO-SHIP) brings together scientists with interests in physical oceanography, the carbon cycle, marine biogeochemistry and ecosystems, and other users and collectors of ocean interior data to develop a sustained global network of hydrographic sections as part of the Global Ocean Climate Observing System. A series of manuals and guidelines are being produced by GO-SHIP which update those developed by the World Ocean Circulation Experiment (WOCE) in the early 1990s. Analysis of the data collected in WOCE suggests that improvements are needed in the collection of nutrient data if they are to be used for determining change within the ocean interior. Production of this manual is timely as it coincides with the development of reference materials for nutrients in seawater (RMNS). These RMNS solutions will be produced in sufficient quantities and be of sufficient quality that they will provide a basis for improving the consistency of nutrient measurements both within and between cruises. This manual is a guide to suggested best practice in performing nutrient measurements at sea. It provides a detailed set of advice on laboratory practice for all the procedures surrounding the use of 1 gas-segmented continuous flow analysers (CFA) for the determination of dissolved nutrients (usually ammonium, nitrate, nitrite, phosphate and silicate) at sea. It does not proscribe the use of a particular instrument or related chemical method as these are well described in other publications. The manual provides a brief introduction to the CFA method, the collection and storage of samples, considerations in the preparation of reagents and the calibrations of the system. It discusses how RMNS solutions can be used to “track” the performance of a system during a cruise and between cruises. It provides a format for the meta-data that need to be reported along side the sample data at the end of a cruise so that the quality of the reported data can be evaluated and set in context relative to other data sets. Most importantly the central manual is accompanied by a set of nutrient standard operating procedures (NSOPs) that provide detailed information on key procedures that are necessary if best quality data are to be achieved consistently. These cover sample collection and storage, an example NSOP for the use of a CFA system at sea, high precision preparation of calibration solutions, assessment of the true calibration blank, checking the linearity of a calibration and the use of internal and externally prepared reference solutions for controlling the precision of data during a cruise and between cruises. An example meta-data report and advice on the assembly of the quality control and statistical data that should form part of the meta-data report are also given

    Transfer Learning for Improving Model Predictions in Highly Configurable Software

    Full text link
    Modern software systems are built to be used in dynamic environments using configuration capabilities to adapt to changes and external uncertainties. In a self-adaptation context, we are often interested in reasoning about the performance of the systems under different configurations. Usually, we learn a black-box model based on real measurements to predict the performance of the system given a specific configuration. However, as modern systems become more complex, there are many configuration parameters that may interact and we end up learning an exponentially large configuration space. Naturally, this does not scale when relying on real measurements in the actual changing environment. We propose a different solution: Instead of taking the measurements from the real system, we learn the model using samples from other sources, such as simulators that approximate performance of the real system at low cost. We define a cost model that transform the traditional view of model learning into a multi-objective problem that not only takes into account model accuracy but also measurements effort as well. We evaluate our cost-aware transfer learning solution using real-world configurable software including (i) a robotic system, (ii) 3 different stream processing applications, and (iii) a NoSQL database system. The experimental results demonstrate that our approach can achieve (a) a high prediction accuracy, as well as (b) a high model reliability.Comment: To be published in the proceedings of the 12th International Symposium on Software Engineering for Adaptive and Self-Managing Systems (SEAMS'17
    corecore