161 research outputs found

    Ground truth deficiencies in software engineering: when codifying the past can be counterproductive

    Get PDF
    Many software engineering tools build and evaluate their models based on historical data to support development and process decisions. These models help us answer numerous interesting questions, but have their own caveats. In a real-life setting, the objective function of human decision-makers for a given task might be influenced by a whole host of factors that stem from their cognitive biases, subverting the ideal objective function required for an optimally functioning system. Relying on this data as ground truth may give rise to systems that end up automating software engineering decisions by mimicking past sub-optimal behaviour. We illustrate this phenomenon and suggest mitigation strategies to raise awareness

    Simulation of masonry arch bridges using 3D discrete element modeling

    Get PDF
    The analysis of masonry arch bridges is still a challenge for engineers due to its complex and nonlinear behavior. In practice, structural behavior of masonry arch bridges is studied by following relatively simple methods, e.g. limit analysis, which does not require a significant number of parameters. Two-dimensional nonlinear finite element models are also common in the literature; however, these do not reflect the full structural response, since they neglect the out-of-plane actions. These models neglect spandrel walls, 3D point load effect and skew arches, among other effects. The objective of this study is to present a methodology that can simulate three-dimensional masonry arch bridge behavior comprehensively and can include various possible failure mechanisms. Discrete element method (DEM), which is a discontinuum approach, is used to understand the influence of essential structural components, such as the arch barrel, spandrel wall and back-fill material on several masonry arch structures. The masonry units are modeled using discrete blocks and back-fill material is generated as a continuum mesh, based on the plasticity theory. Load carrying capacity and related collapse mechanisms are investigated through a set of parametric studies on the mechanical properties of back-fill material. Out-of-plane spandrel wall failures were further explored by taking advantage of a discontinuous approach. The results indicated that soil characteristics (elastic modulus, internal friction angle and cohesion) have remarkable influence on the behavior and load carrying capacity of the masonry arch bridges. Further, the analyses are also validated with previously published experimental work as well as an existing historical bridge.- (undefined

    Characterization of Historic Mortar Samples and Period Analysis: A Case Study

    Get PDF
    The Imperial Temple in Antiochia Ad Cragum is estimated to be first constructed at the end of 2nd or start of 3rd century, the time of the Severan dynasty. However, archaeological evidence also suggests that there were interventions during the Byzantine era, with burials over the temple platform, a wine press on the northern side, and walls constructed perpendicular to the temple on the southern side, use of which are unidentified. There is also a retaining wall in the back of the temple that holds the earth against erosion from the hill on the back, but it is curiously close to the Temple if built as part of original construction. The goal of this study is to investigate the authors’ hypotheses of a multi-phase use and to identify which elements found on the site may be contemporary to each other by comparing the composition of mortar samples collected from different areas, supplemented by a geoarchaeological investigation. Five samples of mortar from the various areas around the temple were collected and tested using three methods: X-ray diffraction (XRD), Scanning Electron Microscope (SEM), X-ray fluorescence (XRF) and thin section petrographic analyses. While all mortar samples include similar locally sourced hydrated lime and sand mixtures, three distinct construction styles are identified in the visual analysis of the building elements, the mortar analyses, and the geoarchaeological investigations. One sample from the walls of the wine press pool includes fibers. The unique interdisciplinary work utilizing both material analyses and geoarchaeology strengthens the conclusions that can be drawn from individual fields of study and provides more support for the hypotheses of the phased destruction and changing use of the monument

    A hybrid generative/discriminative method for EEG evoked potential detection

    Get PDF
    I. INTRODUCTION Generative and discriminative learning approaches are two prevailing and powerful, yet different, paradigms in machine leaning. Generative learning models, such as Bayesian inference [1] attempt to model the underlying distributions of the variables in order to compute classification and regression functions. These methods provide a rich framework for learning from prior knowledge. Discriminative learning models, such as support vector machines (SVM) [2] avoid generative modeling by directly optimizing a mapping from the inputs to the desired outputs by adjusting the resulting classification boundary. These latter methods commonly demonstrate superior performance in classification. Recently, researchers have investigated the relationship between these two learning paradigms and have attempted to combine their complementary strength

    A BRUTE-FORCE ANALYTICAL FORMULATION OF THE INDEPENDENT COMPONENTS ANALYSIS SOLUTION

    Get PDF
    ABSTRACT Many algorithms based on information theoretic measures and/or temporal statistics of the signals have been proposed for ICA in the literature. There have also been analytical solutions suggested based on predictive modeling of the signals. In this paper, we show that finding an analytical solution for the ICA problem through solving a system of nonlinear equations is possible. We demonstrate that this solution is robust to decreasing sample size and measurement SNR. Nevertheless, finding the root of the nonlinear function proves to be a challenge. Besides the analytical solution approach, we try finding the solution using a least squares approach with the derived analytical equations. Monte Carlo simulations using the least squares approach are performed to investigate the effect of sample size and measurement noise on the performance

    Cloud computing in industrial SMEs: Identification of the barriers to its adoption and effects of its application

    Get PDF
    ABSTRACT: Cloud computing is a new technological paradigm that may revolutionize how organizations use IT by facilitating delivery of all technology as a service. In the literature, the Cloud is treated mainly through a technological approach focused on the concept definition, service models, infrastructures for its evelopment and security problems. However, there is an important lack of works which analyze this paradigm adoption in SMEs and its results, with a gap between the technological development and its adoption by organizations. This paper uses a qualitative technique methodology -group meetings with managers- and a quantitative one-survey- and identifies which factors act as barriers to Cloud adoption and which positive effects its application generates in 94 industrial SMEs. The conclusion is that the main barriers are of a cultural type and that the positive effects go well beyond reducing costs

    A review of rapid serial visual presentation-based brain-computer interfaces

    Get PDF
    International audienceRapid serial visual presentation (RSVP) combined with the detection of event related brain responses facilitates the selection of relevant information contained in a stream of images presented rapidly to a human. Event related potentials (ERPs) measured non-invasively with electroencephalography (EEG) can be associated with infrequent targets amongst a stream of images. Human-machine symbiosis may be augmented by enabling human interaction with a computer, without overt movement, and/or enable optimization of image/information sorting processes involving humans. Features of the human visual system impact on the success of the RSVP paradigm, but pre-attentive processing supports the identification of target information post presentation of the information by assessing the co-occurrence or time-locked EEG potentials. This paper presents a comprehensive review and evaluation of the limited but significant literature on research in RSVP-based brain-computer interfaces (BCIs). Applications that use RSVP-based BCIs are categorized based on display mode and protocol design, whilst a range of factors influencing ERP evocation and detection are analyzed. Guidelines for using the RSVP-based BCI paradigms are recommended, with a view to further standardizing methods and enhancing the inter-relatability of experimental design to support future research and the use of RSVP-based BCIs in practice

    Challenges for automatically extracting molecular interactions from full-text articles

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The increasing availability of full-text biomedical articles will allow more biomedical knowledge to be extracted automatically with greater reliability. However, most Information Retrieval (IR) and Extraction (IE) tools currently process only abstracts. The lack of corpora has limited the development of tools that are capable of exploiting the knowledge in full-text articles. As a result, there has been little investigation into the advantages of full-text document structure, and the challenges developers will face in processing full-text articles.</p> <p>Results</p> <p>We manually annotated passages from full-text articles that describe interactions summarised in a Molecular Interaction Map (MIM). Our corpus tracks the process of identifying facts to form the MIM summaries and captures any factual dependencies that must be resolved to extract the fact completely. For example, a fact in the results section may require a synonym defined in the introduction. The passages are also annotated with negated and coreference expressions that must be resolved.</p> <p>We describe the guidelines for identifying relevant passages and possible dependencies. The corpus includes 2162 sentences from 78 full-text articles. Our corpus analysis demonstrates the necessity of full-text processing; identifies the article sections where interactions are most commonly stated; and quantifies the proportion of interaction statements requiring coherent dependencies. Further, it allows us to report on the relative importance of identifying synonyms and resolving negated expressions. We also experiment with an oracle sentence retrieval system using the corpus as a gold-standard evaluation set.</p> <p>Conclusion</p> <p>We introduce the MIM corpus, a unique resource that maps interaction facts in a MIM to annotated passages within full-text articles. It is an invaluable case study providing guidance to developers of biomedical IR and IE systems, and can be used as a gold-standard evaluation set for full-text IR tasks.</p
    • …
    corecore