104 research outputs found

    Image Compression and its Effect on Data

    Get PDF
    This thesis is intended to define and study different image compression techniques, software programs, image formats (from early ones such as “GIF” to most recent ones such as “JPEG 2000”), compression effect on compressed data (compressed images), and its effectiveness and usefulness in reducing the file size and its transmission time, as a result. In many GeoBioPhysical applications, some information inside any image may be the keys to solve different kinds of problems and classify features. This kind of data and information has to be handled with care; i.e. it’s not allowed to be lost during the compression process. On the other hand, dealing with images is more flexible in regular applications such as images used as pictures for simple purposes such as e-mails. An un-compressed aerial image (DOQQ) of Huntington, WV. (with “.Tiff” extension) was taken as the original file to be compressed using different techniques and software programs. The results were studied and attached to each image. The resulting file size of each image was used to perform some comparisons between different software programs that were also used, trying to find the effectiveness of each technique and software from the quality to file size ratio point of view. Some previous work and research from different references was also studied and discussed to show the differences and the similarities between this work and previous ones. One of the goals of this study is to find the software program(s) and the compression types those give the best quality to file size ratio, and the ones that work best for GeoBioPhysical studies. The results show that dealing with different types of imagery is sensitive and depends strongly on the application; the user has to know what he is doing. The user has to use the proper input imagery and compress them to the proper limits to get best results. The results of this study show that JPEG2000 software programs (such as LuraWave) are very good and effective choices. JPEG2000 and ECW are likely to be extensively used in the near future for imagery and internet usage

    Approach to attributed feature modeling for requirements elicitation in Scrum agile development

    Get PDF
    Requirements elicitation is a core activity of requirements engineering for the product to be developed. The knowledge that has been gained during requirements engineering about the product to be developed forms the basis for requirement elicitation. The agile approach is becoming known day by day as the most widely used innovative process in the domain of requirements engineering. Requirements elicitation in agile development faces several challenges. Requirements must be gathered sufficiently to reflect stakeholders' needs. Furthermore, because of the development process, requirements evolve, and they must be adequately treated to keep up with the changing demands of the market and the passage of time. Another challenge with agile implementation is handling non-functional requirements in software development. Addressing non- functional requirements is still a critical factor in the success of any product. Requirements prioritization is also one of the most challenging tasks, and it is uncommon for requirement engineers to be able to specify and document all the requirements at once. This paper presents an approach for requirements elicitation in scrum-based agile development. The approach operates with the feature modeling technique, which is originally used in the Software Product Line (SPL). One of the most important proposed extensions to Feature Models (FMs) is the introduction of feature attributes. Our method uses attributed FMs to consider both functional and non-functional requirements as well as requirement prioritization. For the evaluation purposes, we have demonstrated our approach through two case studies in different domains of software product development. The first case study is in the domain of education, and the second one is in the domain of health care. The results reveal that our approach fits the requirements elicitation process in scrum agile development.Bourns College of Engineering, University of California, Riverside(undefined

    A concrete product derivation in software product line engineering: a practical approach

    Get PDF
    Software Product Lines enable the development of a perfect family of products by reusing shared assets in a systematic manner. Product derivation is a critical activity in software product line engineering and one of the most pressing issues that a software product line must address. This work introduces an approach for automating the derivation of a product from a software product line. The software product line is part of a product family that evolved from a non-structured approach to managing variability. The automated derivation approach relies on product configurations and the refactoring of feature models. The approach was deployed and evaluated in the automotive domain using a real-world software product line. The outcome demonstrates that the approach generates a product in an automated and successful manner.This work has been supported by FCT – Fundação para a Ciência e Tecnologia within the R&D Units Project Scope: UIDB/00319/2020

    FedCSD: A Federated Learning Based Approach for Code-Smell Detection

    Full text link
    This paper proposes a Federated Learning Code Smell Detection (FedCSD) approach that allows organizations to collaboratively train federated ML models while preserving their data privacy. These assertions have been supported by three experiments that have significantly leveraged three manually validated datasets aimed at detecting and examining different code smell scenarios. In experiment 1, which was concerned with a centralized training experiment, dataset two achieved the lowest accuracy (92.30%) with fewer smells, while datasets one and three achieved the highest accuracy with a slight difference (98.90% and 99.5%, respectively). This was followed by experiment 2, which was concerned with cross-evaluation, where each ML model was trained using one dataset, which was then evaluated over the other two datasets. Results from this experiment show a significant drop in the model's accuracy (lowest accuracy: 63.80\%) where fewer smells exist in the training dataset, which has a noticeable reflection (technical debt) on the model's performance. Finally, the last and third experiments evaluate our approach by splitting the dataset into 10 companies. The ML model was trained on the company's site, then all model-updated weights were transferred to the server. Ultimately, an accuracy of 98.34% was achieved by the global model that has been trained using 10 companies for 100 training rounds. The results reveal a slight difference in the global model's accuracy compared to the highest accuracy of the centralized model, which can be ignored in favour of the global model's comprehensive knowledge, lower training cost, preservation of data privacy, and avoidance of the technical debt problem.Comment: 17 pages, 7 figures, Journal pape

    Creating a common priority vector in intuitionistic fuzzy AHP: a comparison of entropy-based and distance-based models

    Get PDF
    In the case of conflicting individuals or evaluator groups, finding the common preferences of the participants is a challenging task. This statement also refers to Intuitionistic Fuzzy Analytic Hierarchy Process models, in which uncertainty of the scoring of individuals is well-handled, however, the aggregation of the modified scores is generally conducted by the conventional way of multi-criteria decision-making. This paper offers two options for this aggregation: the relatively well-known entropy-based, and the lately emerged distance-based aggregations. The manuscript can be considered as a pioneer work by analyzing the nature of distance-based aggregation under a fuzzy environment. In the proposed model, three clearly separable conflicting groups are examined, and the objective is to find their common priority vector, which can be satisfactory to all participant clusters. We have tested the model results on a real-world case study, on a public transport development decision-making problem by conducting a large-scale survey involving three different stakeholder groups of transportation. The comparison of the different approaches has shown that both entropy-based and distance-based techniques can provide a feasible solution based on their high similarity in the final ordinal and cardinal outcomes

    New trends on photoelectrocatalysis (PEC):nanomaterials, wastewater treatment and hydrogen generation

    Get PDF
    The need for novel water treatment technologies has been recently recognised as concerning contaminants (organics and pathogens) are resilient to standard technologies. Advanced oxidation processes degrade organics and inactivate microorganisms via generated reactive oxygen species (ROS). Among them, heterogeneous photocatalysis may have reduced efficiency due to, fast electron-hole pair recombination in the photoexcited semiconductor and reduced effective surface area of immobilised photocatalysts. To overcome these, the process can be electrically assisted by using an external bias, an electrically conductive support for the photocatalyst connected to a counter electrode, this is known as photoelectrocatalysis (PEC). Compared to photocatalysis, PEC increases the efficiency of the generation of ROS due to the prevention of charge recombination between photogenerated electron-hole pairs thanks the electrical bias applied. This review presents recent trends, challenges, nanomaterials and different water applications of PEC (degradation of organic pollutants, disinfection and generation of hydrogen from wastewater)

    Bench-scale photoelectrocatalytic reactor utilizing rGO-TiO2 photoanodes for the degradation of contaminants of emerging concern in water

    Get PDF
    Pharmaceuticals and personal care products are contaminants of emerging concern (CEC) in water. Photocatalysis (PC) and photoelectrocatalysis (PEC) are potential advanced oxidation processes for the effective degradation of these contaminants. In this work a bench-scale photoelectrocatalytic reactor utilizing a UVA-LED array was designed and tested for the degradation of diclofenac as a model CEC. Reduced graphene oxide-titanium dioxide (rGO-TiO2) composite, prepared by the photocatalytic reduction of rGO on TiO2, was immobilised on fluorine doped tin oxide (FTO) glass and evaluated as a photoanode. The influence of UVA intensity and rGO:TiO2 ratio on the degradation rate was studied. Surface modification of the TiO2 with 1% rGO gave the highest photocurrent and best degradation rate of diclofenac, as compared to unmodified TiO2. However, following repeat cycles of photoelectrocatalytic treatment there was an observed drop in the photocurrent with rGO-TiO2 anodes and the rate of diclofenac degradation decreased. Raman and XPS analysis indicated the re-oxidation of the rGO. Attempts to regenerate the rGO in-situ by electrochemical reduction did not prove successful, suggesting that the site of photoelectrocatalytic oxidation of rGO was different to the reduction site targeted in the photocatalytic reduction for the formation of the rGO-TiO2 composites

    Exploring the design space of nonlinear shallow arches with generalised path-following

    Get PDF
    The classic snap-through problem of shallow arches is revisited using the so-called generalised path-following technique. Classical buckling theory is a popular tool for designing structures prone to instabilities, albeit with limited applicability as it assumes a linear pre-buckling state. While incremental-iterative nonlinear finite element methods are more accurate, they are considered to be complex and costly for parametric studies. In this regard, a powerful approach for exploring the entire design space of nonlinear structures is the generalised path-following technique. Within this framework, a nonlinear finite element model is coupled with a numerical continuation solver to provide an accurate and robust way of evaluating multi-parametric structural problems. The capabilities of this technique are exemplified here by studying the effects of four different parameters on the structural behaviour of shallow arches, namely, mid span transverse loading, arch rise height, distribution of cross-sectional area along the span, and total volume of the arch. In particular, the distribution of area has a pronounced effect on the nonlinear load-displacement response and can therefore be used effectively for elastic tailoring. Most importantly, we illustrate the risks entailed in optimising the shape of arches using linear assumptions, which arise because the design drivers influencing linear and nonlinear designs are in fact topologically opposed

    Hospital Admissions Due to Ischemic Heart Diseases and Prescriptions of Cardiovascular Diseases Medications in England and Wales in the Past Two Decades

    Get PDF
    Objectives: The aim of this study was to explore the trend of ischemic heart disease (IHD) admission and the prescriptions of IHD medications in England and Wales. Methods: A secular trends study was conducted during the period of 1999 to 2019. We extracted hospital admission data for patients from all age groups from the Hospital Episode Statistics database in England and the Patient Episode Database for Wales. Prescriptions of IHD medications were extracted from the Prescription Cost Analysis database from 2004 to 2019. The chi-squared test was used to assess the difference between the admission rates and the difference between IHD medication prescription rates. The trends in IHD-related hospital admission and IHD-related medication prescription were assessed using a Poisson model. The correlation between hospital admissions for IHD and its IHD medication-related prescriptions was assessed using the Pearson correlation coefficient. Results: Our study detected a significant increase in the rate of cardiovascular disease (CVD) medication prescriptions in England and Wales, representing a rise in the CVD medications prescription rate of 41.8% (from 539,334.95 (95% CI = 539,286.30–539,383.59) in 2004 to 764,584.55 (95% CI = 764,545.55–764,623.56) in 2019 prescriptions per 100,000 persons), with a mean increase of 2.8% per year during the past 15 years. This increase was connected with a reduction in the IHD hospital admission rate by 15.4% (from 838.50 (95% CI = 836.05–840.94) in 2004 to 709.78 (95% CI = 707.65–711.92) in 2019 per 100,000 persons, trend test, p < 0.01), with a mean decrease of 1.02% per year during the past 15 years and by 5% (from 747.43 (95% CI = 745.09–749.77) in 1999 to 709.78 (95% CI = 707.65–711.92) in 2019 per 100,000 persons, trend test, p < 0.01) with a mean decrease of 0.25% per year during the past two decades in England and Wales. Conclusion: The rate of hospitalisation due to IHD has decreased in England and Wales during the past two decades. Hospitalisation due to IHD was strongly and negatively correlated with the increase in the rates of dispensing of IHD-related medications. Other factors contributing to this decline could be the increase in controlling IHD risk factors during the past few years. Future studies exploring other risk factors that are associated with IHD hospitalisation are warranted
    • …
    corecore