83 research outputs found

    A guide to pre-processing high-throughput animal tracking data

    Get PDF
    1. Modern, high-throughput animal tracking studies collect increasingly large volumes of data at very fine temporal scales. At these scales, location error can exceed the animal’s step size, leading to mis-estimation of key movement metrics such as speed. ‘Cleaning’ the data to reduce location errors prior to analyses is one of the main ways movement ecologists deal with noisy data, and has the advantage of being more scalable to massive datasets than more complex methods. Though data cleaning is widely recommended, and ecologists routinely consider cleaned data to be the ground-truth, inclusive uniform guidance on this crucial step, and on how to organise the cleaning of massive datasets, is still rather scarce. 2. A pipeline for cleaning massive high-throughput datasets must balance ease of use and computationally efficient signal vs. noise screening, in which location errors are rejected without discarding valid animal movements. Another useful feature of a pre-processing pipeline is efficiently segmenting and clustering location data for statistical methods, while also being scalable to large datasets and robust to imperfect sampling. Manual methods being prohibitively time consuming, and to boost reproducibility, a robust pre-processing pipeline must be automated. 3. In this article we provide guidance on building pipelines for pre-processing high-throughput animal tracking data in order to prepare it for subsequent analysis. Our recommended pipeline, consisting of removing outliers, smoothing the filtered result, and thinning it to a uniform sampling interval, is applicable to many massive tracking datasets. We apply this pipeline to simulated movement data with location errors, and also show a case study of how large volumes of cleaned data can be transformed into biologically meaningful ‘residence patches’, for quick biological inference on animal space use. We use calibration data to illustrate how pre-processing improves its quality, and to verify that the residence patch synthesis accurately captures animal space use. Finally, turning to tracking data from Egyptian fruit bats (Rousettus aegyptiacus), we demonstrate the pre-processing pipeline and residence patch method in a fully worked out example. 4. To help with fast implementation of standardised methods, we developed the R package atlastools, which we also introduce here. Our pre-processing pipeline and atlastools can be used with any high-throughput animal movement data in which the high data-volume combined with knowledge of the tracked individuals’ movement capacity can be used to reduce location errors. The atlastools function is easy to use for beginners, while providing a template for further development. The use of common pre-processing steps that are simple yet robust promotes standardised methods in the field of movement ecology and leads to better inferences from data

    Integration of Virtual Programming Lab in a process of teaching programming EduScrum based

    Get PDF
    Programming teaching is a key factor for technological evolution. The efficient way to learn to program is by programming and hard training and thus feedback is a crucial factor in the success and flow of the process. This work aims to analyse the potential use of VPL in the teaching process of programming in higher education. It also intends to verify whether, with VPL, it is possible to make students learning more effective and autonomous, with a reduction in the volume of assessment work by teachers. Experiments were carried out with the VPL, in the practical-laboratory classes of a curricular unit of initiation to programming in a higher education institution. The results supported by the responses to surveys, point to the validity of the model

    Effect of retail service quality on switching intentions among hypermarket customers

    Get PDF
    Retail service quality is a vital driver in determining customer satisfaction, which in turn promotes customer loyalty and reduces switching intentions. Based on disconfirmation theory, the difference between expectations and delivered service quality, determines the level of a customer satisfaction. Service quality is a solution to build customer satisfaction which could lead to customer loyalty hence reducing switching intentions. The concept of switching intentions has received significant attention in the field of marketing, however, little is known about the application of this concept in the context of retail business. Consumer research has neither verified the relationships among constructs like retail service quality, customer satisfaction, customer loyalty and switching intentions, in a single framework, nor explored the possible influence of store ethnicity and price discounts on customer satisfaction and customer loyalty to switching intention. The current study has investigated the interrelationship among service quality, customer satisfaction, customer loyalty with switching intentions, and the moderating role of price discounts and store ethnicity, in a single framework. Random sampling was used by administering standardized questionnaires personally to 450 hypermarket customers located in the Eastern Province of Saudi Arabia. The quantitative data was analyzed by the structural equation modeling technique using AMOS 20 software. The study extended the existing body of knowledge by introducing new moderators of price discounts and ethnic store on the relationships between satisfaction and switching intentions, and loyalty and switching intentions. The results confirmed that retail service quality has significant positive influence on customer satisfaction, and the positive effect of customer satisfaction on customer loyalty. Besides that, the study verified in marketing literature that store ethnicity and price discounts acted as moderating mechanism for explaining the switching intentions of satisfied and loyal customers. The results of the study may serve as a guideline for top managers of the hypermarkets to design appropriate policies and strategies in terms of retail service quality, price discounts and needs of ethnic groups in a particular region. This will help to enhance customer satisfaction and customer loyalty hence reducing switching intentions of customers

    DESIGN AND EXPLORATION OF NEW MODELS FOR SECURITY AND PRIVACY-SENSITIVE COLLABORATION SYSTEMS

    Get PDF
    Collaboration has been an area of interest in many domains including education, research, healthcare supply chain, Internet of things, and music etc. It enhances problem solving through expertise sharing, ideas sharing, learning and resource sharing, and improved decision making. To address the limitations in the existing literature, this dissertation presents a design science artifact and a conceptual model for collaborative environment. The first artifact is a blockchain based collaborative information exchange system that utilizes blockchain technology and semi-automated ontology mappings to enable secure and interoperable health information exchange among different health care institutions. The conceptual model proposed in this dissertation explores the factors that influences professionals continued use of video- conferencing applications. The conceptual model investigates the role the perceived risks and benefits play in influencing professionals’ attitude towards VC apps and consequently its active and automatic use

    Electrochemical and mass spectrometry methods for identification of gunshot residues (GSR) in forensic investigations

    Get PDF
    Gun violence continues to be one of most significant challenges straining the USA society causing thousands of human lives lost every year. In 2020 alone, firearm-related incidents including homicide, accidents, and suicides, reached a staggering number of over 43,000.1,2 With the increase in these types of incidents, several service areas in crime laboratories are heavily impacted by the number of cases run on a yearly basis. These include firearm examinations, gunshot residue (GSR) analysis, bullet hole identification, and shooting distance determination, which are crucial to support a criminal investigation and, overall, the justice system in our country. These areas are very resourceful for reconstructing firearm-related inquiries and evaluating the evidence under source (GSR present or absent) or activity (fired a gun or in the vicinity of the firing) propositions. GSR particles are evaluated based on single-particle morphological and elemental analysis (e.g., lead, barium, and antimony) by Scanning Electron Microscopy Energy Dispersive Spectroscopy (SEM-EDS) following the ASTM 1588-20 method.3–6 In addition to SEM-EDS, color tests are currently used to evaluate distance determination as per the recommendations given by the Scientific Working Group for Firearms and Toolmarks (SWGGUN) for nitrites, lead, barium, and copper.7, 8,9 Our research group has focused its attention on the development of emerging analytical tools that facilitate the detection of both inorganic (IGSR) and organic gunshot residues (OGSR) using electrochemistry (EC) along with data mining tools to support more objective data interpretation. This research aims to fill some of the gaps observed in existing technologies like color tests by offering faster and complementary methods to decrease subjectivity, cost, analysis time, to aid with triage and more cost-effective workflows at the crime scene and laboratory. The complementary OGSR information is anticipated to cause a breakthrough in the GSR analysis paradigm and respond to the current OSAC recommendations for this specialized area of work. 10–14 To this end, the development of innovative sampling methods for distance determination and bullet hole identification were investigated to simultaneously gain spatial and chemical information via electrochemical detection. In the case of distance determination, a set of 30 calibrations and 45 unknown distance clothing samples on various light, dark, patterned, and bloodstained fabrics were assessed to compare the electrochemical performance against current techniques. Discriminant analysis statistical classification method was applied for the classification of the 45 unknowns resulting in an electrochemical method accuracy of 74% compared to color tests at 58%. Bullet hole identification were investigated on 59 fabrics and other alternative substrates commonly found at crime scenes, such as wood, and drywall to assess potential interference and electrochemical performance from unknown shooting distance. Electrochemical methods successfully provided simultaneous detection of IGSR and OGSR with overall 98% accuracy using calibration thresholds for positive identification. OGSR results were confirmed using our research group\u27s previously validated OGSR solvent extraction and LC-MS/MS method. Transitions toward using portable technology probed investigation to compare the performance of portable and benchtop instrumentation for GSR analysis. A comparison of figures of merit and performance metrics found comparable results on the limits of detection, precision, linear dynamic range, and error rates, with 95.7% and 96.5% accuracies for identifying GSR using critical threshold analysis for benchtop and portable potentiostats, respectively. Quick sample collection and screening allowed for fast electrochemical detection in 15 minutes for bullet hole and distance application. The advantage of this methodology is the developed analytical scheme can be easily incorporated within the current workflow to enhance reliability (i.e., physical measurements, color tests, or SEM-EDS) due to its non-destructive nature and highly selective and sensitive characteristics. The conclusions of this work demonstrate the fit-for-purpose of electrochemical detection expanding from GSR analysis to distance determination and bullet hole identification with fast detection using a low-cost platform for simultaneous IGSR and OGSR detection

    Feasibility study of an Integrated Program for Aerospace-vehicle Design (IPAD) system. Volume 6: Implementation schedule, development costs, operational costs, benefit assessment, impact on company organization, spin-off assessment, phase 1, tasks 3 to 8

    Get PDF
    A baseline implementation plan, including alternative implementation approaches for critical software elements and variants to the plan, was developed. The basic philosophy was aimed at: (1) a progressive release of capability for three major computing systems, (2) an end product that was a working tool, (3) giving participation to industry, government agencies, and universities, and (4) emphasizing the development of critical elements of the IPAD framework software. The results of these tasks indicate an IPAD first release capability 45 months after go-ahead, a five year total implementation schedule, and a total developmental cost of 2027 man-months and 1074 computer hours. Several areas of operational cost increases were identified mainly due to the impact of additional equipment needed and additional computer overhead. The benefits of an IPAD system were related mainly to potential savings in engineering man-hours, reduction of design-cycle calendar time, and indirect upgrading of product quality and performance

    Technological Advances in the Diagnosis and Management of Pigmented Fundus Tumours

    Get PDF
    Choroidal naevi are the most common intraocular tumour. They can be pigmented or non-pigmented and have a predilection for the posterior uvea. The majority remain undetected and cause no harm but are increasingly found on routine community optometry examinations. Rarely does a naevus demonstrate growth or the onset of suspicious features to fulfil the criteria for a malignant melanoma. Because of this very small risk, optometrists commonly refer these patients to hospital eye units for a second opinion, triggering specialist examination and investigation, causing significant anxiety to patients and stretching medical resources. This PhD thesis introduces the MOLES acronym and scoring system that has been devised to categorise the risk of malignancy in choroidal melanocytic tumours according to Mushroom tumour shape, Orange pigment, Large tumour size, Enlarging tumour and Subretinal fluid. This is a simplified system that can be used without sophisticated imaging, and hence its main utility lies in the screening of patients with choroidal pigmented lesions in the community and general ophthalmology clinics. Under this system, lesions were categorised by a scoring system as ‘common naevus’, ‘low-risk naevus’, ‘high-risk naevus’ and ‘probable melanoma.’ According to the sum total of the scores, the MOLES system correlates well with ocular oncologists’ final diagnosis. The PhD thesis also describes a model of managing such lesions in a virtual pathway, showing that images of choroidal naevi evaluated remotely using a decision-making algorithm by masked non-medical graders or masked ophthalmologists is safe. This work prospectively validates a virtual naevus clinic model focusing on patient safety as the primary consideration. The idea of a virtual naevus clinic as a fast, one-stop, streamlined and comprehensive service is attractive for patients and healthcare systems, including an optimised patient experience with reduced delays and inconvenience from repeated visits. A safe, standardised model ensures homogeneous management of cases, appropriate and prompt return of care closer to home to community-based optometrists. This research work and strategies, such as the MOLES scoring system for triage, could empower community-based providers to deliver management of benign choroidal naevi without referral to specialist units. Based on the positive outcome of this prospective study and the MOLES studies, a ‘Virtual Naevus Clinic’ has been designed and adapted at Moorfields Eye Hospital (MEH) to prove its feasibility as a response to the COVID-19 pandemic, and with the purpose of reducing in-hospital patient journey times and increasing the capacity of the naevus clinics, while providing safe and efficient clinical care for patients. This PhD chapter describes the design, pathways, and operating procedures for the digitally enabled naevus clinics in Moorfields Eye Hospital, including what this service provides and how it will be delivered and supported. The author will share the current experience and future plan. Finally, the PhD thesis will cover a chapter that discusses the potential role of artificial intelligence (AI) in differentiating benign choroidal naevus from choroidal melanoma. The published clinical and imaging risk factors for malignant transformation of choroidal naevus will be reviewed in the context of how AI applied to existing ophthalmic imaging systems might be able to determine features on medical images in an automated way. The thesis will include current knowledge to date and describe potential benefits, limitations and key issues that could arise with this technology in the ophthalmic field. Regulatory concerns will be addressed with possible solutions on how AI could be implemented in clinical practice and embedded into existing imaging technology with the potential to improve patient care and the diagnostic process. The PhD will also explore the feasibility of developed automated deep learning models and investigate the performance of these models in diagnosing choroidal naevomelanocytic lesions based on medical imaging, including colour fundus and autofluorescence fundus photographs. This research aimed to determine the sensitivity and specificity of an automated deep learning algorithm used for binary classification to differentiate choroidal melanomas from choroidal naevi and prove that a differentiation concept utilising a machine learning algorithm is feasible

    A model for enhancing customer satisfaction for quality services and awareness through community participation

    Get PDF
    Customer satisfaction as a means of measuring government performance has been on top of global debates of developmental programs at local government. The background of this study highlighted the position of customer satisfaction in many municipal coun-cils in the world and with special focus on Batu Pahat Municipal Council (BPMC). The research problem statement was that it has been observed that municipal aware-ness and community participation have not been harnessed for customer satisfaction for municipal councils’ service delivery. The aim and objectives of the study were to investigate the effects of Community Participation (CP) on customer satisfaction. Four research questions and five hypotheses were formulated to guide the study. The study population was 401,902 defined as a group of service users who are directly involved in receiving the services provided by the BPMC. The sample for the study was 400 survey respondents from the citizens living within BPMC. A structured questionnaire was presented to the respondents, selected through simple random sampling. Explor-atory Factor Analysis (EFA), Confirmatory Factor Analysis (CFA) were used to test the validity and reliability of the conceptual research model. The survey results sup-ported three hypotheses (H1, H2, H3) confirming that community participation has a mediating effect on customer satisfaction in BPMC. Hypothesis five (H5) was sup-ported in the main research, which indicates that CP has a mediating effects on cus-tomer satisfaction and also confirmed the pilot study that municipal awareness has an effect on customer satisfaction. The unique finding of the study is that, it has espoused the importance of CP as a mediator towards achieving customer satisfaction. This re-search has only been applied to BPMC, and further testing across different MC’s in Malaysia is needed to generalise the findings. The researcher concludes that commu-nity participation can enhance customer satisfaction through comprehensive model at the grass root level
    • 

    corecore