2,896 research outputs found

    EpiCollect+: linking smartphones to web applications for complex data collection projects.

    Get PDF
    © 2014 Aanensen DM et al.Previously, we have described the development of the generic mobile phone data gathering tool, EpiCollect, and an associated web application, providing two-way communication between multiple data gatherers and a project database. This software only allows data collection on the phone using a single questionnaire form that is tailored to the needs of the user (including a single GPS point and photo per entry), whereas many applications require a more complex structure, allowing users to link a series of forms in a linear or branching hierarchy, along with the addition of any number of media types accessible from smartphones and/or tablet devices (e.g., GPS, photos, videos, sound clips and barcode scanning). A much enhanced version of EpiCollect has been developed (EpiCollect+). The individual data collection forms in EpiCollect+ provide more design complexity than the single form used in EpiCollect, and the software allows the generation of complex data collection projects through the ability to link many forms together in a linear (or branching) hierarchy. Furthermore, EpiCollect+ allows the collection of multiple media types as well as standard text fields, increased data validation and form logic. The entire process of setting up a complex mobile phone data collection project to the specification of a user (project and form definitions) can be undertaken at the EpiCollect+ website using a simple drag and drop procedure, with visualisation of the data gathered using Google Maps and charts at the project website. EpiCollect+ is suitable for situations where multiple users transmit complex data by mobile phone (or other Android devices) to a single project web database and is already being used for a range of field projects, particularly public health projects in sub-Saharan Africa. However, many uses can be envisaged from education, ecology and epidemiology to citizen science

    Biological reference points for Atlantic surfclam (Spisula solidissima) in warming seas

    Get PDF
    Atlantic surfclam (Spisula solidissima) are a large, commercially important shellfish in the United States faced with several important management challenges. Compared to many harvested fish and shellfish, their life history is relatively unknown. They are undergoing contraction in the southern and inshore parts of their range, as well as expansion into deeper water. Atlantic surfclam are thermally sensitive, and the changes in their distribution track changes in maximum bottom temperature. Sessile species cannot emigrate and are limited to recruitment and mortality as mechanisms for redistribution in response to changing climate. Management of Atlantic surf clam should account for these challenges. We describe a simulation designed to calculate biological reference points that will work well for Atlantic surfclam relative to biological and fishery goals, over a range of life history parameters, assessment uncertainties, and increases in temperature. Simulations of the trade-off between somatic growth and mortality under increased temperature led to target fishing mortality rates higher than the status quo, but also to increased variability in yield. Results suggest that increasing temperature may adversely affect the Atlantic surfclam industry, which prefers stable catches to short term increases in yield, due to market limitations. The results of this analysis are specific to Atlantic surfclam, but the methods described here could be used to enhance management for other harvested species facing similar challenges

    Hidden Markov Models and their Application for Predicting Failure Events

    Full text link
    We show how Markov mixed membership models (MMMM) can be used to predict the degradation of assets. We model the degradation path of individual assets, to predict overall failure rates. Instead of a separate distribution for each hidden state, we use hierarchical mixtures of distributions in the exponential family. In our approach the observation distribution of the states is a finite mixture distribution of a small set of (simpler) distributions shared across all states. Using tied-mixture observation distributions offers several advantages. The mixtures act as a regularization for typically very sparse problems, and they reduce the computational effort for the learning algorithm since there are fewer distributions to be found. Using shared mixtures enables sharing of statistical strength between the Markov states and thus transfer learning. We determine for individual assets the trade-off between the risk of failure and extended operating hours by combining a MMMM with a partially observable Markov decision process (POMDP) to dynamically optimize the policy for when and how to maintain the asset.Comment: Will be published in the proceedings of ICCS 2020; @Booklet{EasyChair:3183, author = {Paul Hofmann and Zaid Tashman}, title = {Hidden Markov Models and their Application for Predicting Failure Events}, howpublished = {EasyChair Preprint no. 3183}, year = {EasyChair, 2020}

    Survival of death certificate initiated registrations: selection bias, incomplete trace-back or higher mortality?

    Get PDF
    Cases first notified to a Registry and successfully followed back have an apparently worse prognosis than cases registered in life. A simple approach can be used to assess whether this is due to selection bias, incomplete follow-back or intrinsically higher mortality. For the colorectal, breast and stomach cancers studied and for comparable registries, the main explanations are likely to be selection bias and higher mortality

    Quality determination and the repair of poor quality spots in array experiments.

    Get PDF
    BACKGROUND: A common feature of microarray experiments is the occurrence of missing gene expression data. These missing values occur for a variety of reasons, in particular, because of the filtering of poor quality spots and the removal of undefined values when a logarithmic transformation is applied to negative background-corrected intensities. The efficiency and power of an analysis performed can be substantially reduced by having an incomplete matrix of gene intensities. Additionally, most statistical methods require a complete intensity matrix. Furthermore, biases may be introduced into analyses through missing information on some genes. Thus methods for appropriately replacing (imputing) missing data and/or weighting poor quality spots are required. RESULTS: We present a likelihood-based method for imputing missing data or weighting poor quality spots that requires a number of biological or technical replicates. This likelihood-based approach assumes that the data for a given spot arising from each channel of a two-dye (two-channel) cDNA microarray comparison experiment independently come from a three-component mixture distribution--the parameters of which are estimated through use of a constrained E-M algorithm. Posterior probabilities of belonging to each component of the mixture distributions are calculated and used to decide whether imputation is required. These posterior probabilities may also be used to construct quality weights that can down-weight poor quality spots in any analysis performed afterwards. The approach is illustrated using data obtained from an experiment to observe gene expression changes with 24 hr paclitaxel (Taxol) treatment on a human cervical cancer derived cell line (HeLa). CONCLUSION: As the quality of microarray experiments affect downstream processes, it is important to have a reliable and automatic method of identifying poor quality spots and arrays. We propose a method of identifying poor quality spots, and suggest a method of repairing the arrays by either imputation or assigning quality weights to the spots. This repaired data set would be less biased and can be analysed using any of the appropriate statistical methods found in the microarray literature.RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are

    An Overview Of Factors Affecting Distribution Of The Atlantic Surfclam (Spisula solidissima), A Continental Shelf Biomass Dominant, During A Period Of Climate Change

    Get PDF
    The Atlantic surfclam (Spisula solidissitna) is a dominant member of the biological community of the Middle Atlantic Bight continental shelf and a commercially harvested species. Climate warming is affecting the biology and distribution of this species, which provides an opportunity to investigate the processes and conditions that are restructuring this fishery and the implications for ecological and socioeconomic systems. A Management Strategy Evaluation (MSE) developed for the surfclam fishery provides a mechanistic description of the surfclam\u27s response to climate change and understanding of the cascade of effects initiated by changes in oceanographic conditions that ultimately appear as social and economic effects. This understanding in turn informs development of management policies for the resource. This overview considers the components of the surfclam MSE, relevant results, and implications for management and policy. The lessons learned from the surfclam MSE provide a basis for applying similar approaches to other ecologically important species that are also commercially exploitable resources

    The Exposed Surface Area To Volume Ratio: Is Shell More Efficient Than Limestone In Promoting Oyster Recruitment?

    Get PDF
    Planting oyster cultch is a common management approach used to enhance recruitment. The two most popular cultch materials are shell and limestone. Both are sold by volume or weight; however, once deposited on oyster grounds, only a small portion of the total surface area of each particle is available for recruitment. Shell and limestone have different surface area to volume properties, and thus provide differential settlement opportunities. Exposed surface area to volume (expSA/V) ratios of oyster shell and limestone fragments were compared, as an indicator of their recruitment potential and cost-effectiveness for cultch planting. Samples were collected from the Primary Public Oyster Seed Grounds in Louisiana by vibracore, and from the Pass Christian Tonging Grounds in Mississippi by dredge. Shell (including whole shell and fragments) and limestone particles greater than or equal to 8 mm by geometric shape were classified and their expSA/V was calculated. Mean expSA/V ratios of shell were approximately three to nine times higher than limestone. For limestone of similar particle size to provide an equivalent recruitment benefit for the same cost would require that the cost of purchase, transport, and planting be three to nine times lower than shell. Thus, shell is likely to be a more efficient material than limestone for recruitment enhancement. Nevertheless, the higher variability in expSA/V of shell and other factors such as the expected lifetime and the relative performance of small and large particles of materials should also be considered. Analysis of a Louisiana limestone plant and associated oyster cultch showed that the proportion of small and large limestone particles and the relative proportion of whole shells and fragments can greatly alter expSA/V. In this case, the a priori expectation that oyster shell would outperform limestone did not materialize because of the quantity of small limestone particles of favorable shapes in the deployed material. Even so, as yet unknown is the possible reduction in performance in situ of smaller particles that might occur if they increase the one-dimensionality of the plant

    Perspectives and Practices of Athletic Trainers and Team Physicians Implementing the 2010 NCAA Sickle Cell Trait Screening Policy

    Full text link
    Sickle cell trait (SCT) is usually benign. However, there are some conditions that may lead to SCTâ related problems and put athletes with the trait at particular risk. In 2010 the National Collegiate Athletic Association (NCAA) issued a policy that required all Division I (DI) studentâ athletes to confirm their SCT status or sign a liability waiver to opt out of testing. Athletic trainers and team physicians play key roles in the policy implementation and we examined their perceptions and practices. Between December 2013 and March 2014 we interviewed 13 head athletic trainers and team physicians at NCAA Division I colleges and universities in North Carolina. We used an interview guide with openâ ended questions covering knowledge of SCT, historical screening and education practices, current implementation, and policy benefits and challenges. Participants were knowledgeable about SCT and thought the policy was beneficial in providing SCT health information to and for studentâ athletes. Schools varied in provision of genetic counseling, offering the waiver, SCT tests administered, and other aspects. Challenges included: insufficient guidance from the NCAA; financial considerations; and misunderstanding of the relationships of race and ancestry to SCT risk. Athletic staff found the policy valuable, but felt it needs clarity and standardization.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/146861/1/jgc41292.pd

    Recognition memory, self-other source memory, and theory-of-mind in children with autism spectrum disorder.

    Get PDF
    This study investigated semantic and episodic memory in autism spectrum disorder (ASD), using a task which assessed recognition and self-other source memory. Children with ASD showed undiminished recognition memory but significantly diminished source memory, relative to age- and verbal ability-matched comparison children. Both children with and without ASD showed an “enactment effect”, demonstrating significantly better recognition and source memory for self-performed actions than other-person-performed actions. Within the comparison group, theory-of-mind (ToM) task performance was significantly correlated with source memory, specifically for other-person-performed actions (after statistically controlling for verbal ability). Within the ASD group, ToM task performance was not significantly correlated with source memory (after controlling for verbal ability). Possible explanations for these relations between source memory and ToM are considered
    corecore