96 research outputs found

    The role of oxygen in the vertical distribution of nematodes: an experimental approach

    Get PDF
    The role of oxygen in the vertical distribution of nematodes was investigated by means of an experiment in which different oxygen conditions were imposed on sediment from an intertidal area of the Oosterschelde (The Netherlands). To test our hypothesis that the vertical distribution of the nematode assemblages was not influenced by changing oxygen conditions (e.g. nematodes do not migrate to favourable oxygen conditions), 5 cm sediment was inversed and incubated for 5 days at the lab. In a first treatment, food (diatoms) was added to the bottom; in a second treatment oxygen and food were added to the bottom. For each case and a control treatment, fresh, well-aerated Oosterschelde water was added on top of the sediment. The analysis of the field situation showed that nematodes were the most abundant taxon. Highest densities were observed in the subsurface sediment layer (1-2 cm). The lower abundance in the oxygen and algae-rich superficial layer (0-0.5 cm) could be due to the time of sampling relative to the tides or to biotic factors (e.g. macrofaunal activity). The vertical distribution of the nematode assemblages in the experimental and control treatments proved to be significantly different. An obvious segregation existed between the nematode species assemblage from the superficial (0-0.2cm) and the deeper layers (0.2-1 cm and 4-5 cm). Characterising genera for the superficial sediment layers were Daptonema, Ptycholaimellus, Prochromadorella and Microlaimus; for the deeper layers Terschellingia and Microlaimus. The occurrence of the first species assemblage is determined by the presence of free oxygen. The second species assemblage is adapted to the reduced sediment; nevertheless, artificial addition of limited amounts of oxygen to the deeper sediment layers favoured the assemblage as higher abundances were recorded. In general, oxygen seems to be important in determining the vertical distribution of nematodes in this experiment

    Subcutaneous vitamin B12 administration using a portable infusion pump in cobalamin-related remethylation disorders: a gentle and easy to use alternative to intramuscular injections

    Get PDF
    BACKGROUND Cobalamin (cbl)-related remethylation disorders are a heterogeneous group of inherited disorders comprising the remethylation of homocysteine to methionine and affecting multiple organ systems, most prominently the nervous system and the bone marrow. To date, the parenteral, generally intramuscular, lifelong administration of hydroxycobalamin (OHCbl) is the mainstay of therapy in these disorders. The dosage and frequency of OHCbl is titrated in each patient to the minimum effective dose in order to account for the painful injections. This may result in undertreatment, a possible risk factor for disease progression and disease-related complications. RESULTS We describe parenteral administration of OHCbl using a subcutaneous catheter together with a portable infusion pump in a home therapy setting in four pediatric patients with remethylation disorders, two patients with cblC, one patient with cblG, and one patient with cblE deficiency, in whom intramuscular injections were not or no longer feasible. The placement of the subcutaneous catheters and handling of the infusion pump were readily accomplished and well accepted by the patients and their families. No adverse events occurred. The use of a small, portable syringe driver pump allowed for a most flexible administration of OHCbl in everyday life. The concentrations of total homocysteine levels were determined at regular patient visits and remained within the therapeutic target range. This approach allowed for the continuation of OHCbl therapy or the adjustment of therapy required to improve metabolic control in our patients. CONCLUSIONS Subcutaneous infusion using a subcutaneous catheter system and a portable pump for OHCbl administration in combined and isolated remethylation disorders is safe, acceptable, and effective. It decreases disease burden in preventing frequent single injections and providing patient independence. Thus, it may promote long-term adherence to therapy in patients and parents

    Time-to-birth prediction models and the influence of expert opinions

    Get PDF
    Preterm birth is the leading cause of death among children under five years old. The pathophysiology and etiology of preterm labor are not yet fully understood. This causes a large number of unnecessary hospitalizations due to high--sensitivity clinical policies, which has a significant psychological and economic impact. In this study, we present a predictive model, based on a new dataset containing information of 1,243 admissions, that predicts whether a patient will give birth within a given time after admission. Such a model could provide support in the clinical decision-making process. Predictions for birth within 48 h or 7 days after admission yield an Area Under the Curve of the Receiver Operating Characteristic (AUC) of 0.72 for both tasks. Furthermore, we show that by incorporating predictions made by experts at admission, which introduces a potential bias, the prediction effectiveness increases to an AUC score of 0.83 and 0.81 for these respective tasks

    Linear Protection Schemes Analysis in Scattered Placement Fiber-To-The Home-Passive Optical Network Using Customer Access Protection Unit Solution

    Get PDF
    <STRONG>Problem statement:</STRONG> This study highlights on restoration scheme proposed against failure in working line at the drop region for Fiber-To-The Home (FTTH) with a Passive Optical Network (PON). Whereas PON is a system that brings optical fiber cable and signals all or most of the way to the end user.<STRONG> Approach:</STRONG> Survivability scheme against failure is focused on scattered residence architectures and it is applied in the ring and tree topology respectively by means of Customer Access Protection Unit (CAPU). CAPU will be installed before the ONU and ensure the signal will find the alternative path when failure occurs at the specific line. Our proposal scheme is low cost and applicable to any residence architecture. The advantage of this scheme is the failure at fiber line can be recovered until three levels to make sure the optic signal flow continuously to avoid any application disturbance. Two type of restoration scheme is proposed by means of linear protection (tree) and migrated protection (ring). FTTH based network design is simulated by using Opti System 7.0 in order to investigate the power output and BER performance at each node in the tree and ring protection scheme in scattered placement. This study we perform an analysis on linear protection scheme that consisting of two model a) Line to Line (L2L) protection and CAPU to CAPU (C2C) or Shared protection. However the migration of tree to ring topology to enable the signal flow continuously in the case of failure occurs specifically in random or scattered placement topology has been highlighted in our previous publication. <STRONG>Results:</STRONG> The signal will be divided into section; drop and pass through and the ratio is significant to determine the number of user allowed and achievable distance. Output power for optical nodes could be slightly improved by varying the pass through and drop signal ratio. <STRONG>Conclusion:</STRONG> Our proposal is the first reported up to this time in which the upstream signal flows in anticlockwise in ring topology when the restoration scheme activated

    Measurement of the Transverse Beam Spin Asymmetry in Elastic Electron Proton Scattering and the Inelastic Contribution to the Imaginary Part of the Two-Photon Exchange Amplitude

    Full text link
    We report on a measurement of the asymmetry in the scattering of transversely polarized electrons off unpolarized protons, A⊥_\perp, at two Q2^2 values of \qsquaredaveragedlow (GeV/c)2^2 and \qsquaredaveragedhighII (GeV/c)2^2 and a scattering angle of 30∘<θe<40∘30^\circ < \theta_e < 40^\circ. The measured transverse asymmetries are A⊥_{\perp}(Q2^2 = \qsquaredaveragedlow (GeV/c)2^2) = (\experimentalasymmetry alulowcorr ±\pm \statisticalerrorlowstat_{\rm stat} ±\pm \combinedsyspolerrorlowalucorsys_{\rm sys}) ×\times 10−6^{-6} and A⊥_{\perp}(Q2^2 = \qsquaredaveragedhighII (GeV/c)2^2) = (\experimentalasymme tryaluhighcorr ±\pm \statisticalerrorhighstat_{\rm stat} ±\pm \combinedsyspolerrorhighalucorsys_{\rm sys}) ×\times 10−6^{-6}. The first errors denotes the statistical error and the second the systematic uncertainties. A⊥_\perp arises from the imaginary part of the two-photon exchange amplitude and is zero in the one-photon exchange approximation. From comparison with theoretical estimates of A⊥_\perp we conclude that π\piN-intermediate states give a substantial contribution to the imaginary part of the two-photon amplitude. The contribution from the ground state proton to the imaginary part of the two-photon exchange can be neglected. There is no obvious reason why this should be different for the real part of the two-photon amplitude, which enters into the radiative corrections for the Rosenbluth separation measurements of the electric form factor of the proton.Comment: 4 figures, submitted to PRL on Oct.

    Algorithmic Complexity for Short Binary Strings Applied to Psychology: A Primer

    Full text link
    Since human randomness production has been studied and widely used to assess executive functions (especially inhibition), many measures have been suggested to assess the degree to which a sequence is random-like. However, each of them focuses on one feature of randomness, leading authors to have to use multiple measures. Here we describe and advocate for the use of the accepted universal measure for randomness based on algorithmic complexity, by means of a novel previously presented technique using the the definition of algorithmic probability. A re-analysis of the classical Radio Zenith data in the light of the proposed measure and methodology is provided as a study case of an application.Comment: To appear in Behavior Research Method

    REFORMS: Reporting Standards for Machine Learning Based Science

    Full text link
    Machine learning (ML) methods are proliferating in scientific research. However, the adoption of these methods has been accompanied by failures of validity, reproducibility, and generalizability. These failures can hinder scientific progress, lead to false consensus around invalid claims, and undermine the credibility of ML-based science. ML methods are often applied and fail in similar ways across disciplines. Motivated by this observation, our goal is to provide clear reporting standards for ML-based science. Drawing from an extensive review of past literature, we present the REFORMS checklist (Re\textbf{Re}porting Standards For\textbf{For} M\textbf{M}achine Learning Based S\textbf{S}cience). It consists of 32 questions and a paired set of guidelines. REFORMS was developed based on a consensus of 19 researchers across computer science, data science, mathematics, social sciences, and biomedical sciences. REFORMS can serve as a resource for researchers when designing and implementing a study, for referees when reviewing papers, and for journals when enforcing standards for transparency and reproducibility

    Deep learning models for predicting RNA degradation via dual crowdsourcing

    Get PDF
    Medicines based on messenger RNA (mRNA) hold immense potential, as evidenced by their rapid deployment as COVID-19 vaccines. However, worldwide distribution of mRNA molecules has been limited by their thermostability, which is fundamentally limited by the intrinsic instability of RNA molecules to a chemical degradation reaction called in-line hydrolysis. Predicting the degradation of an RNA molecule is a key task in designing more stable RNA-based therapeutics. Here, we describe a crowdsourced machine learning competition (‘Stanford OpenVaccine’) on Kaggle, involving single-nucleotide resolution measurements on 6,043 diverse 102–130-nucleotide RNA constructs that were themselves solicited through crowdsourcing on the RNA design platform Eterna. The entire experiment was completed in less than 6 months, and 41% of nucleotide-level predictions from the winning model were within experimental error of the ground truth measurement. Furthermore, these models generalized to blindly predicting orthogonal degradation data on much longer mRNA molecules (504–1,588 nucleotides) with improved accuracy compared with previously published models. These results indicate that such models can represent in-line hydrolysis with excellent accuracy, supporting their use for designing stabilized messenger RNAs. The integration of two crowdsourcing platforms, one for dataset creation and another for machine learning, may be fruitful for other urgent problems that demand scientific discovery on rapid timescales

    Deep learning models for predicting RNA degradation via dual crowdsourcing

    Get PDF
    Messenger RNA-based medicines hold immense potential, as evidenced by their rapid deployment as COVID-19 vaccines. However, worldwide distribution of mRNA molecules has been limited by their thermostability, which is fundamentally limited by the intrinsic instability of RNA molecules to a chemical degradation reaction called in-line hydrolysis. Predicting the degradation of an RNA molecule is a key task in designing more stable RNA-based therapeutics. Here, we describe a crowdsourced machine learning competition ("Stanford OpenVaccine") on Kaggle, involving single-nucleotide resolution measurements on 6043 102-130-nucleotide diverse RNA constructs that were themselves solicited through crowdsourcing on the RNA design platform Eterna. The entire experiment was completed in less than 6 months, and 41% of nucleotide-level predictions from the winning model were within experimental error of the ground truth measurement. Furthermore, these models generalized to blindly predicting orthogonal degradation data on much longer mRNA molecules (504-1588 nucleotides) with improved accuracy compared to previously published models. Top teams integrated natural language processing architectures and data augmentation techniques with predictions from previous dynamic programming models for RNA secondary structure. These results indicate that such models are capable of representing in-line hydrolysis with excellent accuracy, supporting their use for designing stabilized messenger RNAs. The integration of two crowdsourcing platforms, one for data set creation and another for machine learning, may be fruitful for other urgent problems that demand scientific discovery on rapid timescales

    Number preferences in lotteries

    Get PDF
    We explore people's preferences for numbers in large proprietary data sets from two different lottery games. We find that choice is far from uniform, and exhibits some familiar and some new tendencies and biases. Players favor personally meaningful and situationally available numbers, and are attracted towards numbers in the center of the choice form. Frequent players avoid winning numbers from recent draws, whereas infrequent players chase these. Combinations of numbers are formed with an eye for aesthetics, and players tend to spread their numbers relatively evenly across the possible range
    • …
    corecore