478 research outputs found

    Oct-2, although not required for early B-cell development, is critical for later B-cell maturation and for postnatal survival

    Get PDF
    Oct-2, a POU homeo domain transcription factor, is believed to stimulate B-cell-restricted expression of immunoglobulin genes through binding sites in immunoglobulin gene promoters and enhancers. To determine whether Oct-2 is required for B-cell development or function, or has other developmental roles, the gene was disrupted by homologous recombination. Oct-2^(-/-) mice develop normally but die within hours of birth for undetermined reasons. Mutants contain normal numbers of B-cell precursors but are somewhat deficient in IgM+ B cells. These B cells have a marked defect in their capacity to secrete immunoglobulin upon mitogenic stimulation in vitro. Thus, Oct-2 is not required for the generation of immunoglobulin-bearing B cells but is crucial for their maturation to immunoglobulin-secreting cells and for another undetermined organismal function

    Clinical profile and treatment of infantile spasms using vigabatrin and ACTH - a developing country perspective

    Get PDF
    Background: Infantile spasms represent a serious epileptic syndrome that occurs in the early infantile age. ACTH and Vigabatrin are actively investigated drugs in its treatment. This study describes the comparison of their efficacy in a large series of Patients with infantile spasms from Pakistan. Methods: All Patients with infantile spasms who presented to Aga Khan University Hospital, Karachi, Pakistan from January, 2006 to April, 2008 were included in this study. Inclusion criteria were clinical symptoms of infantile spasms, hypsarrythmia or modified hyparrythmia on electroencephalography, at least six months of follow-up period and receipt of any of the two drugs mentioned above. The type of drug distribution was random according to the availability, cost and ease of administration. Results: Fifty six cases fulfilled the inclusion criteria. 62.5% were males. Mean age at onset of seizures was 5 +/- 1.4 months. Fifty two (92.8%) Patients demonstrated hypsarrythmia on electroencephalography. 64.3% cases were identified as symptomatic while 19.6% were cryptogenic and 16.1% were idiopathic. Eighteen Patients received ACTH while 38 Patients received Vigabatrin as first line therapy. Initial response to first line therapy was similar (50% for ACTH and 55.3% for Vigabatrin). Overall, the symptomatic and idiopathic groups responded better to Vigabatrin. The relapse rate was higher for ACTH as compared to Vigabatrin (55.5% vs. 33.3%) when considering the first line therapy. Four Patients evolved to Lennox-Gastaut variant, all of these Patients had initially received Vigabatrin and then ACTH. Conclusion: Vigabatrin and ACTH showed no significant difference in the initial treatment of infantile spasms. However, Patients receiving ACTH were 1.2 times more likely to relapse as compared to the Patients receiving Vigabatrin when considering monotherapy. We suggest that Vigabatrin should be the initial drug of choice in Patients presenting with infantile spasms. However, larger studies from developing countries are required to validate the therapeutic trends observed in this study

    Proofs of Space: When Space Is of the Essence

    Get PDF
    Proofs of computational effort were devised to control denial of service attacks. Dwork and Naor (CRYPTO ’92), for example, proposed to use such proofs to discourage spam. The idea is to couple each email message with a proof of work that demonstrates the sender performed some computational task. A proof of work can be either CPU-bound or memory-bound. In a CPU-bound proof, the prover must compute a CPU-intensive function that is easy to check by the verifier. A memory-bound proof, instead, forces the prover to access the main memory several times, effectively replacing CPU cycles with memory accesses. In this paper we put forward a new concept dubbed proof of space. To compute such a proof, the prover must use a specified amount of space, i.e., we are not interested in the number of accesses to the main memory (as in memory-bound proof of work) but rather on the amount of actual memory the prover must employ to compute the proof. We give a complete and detailed algorithmic description of our model. We develop a comprehensive theoretical analysis which uses combinatorial tools from Complexity Theory (such as pebbling games) which are essential in studying space lower bounds

    Secure Code Updates for Smart Embedded Devices based on PUFs

    Get PDF
    Code update is a very useful tool commonly used in low-end embedded devices to improve the existing functionalities or patch discovered bugs or vulnerabilities. If the update protocol itself is not secure, it will only bring new threats to embedded systems. Thus, a secure code update mechanism is required. However, existing solutions either rely on strong security assumptions, or result in considerable storage and computation consumption, which are not practical for resource-constrained embedded devices (e.g., in the context of Internet of Things). In this work, we propose to use intrinsic device characteristics (i.e., Physically Unclonable Functions or PUF) to design a practical and lightweight secure code update scheme. Our scheme can not only ensure the freshness, integrity, confidentiality and authenticity of code update, but also verify that the update is installed correctly on a specific device without any malicious software. Cloned or counterfeit devices can be excluded as the code update is bound to the unpredictable physical properties of underlying hardware. Legitimate devices in an untrustworthy software state can be restored by filling suspect memory with PUF-derived random numbers. After update installation, the initiator of the code update is able to obtain the verifiable software state from device, and the device can maintain a sustainable post-update secure check by enforcing a secure call sequence. To demonstrate the practicality and feasibility, we also implement the proposed scheme on a low-end MCU platform (TI MSP430) by using onboard SRAM and Flash resources

    Proof of Space from Stacked Expanders

    Get PDF
    Recently, proof of space (PoS) has been suggested as a more egalitarian alternative to the traditional hash-based proof of work. In PoS, a prover proves to a verifier that it has dedicated some specified amount of space. A closely related notion is memory-hard functions (MHF), functions that require a lot of memory/space to compute. While making promising progress, existing PoS and MHF have several problems. First, there are large gaps between the desired space-hardness and what can be proven. Second, it has been pointed out that PoS and MHF should require a lot of space not just at some point, but throughout the entire computation/protocol; few proposals considered this issue. Third, the two existing PoS constructions are both based on a class of graphs called superconcentrators, which are either hard to construct or add a logarithmic factor overhead to efficiency. In this paper, we construct PoS from stacked expander graphs. Our constructions are simpler, more efficient and have tighter provable space-hardness than prior works. Our results also apply to a recent MHF called Balloon hash. We show Balloon hash has tighter space-hardness than previously believed and consistent space-hardness throughout its computation

    Search for Neutral Heavy Leptons Produced in Z Decays

    Get PDF
    Weak isosinglet Neutral Heavy Leptons (νm\nu_m) have been searched for using data collected by the DELPHI detector corresponding to 3.3×1063.3\times 10^{6} hadronic~Z0^{0} decays at LEP1. Four separate searches have been performed, for short-lived νm\nu_m production giving monojet or acollinear jet topologies, and for long-lived νm\nu_m giving detectable secondary vertices or calorimeter clusters. No indication of the existence of these particles has been found, leading to an upper limit for the branching ratio BR(BR(Z0νmν)^0\rightarrow \nu_m \overline{\nu}) of about 1.3×1061.3\times10^{-6} at 95\% confidence level for νm\nu_m masses between 3.5 and 50 GeV/c2c^2. Outside this range the limit weakens rapidly with the νm\nu_m mass. %Special emphasis has been given to the search for monojet--like topologies. One event %has passed the selection, in agreement with the expectation from the reaction: %e+eˉννˉe^+e^- \rightarrow\ell \bar\ell \nu\bar\nu. The results are also interpreted in terms of limits for the single production of excited neutrinos

    Updated precision measurement of the average lifetime of B hadrons

    Get PDF
    The measurement of the average lifetime of B hadrons using inclusively reconstructed secondary vertices has been updated using both an improved processing of previous data and additional statistics from new data. This has reduced the statistical and systematic uncertainties and gives \tau_{\mathrm{B}} = 1.582 \pm 0.011\ \mathrm{(stat.)} \pm 0.027\ \mathrm{(syst.)}\ \mathrm{ps.} Combining this result with the previous result based on charged particle impact parameter distributions yields \tau_{\mathrm{B}} = 1.575 \pm 0.010\ \mathrm{(stat.)} \pm 0.026\ \mathrm{(syst.)}\ \mathrm{ps.

    Methanogens, sulphate and heavy metals: a complex system

    Get PDF
    Anaerobic digestion (AD) is a well-established technology used for the treatment of wastes and wastewaters with high organic content. During AD organic matter is converted stepwise to methane-containing biogasa renewable energy carrier. Methane production occurs in the last AD step and relies on methanogens, which are rather sensitive to some contaminants commonly found in wastewaters (e.g. heavy metals), or easily outcompeted by other groups of microorganisms (e.g. sulphate reducing bacteria, SRB). This review gives an overview of previous research and pilot-scale studies that shed some light on the effects of sulphate and heavy metals on methanogenesis. Despite the numerous studies on this subject, comparison is not always possible due to differences in the experimental conditions used and parameters explained. An overview of the possible benefits of methanogens and SRB co-habitation is also covered. Small amounts of sulphide produced by SRB can precipitate with metals, neutralising the negative effects of sulphide accumulation and free heavy metals on methanogenesis. Knowledge on how to untangle and balance sulphate reduction and methanogenesis is crucial to take advantage of the potential for the utilisation of biogenic sulphide as a metal detoxification agent with minimal loss in methane production in anaerobic digesters.The research was financially supported by the People Program (Marie Curie Actions) of the European Union's Seventh Framework Programme FP7/2007-2013 under REA agreement 289193

    Measurement of inclusive π0\pi^{0} production in hadronic Z0Z^{0} decays

    Get PDF
    An analysis is presented of inclusive \pi^0 production in Z^0 decays measured with the DELPHI detector. At low energies, \pi^0 decays are reconstructed by \linebreak using pairs of converted photons and combinations of converted photons and photons reconstructed in the barrel electromagnetic calorimeter (HPC). At high energies (up to x_p = 2 \cdot p_{\pi}/\sqrt{s} = 0.75) the excellent granularity of the HPC is exploited to search for two-photon substructures in single showers. The inclusive differential cross section is measured as a function of energy for {q\overline q} and {b \bar b} events. The number of \pi^0's per hadronic Z^0 event is N(\pi^0)/ Z_{had}^0 = 9.2 \pm 0.2 \mbox{(stat)} \pm 1.0 \mbox{(syst)} and for {b \bar b}~events the number of \pi^0's is {\mathrm N(\pi^0)/ b \overline b} = 10.1 \pm 0.4 \mbox{(stat)} \pm 1.1 \mbox{(syst)} . The ratio of the number of \pi^0's in b \overline b events to hadronic Z^0 events is less affected by the systematic errors and is found to be 1.09 \pm 0.05 \pm 0.01. The measured \pi^0 cross sections are compared with the predictions of different parton shower models. For hadronic events, the peak position in the \mathrm \xi_p = \ln(1/x_p) distribution is \xi_p^{\star} = 3.90^{+0.24}_{-0.14}. The average number of \pi^0's from the decay of primary \mathrm B hadrons is found to be {\mathrm N} (B \rightarrow \pi^0 \, X)/\mbox{B hadron} = 2.78 \pm 0.15 \mbox{(stat)} \pm 0.60 \mbox{(syst)}
    corecore