1,041 research outputs found

    CRLF2 cytokine receptor signaling in acute leukemia

    Get PDF
    Cytokine Receptor Like Factor 2 (CRLF2) is the receptor of the cytokine Thymic Stromal Lymphopoietin (TSLP). CRLF2 plays a role in lymphocyte development, differentiation and homeostasis. Recently, CRLF2 is implicated in a subset of precursor B cell acute lymphoblastic leukemia (pre-B ALL), which is the most common pediatric malignancy. Long-term survival of this disease is stratified into different risk groups dictating treatment plans for patients. Among pre-B ALL risk groups, a much worse prognosis is observed in patients with the gene rearrangement of CRLF2 as well as with the fusion gene BCR-ABL1. Previous transcriptomic analyses indicated a similar gene expression profile for CRLF2 and BCR/ABL1 signaling in pre-B ALL. While both CRLF2 and BCR/ABL1 have potent effects on cell signaling pathways, their total protein and global phosphorylation profiles might be different. In order to interrogate this hypothesis, we leveraged the Ba/F3 cell culture system to carry out a series of quantitative proteomics analysis to compare BCR/ABL1 and multiple forms of aberrant CRLF2 signaling. Our study identified major differences between CRLF2 and BCR/ABL1 signaling including EIF2/EIF4 signaling related to translation initiation regulation, DNA methylation/transcription repression pathways, IKZF2-INPP5D signaling axis, JAK2-mediated H3Y42 phosphorylation, among others. In addition, we took advantage of a kinase and small molecule inhibitor screen to identify PLK1 as a potential therapeutic target downstream of the leukemic CRLF2 signaling. PLK1 dysregulation resulting from the aberrant CRLF2 JAK2 signaling might provide an opportunity to develop CRLF2-targeted therapy in the future. In conclusion, CRLF2 and BCR/ABL1 signaling differs in many aspects that might affect therapy development for the different types of pre-B ALL

    On the Compressed-Oracle Technique, and Post-Quantum Security of Proofs of Sequential Work

    Full text link
    We revisit the so-called compressed oracle technique, introduced by Zhandry for analyzing quantum algorithms in the quantum random oracle model (QROM). To start off with, we offer a concise exposition of the technique, which easily extends to the parallel-query QROM, where in each query-round the considered algorithm may make several queries to the QROM in parallel. This variant of the QROM allows for a more fine-grained query-complexity analysis. Our main technical contribution is a framework that simplifies the use of (the parallel-query generalization of) the compressed oracle technique for proving query complexity results. With our framework in place, whenever applicable, it is possible to prove quantum query complexity lower bounds by means of purely classical reasoning. More than that, for typical examples the crucial classical observations that give rise to the classical bounds are sufficient to conclude the corresponding quantum bounds. We demonstrate this on a few examples, recovering known results (like the optimality of parallel Grover), but also obtaining new results (like the optimality of parallel BHT collision search). Our main target is the hardness of finding a qq-chain with fewer than qq parallel queries, i.e., a sequence x0,x1,,xqx_0, x_1,\ldots, x_q with xi=H(xi1)x_i = H(x_{i-1}) for all 1iq1 \leq i \leq q. The above problem of finding a hash chain is of fundamental importance in the context of proofs of sequential work. Indeed, as a concrete cryptographic application of our techniques, we prove that the "Simple Proofs of Sequential Work" proposed by Cohen and Pietrzak remains secure against quantum attacks. Such an analysis is not simply a matter of plugging in our new bound; the entire protocol needs to be analyzed in the light of a quantum attack. Thanks to our framework, this can now be done with purely classical reasoning

    Efficacy of Sanfujiu to Treat Allergies: Patient Outcomes at 1 Year after Treatment

    Get PDF
    Sanfujiu is a treatment method of applying herbal paste onto the acupoints Fengmen and Feishu during the three hottest days of summer to treat patients with allergies. The objectives of this study were to determine the treatment efficacy at 1 year after the Sanfujiu treatment, and examine variations in the perceived efficacy of Sanfujiu among different subgroups, based on the patients' ages, diagnoses and number of reactive symptoms immediately after the treatment. We enrolled 105 patients who completed Sanfujiu treatment at a medical university hospital in Taipei as the subjects. One year after treatment, trained interviewers conducted telephone interviews with the patients. Approximately 60% of them perceived the treatment as being effective at 1 year later, which was higher than that at 1 week after treatment (45.7%). Younger subjects (<19 years of age) and patients with asthma were more likely to report the treatment as being effective. Patients who had more reactive symptoms after the third Sanfujiu treatment were more likely to report the treatment as being effective. The results demonstrated that Sanfujiu was moderately effective, as perceived by patients in Taiwan, in treating their allergic symptoms

    DEXON: A Highly Scalable, Decentralized DAG-Based Consensus Algorithm

    Get PDF
    A blockchain system is a replicated state machine that must be fault tolerant. When designing a blockchain system, there is usually a trade-off between decentralization, scalability, and security. In this paper, we propose a novel blockchain system, DEXON, which achieves high scalability while remaining decentralized and robust in the real-world environment. We have two main contributions. First, we present a highly scalable sharding framework for blockchain. This framework takes an arbitrary number of single chains and transforms them into the \textit{blocklattice} data structure, enabling \textit{high scalability} and \textit{low transaction confirmation latency} with asymptotically optimal communication overhead. Second, we propose a single-chain protocol based on our novel verifiable random function and a new Byzantine agreement that achieves high decentralization and low latency

    Relatively preserved functional immune capacity with standard COVID-19 vaccine regimen in people living with HIV

    Get PDF
    IntroductionPeople living with HIV (PLWH) are at a higher risk of severe disease with SARS-CoV-2 virus infection. COVID-19 vaccines are effective in most PLWH. However, suboptimal immune responses to the standard two-shot regimen are a concern, especially for those with moderate to severe immunodeficiency. An additional dose is recommended as part of the extended primary series in Taiwan. Herein, we study the efficacy of this additional shot in PLWH with mild immunodeficiency compared to that in healthy non-HIV people.MethodsIn total, 72 PLWH that were asymptomatic or with mild immunodeficiency (CD4 counts ≥200/mm3) and suppressed virology, and 362 healthcare workers of our hospital were enrolled. None of the participants had a history of SARS-CoV-2 infection. They received mRNA-1273 and ChAdOx1 vaccines. Anti-SARS-CoV-2 neutralizing and anti-Spike IgG antibodies, and SARS-CoV-2-specific T cell responses were evaluated.ResultsThe standard two-shot regimen elicited lower responses in PLWH than the healthcare workers without HIV infection, although the difference was statistically insignificant. They had comparable levels of neutralizing and anti-Spike antibodies and comparable effector CD4+ and CD8+ T cell responses. The third shot boosted the SARS-CoV-2 immunity significantly more with better antibody responses and higher IFN-γ and IL-2 responses of the CD4+ and CD8+ T cells in PLWH compared to those without HIV. Upon in vitro stimulation with extracted Wuhan strain SARS-CoV-2 proteins, CD8+ T cells from PLWH after 3 shots had more durable effector responses than the non-HIV controls with extended time of stimulation.ConclusionThis subtle difference between PLWH and non-HIV people implied immune exhaustion with two shots in non-HIV people. Slightly compromised immunity in PLWH indeed preserved the functional capacity for further response to the third shot or natural infection

    Eleven generations of selection for the duration of fertility in the intergeneric crossbreeding of ducks

    Get PDF
    A 12-generation selection experiment involving a selected line (S) and a control line (C) has been conducted since 1992 with the aim of increasing the number of fertile eggs laid by the Brown Tsaiya duck after a single artificial insemination (AI) with pooled Muscovy semen. On average, 28.9% of the females and 17.05% of the males were selected. The selection responses and the predicted responses showed similar trends. The average predicted genetic responses per generation in genetic standard deviation units were 0.40 for the number of fertile eggs, 0.45 for the maximum duration of fertility, and 0.32 for the number of hatched mule ducklings' traits. The fertility rates for days 2–8 after AI were 89.14% in the S line and 61.46% in the C line. Embryo viability was not impaired by this selection. The largest increase in fertility rate per day after a single AI was observed from d5 to d11. In G12, the fertility rate in the selected line was 91% at d2, 94% at d3, 92% at days 3 and 4 then decreased to 81% at d8, 75% at d9, 58% at d10 and 42% at d11. In contrast, the fertility rate in the control line showed an abrupt decrease from d4 (74%). The same tendencies were observed for the evolution of hatchability according to the egg set rates. It was concluded that selection for the number of fertile eggs after a single AI with pooled Muscovy semen could effectively increase the duration of the fertile period in ducks and that research should now be focused on ways to improve the viability of the hybrid mule duck embryo

    Security Enhancement on Reconfiguring Coded Wavelength with Tunable Wavelength Filter Array Triggered Chaotic Sequences

    Get PDF
    In current study, the reconfigurable optical code-division multiple-access (OCDMA) scheme isimplemented that the chaos sequence is created non-linear time-variant sequence as secret key and then trigger tunable wavelength filter array as random wavelength allocation. In the encryption, the distribution of light carrier is designed and implemented by using tunable wavelength filter array triggered chaotic sequence. In addition, the arrayed waveguide grating (AWG) router is rewritten with maximal length code (M-sequence) to act as encoder. In the decryption, symmetric scheme and balanced photo-detector is presented and reconfigurable mechanism is followed the encryption synchronously either public or private channel. Hence, the multiple access interference (MAI) is cancelled completely while chaotic sequence is varied synchronously in transmitter and receiver. Compared to previous reconfigurable scheme by triggered register and switches after AWG router, the simulation results show that the secret key number of proposed cryptography is significantly increased to avoid eavesdropping attack in physical layer

    The Case ∣ Acute heart failure with elevated cardiac enzymes

    Get PDF

    Analyzing Coronary Artery Disease in Patients with Low CAC Scores by 64-Slice MDCT

    Get PDF
    Purpose. Coronary artery calcification (CAC) scores are widely used to determine risk for Coronary Artery Disease (CAD). A CAC score does not have the diagnostic accuracy needed for CAD. This work uses a novel efficient approach to predict CAD in patients with low CAC scores. Materials and Methods. The study group comprised 86 subjects who underwent a screening health examination, including laboratory testing, CAC scanning, and cardiac angiography by 64-slice multidetector computed tomographic angiography. Eleven physiological variables and three personal parameters were investigated in proposed model. Logistic regression was applied to assess the sensitivity, specificity, and accuracy of when using individual variables and CAC score. Meta-analysis combined physiological and personal parameters by logistic regression. Results. The diagnostic sensitivity of the CAC score was 14.3% when the CAC score was ≤30. Sensitivity increased to 57.13% using the proposed model. The statistically significant variables, based on beta values and P values, were family history, LDL-c, blood pressure, HDL-c, age, triglyceride, and cholesterol. Conclusions. The CAC score has low negative predictive value for CAD. This work applied a novel prediction method that uses patient information, including physiological and society parameters. The proposed method increases the accuracy of CAC score for predicting CAD

    On the Compressed-Oracle Technique, and Post-Quantum Security of Proofs of Sequential Work

    Get PDF
    We revisit the so-called compressed oracle technique, introduced by Zhandry for analyzing quantum algorithms in the quantum random oracle model (QROM). This technique has proven to be very powerful for reproving known lower bound results, but also for proving new results that seemed to be out of reach before. Despite being very useful, it is however still quite cumbersome to actually employ the compressed oracle technique. To start off with, we offer a concise yet mathematically rigorous exposition of the compressed oracle technique. We adopt a more abstract view than other descriptions found in the literature, which allows us to keep the focus on the relevant aspects. Our exposition easily extends to the parallel-query QROM, where in each query-round the considered quantum oracle algorithm may make several queries to the QROM in parallel. This variant of the QROM allows for a more fine-grained query-complexity analysis of quantum oracle algorithms. Our main technical contribution is a framework that simplifies the use of (the parallel-query generalization of) the compressed oracle technique for proving query complexity results. With our framework in place, whenever applicable, it is possible to prove quantum query complexity lower bounds by means of purely classical reasoning. More than that, we show that, for typical examples, the crucial classical observations that give rise to the classical bounds are sufficient to conclude the corresponding quantum bounds. We demonstrate this on a few examples, recovering known results (like the optimality of parallel Grover), but also obtaining new results (like the optimality of parallel BHT collision search). Our main application is to prove hardness of finding a qq-chain, i.e., a sequence x0,x1,,xqx_0,x_1,\ldots,x_q with the property that xi=H(xi1)x_i = H(x_{i-1}) for all 1iq1 \leq i \leq q, with fewer than qq parallel queries. The above problem of producing a hash chain is of fundamental importance in the context of proofs of sequential work. Indeed, as a concrete application of our new bound, we prove that the ``Simple Proofs of Sequential Work proposed by Cohen and Pietrzak remain secure against quantum attacks. Such a proof is not simply a matter of plugging in our new bound; the entire protocol needs to be analyzed in the light of a quantum attack, and substantial additional work is necessary. Thanks to our framework, this can now be done with purely classical reasoning
    corecore