110 research outputs found

    Memory-Sample Lower Bounds for Learning Parity with Noise

    Get PDF
    In this work, we show, for the well-studied problem of learning parity under noise, where a learner tries to learn x=(x1,,xn){0,1}nx=(x_1,\ldots,x_n) \in \{0,1\}^n from a stream of random linear equations over F2\mathrm{F}_2 that are correct with probability 12+ε\frac{1}{2}+\varepsilon and flipped with probability 12ε\frac{1}{2}-\varepsilon, that any learning algorithm requires either a memory of size Ω(n2/ε)\Omega(n^2/\varepsilon) or an exponential number of samples. In fact, we study memory-sample lower bounds for a large class of learning problems, as characterized by [GRT'18], when the samples are noisy. A matrix M:A×X{1,1}M: A \times X \rightarrow \{-1,1\} corresponds to the following learning problem with error parameter ε\varepsilon: an unknown element xXx \in X is chosen uniformly at random. A learner tries to learn xx from a stream of samples, (a1,b1),(a2,b2)(a_1, b_1), (a_2, b_2) \ldots, where for every ii, aiAa_i \in A is chosen uniformly at random and bi=M(ai,x)b_i = M(a_i,x) with probability 1/2+ε1/2+\varepsilon and bi=M(ai,x)b_i = -M(a_i,x) with probability 1/2ε1/2-\varepsilon (0<ε<120<\varepsilon< \frac{1}{2}). Assume that k,,rk,\ell, r are such that any submatrix of MM of at least 2kA2^{-k} \cdot |A| rows and at least 2X2^{-\ell} \cdot |X| columns, has a bias of at most 2r2^{-r}. We show that any learning algorithm for the learning problem corresponding to MM, with error, requires either a memory of size at least Ω(kε)\Omega\left(\frac{k \cdot \ell}{\varepsilon} \right), or at least 2Ω(r)2^{\Omega(r)} samples. In particular, this shows that for a large class of learning problems, same as those in [GRT'18], any learning algorithm requires either a memory of size at least Ω((logX)(logA)ε)\Omega\left(\frac{(\log |X|) \cdot (\log |A|)}{\varepsilon}\right) or an exponential number of noisy samples. Our proof is based on adapting the arguments in [Raz'17,GRT'18] to the noisy case.Comment: 19 pages. To appear in RANDOM 2021. arXiv admin note: substantial text overlap with arXiv:1708.0263

    Multi-Task Self-Supervised Learning for Disfluency Detection

    Full text link
    Most existing approaches to disfluency detection heavily rely on human-annotated data, which is expensive to obtain in practice. To tackle the training data bottleneck, we investigate methods for combining multiple self-supervised tasks-i.e., supervised tasks where data can be collected without manual labeling. First, we construct large-scale pseudo training data by randomly adding or deleting words from unlabeled news data, and propose two self-supervised pre-training tasks: (i) tagging task to detect the added noisy words. (ii) sentence classification to distinguish original sentences from grammatically-incorrect sentences. We then combine these two tasks to jointly train a network. The pre-trained network is then fine-tuned using human-annotated disfluency detection training data. Experimental results on the commonly used English Switchboard test set show that our approach can achieve competitive performance compared to the previous systems (trained using the full dataset) by using less than 1% (1000 sentences) of the training data. Our method trained on the full dataset significantly outperforms previous methods, reducing the error by 21% on English Switchboard

    Endothelium Aging and Vascular Diseases

    Get PDF

    Mcl-1 Ubiquitination and Destruction

    Get PDF
    Loss of the Fbw7 tumor suppressor is common in diverse human cancer types, including T-Cell Acute Lymphoblastic Leukemia (T-ALL), although the mechanistic basis of its anti-oncogenic activity remains largely unclear. We recently reported that SCFFbw7^{Fbw7} regulates cellular apoptosis by controlling the ubiquitination and destruction of the pro-survival protein, Mcl-1, in a GSK3 phosphorylation-dependent manner. We found that human T-ALL cell lines displayed a close relationship between Fbw7 loss and Mcl-1 overexpression. More interestingly, T-ALL cell lines that are deficient in Fbw7 are particularly sensitive to sorafenib, a multi-kinase inhibitor that has been demonstrated to reduce Mcl-1 expression through an unknown mechanism. On the other hand, Fbw7-deficient T-ALL cell lines are much more resistant to the Bcl-2 antagonist, ABT-737. Furthermore, reconstitution of Fbw7 or depletion of Mcl-1 in Fbw7-deficient cells restores ABT-737 sensitivity, suggesting that elevated Mcl-1 expression is important for Fbw7-deficient cells to evade apoptosis. Therefore, our work provides a novel molecular mechanism for the tumor suppression function of Fbw7. Furthermore, it provides the rationale for targeted usage of Mcl-1 antagonists to treat Fbw7-deficient T-ALL patients

    Regulation of EWSR1-FLI1 Function by Post-Transcriptional and Post-Translational Modifications

    Get PDF
    Ewing sarcoma is the second most common bone tumor in childhood and adolescence. Currently, first-line therapy includes multidrug chemotherapy with surgery and/or radiation. Although most patients initially respond to chemotherapy, recurrent tumors become treatment refractory. Pathologically, Ewing sarcoma consists of small round basophilic cells with prominent nuclei marked by expression of surface protein CD99. Genetically, Ewing sarcoma is driven by a fusion oncoprotein that results from one of a small number of chromosomal translocations composed of a FET gene and a gene encoding an ETS family transcription factor, with ~85% of tumors expressing the EWSR1::FLI1 fusion. EWSR1::FLI1 regulates transcription, splicing, genome instability and other cellular functions. Although a tumor-specific target, EWSR1::FLI1-targeted therapy has yet to be developed, largely due to insufficient understanding of EWSR1::FLI1 upstream and downstream signaling, and the challenges in targeting transcription factors with small molecules. In this review, we summarize the contemporary molecular understanding of Ewing sarcoma, and the post-transcriptional and post-translational regulatory mechanisms that control EWSR1::FLI1 function
    corecore