52,858 research outputs found

    Subunit composition of minK potassium channels.

    Get PDF
    Expression of minK protein in Xenopus oocytes induces a slowly activating, voltage-dependent, potassium-selective current. Point mutations in minK that alter current gating kinetics, ion selectivity, pharmacology, and response to protein kinase C all support the notion that minK is a structural protein for a channel-type transporter. Yet, minK has just 130 amino acids and a single transmembrane domain. Though larger cloned potassium channels form functional channels through tetrameric subunit association, the subunit composition of minK is unknown. Subunit stoichiometry was determined by coexpression of wild-type minK and a dominant lethal point mutant of minK, which reaches the plasma membrane but passes no current. The results support a model for complete minK potassium channels in which just two minK monomers are present, with other, as yet unidentified, non-minK subunits

    Energy and centrality dependences of charged multiplicity density in relativistic nuclear collisions

    Get PDF
    Using a hadron and string cascade model, JPCIAE, the energy and centrality dependences of charged particle pseudorapidity density in relativistic nuclear collisions were studied. Within the framework of this model, both the relativistic p+pˉp+\bar p experimental data and the PHOBOS and PHENIX Au+AuAu+Au data at snn\sqrt s_{nn}=130 GeV could be reproduced fairly well without retuning the model parameters. The predictions for full RHIC energy Au+AuAu+Au collisions and for Pb+PbPb+Pb collisions at the ALICE energy were given. Participant nucleon distributions were calculated based on different methods. It was found that the number of participant nucleons, ,isnotawelldefinedvariablebothexperimentallyandtheoretically.Therefore,itisinappropriatetousechargedparticlepseudorapiditydensityperparticipantpairasafunctionof, is not a well defined variable both experimentally and theoretically. Therefore, it is inappropriate to use charged particle pseudorapidity density per participant pair as a function of for distinguishing various theoretical models.Comment: 10 pages, 4 figures, submitted to Phy. Lett.

    Rethinking Skip-thought: A Neighborhood based Approach

    Full text link
    We study the skip-thought model with neighborhood information as weak supervision. More specifically, we propose a skip-thought neighbor model to consider the adjacent sentences as a neighborhood. We train our skip-thought neighbor model on a large corpus with continuous sentences, and then evaluate the trained model on 7 tasks, which include semantic relatedness, paraphrase detection, and classification benchmarks. Both quantitative comparison and qualitative investigation are conducted. We empirically show that, our skip-thought neighbor model performs as well as the skip-thought model on evaluation tasks. In addition, we found that, incorporating an autoencoder path in our model didn't aid our model to perform better, while it hurts the performance of the skip-thought model

    Speeding up Context-based Sentence Representation Learning with Non-autoregressive Convolutional Decoding

    Full text link
    Context plays an important role in human language understanding, thus it may also be useful for machines learning vector representations of language. In this paper, we explore an asymmetric encoder-decoder structure for unsupervised context-based sentence representation learning. We carefully designed experiments to show that neither an autoregressive decoder nor an RNN decoder is required. After that, we designed a model which still keeps an RNN as the encoder, while using a non-autoregressive convolutional decoder. We further combine a suite of effective designs to significantly improve model efficiency while also achieving better performance. Our model is trained on two different large unlabelled corpora, and in both cases the transferability is evaluated on a set of downstream NLP tasks. We empirically show that our model is simple and fast while producing rich sentence representations that excel in downstream tasks
    corecore