17 research outputs found

    Independent EEG Sources Are Dipolar

    Get PDF
    Independent component analysis (ICA) and blind source separation (BSS) methods are increasingly used to separate individual brain and non-brain source signals mixed by volume conduction in electroencephalographic (EEG) and other electrophysiological recordings. We compared results of decomposing thirteen 71-channel human scalp EEG datasets by 22 ICA and BSS algorithms, assessing the pairwise mutual information (PMI) in scalp channel pairs, the remaining PMI in component pairs, the overall mutual information reduction (MIR) effected by each decomposition, and decomposition ‘dipolarity’ defined as the number of component scalp maps matching the projection of a single equivalent dipole with less than a given residual variance. The least well-performing algorithm was principal component analysis (PCA); best performing were AMICA and other likelihood/mutual information based ICA methods. Though these and other commonly-used decomposition methods returned many similar components, across 18 ICA/BSS algorithms mean dipolarity varied linearly with both MIR and with PMI remaining between the resulting component time courses, a result compatible with an interpretation of many maximally independent EEG components as being volume-conducted projections of partially-synchronous local cortical field activity within single compact cortical domains. To encourage further method comparisons, the data and software used to prepare the results have been made available (http://sccn.ucsd.edu/wiki/BSSComparison)

    3C Protease of Enterovirus 68 Structure Based Design of Michael Acceptor Inhibitors and Their Broad Spectrum Antiviral Effects Against Picornaviruses.

    No full text
    We have determined the cleavage specificity and the crystal structure of the 3C protease of enterovirus 68 (EV68 3C(pro)). The protease exhibits a typical chymotrypsin fold with a Cys...His...Glu catalytic triad; its three-dimensional structure is closely related to that of the 3C(pro) of rhinovirus 2, as well as to that of poliovirus. The phylogenetic position of the EV68 3C(pro) between the corresponding enzymes of rhinoviruses on the one hand and classical enteroviruses on the other prompted us to use the crystal structure for the design of irreversible inhibitors, with the goal of discovering broad-spectrum antiviral compounds. We synthesized a series of peptidic α,β-unsaturated ethyl esters of increasing length and for each inhibitor candidate, we determined a crystal structure of its complex with the EV68 3C(pro), which served as the basis for the next design round. To exhibit inhibitory activity, compounds must span at least P3 to P1′; the most potent inhibitors comprise P4 to P1′. Inhibitory activities were found against the purified 3C protease of EV68, as well as with replicons for poliovirus and EV71 (50% effective concentration [EC(50)] = 0.5 μM for the best compound). Antiviral activities were determined using cell cultures infected with EV71, poliovirus, echovirus 11, and various rhinovirus serotypes. The most potent inhibitor, SG85, exhibited activity with EC(50)s of ≈180 nM against EV71 and ≈60 nM against human rhinovirus 14 in a live virus–cell-based assay. Even the shorter SG75, spanning only P3 to P1′, displayed significant activity (EC(50) = 2 to 5 μM) against various rhinoviruses

    A Hilbert space embedding for distributions

    No full text
    Abstract. We describe a technique for comparing distributions without the need for density estimation as an intermediate step. Our approach relies on mapping the distributions into a reproducing kernel Hilbert space. Applications of this technique can be found in two-sample tests, which are used for determining whether two sets of observations arise from the same distribution, covariate shift correction, local learning, measures of independence, and density estimation. Kernel methods are widely used in supervised learning [1, 2, 3, 4], however they are much less established in the areas of testing, estimation, and analysis of probability distributions, where information theoretic approaches [5, 6] have long been dominant. Recent examples include [7] in the context of construction of graphical models, [8] in the context of feature extraction, and [9] in the context of independent component analysis. These methods have by and large a common issue: to compute quantities such as the mutual information, entropy, or Kullback-Leibler divergence, we require sophisticated space partitioning and/o
    corecore