107 research outputs found

    Overlearning in marginal distribution-based ICA: analysis and solutions

    Get PDF
    The present paper is written as a word of caution, with users of independent component analysis (ICA) in mind, to overlearning phenomena that are often observed.\\ We consider two types of overlearning, typical to high-order statistics based ICA. These algorithms can be seen to maximise the negentropy of the source estimates. The first kind of overlearning results in the generation of spike-like signals, if there are not enough samples in the data or there is a considerable amount of noise present. It is argued that, if the data has power spectrum characterised by 1/f1/f curve, we face a more severe problem, which cannot be solved inside the strict ICA model. This overlearning is better characterised by bumps instead of spikes. Both overlearning types are demonstrated in the case of artificial signals as well as magnetoencephalograms (MEG). Several methods are suggested to circumvent both types, either by making the estimation of the ICA model more robust or by including further modelling of the data

    Denoising source separation

    Get PDF
    A new algorithmic framework called denoising source separation (DSS) is introduced. The main benefit of this framework is that it allows for easy development of new source separation algorithms which are optimised for specific problems. In this framework, source separation algorithms are constucted around denoising procedures. The resulting algorithms can range from almost blind to highly specialised source separation algorithms. Both simple linear and more complex nonlinear or adaptive denoising schemes are considered. Some existing independent component analysis algorithms are reinterpreted within DSS framework and new, robust blind source separation algorithms are suggested. Although DSS algorithms need not be explicitly based on objective functions, there is often an implicit objective function that is optimised. The exact relation between the denoising procedure and the objective function is derived and a useful approximation of the objective function is presented. In the experimental section, various DSS schemes are applied extensively to artificial data, to real magnetoencephalograms and to simulated CDMA mobile network signals. Finally, various extensions to the proposed DSS algorithms are considered. These include nonlinear observation mappings, hierarchical models and overcomplete, nonorthogonal feature spaces. With these extensions, DSS appears to have relevance to many existing models of neural information processing

    Independent component approach to the analysis of EEG and MEG recordings

    Get PDF
    Multichannel recordings of the electromagnetic fields emerging from neural currents in the brain generate large amounts of data. Suitable feature extraction methods are, therefore, useful to facilitate the representation and interpretation of the data. Recently developed independent component analysis (ICA) has been shown to be an efficient tool for artifact identification and extraction from electroencephalographic (EEG) and magnetoen- cephalographic (MEG) recordings. In addition, ICA has been ap- plied to the analysis of brain signals evoked by sensory stimuli. This paper reviews our recent results in this field

    Uncoated paper surface for coldset web offset printing : set-off studies

    Get PDF
    This study examines set-off in newspaper printing and its relation to newsprint properties and other printing parameters, consisting of a review of the literature related to the subject, a theoretical discussion and a set-off process analysis. Laboratory printing tests with laboratory-made paper and ink samples and commercial newsprint and newsink samples were carried out. In addition, the surface of newsprint, the structure of the offset blanket, the ink penetration and its location on the paper surface, and set-off prints were studied with microscopic methods. Ink transfer and ink setting and the situations where set-off is created were also examined. Main targets for this work were: to define the optimum surface characteristics for uncoated newsprint in newspaper printing with the coldset web offset (CSWO) method, and to gain a better insight into the set-off phenomenon. In CSWO printing, the compressibility and conformability of the blanket and the compressibility of the newsprint surface improve the ink coverage. After the printing nip the ink film splits only in the area where it has been in contact with the newsprint surface. In single-colour printing, the penetration of ink that is caused by the pressure in the printing nip is insignificant. In contrast, in multi-colour printing, the ink layer printed later pushes the previously printed ink deeper into the voids of the paper surface. Ink setting decreases set-off. It is the result of solvent separation from the ink layer. Ink setting is defined by the change of the set-off during the delay time. Filler loading improves ink setting. Set-off can be reduced by using less ink, higher printing nip pressure, lower pressure in the folder and in other set-off situations, and chemical pulp containing less compressible DIP-based newsprint. The most important properties of uncoated newsprint for set-off are: optimised roughness in relation to print quality and set-off, low surface compressibility, high specific surface area, good absorption ability of the fines area and uniform formation (no high-density calendering spots).reviewe

    BloomCasting for publish/subscribe networks

    Get PDF
    Publish/subscribe has been proposed as a way of addressing information as the primary named entity in the network. In this thesis, we develop and explore a network architecture based on publish/subscribe primitives, based on our work on PSIRP project. Our work is divided into two areas: rendezvous and Bloomcasting, i.e. fast Bloom filter-based forwarding architecture for source-specific multicast. Taken together these are combined as a publish/subscribe architecture, where publisher and subscriber matching is done by the rendezvous and Bloom filter-based forwarding fabric is used for multicasting the published content. Our work on the inter-domain rendezvous shows that a combination of policy routing at edges and an overlay based on hierarchical distributed hash tables can overcome problems related to incremental deployment while keeping the stretch of queries small and that it can solve some policy related problems that arise from using distributed hash tables in inter-domain setting. Bloom filters can cause false positives. We show that false positives can cause network anomalies, when Bloom filters are used for packet forwarding. We found three such anomalies: packet storms, packet loops, and flow duplication. They can severely disrupt the network infrastructure and be used for denial-of-service attacks against the network or target services. These security and reliability problems can be solved by using the combination of three techniques. Cryptographically computed edge pair-labels ensure that an attacker cannot construct Bloom filter-based path identifiers for chosen path. Varying the Bloom filter parameters locally at each router prevents packet storms and using bit permutations on the Bloom filter locally at each router prevent accidental and malicious loops and flow duplications.Yksi Internetin puutteista on se, ettei ole mitään kaikille sovelluksille yhteistä tapaa nimetä informaatiota. Julkaisija/tilaaja-malli on yksi ehdotus, jolla Internet-arkkitehtuuria voisi muuttaa tämän puutteen korvaamiseksi. Väitöskirjassani kehitän julkaisija/tilaaja-malliin pohjautuvan verkkoarkkitehtuurin, joka pohjautuu työlleni PSRIP-projektissa. Arkkitehtuuri koostuu kohtaamisjärjestelmästä, joka yhdistää julkaisijat ja tilaajat, ja Bloom-suodattimiin pohjautuvasta monen vastaanottajan viestintäkanavasta, jolla julkaistu sisältö toimitetaan tilaajille. Internetin kattavalla kohtaamisjärjestelmällä on korkeat vaatimukset. Tutkin kahta erilaista menetelmää: paikallisiin reitityspolitiikoihin pohjautuvaa järjestelmää ja toinen hajautettuihin hajautustauluihin pohjautuvaa järjestelmää. Ensimmäisen haasteena on skaalautuvuus erityisesti silloin, kun kaikki Internetin verkot eivät osallistu järjestelmän ylläpitoon. Jälkimmäinen on ongelmallinen, sillä siihen pohjautuvat järjestelmät eivät voi taata, mitä reittiä julkaisu ja tilaus -viestit kulkevat järjestelmässä. Näin viesti saattaa kulkea myös julkaisijan tai tilaajan kilpailijan verkon kautta. Ehdotan väitöskirjassani menetelmää, joka yhdistää reunoilla politiikkaan pohjautuvan julkaisu/tilaaja reitityksen ja verkon keskellä yhdistää nämä erilliset saarekkeet hierarkista hajautettua hajautustaulua hyödyntäen. Julkaisujen toimittamiseen tilaajille käytän Bloom-suodattimiin pohjautuvaa järjestelmää. Osoitan väitöskirjassani, että Bloom-suodattimien käyttö pakettien reitittämiseen voi aiheuttaa verkossa merkittäviä vikatilanteita, esimerkiksi pakettiräjähdyksen, silmukan, tai samaan vuohon kuuluvien pakettien moninkertaistumisen. Nämä ongelmat aiheuttavat verkolle turvallisuus- ja luotettavuusongelmia, jotka voidaan ratkaista kolmen tekniikan yhdistelmällä. Ensinnäkin, Bloom-suodattimiin laitettavat polun osia merkitsevät nimet lasketaan kryptografiaa hyödyntäen, ettei hyökkääjä kykene laskemaan Bloom-suodatinta haluamalleen polulle ilman verkon apua. Toisekseen, reitittimet määrittävät Bloom suodatinparametrit paikallisesti siten, ettei pakkettiräjähdyksiä tapahdu. Kolmannekseen, kukin reititin uudelleen järjestelee Bloom-suodattimen bitit varmistaen, ettei suodatin ole enää sama, jos paketti kulkee esimerkiksi silmukan läpi ja palaa samalle takaisin samalle reitittimelle.

    Solmutyöskentelyä Viikin tiedekirjastossa

    Get PDF

    Exploratory source separation in biomedical systems

    Get PDF
    Contemporary science produces vast amounts of data. The analysis of this data is in a central role for all empirical sciences as well as humanities and arts using quantitative methods. One central role of an information scientist is to provide this research with sophisticated, computationally tractable data analysis tools. When the information scientist confronts a new target field of research producing data for her to analyse, she has two options: She may make some specific hypotheses, or guesses, on the contents of the data, and test these using statistical analysis. On the other hand, she may use general purpose statistical models to get a better insight into the data before making detailed hypotheses. Latent variable models present a case of such general models. In particular, such latent variable models are discussed where the measured data is generated by some hidden sources through some mapping. The task of source separation is to recover the sources. Additionally, one may be interested in the details of the generation process itself. We argue that when little is known of the target field, independent component analysis (ICA) serves as a valuable tool to solve a problem called blind source separation (BSS). BSS means solving a source separation problem with no, or at least very little, prior information. In case more is known of the target field, it is natural to incorporate the knowledge in the separation process. Hence, we also introduce methods for this incorporation. Finally, we suggest a general framework of denoising source separation (DSS) that can serve as a basis for algorithms ranging from almost blind approach to highly specialised and problem-tuned source separation algoritms. We show that certain ICA methods can be constructed in the DSS framework. This leads to new, more robust algorithms. It is natural to use the accumulated knowledge from applying BSS in a target field to devise more detailed source separation algorithms. We call this process exploratory source separation (ESS). We show that DSS serves as a practical and flexible framework to perform ESS, too. Biomedical systems, the nervous system, heart, etc., constitute arguably the most complex systems that human beings have ever studied. Furthermore, the contemporary physics and technology have made it possible to study these systems while they operate in near-natural conditions. The usage of these sophisticated instruments has resulted in a massive explosion of available data. In this thesis, we apply the developed source separation algorithms in the analysis of the human brain, using mainly magnetoencephalograms (MEG). The methods are directly usable for electroencephalograms (EEG) and with small adjustments for other imaging modalities, such as (functional) magnetic resonance imaging (fMRI), too.reviewe

    From neural PCA to deep unsupervised learning

    Full text link
    A network supporting deep unsupervised learning is presented. The network is an autoencoder with lateral shortcut connections from the encoder to decoder at each level of the hierarchy. The lateral shortcut connections allow the higher levels of the hierarchy to focus on abstract invariant features. While standard autoencoders are analogous to latent variable models with a single layer of stochastic variables, the proposed network is analogous to hierarchical latent variables models. Learning combines denoising autoencoder and denoising sources separation frameworks. Each layer of the network contributes to the cost function a term which measures the distance of the representations produced by the encoder and the decoder. Since training signals originate from all levels of the network, all layers can learn efficiently even in deep networks. The speedup offered by cost terms from higher levels of the hierarchy and the ability to learn invariant features are demonstrated in experiments.Comment: A revised version of an article that has been accepted for publication in Advances in Independent Component Analysis and Learning Machines (2015), edited by Ella Bingham, Samuel Kaski, Jorma Laaksonen and Jouko Lampine

    Outbreak of delta variant SARS-CoV-2 virus on a psychogeriatric ward in Helsinki, Finland, August 2021 : two-dose vaccination reduces mortality and disease severity amongst the elderly

    Get PDF
    We describe an outbreak of delta variant SARS-CoV-2 on a psychogeriatric ward of elderly patients. Retrospectively collected data was analysed using Fisher's exact test to assess the association between patients' vaccination status and infection rates, severity of disease and mortality. Vaccination with two doses was shown to reduce severity of disease (5% vs. 75%, p < 0.001) and mortality (5% vs. 50%, p < 0.018) amongst an elderly inpatient population during an outbreak of delta variant SARS-CoV-2. Vaccination should be encouraged in elderly care institutions. Furthermore, adequate vaccination in elderly care institutions is an important consideration in current booster (third/fourth) dose schedules.Peer reviewe
    • …
    corecore