3,883 research outputs found

    Sample preparation and EFTEM of meat samples for nanoparticle analysis in food

    Get PDF
    Nanoparticles are used in industry for personal care products and the preparation of food. In the latter application, their functions include the prevention of microbes' growth, increase of the foods nutritional value and sensory quality. EU regulations require a risk assessment of the nanoparticles used in foods and food contact materials before the products can reach the market. However, availability of validated analytical methodologies for detection and characterisation of the nanoparticles in food hampers appropriate risk assessment. As part of a research on the evaluation of the methods for screening and quantification of Ag nanoparticles in meat we have tested a new TEM sample preparation alternative to resin embedding and cryo-sectioning. Energy filtered TEM analysis was applied to evaluate thickness and the uniformity of thin meat layers acquired at increasing input of the sample demonstrating that the protocols used ensured good stability under the electron beam, reliable sample concentration and reproducibility

    Improving service scalability in IoT platform business

    Get PDF
    Abstract. This thesis aims to improve the scalability of several case companies’ business which offer their services through their own IoT platforms. The case companies are still in the early stages of their lifecycle, and their aim is to grow their businesses significantly in the future. Thus, enabling high scalability in service production is important for them. A literature review was conducted to find the most critical factors that affect scalability of services that are provided through an IoT platform. Interviews with open-ended questions were used to determine the current state of the case companies regarding the factors that were presented by the literature review. Based on the literature review and the current state analysis, two productization models were created including commercial and technical portfolios. Resource drivers were also included in the models. The created productization models for IoT service offerings are suggested to ease sales item management and to clarify the service offerings for both the provider and the buyer. Further, linking the resource drivers to the processes needed to offer the services illustrates the needed resources in different service production processes. The presented productized service models are one step that the case companies can take to improve their service scalability, but the models are not a solution to all scalability problems. However, similar models could be used in other companies that provide their service offerings through an IoT platform to improve their service scalability as well.Palvelutuotannon skaalautuvuuden parantaminen alustan kautta toimivissa yrityksissĂ€. TiivistelmĂ€. TĂ€mĂ€n opinnĂ€ytetyön tavoitteena on parantaa alustatalouden kautta palveluitaan tarjoavien case yritysten skaalautuvuutta. Case-yritykset ovat vielĂ€ elinkaarensa alkuvaiheessa ja niiden tavoitteena on kasvattaa liiketoimintaa merkittĂ€vĂ€sti tulevaisuudessa. TĂ€mĂ€n johdosta korkean skaalautuvuuden mahdollistaminen yrityksien palvelutuotannossa on tĂ€rkeÀÀ. Kirjallisuuskatsauksessa pyritÀÀn löytĂ€mÀÀn merkittĂ€vimmĂ€t tekijĂ€t, jotka vaikuttavat skaalautuvuuteen alustatalouden kautta tehtĂ€vĂ€ssĂ€ palveluntarjonnassa. Case yritysten nykytila analysoidaan avoimin kysymyksin suoritettavilla haastatteluilla, joilla pyritÀÀn selvittĂ€mÀÀn tekijĂ€t, joissa case yrityksillĂ€ olisi parantamisen varaa. Kirjallisuuskatsauksen ja yritysten nykytila-analyysin pohjalta luodaan kaksi tuotteistusmallia, joissa kaupallinen ja tekninen tuoteportfolio on eroteltu toisistaan, lisĂ€ksi resurssiajurit on kuvattu mukaan malleihin. Tuotteistusmalli helpottaa eri tuotenimikkeiden hallintaa ja lisÀÀ palvelun selkeyttĂ€ niin myyjĂ€n kuin ostajankin puolella, lisĂ€ksi resurssiajureiden ottaminen mukaan malliin havainnollistaa tarjoajayritykselle sen tarvitsemia resursseja eri palveluprosessin vaiheissa. Työn loppupÀÀtelmĂ€nĂ€ luodut tuotteistusmallit toimivat yksinĂ€ toimenpiteinĂ€, joidenka voidaan nĂ€hdĂ€ parantavan case-yrityksien skaalautuvuutta, mutta ne eivĂ€t ole ratkaisu kaikkiin skaalautuvuuden ongelmiin. Samankaltaisia malleja voitaisiin kuitenkin hyödyntÀÀ muissakin yrityksissĂ€, jotka tarjoavat palveluitaan alustatalouden kautta toimialasta riippumatta

    Unit testing methods for Internet of Things Mbed OS operating system

    Get PDF
    Abstract. Embedded operating systems for Internet of Things are responsible for managing hardware and software in these systems. From the vast number of IoT operating system projects available, some projects are backed by large companies or institutes and some are developed completely by the open source community. IoT operating system testing focuses on the key features of IoT such as networking and limited resources. In this thesis, problems in Mbed OS operating system testing methods are identified and a unit testing solution is implemented. The implemented unit testing framework allows developers to write and run unit tests. The framework is also integrated into Mbed OS continuous integration to increase test coverage. This thesis shows how functional testing and unit testing are the most common types of testing in open source embedded operating system projects. Mbed OS unit testing framework results shows how running tests on PC platforms is faster than running tests on IoT devices. This framework also enables developers to write unit tests more freely and improve Mbed OS development process. The implemented unit testing framework solved issues in Mbed OS testing but more in depth research is needed to improve testing methods further.YksikkötestausmenetelmĂ€t esineiden internet Mbed OS kĂ€yttöjĂ€rjestelmĂ€lle. TiivistelmĂ€. Esineiden internettiin tarkoitetut sulautetut käyttöjärjestelmät ovat tarvittavia laitteiston ja sovellusten hallintaan IoT järjestelmissä. Saatavilla olevien IoT käyttöjärjestelmien joukosta osa on suurten yritysten tai instituutioiden tukemia, ja osa on täysin vapaan lähdekoodin yhteisön kehittämiä. IoT käyttöjärjestelmän testaus keskittyy esineiden internetin avainominaisuuksiin kuten verkkotietoliikenteeseen ja rajallisiin resursseihin. Työssä tunnistetaan Mbed OS käyttöjärjestelmän testausmenetelmien ongelmia ja kehitetään yksikkötestaustyökalu. Kehitetty yksikkötestausympäristö mahdollistaa kehittäjille yksikkötestien kirjoittamisen ja ajamisen. Testaustyökalu yhdistetään myös Mbed OS jatkuvan integraation prosessiin testauskattavuuden parantamiseksi. Työssä katsotaan kuinka funktionaaliset testit ja yksikkötestit ovat yleisimmät testityypit avoimen lähdekoodin sulautetuissa käyttöjärjestelmäprojekteissa. Mbed OS yksikkötestaustyökalu näyttää kuinka testien ajaminen PC ympäristössä on nopeampaa kuin IoT laitteissa. Tämä työkalu myös mahdollistaa kehittäjien kirjoittaa yksikkötestejä vapaammin ja siten parantaa kehitysprosessia. Kehitetty yksikkötestaustyökalu ratkaisi Mbed OS testauksen ongelmia, mutta syventävää tutkimusta tarvitaan enemmän testausmenetelmien parantamiseksi edelleen

    Scheduling the installation of the LHC injection lines

    Get PDF
    The installation of the two Large Hadron Collider (LHC) injection lines has to fit within tight milestones of the LHC project and of CERN's accelerator activity in general. For instance, the transfer line from the Super Proton Synchrotron (SPS) to LHC point 8 (to fill the anti-clockwise LHC ring) should be tested with beam before the end of 2004 since the SPS will not run in 2005. It will first serve during the LHC sector test in 2006. Time constraints are also very strong on the installation of the transfer line from the SPS to LHC point 2 (for the clockwise LHC ring): its tunnel is the sole access for the LHC cryo-magnets and a large part of the beam line can only be installed once practically all LHC cryo-magnets are in place. Of course, the line must be operational when the LHC starts. This paper presents the various constraints and how they are taken into account for the logistics and installation planning of the LHC injection lines

    Use of independent component analysis for head movement artifact detection in EEG

    Get PDF
    Resumen. EEG data is often contaminated with artifacts and noise from various sources, such as eye blinks, jaw movement and AC outlets. VR opens up many new interesting opportunities for EEG research. However, VR usually involves head movement, which can also cause notable artifacts in EEG data. We need an effective method to remove the head movement related artifacts, so that VR can be effectively used in EEG experiments. In this study, we apply independent component analysis (ICA) for removing head movement artifacts from EEG data. We constructed an experiment where a subject is placed into a VR environment that incentivizes them to move their head. At the same time, the subject is performing a conventional auditory oddball experiment, which is known to cause an ERP containing a measurable P300 component. We attempt to remove head movement related artifacts from the data without removing ERP components of interest, such as the P300, as a side effect. Our data processing pipeline was implemented using Matlab with the EEGLAB and ERPLAB toolboxes, along with AMICA as our chosen ICA implementation. We explain the design and implementation of both the experiment and the following data processing. We then discuss the results and how they could help future research. We were able to distinguish head movement related components with ICA, but the impact of their removal ended up being fairly limited. We also found out that the head movements seem to have an effect on the shape and amplitude of the P300 component.Independent component analysis -menetelmÀn kÀyttö pÀÀn liikkeistÀ aiheutuvien hÀiriöiden havaitsemiseen EEG:ssÀ. TiivistelmÀ. EEG-data sisÀltÀÀ tyyppillisesti useista eri lÀhteistÀ perÀisin olevia hÀiriöitÀ. NÀitÀ lÀhteitÀ voivat olla esimerkiksi silmÀnrÀpÀytykset, leuan liikkeet sekÀ koetilassa olevat vaihtovirtaa kuljettavat sÀhköjohdot ja pistorasiat. VR avaa uusia mielenkiintoisia mahdollisuuksia EEG-tutkimuksia varten, mutta VR-ympÀristöissÀ koehenkilö tyypillisesti liikuttelee pÀÀtÀÀn, mikÀ voi aiheuttaa merkittÀviÀ hÀiriöitÀ EEG-dataan. TÀmÀn takia tarvitaan menetelmÀ poistamaan pÀÀn liikkeistÀ aiheutuvia hÀiriöitÀ, jotta VR:n kÀyttö voi yleistyÀ EEG-tutkimuksissa. TÀssÀ tutkimuksessa pyrimme kÀyttÀmÀÀn ICA-menetelmÀÀ pÀÀn liikkeistÀ aiheutuvien hÀiriöiden poistamiseksi. Rakensimme koeasetelman, jossa koehenkilö asetetaan VR-ympÀristöön, joka kannustaa hÀntÀ liikuttamaan pÀÀtÀÀn. Samaan aikaan koehenkilö suorittaa tavanomaista "oddball-koetta", jonka tiedetÀÀn aiheuttavan P300-komponentin sisÀltÀvÀn ERP:n. Pyrimme poistamaan pÀÀn liikkeistÀ aiheutuvat hÀiriöt ilman ettÀ poistamme samalla kertaa mahdollisesti kiinnostavia komponentteja, kuten P300:n. Data-analyysin toteuttamiseen kÀytimme Matlab-ohjelmistoa yhdessÀ EEGLAB- ja ERPLAB-lisÀosien kanssa. KÀyttÀmÀmme ICA-algoritmi oli AMICA. Kerromme sekÀ koeasetelman ettÀ sitÀ seuranneen data-analyysivaiheen suunnittelusta ja toteutuksesta. LisÀksi kÀymme lÀpi tutkimuksen tulokset ja pohdimme miten ne voivat auttaa tutkimustyötÀ tulevaisuudessa. Onnistuimme erottelemaan pÀÀn liikkeisiin liittyviÀ komponentteja ICA-menetelmÀn avulla, mutta niiden poistamisella oli hieman odotettua pienempi vaikutus. Havaitsimme myös, ettÀ pÀÀn liikkeillÀ nÀyttÀÀ olevan jonkinlaista vaikutusta P300-komponentin muotoon ja amplitudiin

    A New Method for ISOCAM Data Reduction - I. Application to the European Large Area ISO Survey Southern Field: Method and Results

    Get PDF
    We have developed a new data reduction technique for ISOCAM LW data and have applied it to the European Large Area ISO Survey (ELAIS) LW3 (15 micron) observations in the southern hemisphere (S1). This method, known as LARI technique and based on the assumption of the existence of two different time scales in ISOCAM transients (accounting either for fast or slow detector response), was particularly designed for the detection of faint sources. In the ELAIS S1 field we obtained a catalogue of 462 15 micron sources with signal-to-noise ratio >= 5 and flux densities in the range 0.45 - 150 mJy (filling the whole flux range between the Deep ISOCAM Surveys and the IRAS Faint Source Survey). The completeness at different flux levels and the photometric accuracy of this catalogue have been tested with simulations. Here we present a detailed description of the method and discuss the results obtained by its application to the S1 LW3 data.Comment: 20 pages, LaTeX, MNRAS style, 20 postscript figures, full catalogue not yet available at http://boas5.bo.astro.it/~elais/catalogues/. Accepted for publication in MNRA

    Towards characterizing LNAPL remediation endpoints

    Get PDF
    Remediating sites contaminated with light non-aqueous phase liquids (LNAPLs) is a demanding and often prolonged task. It is vital to determine when it is appropriate to cease engineered remedial efforts based on the long-term effectiveness of remediation technology options. For the first time, the long term effectiveness of a range of LNAPL remediation approaches including skimming and vacuum-enhanced skimming each with and without water table drawdown was simulated through a multi-phase and multi-component approach. LNAPL components of gasoline were simulated to show how component changes affect the LNAPL\u27s multi-phase behaviour and to inform the risk profile of the LNAPL. The four remediation approaches along with five types of soils, two states of the LNAPL specific mass and finite and infinite LNAPL plumes resulted in 80 simulation scenarios. Effective conservative mass removal endpoints for all the simulations were determined. As a key driver of risk, the persistence and mass removal of benzene was investigated across the scenarios. The time to effectively achieve a technology endpoint varied from 2 to 6 years. The recovered LNAPL in the liquid phase varied from 5% to 53% of the initial mass. The recovered LNAPL mass as extracted vapour was also quantified. Additional mass loss through induced biodegradation was not determined. Across numerous field conditions and release incidents, graphical outcomes provide conservative (i.e. more prolonged or greater mass recovery potential) LNAPL remediation endpoints for use in discussing the halting or continuance of engineered remedial efforts

    Towards a digital twin for characterising natural source zone depletion: A feasibility study based on the Bemidji site

    Get PDF
    Natural source zone depletion (NSZD) of light non-aqueous phase liquids (LNAPLs) may be a valid long-term management option at petroleum impacted sites. However, its future long-term reliability needs to be established. NSZD includes partitioning, biotic and abiotic degradation of LNAPL components plus multiphase fluid dynamics in the subsurface. Over time, LNAPL components are depleted and those partitioning to various phases change, as do those available for biodegradation. To accommodate these processes and predict trends and NSZD over decades to centuries, for the first time, we incorporated a multi-phase multi-component multi-microbe non-isothermal approach to representatively simulate NSZD at field scale. To validate the approach we successfully mimic data from the LNAPL release at the Bemidji site. We simulate the entire depth of saturated and unsaturated zones over the 27 years of post-release measurements. The study progresses the idea of creating a generic digital twin of NSZD processes and future trends. Outcomes show the feasibility and affordability of such detailed computational approaches to improve decision-making for site management and restoration strategies. The study provided a basis to progress a computational digital twin for complex subsurface systems

    The obscured X-ray source population in the HELLAS2XMM survey: the Spitzer view

    Full text link
    Recent X-ray surveys have provided a large number of high-luminosity, obscured Active Galactic Nuclei (AGN), the so-called Type 2 quasars. Despite the large amount of multi-wavelength supporting data, the main parameters related to the black holes harbored in such AGN are still poorly known. Here we present the results obtained for a sample of eight Type 2 quasars in the redshift range 0.9-2.1 selected from the HELLAS2XMM survey, for which we used Ks-band, Spitzer IRAC and MIPS data at 24 micron to estimate bolometric corrections, black hole masses, and Eddington ratios.Comment: 6 pages, to appear in "The Multicoloured Landscape of Compact Objects and their Explosive Progenitors: Theory vs Observations" (Cefalu, Sicily, June 2006). Eds. L. Burderi et al. (New York: AIP
    • 

    corecore