12,700 research outputs found

    Security and Privacy Problems in Voice Assistant Applications: A Survey

    Full text link
    Voice assistant applications have become omniscient nowadays. Two models that provide the two most important functions for real-life applications (i.e., Google Home, Amazon Alexa, Siri, etc.) are Automatic Speech Recognition (ASR) models and Speaker Identification (SI) models. According to recent studies, security and privacy threats have also emerged with the rapid development of the Internet of Things (IoT). The security issues researched include attack techniques toward machine learning models and other hardware components widely used in voice assistant applications. The privacy issues include technical-wise information stealing and policy-wise privacy breaches. The voice assistant application takes a steadily growing market share every year, but their privacy and security issues never stopped causing huge economic losses and endangering users' personal sensitive information. Thus, it is important to have a comprehensive survey to outline the categorization of the current research regarding the security and privacy problems of voice assistant applications. This paper concludes and assesses five kinds of security attacks and three types of privacy threats in the papers published in the top-tier conferences of cyber security and voice domain.Comment: 5 figure

    A Decision Support System for Economic Viability and Environmental Impact Assessment of Vertical Farms

    Get PDF
    Vertical farming (VF) is the practice of growing crops or animals using the vertical dimension via multi-tier racks or vertically inclined surfaces. In this thesis, I focus on the emerging industry of plant-specific VF. Vertical plant farming (VPF) is a promising and relatively novel practice that can be conducted in buildings with environmental control and artificial lighting. However, the nascent sector has experienced challenges in economic viability, standardisation, and environmental sustainability. Practitioners and academics call for a comprehensive financial analysis of VPF, but efforts are stifled by a lack of valid and available data. A review of economic estimation and horticultural software identifies a need for a decision support system (DSS) that facilitates risk-empowered business planning for vertical farmers. This thesis proposes an open-source DSS framework to evaluate business sustainability through financial risk and environmental impact assessments. Data from the literature, alongside lessons learned from industry practitioners, would be centralised in the proposed DSS using imprecise data techniques. These techniques have been applied in engineering but are seldom used in financial forecasting. This could benefit complex sectors which only have scarce data to predict business viability. To begin the execution of the DSS framework, VPF practitioners were interviewed using a mixed-methods approach. Learnings from over 19 shuttered and operational VPF projects provide insights into the barriers inhibiting scalability and identifying risks to form a risk taxonomy. Labour was the most commonly reported top challenge. Therefore, research was conducted to explore lean principles to improve productivity. A probabilistic model representing a spectrum of variables and their associated uncertainty was built according to the DSS framework to evaluate the financial risk for VF projects. This enabled flexible computation without precise production or financial data to improve economic estimation accuracy. The model assessed two VPF cases (one in the UK and another in Japan), demonstrating the first risk and uncertainty quantification of VPF business models in the literature. The results highlighted measures to improve economic viability and the viability of the UK and Japan case. The environmental impact assessment model was developed, allowing VPF operators to evaluate their carbon footprint compared to traditional agriculture using life-cycle assessment. I explore strategies for net-zero carbon production through sensitivity analysis. Renewable energies, especially solar, geothermal, and tidal power, show promise for reducing the carbon emissions of indoor VPF. Results show that renewably-powered VPF can reduce carbon emissions compared to field-based agriculture when considering the land-use change. The drivers for DSS adoption have been researched, showing a pathway of compliance and design thinking to overcome the ‘problem of implementation’ and enable commercialisation. Further work is suggested to standardise VF equipment, collect benchmarking data, and characterise risks. This work will reduce risk and uncertainty and accelerate the sector’s emergence

    Decoding spatial location of attended audio-visual stimulus with EEG and fNIRS

    Get PDF
    When analyzing complex scenes, humans often focus their attention on an object at a particular spatial location in the presence of background noises and irrelevant visual objects. The ability to decode the attended spatial location would facilitate brain computer interfaces (BCI) for complex scene analysis. Here, we tested two different neuroimaging technologies and investigated their capability to decode audio-visual spatial attention in the presence of competing stimuli from multiple locations. For functional near-infrared spectroscopy (fNIRS), we targeted dorsal frontoparietal network including frontal eye field (FEF) and intra-parietal sulcus (IPS) as well as superior temporal gyrus/planum temporal (STG/PT). They all were shown in previous functional magnetic resonance imaging (fMRI) studies to be activated by auditory, visual, or audio-visual spatial tasks. We found that fNIRS provides robust decoding of attended spatial locations for most participants and correlates with behavioral performance. Moreover, we found that FEF makes a large contribution to decoding performance. Surprisingly, the performance was significantly above chance level 1s after cue onset, which is well before the peak of the fNIRS response. For electroencephalography (EEG), while there are several successful EEG-based algorithms, to date, all of them focused exclusively on auditory modality where eye-related artifacts are minimized or controlled. Successful integration into a more ecological typical usage requires careful consideration for eye-related artifacts which are inevitable. We showed that fast and reliable decoding can be done with or without ocular-removal algorithm. Our results show that EEG and fNIRS are promising platforms for compact, wearable technologies that could be applied to decode attended spatial location and reveal contributions of specific brain regions during complex scene analysis

    Modelling uncertainties for measurements of the H → γγ Channel with the ATLAS Detector at the LHC

    Get PDF
    The Higgs boson to diphoton (H → γγ) branching ratio is only 0.227 %, but this final state has yielded some of the most precise measurements of the particle. As measurements of the Higgs boson become increasingly precise, greater import is placed on the factors that constitute the uncertainty. Reducing the effects of these uncertainties requires an understanding of their causes. The research presented in this thesis aims to illuminate how uncertainties on simulation modelling are determined and proffers novel techniques in deriving them. The upgrade of the FastCaloSim tool is described, used for simulating events in the ATLAS calorimeter at a rate far exceeding the nominal detector simulation, Geant4. The integration of a method that allows the toolbox to emulate the accordion geometry of the liquid argon calorimeters is detailed. This tool allows for the production of larger samples while using significantly fewer computing resources. A measurement of the total Higgs boson production cross-section multiplied by the diphoton branching ratio (σ × Bγγ) is presented, where this value was determined to be (σ × Bγγ)obs = 127 ± 7 (stat.) ± 7 (syst.) fb, within agreement with the Standard Model prediction. The signal and background shape modelling is described, and the contribution of the background modelling uncertainty to the total uncertainty ranges from 18–2.4 %, depending on the Higgs boson production mechanism. A method for estimating the number of events in a Monte Carlo background sample required to model the shape is detailed. It was found that the size of the nominal γγ background events sample required a multiplicative increase by a factor of 3.60 to adequately model the background with a confidence level of 68 %, or a factor of 7.20 for a confidence level of 95 %. Based on this estimate, 0.5 billion additional simulated events were produced, substantially reducing the background modelling uncertainty. A technique is detailed for emulating the effects of Monte Carlo event generator differences using multivariate reweighting. The technique is used to estimate the event generator uncertainty on the signal modelling of tHqb events, improving the reliability of estimating the tHqb production cross-section. Then this multivariate reweighting technique is used to estimate the generator modelling uncertainties on background V γγ samples for the first time. The estimated uncertainties were found to be covered by the currently assumed background modelling uncertainty

    Annals [...].

    Get PDF
    Pedometrics: innovation in tropics; Legacy data: how turn it useful?; Advances in soil sensing; Pedometric guidelines to systematic soil surveys.Evento online. Coordenado por: Waldir de Carvalho Junior, Helena Saraiva Koenow Pinheiro, Ricardo Simão Diniz Dalmolin

    Search for third generation vector-like leptons with the ATLAS detector

    Get PDF
    The Standard Model of particle physics provides a concise description of the building blocks of our universe in terms of fundamental particles and their interactions. It is an extremely successful theory, providing a plethora of predictions that precisely match experimental observation. In 2012, the Higgs boson was observed at CERN and was the last particle predicted by the Standard Model that had yet-to-be discovered. While this added further credibility to the theory, the Standard Model appears incomplete. Notably, it only accounts for 5% of the energy density of the universe (the rest being ``dark matter'' and ``dark energy''), it cannot resolve the gravitational force with quantum theory, it does not explain the origin of neutrino masses and cannot account for matter/anti-matter asymmetry. The most plausible explanation is that the theory is an approximation and new physics remains. Vector-like leptons are well-motivated by a number of theories that seek to provide closure on the Standard Model. They are a simple addition to the Standard Model and can help to resolve a number of discrepancies without disturbing precisely measured observables. This thesis presents a search for vector-like leptons that preferentially couple to tau leptons. The search was performed using proton-proton collision data from the Large Hadron Collider collected by the ATLAS experiment from 2015 to 2018 at center-of-mass energy of 13 TeV, corresponding to an integrated luminosity of 139 inverse femtobarns. Final states of various lepton multiplicities were considered to isolate the vector-like lepton signal against Standard Model and instrumental background. The major backgrounds mimicking the signal are from WZ, ZZ, tt+Z production and from mis-identified leptons. A number of boosted decision trees were used to improve rejection power against background where the signal was measured using a binned-likelihood estimator. No excess relative to the Standard Model was observed. Exclusion limits were placed on vector-like leptons in the mass range of 130 to 898 GeV

    Development of attention to social interactions in naturalistic scenes

    Get PDF

    Optimizing transcriptomics to study the evolutionary effect of FOXP2

    Get PDF
    The field of genomics was established with the sequencing of the human genome, a pivotal achievement that has allowed us to address various questions in biology from a unique perspective. One question in particular, that of the evolution of human speech, has gripped philosophers, evolutionary biologists, and now genomicists. However, little is known of the genetic basis that allowed humans to evolve the ability to speak. Of the few genes implicated in human speech, one of the most studied is FOXP2, which encodes for the transcription factor Forkhead box protein P2 (FOXP2). FOXP2 is essential for proper speech development and two mutations in the human lineage are believed to have contributed to the evolution of human speech. To address the effect of FOXP2 and investigate its evolutionary contribution to human speech, one can utilize the power of genomics, more specifically gene expression analysis via ribonucleic acid sequencing (RNA-seq). To this end, I first contributed in developing mcSCRB-seq, a highly sensitive, powerful, and efficient single cell RNA-seq (scRNA-seq) protocol. Previously having emerged as a central method for studying cellular heterogeneity and identifying cellular processes, scRNA-seq was a powerful genomic tool but lacked the sensitivity and cost-efficiency of more established protocols. By systematically evaluating each step of the process, I helped find that the addition of polyethylene glycol increased sensitivity by enhancing the cDNA synthesis reaction. This, along with other optimizations resulted in developing a sensitive and flexible protocol that is cost-efficient and ideal in many research settings. A primary motivation driving the extensive optimizations surrounding single cell transcriptomics has been the generation of cellular atlases, which aim to identify and characterize all of the cells in an organism. As such efforts are carried out in a variety of research groups using a number of different RNA-seq protocols, I contributed in an effort to benchmark and standardize scRNA-seq methods. This not only identified methods which may be ideal for the purpose of cell atlas creation, but also highlighted optimizations that could be integrated into existing protocols. Using mcSCRB-seq as a foundation as well as the findings from the scRNA-seq benchmarking, I helped develop prime-seq, a sensitive, robust, and most importantly, affordable bulk RNA-seq protocol. Bulk RNA-seq was frequently overlooked during the efforts to optimize and establish single-cell techniques, even though the method is still extensively used in analyzing gene expression. Introducing early barcoding and reducing library generation costs kept prime-seq cost-efficient, but basing it off of single-cell methods ensured that it would be a sensitive and powerful technique. I helped verify this by benchmarking it against TruSeq generated data and then helped test the robustness by generating prime-seq libraries from over seventeen species. These optimizations resulted in a final protocol that is well suited for investigating gene expression in comprehensive and high-throughput studies. Finally, I utilized prime-seq in order to develop a comprehensive gene expression atlas to study the function of FOXP2 and its role in speech evolution. I used previously generated mouse models: a knockout model containing one non-functional Foxp2 allele and a humanized model, which has a variant Foxp2 allele with two human-specific mutations. To study the effect globally across the mouse, I helped harvest eighteen tissues which were previously identified to express FOXP2. By then comparing the mouse models to wild-type mice, I helped highlight the importance of FOXP2 within lung development and the importance of the human variant allele in the brain. Both mcSCRB-seq and prime-seq have already been used and published in numerous studies to address a variety of biological and biomedical questions. Additionally, my work on FOXP2 not only provides a thorough expression atlas, but also provides a detailed and cost-efficient plan for undertaking a similar study on other genes of interest. Lastly, the studies on FOXP2 done within this work, lay the foundation for future studies investigating the role of FOXP2 in modulating learning behavior, and thereby affecting human speech

    Response of saline reservoir to different phaseCOâ‚‚-brine: experimental tests and image-based modelling

    Get PDF
    Geological CO₂ storage in saline rocks is a promising method for meeting the target of net zero emission and minimizing the anthropogenic CO₂ emitted into the earth’s atmosphere. Storage of CO₂ in saline rocks triggers CO₂-brine-rock interaction that alters the properties of the rock. Properties of rocks are very crucial for the integrity and efficiency of the storage process. Changes in properties of the reservoir rocks due to CO₂-brine-rock interaction must be well predicted, as some changes can reduce the storage integrity of the reservoir. Considering the thermodynamics, phase behavior, solubility of CO₂ in brine, and the variable pressure-temperature conditions of the reservoir, there will be undissolved CO₂ in a CO₂ storage reservoir alongside the brine for a long time, and there is a potential for phase evolution of the undissolved CO₂. The phase of CO₂ influence the CO₂-brine-rock interaction, different phaseCO₂-brine have a unique effect on the properties of the reservoir rocks, Therefore, this study evaluates the effect of four different phaseCO₂-brine reservoir states on the properties of reservoir rocks using experimental and image-based approach. Samples were saturated with the different phaseCO₂-brine, then subjected to reservoir conditions in a triaxial compression test. The representative element volume (REV)/representative element area (REA) for the rock samples was determined from processed digital images, and rock properties were evaluated using digital rock physics and rock image analysis techniques. This research has evaluated the effect of different phaseCO₂-brine on deformation rate and deformation behavior, bulk modulus, compressibility, strength, and stiffness as well as porosity and permeability of sample reservoir rocks. Changes in pore geometry properties, porosity, and permeability of the rocks in CO₂ storage conditions with different phaseCO₂-brine have been evaluated using digital rock physics techniques. Microscopic rock image analysis has been applied to provide evidence of changes in micro-fabric, the topology of minerals, and elemental composition of minerals in saline rocks resulting from different phaseCO₂-br that can exist in a saline CO₂ storage reservoir. It was seen that the properties of the reservoir that are most affected by the scCO₂-br state of the reservoir include secondary fatigue rate, bulk modulus, shear strength, change in the topology of minerals after saturation as well as change in shape and flatness of pore surfaces. The properties of the reservoir that is most affected by the gCO₂-br state of the reservoir include primary fatigue rate, change in permeability due to stress, change in porosity due to stress, and change topology of minerals due to stress. For all samples, the roundness and smoothness of grains as well as smoothness of pores increased after compression while the roundness of pores decreased. Change in elemental composition in rock minerals in CO₂-brine-rock interaction was seen to depend on the reactivity of the mineral with CO₂ and/or brine and the presence of brine accelerates such change. Carbon, oxygen, and silicon can be used as index minerals for elemental changes in a CO₂-brine-rock system. The result of this work can be applied to predicting the effect the different possible phases of CO₂ will have on the deformation, geomechanics indices, and storage integrity of giant CO₂ storage fields such as Sleipner, In Salah, etc
    • …
    corecore