202 research outputs found

    Quantum Holographic Encoding in a Two-dimensional Electron Gas

    Full text link
    The advent of bottom-up atomic manipulation heralded a new horizon for attainable information density, as it allowed a bit of information to be represented by a single atom. The discrete spacing between atoms in condensed matter has thus set a rigid limit on the maximum possible information density. While modern technologies are still far from this scale, all theoretical downscaling of devices terminates at this spatial limit. Here, however, we break this barrier with electronic quantum encoding scaled to subatomic densities. We use atomic manipulation to first construct open nanostructures--"molecular holograms"--which in turn concentrate information into a medium free of lattice constraints: the quantum states of a two-dimensional degenerate Fermi gas of electrons. The information embedded in the holograms is transcoded at even smaller length scales into an atomically uniform area of a copper surface, where it is densely projected into both two spatial degrees of freedom and a third holographic dimension mapped to energy. In analogy to optical volume holography, this requires precise amplitude and phase engineering of electron wavefunctions to assemble pages of information volumetrically. This data is read out by mapping the energy-resolved electron density of states with a scanning tunnelling microscope. As the projection and readout are both extremely near-field, and because we use native quantum states rather than an external beam, we are not limited by lensing or collimation and can create electronically projected objects with features as small as ~0.3 nm. These techniques reach unprecedented densities exceeding 20 bits/nm2 and place tens of bits into a single fermionic state.Comment: Published online 25 January 2009 in Nature Nanotechnology; 12 page manuscript (including 4 figures) + 2 page supplement (including 1 figure); supplementary movie available at http://mota.stanford.ed

    New physics searches with heavy-ion collisions at the CERN Large Hadron Collider

    Get PDF
    This document summarises proposed searches for new physics accessible in the heavy-ion mode at the CERN Large Hadron Collider (LHC), both through hadronic and ultraperipheral gamma gamma interactions, and that have a competitive or, even, unique discovery potential compared to standard proton-proton collision studies. Illustrative examples include searches for new particles-such as axion-like pseudoscalars, radions, magnetic monopoles, new long-lived particles, dark photons, and sexaquarks as dark matter candidates-as well as new interactions, such as nonlinear or non-commutative QED extensions. We argue that such interesting possibilities constitute a well-justified scientific motivation, complementing standard quark-gluon-plasma physics studies, to continue running with ions at the LHC after the Run-4, i.e. beyond 2030, including light and intermediate-mass ion species, accumulating nucleon-nucleon integrated luminosities in the accessible fb(-1) range per month.Peer reviewe

    New physics searches with heavy-ion collisions at the CERN Large Hadron Collider

    Get PDF
    This document summarises proposed searches for new physics accessible in the heavy-ion mode at the CERN Large Hadron Collider (LHC), both through hadronic and ultraperipheral γγ interactions, and that have a competitive or, even, unique discovery potential compared to standard proton-proton collision studies. Illustrative examples include searches for new particles - such as axion-like pseudoscalars, radions, magnetic monopoles, new long-lived particles, dark photons, and sexaquarks as dark matter candidates - as well as new interactions, such as nonlinear or non-commutative QED extensions. We argue that such interesting possibilities constitute a well-justified scientific motivation, complementing standard quark-gluon-plasma physics studies, to continue running with ions at the LHC after the Run-4, i.e. beyond 2030, including light and intermediate-mass ion species, accumulating nucleon-nucleon integrated luminosities in the accessible fb-1 range per month

    Sharing health-related data:A privacy test?

    Get PDF
    Greater sharing of potentially sensitive data raises important ethical, legal and social issues (ELSI), which risk hindering and even preventing useful data sharing if not properly addressed. One such important issue is respecting the privacy-related interests of individuals whose data are used in genomic research and clinical care. As part of the Global Alliance for Genomics and Health (GA4GH), we examined the ELSI status of health-related data that are typically considered ‘sensitive’ in international policy and data protection laws. We propose that ‘tiered protection’ of such data could be implemented in contexts such as that of the GA4GH Beacon Project to facilitate responsible data sharing. To this end, we discuss a Data Sharing Privacy Test developed to distinguish degrees of sensitivity within categories of data recognised as ‘sensitive’. Based on this, we propose guidance for determining the level of protection when sharing genomic and health-related data for the Beacon Project and in other international data sharing initiatives

    Transdimensional inversion of receiver functions and surface wave dispersion

    No full text
    International audienceWe present a novel method for joint inversion of receiver functions and surface wave dispersion data, using a transdimensional Bayesian formulation. This class of algorithm treats the number of model parameters (e.g. number of layers) as an unknown in the problem. The dimension of the model space is variable and a Markov chain Monte Carlo (McMC) scheme is used to provide a parsimonious solution that fully quantifies the degree of knowledge one has about seismic structure (i.e constraints on the model, resolution, and trade-offs). The level of data noise (i.e. the covariance matrix of data errors) effectively controls the information recoverable from the data and here it naturally determines the complexity of the model (i.e. the number of model parameters). However, it is often difficult to quantify the data noise appropriately, particularly in the case of seismic waveform inversion where data errors are correlated. Here we address the issue of noise estimation using an extended Hierarchical Bayesian formulation, which allows both the variance and covariance of data noise to be treated as unknowns in the inversion. In this way it is possible to let the data infer the appropriate level of data fit. In the context of joint inversions, assessment of uncertainty for different data types becomes crucial in the evaluation of the misfit function. We show that the Hierarchical Bayes procedure is a powerful tool in this situation, because it is able to evaluate the level of information brought by different data types in the misfit, thus removing the arbitrary choice of weighting factors. After illustrating the method with synthetic tests, a real data application is shown where teleseismic receiver functions and ambient noise surface wave dispersion measurements from the WOMBAT array (South-East Australia) are jointly inverted to provide a probabilistic 1D model of shear-wave velocity beneath a given station

    Moho depths beneath the European Alps: a homogeneously processed map and receiver functions database

    Get PDF
    We use seismic waveform data from the AlpArray Seismic Network and three other temporary seismic networks, to perform receiver function (RF) calculations and time-to-depth migration to update the knowledge of the Moho discontinuity beneath the broader European Alps. In particular, we set up a homogeneous processing scheme to compute RFs using the time-domain iterative deconvolution method and apply consistent quality control to yield 112 205 high-quality RFs. We then perform time-to-depth migration in a newly implemented 3D spherical coordinate system using a European-scale reference P and S wave velocity model. This approach, together with the dense data coverage, provide us with a 3D migrated volume, from which we present migrated profiles that reflect the first-order crustal thickness structure. We create a detailed Moho map by manually picking the discontinuity in a set of orthogonal profiles covering the entire area. We make the RF dataset, the software for the entire processing workflow, as well as the Moho map, openly available; these open-access datasets and results will allow other researchers to build on the current study.</p

    Intravenous alteplase for stroke with unknown time of onset guided by advanced imaging: systematic review and meta-analysis of individual patient data

    Get PDF
    Background: Patients who have had a stroke with unknown time of onset have been previously excluded from thrombolysis. We aimed to establish whether intravenous alteplase is safe and effective in such patients when salvageable tissue has been identified with imaging biomarkers. Methods: We did a systematic review and meta-analysis of individual patient data for trials published before Sept 21, 2020. Randomised trials of intravenous alteplase versus standard of care or placebo in adults with stroke with unknown time of onset with perfusion-diffusion MRI, perfusion CT, or MRI with diffusion weighted imaging-fluid attenuated inversion recovery (DWI-FLAIR) mismatch were eligible. The primary outcome was favourable functional outcome (score of 0–1 on the modified Rankin Scale [mRS]) at 90 days indicating no disability using an unconditional mixed-effect logistic-regression model fitted to estimate the treatment effect. Secondary outcomes were mRS shift towards a better functional outcome and independent outcome (mRS 0–2) at 90 days. Safety outcomes included death, severe disability or death (mRS score 4–6), and symptomatic intracranial haemorrhage. This study is registered with PROSPERO, CRD42020166903. Findings: Of 249 identified abstracts, four trials met our eligibility criteria for inclusion: WAKE-UP, EXTEND, THAWS, and ECASS-4. The four trials provided individual patient data for 843 individuals, of whom 429 (51%) were assigned to alteplase and 414 (49%) to placebo or standard care. A favourable outcome occurred in 199 (47%) of 420 patients with alteplase and in 160 (39%) of 409 patients among controls (adjusted odds ratio [OR] 1·49 [95% CI 1·10–2·03]; p=0·011), with low heterogeneity across studies (I2=27%). Alteplase was associated with a significant shift towards better functional outcome (adjusted common OR 1·38 [95% CI 1·05–1·80]; p=0·019), and a higher odds of independent outcome (adjusted OR 1·50 [1·06–2·12]; p=0·022). In the alteplase group, 90 (21%) patients were severely disabled or died (mRS score 4–6), compared with 102 (25%) patients in the control group (adjusted OR 0·76 [0·52–1·11]; p=0·15). 27 (6%) patients died in the alteplase group and 14 (3%) patients died among controls (adjusted OR 2·06 [1·03–4·09]; p=0·040). The prevalence of symptomatic intracranial haemorrhage was higher in the alteplase group than among controls (11 [3%] vs two [&lt;1%], adjusted OR 5·58 [1·22–25·50]; p=0·024). Interpretation: In patients who have had a stroke with unknown time of onset with a DWI-FLAIR or perfusion mismatch, intravenous alteplase resulted in better functional outcome at 90 days than placebo or standard care. A net benefit was observed for all functional outcomes despite an increased risk of symptomatic intracranial haemorrhage. Although there were more deaths with alteplase than placebo, there were fewer cases of severe disability or death. Funding: None
    corecore