71 research outputs found
From the arrow of time in Badiali's quantum approach to the dynamic meaning of Riemann's hypothesis
The novelty of the Jean Pierre Badiali last scientific works stems to a
quantum approach based on both (i) a return to the notion of trajectories
(Feynman paths) and (ii) an irreversibility of the quantum transitions. These
iconoclastic choices find again the Hilbertian and the von Neumann algebraic
point of view by dealing statistics over loops. This approach confers an
external thermodynamic origin to the notion of a quantum unit of time (Rovelli
Connes' thermal time). This notion, basis for quantization, appears herein as a
mere criterion of parting between the quantum regime and the thermodynamic
regime. The purpose of this note is to unfold the content of the last five
years of scientific exchanges aiming to link in a coherent scheme the Jean
Pierre's choices and works, and the works of the authors of this note based on
hyperbolic geodesics and the associated role of Riemann zeta functions. While
these options do not unveil any contradictions, nevertheless they give birth to
an intrinsic arrow of time different from the thermal time. The question of the
physical meaning of Riemann hypothesis as the basis of quantum mechanics, which
was at the heart of our last exchanges, is the backbone of this note.Comment: 13 pages, 2 figure
Arrows of times, non-integer operators, self-similar structures, zeta functions and Riemann hypothesis: A synthetic categorical approach
© 2017 L & H Scientific Publishing, LLC. The authors have previously reported the existence of a morphism be- tween the Riemann zeta function and the "Cole and Cole" canonical transfer functions observed in dielectric relaxation, electrochemistry, mechanics and electromagnetism. The link with self-similar struc- tures has been addressed for a long time and likewise the discovered of the incompleteness which may be attached to any dynamics con- trolled by non-integer derivative operators. Furthermore it was al- ready shown that the Riemann Hypothesis can be associated with a transition of an order parameter given by the geometric phase at- tached to the fractional operators. The aim of this note is to show that all these properties have a generic basis in category theory. The highlighting of the incompleteness of non-integer operators considered as critical by some authors is relevant, but the use of the morphism with zeta function reduces the operational impact of this issue with- out limited its epistemological consequences
Recommended from our members
Single Particle Fluorescence & Mass Spectrometry for the Detection of Biological Aerosols
Biological Aerosol Mass Spectrometry (BAMS) is an emerging technique for the detection of biological aerosols, which is being developed at Lawrence Livermore National Laboratory. The current system uses several orthogonal analytical methods to improve system selectivity, sensitivity and speed in order to maximize its utility as a biological aerosol detection system with extremely low probability of false alarm and high probability of detection. Our approach is to pre-select particles of interest by size and fluorescence prior to mass spectral analysis. The ability to distinguish biological aerosols from background and to discriminate bacterial spores, vegetative cells, viruses and toxins from one another will be shown. Data from particle standards of known chemical composition will be discussed. Analysis of ambient particles will also be presented
Recommended from our members
DSP-Based dual-polarity mass spectrum pattern recognition for bio-detection
The Bio-Aerosol Mass Spectrometry (BAMS) instrument analyzes single aerosol particles using a dual-polarity time-of-flight mass spectrometer recording simultaneously spectra of thirty to a hundred thousand points on each polarity. We describe here a real-time pattern recognition algorithm developed at Lawrence Livermore National Laboratory that has been implemented on a nine Digital Signal Processor (DSP) system from Signatec Incorporated. The algorithm first preprocesses independently the raw time-of-flight data through an adaptive baseline removal routine. The next step consists of a polarity dependent calibration to a mass-to-charge representation, reducing the data to about five hundred to a thousand channels per polarity. The last step is the identification step using a pattern recognition algorithm based on a library of known particle signatures including threat agents and background particles. The identification step includes integrating the two polarities for a final identification determination using a score-based rule tree. This algorithm, operating on multiple channels per-polarity and multiple polarities, is well suited for parallel real-time processing. It has been implemented on the PMP8A from Signatec Incorporated, which is a computer based board that can interface directly to the two one-Giga-Sample digitizers (PDA1000 from Signatec Incorporated) used to record the two polarities of time-of-flight data. By using optimized data separation, pipelining, and parallel processing across the nine DSPs it is possible to achieve a processing speed of up to a thousand particles per seconds, while maintaining the recognition rate observed on a non-real time implementation. This embedded system has allowed the BAMS technology to improve its throughput and therefore its sensitivity while maintaining a large dynamic range (number of channels and two polarities) thus maintaining the systems specificity for bio-detection
Recommended from our members
Characterization of ambient aerosols at the San Francisco International Airport using BioAerosol Mass Spectrometry
The BioAerosol Mass Spectrometry (BAMS) system is a rapidly fieldable, fully autonomous instrument that can perform correlated measurements of multiple orthogonal properties of individual aerosol particles. The BAMS front end uses optical techniques to nondestructively measure a particle's aerodynamic diameter and fluorescence properties. Fluorescence can be excited at 266nm or 355nm and is detected in two broad wavelength bands. Individual particles with appropriate size and fluorescence properties can then be analyzed more thoroughly in a dual-polarity time-of-flight mass spectrometer. Over the course of two deployments to the San Francisco International Airport, more than 6.5 million individual aerosol particles were fully analyzed by the system. Analysis of the resulting data has provided a number of important insights relevant to rapid bioaerosol detection, which are described here
Recommended from our members
Initial test results of an ionization chamber shower detector for a LHC luminosity monitor
A novel, segmented, multi-gap, pressurized gas ionization chamber is being developed for optimization of the luminosity of the LHC. The ionization chambers are to be installed in the front quadrupole and zero degree neutral particle absorbers in the high luminosity IRs and sample the energy deposited near the maxima of the hadronic/electromagnetic showers in these absorbers. The ionization chambers are instrumented with low noise, fast, pulse shaping electronics to be capable of resolving individual bunch crossings at 40 MHz. In this paper we report the initial results of our second test of this instrumentation in an SPS external proton beam. Single 300 GeV protons are used to simulate the hadronic/electromagnetic shower produced by the forward collision products from the interaction regions of the LHC. The capability of instrumentations to measure the luminosity of individual bunches in a 40 MHz bunch train is demonstrated
LSST: from Science Drivers to Reference Design and Anticipated Data Products
(Abridged) We describe here the most ambitious survey currently planned in
the optical, the Large Synoptic Survey Telescope (LSST). A vast array of
science will be enabled by a single wide-deep-fast sky survey, and LSST will
have unique survey capability in the faint time domain. The LSST design is
driven by four main science themes: probing dark energy and dark matter, taking
an inventory of the Solar System, exploring the transient optical sky, and
mapping the Milky Way. LSST will be a wide-field ground-based system sited at
Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m
effective) primary mirror, a 9.6 deg field of view, and a 3.2 Gigapixel
camera. The standard observing sequence will consist of pairs of 15-second
exposures in a given field, with two such visits in each pointing in a given
night. With these repeats, the LSST system is capable of imaging about 10,000
square degrees of sky in a single filter in three nights. The typical 5
point-source depth in a single visit in will be (AB). The
project is in the construction phase and will begin regular survey operations
by 2022. The survey area will be contained within 30,000 deg with
, and will be imaged multiple times in six bands, ,
covering the wavelength range 320--1050 nm. About 90\% of the observing time
will be devoted to a deep-wide-fast survey mode which will uniformly observe a
18,000 deg region about 800 times (summed over all six bands) during the
anticipated 10 years of operations, and yield a coadded map to . The
remaining 10\% of the observing time will be allocated to projects such as a
Very Deep and Fast time domain survey. The goal is to make LSST data products,
including a relational database of about 32 trillion observations of 40 billion
objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures
available from https://www.lsst.org/overvie
An integrated online radioassay data storage and analytics tool for nEXO
Large-scale low-background detectors are increasingly used in rare-event
searches as experimental collaborations push for enhanced sensitivity. However,
building such detectors, in practice, creates an abundance of radioassay data
especially during the conceptual phase of an experiment when hundreds of
materials are screened for radiopurity. A tool is needed to manage and make use
of the radioassay screening data to quantitatively assess detector design
options. We have developed a Materials Database Application for the nEXO
experiment to serve this purpose. This paper describes this database, explains
how it functions, and discusses how it streamlines the design of the
experiment
Performance of novel VUV-sensitive Silicon Photo-Multipliers for nEXO
Liquid xenon time projection chambers are promising detectors to search for
neutrinoless double beta decay (0), due to their response
uniformity, monolithic sensitive volume, scalability to large target masses,
and suitability for extremely low background operations. The nEXO collaboration
has designed a tonne-scale time projection chamber that aims to search for
0 of \ce{^{136}Xe} with projected half-life sensitivity of
~yr. To reach this sensitivity, the design goal for nEXO is
1\% energy resolution at the decay -value (~keV).
Reaching this resolution requires the efficient collection of both the
ionization and scintillation produced in the detector. The nEXO design employs
Silicon Photo-Multipliers (SiPMs) to detect the vacuum ultra-violet, 175 nm
scintillation light of liquid xenon. This paper reports on the characterization
of the newest vacuum ultra-violet sensitive Fondazione Bruno Kessler VUVHD3
SiPMs specifically designed for nEXO, as well as new measurements on new test
samples of previously characterised Hamamatsu VUV4 Multi Pixel Photon Counters
(MPPCs). Various SiPM and MPPC parameters, such as dark noise, gain, direct
crosstalk, correlated avalanches and photon detection efficiency were measured
as a function of the applied over voltage and wavelength at liquid xenon
temperature (163~K). The results from this study are used to provide updated
estimates of the achievable energy resolution at the decay -value for the
nEXO design
Application of Spherical Harmonics to the Modeling of Anatomical Shapes
3D shape modeling is a key issue in the resolution of major medical imaging problems. In this paper we address the problem of the modeling of closed free form anatomical shapes with spherical harmonics. We define the basis of an ongoing project by illustrating, through two preliminary applications, the interest of such a modeling. After the presentation of spherical harmonics, both for static modeling and for time-dependent modeling, applications to the modeling and deformation analysis of the vertebra shape from CT data, and to the modeling of the endocardial surface from SPECT data are successively depicted
- …