143 research outputs found
The Atacama Cosmology Telescope: Data Characterization and Map Making
We present a description of the data reduction and mapmaking pipeline used
for the 2008 observing season of the Atacama Cosmology Telescope (ACT). The
data presented here at 148 GHz represent 12% of the 90 TB collected by ACT from
2007 to 2010. In 2008 we observed for 136 days, producing a total of 1423 hours
of data (11 TB for the 148 GHz band only), with a daily average of 10.5 hours
of observation. From these, 1085 hours were devoted to a 850 deg^2 stripe (11.2
hours by 9.1 deg) centered on a declination of -52.7 deg, while 175 hours were
devoted to a 280 deg^2 stripe (4.5 hours by 4.8 deg) centered at the celestial
equator. We discuss sources of statistical and systematic noise, calibration,
telescope pointing, and data selection. Out of 1260 survey hours and 1024
detectors per array, 816 hours and 593 effective detectors remain after data
selection for this frequency band, yielding a 38% survey efficiency. The total
sensitivity in 2008, determined from the noise level between 5 Hz and 20 Hz in
the time-ordered data stream (TOD), is 32 micro-Kelvin sqrt{s} in CMB units.
Atmospheric brightness fluctuations constitute the main contaminant in the data
and dominate the detector noise covariance at low frequencies in the TOD. The
maps were made by solving the least-squares problem using the Preconditioned
Conjugate Gradient method, incorporating the details of the detector and noise
correlations. Cross-correlation with WMAP sky maps, as well as analysis from
simulations, reveal that our maps are unbiased at multipoles ell > 300. This
paper accompanies the public release of the 148 GHz southern stripe maps from
2008. The techniques described here will be applied to future maps and data
releases.Comment: 20 pages, 18 figures, 6 tables, an ACT Collaboration pape
The Atacama Cosmology Telescope: Cosmological parameters from three seasons of data
We present constraints on cosmological and astrophysical parameters from
high-resolution microwave background maps at 148 GHz and 218 GHz made by the
Atacama Cosmology Telescope (ACT) in three seasons of observations from 2008 to
2010. A model of primary cosmological and secondary foreground parameters is
fit to the map power spectra and lensing deflection power spectrum, including
contributions from both the thermal Sunyaev-Zeldovich (tSZ) effect and the
kinematic Sunyaev-Zeldovich (kSZ) effect, Poisson and correlated anisotropy
from unresolved infrared sources, radio sources, and the correlation between
the tSZ effect and infrared sources. The power ell^2 C_ell/2pi of the thermal
SZ power spectrum at 148 GHz is measured to be 3.4 +\- 1.4 muK^2 at ell=3000,
while the corresponding amplitude of the kinematic SZ power spectrum has a 95%
confidence level upper limit of 8.6 muK^2. Combining ACT power spectra with the
WMAP 7-year temperature and polarization power spectra, we find excellent
consistency with the LCDM model. We constrain the number of effective
relativistic degrees of freedom in the early universe to be Neff=2.79 +\- 0.56,
in agreement with the canonical value of Neff=3.046 for three massless
neutrinos. We constrain the sum of the neutrino masses to be Sigma m_nu < 0.39
eV at 95% confidence when combining ACT and WMAP 7-year data with BAO and
Hubble constant measurements. We constrain the amount of primordial helium to
be Yp = 0.225 +\- 0.034, and measure no variation in the fine structure
constant alpha since recombination, with alpha/alpha0 = 1.004 +/- 0.005. We
also find no evidence for any running of the scalar spectral index, dns/dlnk =
-0.004 +\- 0.012.Comment: 26 pages, 22 figures. This paper is a companion to Das et al. (2013)
and Dunkley et al. (2013). Matches published JCAP versio
A Search for Technosignatures Around 11,680 Stars with the Green Bank Telescope at 1.15-1.73 GHz
We conducted a search for narrowband radio signals over four observing
sessions in 2020-2023 with the L-band receiver (1.15-1.73 GHz) of the 100 m
diameter Green Bank Telescope. We pointed the telescope in the directions of 62
TESS Objects of Interest, capturing radio emissions from a total of ~11,680
stars and planetary systems in the ~9 arcminute beam of the telescope. All
detections were either automatically rejected or visually inspected and
confirmed to be of anthropogenic nature. In this work, we also quantified the
end-to-end efficiency of radio SETI pipelines with a signal injection and
recovery analysis. The UCLA SETI pipeline recovers 94.0% of the injected
signals over the usable frequency range of the receiver and 98.7% of the
injections when regions of dense RFI are excluded. In another pipeline that
uses incoherent sums of 51 consecutive spectra, the recovery rate is ~15 times
smaller at ~6%. The pipeline efficiency affects calculations of transmitter
prevalence and SETI search volume. Accordingly, we developed an improved Drake
Figure of Merit and a formalism to place upper limits on transmitter prevalence
that take the pipeline efficiency and transmitter duty cycle into account.
Based on our observations, we can state at the 95% confidence level that fewer
than 6.6% of stars within 100 pc host a transmitter that is detectable in our
search (EIRP > 1e13 W). For stars within 20,000 ly, the fraction of stars with
detectable transmitters (EIRP > 5e16 W) is at most 3e-4. Finally, we showed
that the UCLA SETI pipeline natively detects the signals detected with AI
techniques by Ma et al. (2023).Comment: 22 pages, 9 figures, submitted to AJ, revise
Mimicking human neuronal pathways in silico: an emergent model on the effective connectivity
International audienceWe present a novel computational model that detects temporal configurations of a given human neuronal pathway and constructs its artificial replication. This poses a great challenge since direct recordings from individual neurons are impossible in the human central nervous system and therefore the underlying neuronal pathway has to be considered as a black box. For tackling this challenge, we used a branch of complex systems modeling called artificial self-organization in which large sets of software entities interacting locally give rise to bottom-up collective behaviors. The result is an emergent model where each software entity represents an integrate-and-fire neuron. We then applied the model to the reflex responses of single motor units obtained from conscious human subjects. Experimental results show that the model recovers functionality of real human neuronal pathways by comparing it to appropriate surrogate data. What makes the model promising is the fact that, to the best of our knowledge, it is the first realistic model to self-wire an artificial neuronal network by efficiently combining neuroscience with artificial self-organization. Although there is no evidence yet of the model's connectivity mapping onto the human connectivity, we anticipate this model will help neuroscientists to learn much more about human neuronal networks, and could also be used for predicting hypotheses to lead future experiments
The North American tree-ring fire-scar network
Fire regimes in North American forests are diverse and modern fire records are often too short to capture important patterns, trends, feedbacks, and drivers of variability. Tree-ring fire scars provide valuable perspectives on fire regimes, including centuries-long records of fire year, season, frequency, severity, and size. Here, we introduce the newly compiled North American tree-ring fire-scar network (NAFSN), which contains 2562 sites, >37,000 fire-scarred trees, and covers large parts of North America. We investigate the NAFSN in terms of geography, sample depth, vegetation, topography, climate, and human land use. Fire scars are found in most ecoregions, from boreal forests in northern Alaska and Canada to subtropical forests in southern Florida and Mexico. The network includes 91 tree species, but is dominated by gymnosperms in the genus Pinus. Fire scars are found from sea level to >4000-m elevation and across a range of topographic settings that vary by ecoregion. Multiple regions are densely sampled (e.g., >1000 fire-scarred trees), enabling new spatial analyses such as reconstructions of area burned. To demonstrate the potential of the network, we compared the climate space of the NAFSN to those of modern fires and forests; the NAFSN spans a climate space largely representative of the forested areas in North America, with notable gaps in warmer tropical climates. Modern fires are burning in similar climate spaces as historical fires, but disproportionately in warmer regions compared to the historical record, possibly related to under-sampling of warm subtropical forests or supporting observations of changing fire regimes. The historical influence of Indigenous and non-Indigenous human land use on fire regimes varies in space and time. A 20th century fire deficit associated with human activities is evident in many regions, yet fire regimes characterized by frequent surface fires are still active in some areas (e.g., Mexico and the southeastern United States). These analyses provide a foundation and framework for future studies using the hundreds of thousands of annually- to sub-annually-resolved tree-ring records of fire spanning centuries, which will further advance our understanding of the interactions among fire, climate, topography, vegetation, and humans across North America
Naturalizing Institutions: Evolutionary Principles and Application on the Case of Money
In recent extensions of the Darwinian paradigm into economics, the replicator-interactor duality looms large. I propose a strictly naturalistic approach to this duality in the context of the theory of institutions, which means that its use is seen as being always and necessarily dependent on identifying a physical realization. I introduce a general framework for the analysis of institutions, which synthesizes Searle's and Aoki's theories, especially with regard to the role of public representations (signs) in the coordination of actions, and the function of cognitive processes that underly rule-following as a behavioral disposition. This allows to conceive institutions as causal circuits that connect the population-level dynamics of interactions with cognitive phenomena on the individual level. Those cognitive phenomena ultimately root in neuronal structures. So, I draw on a critical restatement of the concept of the meme by Aunger to propose a new conceptualization of the replicator in the context of institutions, namely, the replicator is a causal conjunction between signs and neuronal structures which undergirds the dispositions that generate rule-following actions. Signs, in turn, are outcomes of population-level interactions. I apply this framework on the case of money, analyzing the emotions that go along with the use of money, and presenting a stylized account of the emergence of money in terms of the naturalized Searle-Aoki model. In this view, money is a neuronally anchored metaphor for emotions relating with social exchange and reciprocity. Money as a meme is physically realized in a replicator which is a causal conjunction of money artefacts and money emotions
Titin-truncating variants affect heart function in disease cohorts and the general population
Titin-truncating variants (TTNtv) commonly cause dilated cardiomyopathy (DCM). TTNtv are also encountered in ~1% of the general population, where they may be silent, perhaps reflecting allelic factors. To better understand TTNtv, we integrated TTN allelic series, cardiac imaging and genomic data in humans and studied rat models with disparate TTNtv. In patients with DCM, TTNtv throughout titin were significantly associated with DCM. Ribosomal profiling in rat showed the translational footprint of premature stop codons in Ttn, TTNtv-position-independent nonsense-mediated degradation of the mutant allele and a signature of perturbed cardiac metabolism. Heart physiology in rats with TTNtv was unremarkable at baseline but became impaired during cardiac stress. In healthy humans, machine-learning-based analysis of high-resolution cardiac imaging showed TTNtv to be associated with eccentric cardiac remodeling. These data show that TTNtv have molecular and physiological effects on the heart across species, with a continuum of expressivity in health and disease
GA4GH: International policies and standards for data sharing across genomic research and healthcare.
The Global Alliance for Genomics and Health (GA4GH) aims to accelerate biomedical advances by enabling the responsible sharing of clinical and genomic data through both harmonized data aggregation and federated approaches. The decreasing cost of genomic sequencing (along with other genome-wide molecular assays) and increasing evidence of its clinical utility will soon drive the generation of sequence data from tens of millions of humans, with increasing levels of diversity. In this perspective, we present the GA4GH strategies for addressing the major challenges of this data revolution. We describe the GA4GH organization, which is fueled by the development efforts of eight Work Streams and informed by the needs of 24 Driver Projects and other key stakeholders. We present the GA4GH suite of secure, interoperable technical standards and policy frameworks and review the current status of standards, their relevance to key domains of research and clinical care, and future plans of GA4GH. Broad international participation in building, adopting, and deploying GA4GH standards and frameworks will catalyze an unprecedented effort in data sharing that will be critical to advancing genomic medicine and ensuring that all populations can access its benefits
A global action agenda for turning the tide on fatty liver disease
Background and Aims:
Fatty liver disease is a major public health threat due to its very high prevalence and related morbidity and mortality. Focused and dedicated interventions are urgently needed to target disease prevention, treatment, and care.
Approach and Results:
We developed an aligned, prioritized action agenda for the global fatty liver disease community of practice. Following a Delphi methodology over 2 rounds, a large panel (R1 n = 344, R2 n = 288) reviewed the action priorities using Qualtrics XM, indicating agreement using a 4-point Likert-scale and providing written feedback. Priorities were revised between rounds, and in R2, panelists also ranked the priorities within 6 domains: epidemiology, treatment and care, models of care, education and awareness, patient and community perspectives, and leadership and public health policy. The consensus fatty liver disease action agenda encompasses 29 priorities. In R2, the mean percentage of “agree” responses was 82.4%, with all individual priorities having at least a super-majority of agreement (> 66.7% “agree”). The highest-ranked action priorities included collaboration between liver specialists and primary care doctors on early diagnosis, action to address the needs of people living with multiple morbidities, and the incorporation of fatty liver disease into relevant non-communicable disease strategies and guidance.
Conclusions:
This consensus-driven multidisciplinary fatty liver disease action agenda developed by care providers, clinical researchers, and public health and policy experts provides a path to reduce the prevalence of fatty liver disease and improve health outcomes. To implement this agenda, concerted efforts will be needed at the global, regional, and national levels.publishedVersio
The James Webb Space Telescope Mission
Twenty-six years ago a small committee report, building on earlier studies,
expounded a compelling and poetic vision for the future of astronomy, calling
for an infrared-optimized space telescope with an aperture of at least .
With the support of their governments in the US, Europe, and Canada, 20,000
people realized that vision as the James Webb Space Telescope. A
generation of astronomers will celebrate their accomplishments for the life of
the mission, potentially as long as 20 years, and beyond. This report and the
scientific discoveries that follow are extended thank-you notes to the 20,000
team members. The telescope is working perfectly, with much better image
quality than expected. In this and accompanying papers, we give a brief
history, describe the observatory, outline its objectives and current observing
program, and discuss the inventions and people who made it possible. We cite
detailed reports on the design and the measured performance on orbit.Comment: Accepted by PASP for the special issue on The James Webb Space
Telescope Overview, 29 pages, 4 figure
- …