195 research outputs found
Neutrinos and Future Concordance Cosmologies
We review the free parameters in the concordance cosmology, and those which
might be added to this set as the quality of astrophysical data improves. Most
concordance parameters encode information about otherwise unexplored aspects of
high energy physics, up to the GUT scale via the "inflationary sector," and
possibly even the Planck scale in the case of dark energy. We explain how
neutrino properties may be constrained by future astrophysical measurements.
Conversely, future neutrino physics experiments which directly measure these
parameters will remove uncertainty from fits to astrophysical data, and improve
our ability to determine the global properties of our universe.Comment: Proceedings of paper given at Neutrino 2008 meeting (by RE
Hadrons with Charm and Beauty
By combining potential models and QCD spectral sum rules (QSSR), we discuss
the spectroscopy of the mesons and of the , and
baryons ( or ), the decay constant and the
(semi)leptonic decay modes of the meson. For the masses, the best
predictions come from potential models and read: ~MeV,
~MeV, ~GeV,
~GeV, ~GeV
and ~GeV. The decay constant is well determined from QSSR and leads to:
s.The uses of the vertex sum rules for the semileptonic decays of the
show that the -dependence of the form factors is much stronger than
predicted by vector meson dominance. It also predicts the almost equal strength
of about 0.30 sec for the semileptonic rates into
and J/. Besides these phenomenological results, we
also show explicitly how the Wilson coefficients of the and gluon condensates already contain the full
heavy quark- () and mixed- ()
condensate contributions in the OPE.}Comment: 32 pages, LaTeX, no changes in the 1994 paper, latex errors corrected
in 201
Electromagnetic Form Factors in the hypercentral CQM
We report on the recent results of the hypercentral Constituent Quark Model
(hCQM). The model contains a spin independent three-quark interaction which is
inspired by Lattice QCD calculations and reproduces the average energy values
of the SU(6) multiplets. The splittings are obtained with a SU(6)-breaking
interaction, which can include also an isospin dependent term. Concerning
Constituent Quark models, we have shown for the first time that the decreasing
of the ratio of the elastic form factors of the proton is due to relativistic
effects using relativistic corrections to the e.m. current and boosts. Now the
elastic nucleon form factors have been recalculated, using a relativistic
version of the hCQM and a relativistic quark current showing a very detailed
reproduction of all the four form factor existing data over the complete range
of 0-4 . Futhermore, the model has been used for predictions concerning
the electromagnetic transverse and longitudinal transition form factors giving
a good description of the medium behaviour. We show that the
discrepancies in the reproduction of the helicity amplitudes at low are
due to pion loops. We have calculated the helicity amplitudes for all the 3 and
4 star resonances opening the possibility of application to the evaluation of
cross sections.Comment: 5 pages, 7 figures, Invited talk at the ICTP 4th International
Conference on Perspectives in Hadronic Physics, Trieste, Italy, 12-16 May
2003. Accepted by Eur. Phys. J.
Living with interpersonal data: observability and accountability in the age of pervasive ICT
The Internet of Things, alongside existing mobile digital technologies, heralds a world in which pervasive sensing constantly captures data about us. Simultaneous with this technology programme are moves by policymakers to shore up the digital economy through the legislating of new trust-building models of data management. These moves seek to give individuals control and oversight of their personal data. Within shared settings, the consequences of these changes are the large-scale generation of interpersonal data generated by and acting on the group rather than individual. We consider how such systems create new forms of observability and hence accountability among members of the home, and draw on the work of Simmel and Goffman to explore how these demands are managed. Such management mitigates the more extreme possibilities for domestic monitoring posited by these systems, yet without careful design there remains a considerable danger of unanticipated negative consequences
Search for Flavoured Multiquarks in a Simple Bag Model
We use a bag model to study flavoured mesonic and baryonic
states, where one heavy quark is associated with
light quarks or antiquarks, and search for possible stable multiquarks. No
bound state is found. However some states lie not too high above their
dissociation threshold, suggesting the possibility of resonances, or perhaps
bound states in improved models.Comment: REVTEX, VERSION 3.
Refractive Index of Humid Air in the Infrared: Model Fits
The theory of summation of electromagnetic line transitions is used to
tabulate the Taylor expansion of the refractive index of humid air over the
basic independent parameters (temperature, pressure, humidity, wavelength) in
five separate infrared regions from the H to the Q band at a fixed percentage
of Carbon Dioxide. These are least-squares fits to raw, highly resolved spectra
for a set of temperatures from 10 to 25 C, a set of pressures from 500 to 1023
hPa, and a set of relative humidities from 5 to 60%. These choices reflect the
prospective application to characterize ambient air at mountain altitudes of
astronomical telescopes.Comment: Corrected exponents of c0ref, c1ref and c1p in Table
Enabling quantitative data analysis through e-infrastructures
This paper discusses how quantitative data analysis in the social sciences can engage with and exploit an e-Infrastructure. We highlight how a number of activities which are central to quantitative data analysis, referred to as âdata managementâ, can benefit from e-infrastructure support. We conclude by discussing how these issues are relevant to the DAMES (Data Management through e-Social Science) research Node, an ongoing project that aims to develop e-Infrastructural resources for quantitative data analysis in the social sciences
Scalar mesons in the Nambu--Jona-Lasinio model with 't Hooft interaction
We calculate the mass spectra of the pseudoscalar and scalar meson nonets in
the Nambu--Jona-Lasinio model with the 't Hooft interaction. We obtain
satisfactory result for the pseudoscalar mesons. For the scalar mesons, the 't
Hooft interaction somewhat increases the values of the masses. However, it is
not sufficient to explain the whole scalar mass spectrum. The situation could
be improved for the and mesons through mixing with the glueball
state. For the description of the masses of and \kstar mesons, it is
necessary to involve the other models. The strong decay widths of the scalar
mesons are described.Comment: LaTeX text, 8 page
Recommended from our members
Big data from electronic health records for early and late translational cardiovascular research: challenges and potential.
AIMS: Cohorts of millions of people's health records, whole genome sequencing, imaging, sensor, societal and publicly available data present a rapidly expanding digital trace of health. We aimed to critically review, for the first time, the challenges and potential of big data across early and late stages of translational cardiovascular disease research. METHODS AND RESULTS: We sought exemplars based on literature reviews and expertise across the BigData@Heart Consortium. We identified formidable challenges including: data quality, knowing what data exist, the legal and ethical framework for their use, data sharing, building and maintaining public trust, developing standards for defining disease, developing tools for scalable, replicable science and equipping the clinical and scientific work force with new inter-disciplinary skills. Opportunities claimed for big health record data include: richer profiles of health and disease from birth to death and from the molecular to the societal scale; accelerated understanding of disease causation and progression, discovery of new mechanisms and treatment-relevant disease sub-phenotypes, understanding health and diseases in whole populations and whole health systems and returning actionable feedback loops to improve (and potentially disrupt) existing models of research and care, with greater efficiency. In early translational research we identified exemplars including: discovery of fundamental biological processes e.g. linking exome sequences to lifelong electronic health records (EHR) (e.g. human knockout experiments); drug development: genomic approaches to drug target validation; precision medicine: e.g. DNA integrated into hospital EHR for pre-emptive pharmacogenomics. In late translational research we identified exemplars including: learning health systems with outcome trials integrated into clinical care; citizen driven health with 24/7 multi-parameter patient monitoring to improve outcomes and population-based linkages of multiple EHR sources for higher resolution clinical epidemiology and public health. CONCLUSION: High volumes of inherently diverse ('big') EHR data are beginning to disrupt the nature of cardiovascular research and care. Such big data have the potential to improve our understanding of disease causation and classification relevant for early translation and to contribute actionable analytics to improve health and healthcare
New Horizons in the use of routine data for ageing research
The past three decades have seen a steady increase in the availability of routinely collected health and social care data and the processing power to analyse it. These developments represent a major opportunity for ageing research, especially with the integration of different datasets across traditional boundaries of health and social care, for prognostic research and novel evaluations of interventions with representative populations of older people. However, there are considerable challenges in using routine data at the level of coding, data analysis and in the application of findings to everyday care. New Horizons in applying routine data to investigate novel questions in ageing research require a collaborative approach between clinicians, data scientists, biostatisticians, epidemiologists and trial methodologists. This requires building capacity for the next generation of research leaders in this important area. There is a need to develop consensus code lists and standardised, validated algorithms for common conditions and outcomes that are relevant for older people to maximise the potential of routine data research in this group. Lastly, we must help drive the application of routine data to improve the care of older people, through the development of novel methods for evaluation of interventions using routine data infrastructure. We believe that harnessing routine data can help address knowledge gaps for older people living with multiple conditions and frailty, and design interventions and pathways of care to address the complex health issues we face in caring for older people
- âŠ