113 research outputs found
Initial GPS scintillation results from CASES receiver at South Pole, Antarctica
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/94940/1/rds6025.pd
Turbulent Times in the Northern Polar Ionosphere?
A model is presented of the growth rate of turbulently generated irregularities
in the electron concentration of northern polar cap plasma patches. The turbulence is
generated by the short-term fluctuations in the electric field imposed on the polar-cap
ionosphere by electric field mapping from the magnetosphere. The model uses an
ionospheric imaging algorithm to specify the state of the ionosphere, throughout. The
growth rates are used to estimate mean amplitudes for the irregularities and these
mean amplitudes are compared with observations of the scintillation indices, S4 and бø, by calculating the linear correlation co-efficients between them. The scintillation data are recorded by GPS L1 band receivers stationed at high northern latitudes. A total of 13 days are analysed, covering four separate magnetic storm periods. These results are compared with those from a similar model of the Gradient Drift Instability (GDI) growth rate. Over-all, the results show better correlation between the GDI process and the scintillation indices than for the turbulence process and the scintillation indices. Two storms, however, show approximately equally good correlations for both processes, indicating that there might be times when the turbulence process of irregularity formation on plasma patches may be the controlling one
Turbulent times in the northern polar ionosphere?
A model is presented of the growth rate of turbulently generated irregularities in
the electron concentration of northern polar cap plasma patches. The turbulence is
generated by the short‐term fluctuations in the electric field imposed on the polar cap
ionosphere by electric field mapping from the magnetosphere. The model uses an
ionospheric imaging algorithm to specify the state of the ionosphere throughout. The
growth rates are used to estimate mean amplitudes for the irregularities, and these mean
amplitudes are compared with observations of the scintillation indices S4 and s by
calculating the linear correlation coefficients between them. The scintillation data
are recorded by GPS L1 band receivers stationed at high northern latitudes. A total of
13 days are analyzed, covering four separate magnetic storm periods. These results are
compared with those from a similar model of the gradient drift instability (GDI) growth
rate. Overall, the results show better correlation between the GDI process and the
scintillation indices than for the turbulence process and the scintillation indices. Two
storms, however, show approximately equally good correlations for both processes,
indicating that there might be times when the turbulence process of irregularity formation
on plasma patches may be the controlling one
Quasiparticle interfacial level alignment of highly hybridized frontier levels: HO on TiO(110)
Knowledge of the frontier levels' alignment prior to photo-irradiation is
necessary to achieve a complete quantitative description of HO
photocatalysis on TiO(110). Although HO on rutile TiO(110) has been
thoroughly studied both experimentally and theoretically, a quantitative value
for the energy of the highest HO occupied levels is still lacking. For
experiment, this is due to the HO levels being obscured by hybridization
with TiO(110) levels in the difference spectra obtained via ultraviolet
photoemission spectroscopy (UPS). For theory, this is due to inherent
difficulties in properly describing many-body effects at the
HO-TiO(110) interface. Using the projected density of states (DOS) from
state-of-the-art quasiparticle (QP) , we disentangle the adsorbate and
surface contributions to the complex UPS spectra of HO on TiO(110). We
perform this separation as a function of HO coverage and dissociation on
stoichiometric and reduced surfaces. Due to hybridization with the TiO(110)
surface, the HO 3a and 1b levels are broadened into several peaks
between 5 and 1 eV below the TiO(110) valence band maximum (VBM). These
peaks have both intermolecular and interfacial bonding and antibonding
character. We find the highest occupied levels of HO adsorbed intact and
dissociated on stoichiometric TiO(110) are 1.1 and 0.9 eV below the VBM. We
also find a similar energy of 1.1 eV for the highest occupied levels of HO
when adsorbed dissociatively on a bridging O vacancy of the reduced surface. In
both cases, these energies are significantly higher (by 0.6 to 2.6 eV) than
those estimated from UPS difference spectra, which are inconclusive in this
energy region. Finally, we apply self-consistent QP (scQP1) to obtain
the ionization potential of the HO-TiO(110) interface.Comment: 12 pages, 12 figures, 1 tabl
Costs and effects of a 'healthy living' approach to community development in two deprived communities: findings from a mixed methods study
Background: Inequalities in health have proved resistant to 'top down' approaches. It is increasingly recognised that health promotion initiatives are unlikely to succeed without strong local involvement at all stages of the process and many programmes now use grass roots approaches. A healthy living approach to community development (HLA) was developed as an innovative response to local concerns about a lack of appropriate services in two deprived communities in Pembrokeshire, West Wales. We sought to assess feasibility, costs, benefits and working relationships of this HLA. Methods: The HLA intervention operated through existing community forums and focused on the whole community and its relationship with statutory and voluntary sectors. Local people were trained as community researchers and gathered views about local needs though resident interviews. Forums used interview results to write action plans, disseminated to commissioning organisations. The process was supported throughout through the project. The evaluation used a multi-method before and after study design including process and outcome formative and summative evaluation; data gathered through documentary evidence, diaries and reflective accounts, semi-structured interviews, focus groups and costing proformas. Main outcome measures were processes and timelines of implementation of HLA; self reported impact on communities and participants; community-agency processes of liaison; costs. Results: Communities were able to produce and disseminate action plans based on locally-identified needs. The process was slower than anticipated: few community changes had occurred but expectations were high. Community participants gained skills and confidence. Cross-sector partnership working developed. The process had credibility within service provider organisations but mechanisms for refocusing commissioning were patchy. Intervention costs averaged £58,304 per community per annum. Conclusions: The intervention was feasible and inexpensive, with indications of potential impact at individual, community and policy planning levels. However, it is a long term process which requires sustained investment and must be embedded in planning and service delivery processes.12 page(s
Challenges and opportunities for web-shared publication of quality-assured life cycle data: the contributions of the Life Cycle Data Network
A framework for increasing the availability of life cycle inventory data based on the role of multinational companies
Purpose
The aim of the paper is to assesses the role and effectiveness of a proposed novel strategy for Life Cycle Inventory (LCI) data collection in the food sector and associated supply chains. The study represents one of the first of its type and provides answers to some of the key questions regarding the data collection process developed, managed and implemented by a multinational food company across the supply chain.
Methods
An integrated LCI data collection process for confectionery products was developed and implemented by Nestlé, a multinational food company. Some of the key features includes: (1) management and implementation by a multinational food company, (2) types of roles to manage, provide and facilitate data exchange, (3) procedures to identify key products, suppliers and customers, (4) LCI questionnaire and cover letter, and (5) data quality management based on the pedigree matrix. Overall, the combined features in an integrated framework provides a new way of thinking about the collection of LCI data from the perspective of a multinational food company.
Results
The integrated LCI collection framework spanned across five months and resulted in 87 new LCI datasets for confectionery products from raw material, primary resource use, emission and waste release data collected from suppliers across 19 countries. The data collected was found to be of medium-to-high quality compared with secondary data. However, for retailers and waste service companies only partially completed questionnaires were returned. Some of the key challenges encountered during the collection and creation of data included: lack of experience, identifying key actors, communication and technical language, commercial compromise, confidentiality protection, and complexity of multi-tiered supplier systems. A range of recommendations are proposed to reconcile these challenges which include: standardisation of environmental data from suppliers, concise and targeted LCI questionnaires, and visualising complexity through drawings.
Conclusions
The integrated LCI data collection process and strategy has demonstrated the potential role of a multinational company to quickly engage and act as a strong enabler to unlock latent data for various aspects of the confectionery supply chain. Overall, it is recommended that the research findings serve as the foundations to transition towards a standardised procedure which can practically guide other multinational companies to considerably increase the availability of LCI data
Stratagems for Effective Function Evaluation in Computational Chemistry
In recent years, the potential benefits of high-throughput virtual screening to the drug discovery community have been recognized, bringing an increase in the number of tools developed for this purpose. These programs have to process large quantities of data, searching for an optimal solution in a vast combinatorial range. This is particularly the case for protein-ligand docking, since proteins are sophisticated structures with complicated interactions for which either molecule might reshape itself. Even the very limited flexibility model to be considered here, using ligand conformation ensembles, requires six dimensions of exploration — three translations and three rotations — per rigid conformation. The functions for evaluating pose suitability can also be complex to calculate. Consequently, the programs being written for these biochemical simulations are extremely resource-intensive. This work introduces a pure computer science approach to the field, developing techniques to improve the effectiveness of such tools. Their architecture is generalized to an abstract pattern of nested layers for discussion, covering scoring functions, searc
- …