11,265 research outputs found
Environmental chemical exposures and breast cancer
As a hormone-sensitive condition with no single identifiable cause, breast cancer is a major health problem. It is characterized by a wide range of contributing factors and exposures occurring in different combinations and strengths across a lifetime that may be amplified during periods of enhanced developmental susceptibility and impacted by reproductive patterns and behaviours. The vast majority of cases are oestrogen-receptor positive and occur in women with no family history of the disease suggesting that modifiable risk factors are involved. A substantial body of evidence now links oestrogen-positive breast cancer with environmental exposures. Synthetic chemicals capable of oestrogen mimicry are characteristic of industrial development and have been individually and extensively assessed as risk factors for oestrogen-sensitive cancers. Existing breast cancer risk assessment tools do not take such factors into account. In the absence of consensus on causation and in order to better understand the problem of escalating incidence globally, an expanded, integrated approach broadening the inquiry into individual susceptibility breast cancer is proposed. Applying systems thinking to existing data on oestrogen-modulating environmental exposures and other oestrogenic factors characteristic of Westernisation and their interactions in the exposure, encompassing social, behavioural, environmental, hormonal and genetic factors, can assist in understanding cancer risks and the pursuit of prevention strategies. A new conceptual framework based on a broader understanding of the “system” that underlies the development of breast cancer over a period of many years, incorporating the factors known to contribute to breast cancer risk, could provide a new platform from which government and regulators can promulgate enhanced and more effective prevention strategies
Data-efficient Neuroevolution with Kernel-Based Surrogate Models
Surrogate-assistance approaches have long been used in computationally
expensive domains to improve the data-efficiency of optimization algorithms.
Neuroevolution, however, has so far resisted the application of these
techniques because it requires the surrogate model to make fitness predictions
based on variable topologies, instead of a vector of parameters. Our main
insight is that we can sidestep this problem by using kernel-based surrogate
models, which require only the definition of a distance measure between
individuals. Our second insight is that the well-established Neuroevolution of
Augmenting Topologies (NEAT) algorithm provides a computationally efficient
distance measure between dissimilar networks in the form of "compatibility
distance", initially designed to maintain topological diversity. Combining
these two ideas, we introduce a surrogate-assisted neuroevolution algorithm
that combines NEAT and a surrogate model built using a compatibility distance
kernel. We demonstrate the data-efficiency of this new algorithm on the low
dimensional cart-pole swing-up problem, as well as the higher dimensional
half-cheetah running task. In both tasks the surrogate-assisted variant
achieves the same or better results with several times fewer function
evaluations as the original NEAT.Comment: In GECCO 201
Establishing Social Work Practices in England: The Early Evidence
Social Work Practices (SWPs) were established in England in 2009 to deliver social work services to looked after children and care leavers. The introduction of independent social work-led organisations generated controversy focused on issues such as the privatisation of children's services and social workers' conditions of employment. This paper reports early findings from the evaluation of four of these pilots, drawing on interviews with children and young people, staff, and local authority and national stakeholders. The SWPs assumed a variety of organisational forms. The procurement process was demanding, with protracted negotiations over matters such as budgetary control and providing a round-the-clock service. Start-up was facilitated by an established relationship between the SWP provider and the local authority. Once operational, SWPs continued to rely on local authorities for various functions; in most cases, local authorities retained control of placement budgets. Levels of consultation and choice offered to children and young people regarding the move to an SWP varied considerably. Children's understanding about SWPs was generally low except in the pilot where most children retained their original social worker. These early findings show some dilution of the original SWP model, while the pilots' diversity allows the benefits of particular models to emerge
A hermeneutic inquiry into user-created personas in different Namibian locales
Persona is a tool broadly used in technology design to support communicational interactions between designers and users. Different Persona types and methods have evolved mostly in the Global North, and been partially deployed in the Global South every so often in its original User-Centred Design methodology. We postulate persona conceptualizations are expected to differ across cultures. We demonstrate this with an exploratory-case study on user-created persona co-designed with four Namibian ethnic groups: ovaHerero, Ovambo, ovaHimba and Khoisan. We follow a hermeneutic inquiry approach to discern cultural nuances from diverse human conducts. Findings reveal diverse self-representations whereby for each ethnic group results emerge in unalike fashions, viewpoints, recounts and storylines. This paper ultimately argues User-Created Persona as a potentially valid approach for pursuing cross-cultural depictions of personas that communicate cultural features and user experiences paramount to designing acceptable and gratifying technologies in dissimilar locales
Methods for measuring the citations and productivity of scientists across time and discipline
Publication statistics are ubiquitous in the ratings of scientific
achievement, with citation counts and paper tallies factoring into an
individual's consideration for postdoctoral positions, junior faculty, tenure,
and even visa status for international scientists. Citation statistics are
designed to quantify individual career achievement, both at the level of a
single publication, and over an individual's entire career. While some academic
careers are defined by a few significant papers (possibly out of many), other
academic careers are defined by the cumulative contribution made by the
author's publications to the body of science. Several metrics have been
formulated to quantify an individual's publication career, yet none of these
metrics account for the dependence of citation counts and journal size on time.
In this paper, we normalize publication metrics across both time and discipline
in order to achieve a universal framework for analyzing and comparing
scientific achievement. We study the publication careers of individual authors
over the 50-year period 1958-2008 within six high-impact journals: CELL, the
New England Journal of Medicine (NEJM), Nature, the Proceedings of the National
Academy of Science (PNAS), Physical Review Letters (PRL), and Science. In
comparing the achievement of authors within each journal, we uncover
quantifiable statistical regularity in the probability density function (pdf)
of scientific achievement across both time and discipline. The universal
distribution of career success within these arenas for publication raises the
possibility that a fundamental driving force underlying scientific achievement
is the competitive nature of scientific advancement.Comment: 25 pages in 1 Column Preprint format, 7 Figures, 4 Tables. Version
II: changes made in response to referee comments. Note: change in definition
of "Paper shares.
Application of the Principle of Maximum Conformality to Top-Pair Production
A major contribution to the uncertainty of finite-order perturbative QCD
predictions is the perceived ambiguity in setting the renormalization scale
. For example, by using the conventional way of setting , one obtains the total production cross-section
with the uncertainty \Delta \sigma_{t \bar{t}}/\sigma_{t
\bar{t}}\sim ({}^{+3%}_{-4%}) at the Tevatron and LHC even for the present
NNLO level. The Principle of Maximum Conformality (PMC) eliminates the
renormalization scale ambiguity in precision tests of Abelian QED and
non-Abelian QCD theories. In this paper we apply PMC scale-setting to predict
the cross-section at the Tevatron and LHC
colliders. It is found that remains almost unchanged by
varying within the region of . The convergence
of the expansion series is greatly improved. For the -channel,
which is dominant at the Tevatron, its NLO PMC scale is much smaller than the
top-quark mass in the small -region, and thus its NLO cross-section is
increased by about a factor of two. In the case of the -channel, which is
dominant at the LHC, its NLO PMC scale slightly increases with the subprocess
collision energy , but it is still smaller than for
TeV, and the resulting NLO cross-section is increased by
. As a result, a larger is obtained in comparison
to the conventional scale-setting method, which agrees well with the present
Tevatron and LHC data. More explicitly, by setting GeV, we
predict pb,
pb and pb. [full abstract can be found in the
paper.]Comment: 15 pages, 11 figures, 5 tables. Fig.(9) is correcte
- …