66 research outputs found
Lie Superalgebras and the Multiplet Structure of the Genetic Code II: Branching Schemes
Continuing our attempt to explain the degeneracy of the genetic code using
basic classical Lie superalgebras, we present the branching schemes for the
typical codon representations (typical 64-dimensional irreducible
representations) of basic classical Lie superalgebras and find three schemes
that do reproduce the degeneracies of the standard code, based on the
orthosymplectic algebra osp(5|2) and differing only in details of the symmetry
breaking pattern during the last step.Comment: 34 pages, 9 tables, LaTe
Navigating the Future V: Marine Science for a Sustainable Future
Navigating the Future is a publication series produced by the European Marine Board providing future
perspectives on marine science and technology in Europe. Navigating the Future V (NFV) highlights new
knowledge obtained since Navigating the Future IV1 (2013). It is set within the framework of the 2015
Paris Agreement2 and builds on the scientific basis and recommendations of the IPCC reports3. NFV gives
recommendations on the science required during the next decade to deliver the ocean we need to support
a sustainable future. This will be important for the United Nations Decade of Ocean Science for Sustainable
Development4 (2021 – 2030), the implementation of the UN Sustainable Development Goals5 and the
European Commission’s next framework programme, Horizon Europe6 (2021 - 2027). There is a growing need
to strengthen the links between marine science, society and policy since we cannot properly manage what
we do not know.
In recent years, the ocean and seas have received new prominence in international agendas. To secure a
safe planet a priority is the management of the ocean as a “common good for humanity”, which requires
smarter observations to assess of the state of the ocean and predictions about how it may change in the
future. The ocean is a three-dimensional space that needs to be managed over time (thus four-dimensional),
and there is a need for management and conservation practices that integrate the structure and function
of marine ecosystems into these four dimensions (Chapter 2). This includes understanding the dynamic
spatial and temporal interplay between ocean physics, chemistry and biology. Multiple stressors including
climate change, pollution and over-fishing affect the ocean and we need to better understand and predict
their interactions and identify tipping points to decide on management priorities (Chapter 3). This should
integrate our understanding of land-ocean-atmosphere processes and approaches to reducing impacts. An
improved science base is also needed to help predict and minimize the impact of extreme events such as
storm surges, heat waves, dynamic sea-floor processes and tsunamis (Chapter 4). New technologies, data
handling and modelling approaches will help us to observe, understand and manage our use of the fourdimensional
ocean and the effect of multiple stressors (Chapter 5).
Addressing these issues requires a strategic, collective and holistic approach and we need to build a
community of sustainability scientists that are able to provide evidence-based support to policy makers
within the context of major societal challenges (Chapter 6). We outline new frontiers, knowledge gaps and
recommendations needed to manage the ocean as a common good and to develop solutions for a sustainable
future (Chapter 7). The governance of sustainability should be at the core of the marine research agenda
through co-production and collaboration with stakeholders to identify priorities. There is need for a fully
integrated scientific assessment of resilience strategies, associated trade-offs and underlying ethical concepts
for the ocean, which should be incorporated into decision support frameworks that involve stakeholders from
the outset. To allow the collection, processing and access to all data, a key priority is the development of a
business model that ensures the long-term economic sustainability of ocean observations
Comparison of adjuvant gemcitabine and capecitabine with gemcitabine monotherapy in patients with resected pancreatic cancer (ESPAC-4): a multicentre, open-label, randomised, phase 3 trial
BACKGROUND: The ESPAC-3 trial showed that adjuvant gemcitabine is the standard of care based on similar survival to and less toxicity than adjuvant 5-fluorouracil/folinic acid in patients with resected pancreatic cancer. Other clinical trials have shown better survival and tumour response with gemcitabine and capecitabine than with gemcitabine alone in advanced or metastatic pancreatic cancer. We aimed to determine the efficacy and safety of gemcitabine and capecitabine compared with gemcitabine monotherapy for resected pancreatic cancer. METHODS: We did a phase 3, two-group, open-label, multicentre, randomised clinical trial at 92 hospitals in England, Scotland, Wales, Germany, France, and Sweden. Eligible patients were aged 18 years or older and had undergone complete macroscopic resection for ductal adenocarcinoma of the pancreas (R0 or R1 resection). We randomly assigned patients (1:1) within 12 weeks of surgery to receive six cycles of either 1000 mg/m(2) gemcitabine alone administered once a week for three of every 4 weeks (one cycle) or with 1660 mg/m(2) oral capecitabine administered for 21 days followed by 7 days' rest (one cycle). Randomisation was based on a minimisation routine, and country was used as a stratification factor. The primary endpoint was overall survival, measured as the time from randomisation until death from any cause, and assessed in the intention-to-treat population. Toxicity was analysed in all patients who received trial treatment. This trial was registered with the EudraCT, number 2007-004299-38, and ISRCTN, number ISRCTN96397434. FINDINGS: Of 732 patients enrolled, 730 were included in the final analysis. Of these, 366 were randomly assigned to receive gemcitabine and 364 to gemcitabine plus capecitabine. The Independent Data and Safety Monitoring Committee requested reporting of the results after there were 458 (95%) of a target of 480 deaths. The median overall survival for patients in the gemcitabine plus capecitabine group was 28·0 months (95% CI 23·5-31·5) compared with 25·5 months (22·7-27·9) in the gemcitabine group (hazard ratio 0·82 [95% CI 0·68-0·98], p=0·032). 608 grade 3-4 adverse events were reported by 226 of 359 patients in the gemcitabine plus capecitabine group compared with 481 grade 3-4 adverse events in 196 of 366 patients in the gemcitabine group. INTERPRETATION: The adjuvant combination of gemcitabine and capecitabine should be the new standard of care following resection for pancreatic ductal adenocarcinoma
Confidence in uncertainty: Error cost and commitment in early speech hypotheses
© 2018 Loth et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Interactions with artificial agents often lack immediacy because agents respond slower than their users expect. Automatic speech recognisers introduce this delay by analysing a user’s utterance only after it has been completed. Early, uncertain hypotheses of incremental speech recognisers can enable artificial agents to respond more timely. However, these hypotheses may change significantly with each update. Therefore, an already initiated action may turn into an error and invoke error cost. We investigated whether humans would use uncertain hypotheses for planning ahead and/or initiating their response. We designed a Ghost-in-the-Machine study in a bar scenario. A human participant controlled a bartending robot and perceived the scene only through its recognisers. The results showed that participants used uncertain hypotheses for selecting the best matching action. This is comparable to computing the utility of dialogue moves. Participants evaluated the available evidence and the error cost of their actions prior to initiating them. If the error cost was low, the participants initiated their response with only suggestive evidence. Otherwise, they waited for additional, more confident hypotheses if they still had time to do so. If there was time pressure but only little evidence, participants grounded their understanding with echo questions. These findings contribute to a psychologically plausible policy for human-robot interaction that enables artificial agents to respond more timely and socially appropriately under uncertainty
Creative destruction in science
Drawing on the concept of a gale of creative destruction in a capitalistic economy, we argue that initiatives to assess the robustness of findings in the organizational literature should aim to simultaneously test competing ideas operating in the same theoretical space. In other words, replication efforts should seek not just to support or question the original findings, but also to replace them with revised, stronger theories with greater explanatory power. Achieving this will typically require adding new measures, conditions, and subject populations to research designs, in order to carry out conceptual tests of multiple theories in addition to directly replicating the original findings. To illustrate the value of the creative destruction approach for theory pruning in organizational scholarship, we describe recent replication initiatives re-examining culture and work morality, working parents\u2019 reasoning about day care options, and gender discrimination in hiring decisions.
Significance statement
It is becoming increasingly clear that many, if not most, published research findings across scientific fields are not readily replicable when the same method is repeated. Although extremely valuable, failed replications risk leaving a theoretical void\u2014 reducing confidence the original theoretical prediction is true, but not replacing it with positive evidence in favor of an alternative theory. We introduce the creative destruction approach to replication, which combines theory pruning methods from the field of management with emerging best practices from the open science movement, with the aim of making replications as generative as possible. In effect, we advocate for a Replication 2.0 movement in which the goal shifts from checking on the reliability of past findings to actively engaging in competitive theory testing and theory building.
Scientific transparency statement
The materials, code, and data for this article are posted publicly on the Open Science Framework, with links provided in the article
Examining the generalizability of research findings from archival data
This initiative examined systematically the extent to which a large set of archival research findings generalizes across contexts. We repeated the key analyses for 29 original strategic management effects in the same context (direct reproduction) as well as in 52 novel time periods and geographies; 45% of the reproductions returned results matching the original reports together with 55% of tests in different spans of years and 40% of tests in novel geographies. Some original findings were associated with multiple new tests. Reproducibility was the best predictor of generalizability—for the findings that proved directly reproducible, 84% emerged in other available time periods and 57% emerged in other geographies. Overall, only limited empirical evidence emerged for context sensitivity. In a forecasting survey, independent scientists were able to anticipate which effects would find support in tests in new samples
Evaluation of the pain level from speech: introducing a novel pain database and benchmarks
A paper in Speech Communication: 13. ITG-Fachtagung Sprachkommunikation 201
- …