6,853 research outputs found
Asymmetrical Interference Effects Between Two-Dimensional Geometric Shapes and Their Corresponding Shape Words
Nativists have postulated fundamental geometric knowledge that predates linguistic and symbolic thought. Central to these claims is the proposal for an isolated cognitive system dedicated to processing geometric information. Testing such hypotheses presents challenges due to difficulties in eliminating the combination of geometric and non-geometric information through language. We present evidence using a modified matching interference paradigm that an incongruent shape word interferes with identifying a two-dimensional geometric shape, but an incongruent two-dimensional geometric shape does not interfere with identifying a shape word. This asymmetry in interference effects between two-dimensional geometric shapes and their corresponding shape words suggests that shape words activate spatial representations of shapes but shapes do not activate linguistic representations of shape words. These results appear consistent with hypotheses concerning a cognitive system dedicated to processing geometric information isolated from linguistic processing and provide evidence consistent with hypotheses concerning knowledge of geometric properties of space that predates linguistic and symbolic thought
A randomized, phase II study of afatinib versus cetuximab in metastatic or recurrent squamous cell carcinoma of the head and neck.
BackgroundAfatinib is an oral, irreversible ErbB family blocker that has shown activity in epidermal growth factor receptor (EGFR)-mutated lung cancer. We hypothesized that the agent would have greater antitumor activity compared with cetuximab in recurrent or metastatic (R/M) head and neck squamous cell carcinoma (HNSCC) patients, whose disease has progressed after platinum-containing therapy.Patients and methodsAn open-label, randomized, phase II trial was conducted in 43 centers; 124 patients were randomized (1 : 1) to either afatinib (50 mg/day) or cetuximab (250 mg/m(2)/week) until disease progression or intolerable adverse events (AEs) (stage I), with optional crossover (stage II). The primary end point was tumor shrinkage before crossover assessed by investigator (IR) and independent central review (ICR).ResultsA total of 121 patients were treated (61 afatinib, 60 cetuximab) and 68 crossed over to stage II (32 and 36 respectively). In stage I, mean tumor shrinkage by IR/ICR was 10.4%/16.6% with afatinib and 5.4%/10.1% with cetuximab (P = 0.46/0.30). Objective response rate was 16.1%/8.1% with afatinib and 6.5%/9.7% with cetuximab (IR/ICR). Comparable disease control rates were observed with afatinib (50%) and cetuximab (56.5%) by IR; similar results were seen by ICR. Most common grade ≥3 drug-related AEs (DRAEs) were rash/acne (18% versus 8.3%), diarrhea (14.8% versus 0%), and stomatitis/mucositis (11.5% versus 0%) with afatinib and cetuximab, respectively. Patients with DRAEs leading to treatment discontinuation were 23% with afatinib and 5% with cetuximab. In stage II, disease control rate (IR/ICR) was 38.9%/33.3% with afatinib and 18.8%/18.8% with cetuximab.ConclusionAfatinib showed antitumor activity comparable to cetuximab in R/M HNSCC in this exploratory phase II trial, although more patients on afatinib discontinued treatment due to AEs. Sequential EGFR/ErbB treatment with afatinib and cetuximab provided sustained clinical benefit in patients after crossover, suggesting a lack of cross-resistance
Your Proof Fails? Testing Helps to Find the Reason
Applying deductive verification to formally prove that a program respects its
formal specification is a very complex and time-consuming task due in
particular to the lack of feedback in case of proof failures. Along with a
non-compliance between the code and its specification (due to an error in at
least one of them), possible reasons of a proof failure include a missing or
too weak specification for a called function or a loop, and lack of time or
simply incapacity of the prover to finish a particular proof. This work
proposes a new methodology where test generation helps to identify the reason
of a proof failure and to exhibit a counter-example clearly illustrating the
issue. We describe how to transform an annotated C program into C code suitable
for testing and illustrate the benefits of the method on comprehensive
examples. The method has been implemented in STADY, a plugin of the software
analysis platform FRAMA-C. Initial experiments show that detecting
non-compliances and contract weaknesses allows to precisely diagnose most proof
failures.Comment: 11 pages, 10 figure
Binary pattern tile set synthesis is NP-hard
In the field of algorithmic self-assembly, a long-standing unproven
conjecture has been that of the NP-hardness of binary pattern tile set
synthesis (2-PATS). The -PATS problem is that of designing a tile assembly
system with the smallest number of tile types which will self-assemble an input
pattern of colors. Of both theoretical and practical significance, -PATS
has been studied in a series of papers which have shown -PATS to be NP-hard
for , , and then . In this paper, we close the
fundamental conjecture that 2-PATS is NP-hard, concluding this line of study.
While most of our proof relies on standard mathematical proof techniques, one
crucial lemma makes use of a computer-assisted proof, which is a relatively
novel but increasingly utilized paradigm for deriving proofs for complex
mathematical problems. This tool is especially powerful for attacking
combinatorial problems, as exemplified by the proof of the four color theorem
by Appel and Haken (simplified later by Robertson, Sanders, Seymour, and
Thomas) or the recent important advance on the Erd\H{o}s discrepancy problem by
Konev and Lisitsa using computer programs. We utilize a massively parallel
algorithm and thus turn an otherwise intractable portion of our proof into a
program which requires approximately a year of computation time, bringing the
use of computer-assisted proofs to a new scale. We fully detail the algorithm
employed by our code, and make the code freely available online
Linguistic Structure Guided Context Modeling for Referring Image Segmentation
Referring image segmentation aims to predict the foreground mask of the
object referred by a natural language sentence. Multimodal context of the
sentence is crucial to distinguish the referent from the background. Existing
methods either insufficiently or redundantly model the multimodal context. To
tackle this problem, we propose a "gather-propagate-distribute" scheme to model
multimodal context by cross-modal interaction and implement this scheme as a
novel Linguistic Structure guided Context Modeling (LSCM) module. Our LSCM
module builds a Dependency Parsing Tree suppressed Word Graph (DPT-WG) which
guides all the words to include valid multimodal context of the sentence while
excluding disturbing ones through three steps over the multimodal feature,
i.e., gathering, constrained propagation and distributing. Extensive
experiments on four benchmarks demonstrate that our method outperforms all the
previous state-of-the-arts.Comment: Accepted by ECCV 2020. Code is available at
https://github.com/spyflying/LSCM-Refse
Easing Legal News Monitoring with Learning to Rank and BERT
While ranking approaches have made rapid advances in the Web search, systems that cater to the complex information needs in professional search tasks are not widely developed, common issues and solutions typically rely on dedicated search strategies backed by ad-hoc retrieval models. In this paper we present a legal search problem where professionals monitor news articles with constant queries on a periodic basis. Firstly, we demonstrate the effectiveness of using traditional retrieval models against the Boolean search of documents in chronological order. In an attempt to capture the complex information needs of users, a learning to rank approach is adopted with user specified relevance criteria as features. This approach, however, only achieves mediocre results compared to the traditional models. However, we find that by fine-tuning a contextualised language model (e.g. BERT), significantly improved retrieval performance can be achieved, providing a flexible solution to satisfying complex information needs without explicit feature engineering
Effects of external nutrient sources and extreme weather events on the nutrient budget of a Southern European coastal lagoon
The seasonal and annual nitrogen (N), phosphorus (P), and carbon (C) budgets of the mesotidal Ria Formosa lagoon, southern Portugal, were estimated to reveal the main inputs and outputs, the seasonal patterns, and how they may influence the ecological functioning of the system. The effects of extreme weather events such as long-lasting strong winds causing upwelling and strong rainfall were assessed. External nutrient inputs were quantified; ocean exchange was assessed in 24-h sampling campaigns, and final calculations were made using a hydrodynamic model of the lagoon. Rain and stream inputs were the main freshwater sources to the lagoon. However, wastewater treatment plant and groundwater discharges dominated nutrient input, together accounting for 98, 96, and 88 % of total C, N, and P input, respectively. Organic matter and nutrients were continuously exported to the ocean. This pattern was reversed following extreme events, such as strong winds in early summer that caused upwelling and after a period of heavy rainfall in late autumn. A principal component analysis (PCA) revealed that ammonium and organic N and C exchange were positively associated with temperature as opposed to pH and nitrate. These variables reflected mostly the benthic lagoon metabolism, whereas particulate P exchange was correlated to Chl a, indicating that this was more related to phytoplankton dynamics. The increase of stochastic events, as expected in climate change scenarios, may have strong effects on the ecological functioning of coastal lagoons, altering the C and nutrient budgets.Portuguese Science and Technology Foundation (FCT) [POCI/MAR/58427/2004, PPCDT/MAR/58427/2004]; Portuguese Science and Technology Foundation (FCT
Complex-Distance Potential Theory and Hyperbolic Equations
An extension of potential theory in R^n is obtained by continuing the
Euclidean distance function holomorphically to C^n. The resulting Newtonian
potential is generated by an extended source distribution D(z) in C^n whose
restriction to R^n is the delta function. This provides a natural model for
extended particles in physics. In C^n, interpreted as complex spacetime, D(z)
acts as a propagator generating solutions of the wave equation from their
initial values. This gives a new connection between elliptic and hyperbolic
equations that does not assume analyticity of the Cauchy data. Generalized to
Clifford analysis, it induces a similar connection between solutions of
elliptic and hyperbolic Dirac equations. There is a natural application to the
time-dependent, inhomogeneous Dirac and Maxwell equations, and the
`electromagnetic wavelets' introduced previously are an example.Comment: 25 pages, submited to Proceedings of 5th Intern. Conf. on Clifford
Algebras, Ixtapa, June 24 - July 4, 199
An Alternative Interpretation of Statistical Mechanics
In this paper I propose an interpretation of classical statistical mechanics that centers on taking seriously the idea that probability measures represent complete states of statistical mechanical systems. I show how this leads naturally to the idea that the stochasticity of statistical mechanics is associated directly with the observables of the theory rather than with the microstates (as traditional accounts would have it). The usual assumption that microstates are representationally significant in the theory is therefore dispensable, a consequence which suggests interesting possibilities for developing non-equilibrium statistical mechanics and investigating inter-theoretic answers to the foundational questions of statistical mechanics
- …