250 research outputs found
Parameterizing by the Number of Numbers
The usefulness of parameterized algorithmics has often depended on what
Niedermeier has called, "the art of problem parameterization". In this paper we
introduce and explore a novel but general form of parameterization: the number
of numbers. Several classic numerical problems, such as Subset Sum, Partition,
3-Partition, Numerical 3-Dimensional Matching, and Numerical Matching with
Target Sums, have multisets of integers as input. We initiate the study of
parameterizing these problems by the number of distinct integers in the input.
We rely on an FPT result for ILPF to show that all the above-mentioned problems
are fixed-parameter tractable when parameterized in this way. In various
applied settings, problem inputs often consist in part of multisets of integers
or multisets of weighted objects (such as edges in a graph, or jobs to be
scheduled). Such number-of-numbers parameterized problems often reduce to
subproblems about transition systems of various kinds, parameterized by the
size of the system description. We consider several core problems of this kind
relevant to number-of-numbers parameterization. Our main hardness result
considers the problem: given a non-deterministic Mealy machine M (a finite
state automaton outputting a letter on each transition), an input word x, and a
census requirement c for the output word specifying how many times each letter
of the output alphabet should be written, decide whether there exists a
computation of M reading x that outputs a word y that meets the requirement c.
We show that this problem is hard for W[1]. If the question is whether there
exists an input word x such that a computation of M on x outputs a word that
meets c, the problem becomes fixed-parameter tractable
Minimal Synthesis of String To String Functions From Examples
We study the problem of synthesizing string to string transformations from a
set of input/output examples. The transformations we consider are expressed
using deterministic finite automata (DFA) that read pairs of letters, one
letter from the input and one from the output. The DFA corresponding to these
transformations have additional constraints, ensuring that each input string is
mapped to exactly one output string.
We suggest that, given a set of input/output examples, the smallest DFA
consistent with the examples is a good candidate for the transformation the
user was expecting. We therefore study the problem of, given a set of examples,
finding a minimal DFA consistent with the examples and satisfying the
functionality and totality constraints mentioned above.
We prove that, in general, this problem (the corresponding decision problem)
is NP-complete. This is unlike the standard DFA minimization problem which can
be solved in polynomial time. We provide several NP-hardness proofs that show
the hardness of multiple (independent) variants of the problem.
Finally, we propose an algorithm for finding the minimal DFA consistent with
input/output examples, that uses a reduction to SMT solvers. We implemented the
algorithm, and used it to evaluate the likelihood that the minimal DFA indeed
corresponds to the DFA expected by the user.Comment: SYNT 201
Interaction and observation, categorically
This paper proposes to use dialgebras to specify the semantics of interactive
systems in a natural way. Dialgebras are a conservative extension of
coalgebras. In this categorical model, from the point of view that we provide,
the notions of observation and interaction are separate features. This is
useful, for example, in the specification of process equivalences, which are
obtained as kernels of the homomorphisms of dialgebras. As an example we
present the asynchronous semantics of the CCS.Comment: In Proceedings ICE 2011, arXiv:1108.014
ĐŃĐžĐŒĐ”ĐœĐ”ĐœĐžĐ” гОЎŃĐŸĐŽĐžĐœĐ°ĐŒĐžŃĐ”ŃĐșĐŸĐłĐŸ ĐŒĐŸĐŽĐ”Đ»ĐžŃĐŸĐČĐ°ĐœĐžŃ ĐŽĐ»Ń ĐŸĐ±ĐŸŃĐœĐŸĐČĐ°ĐœĐžŃ ĐŒĐ”ŃĐŸĐżŃĐžŃŃĐžĐč ĐżĐŸ Đ±ĐŸŃŃбД Ń ĐżĐŸĐŽŃĐŸĐżĐ»Đ”ĐœĐžĐ”ĐŒ ĐżŃĐž ĐžŃĐżĐŸĐ»ŃĐ·ĐŸĐČĐ°ĐœĐžĐž Đ»ŃŃĐ”ĐČĐŸĐłĐŸ ĐŽŃĐ”ĐœĐ°Đ¶Đ°
Đ Đ°Đ·ĐČĐžŃОД ĐżĐŸĐŽŃĐŸĐżĐ»Đ”ĐœĐžŃ ŃĐżĐŸŃĐŸĐ±ĐœĐŸ ŃŃŃĐ”ŃŃĐČĐ”ĐœĐœŃĐŒ ĐŸĐ±ŃĐ°Đ·ĐŸĐŒ ĐŸŃĐ»ĐŸĐ¶ĐœĐžŃŃ ŃĐșŃплŃĐ°ŃĐ°ŃĐžŃ ĐžĐœĐ¶Đ”ĐœĐ”ŃĐœŃŃ
ŃĐŸĐŸŃŃĐ¶Đ”ĐœĐžĐč. ĐŃĐŸ ŃŃДбŃĐ”Ń ŃĐ°Đ·ŃĐ°Đ±ĐŸŃĐșĐž ŃŃŃĐ”ĐșŃĐžĐČĐœŃŃ
ĐŒĐ”Ń ĐżĐŸ ĐżŃĐ”ĐŽĐŸŃĐČŃĐ°ŃĐ”ĐœĐžŃ ĐœĐ”ĐłĐ°ŃĐžĐČĐœŃŃ
ĐżĐŸŃлДЎŃŃĐČĐžĐč ĐżĐŸĐŽŃŃĐŒĐ° ŃŃĐŸĐČĐœĐ”Đč ĐżĐŸĐŽĐ·Đ”ĐŒĐœŃŃ
ĐČĐŸĐŽ. Đ ŃĐ»ŃŃĐ°Đ” ĐŸŃĐŸĐ±ĐŸ ĐŸŃĐČĐ”ŃŃŃĐČĐ”ĐœĐœŃŃ
ŃĐŸĐŸŃŃĐ¶Đ”ĐœĐžĐč ĐżĐŸĐČŃŃĐ°ŃŃŃŃ ŃŃĐ”Đ±ĐŸĐČĐ°ĐœĐžŃ Đș ĐœĐ°ĐŽŃĐ¶ĐœĐŸŃŃĐž ĐżŃĐŸĐłĐœĐŸĐ·Đ° ĐžĐ·ĐŒĐ”ĐœĐ”ĐœĐžŃ ĐłĐžĐŽŃĐŸĐłĐ”ĐŸĐ»ĐŸĐłĐžŃĐ”ŃĐșĐžŃ
ŃŃĐ»ĐŸĐČĐžĐč ĐżĐŸĐŽ ĐČлОŃĐœĐžĐ”ĐŒ ŃĐșŃплŃĐ°ŃĐ°ŃОО Đ·Đ°ŃĐžŃĐœŃŃ
ĐŒĐ”ŃĐŸĐżŃĐžŃŃĐžĐč. Đ ŃŃĐ°ŃŃĐ” ŃĐ°ŃŃĐŒĐ°ŃŃĐžĐČĐ°ŃŃŃŃ ĐČĐŸĐżŃĐŸŃŃ ĐżŃĐŸĐłĐœĐŸĐ·Đ° ĐžĐ·ĐŒĐ”ĐœĐ”ĐœĐžŃ ĐłĐžĐŽŃĐŸĐłĐ”ĐŸĐ»ĐŸĐłĐžŃĐ”ŃĐșĐžŃ
ŃŃĐ»ĐŸĐČĐžĐč ĐœĐ° ŃŃĐ°ŃŃĐșĐ” Ń
ŃĐ°ĐœĐ”ĐœĐžŃ ĐČŃŃĐŸĐșĐŸŃĐ°ĐŽĐžĐŸĐ°ĐșŃĐžĐČĐœŃŃ
ĐŸŃŃ
ĐŸĐŽĐŸĐČ ĐżĐŸĐŽ ĐČлОŃĐœĐžĐ”ĐŒ Đ»ŃŃĐ”ĐČĐŸĐłĐŸ ĐŽŃĐ”ĐœĐ°Đ¶Đ°. Flooding development is capable to complicate operation of engineering constructions essentially. It demands working out of effectual measures on prevention of negative consequences of ascending gradient of levels of underground waters. In case of especially responsible constructions requirements to reliability of the forecast of change of hydrogeological conditions under the influence of operation of protective actions raise. In article questions of the forecast of change of hydrogeological conditions on a lot of storage of a highly radioactive waste under the influence of a beam drainage are considered
Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy
Background
A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets.
Methods
Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendallâs tau for dichotomous variables, or JonckheereâTerpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis.
Results
A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both pâ<â0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROCâ=â0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all pâ<â0.001).
Conclusion
We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty
A three-group study, internet-based, face-to-face based and standard- management after acute whiplash associated disorders (WAD) â choosing the most efficient and cost-effective treatment: study protocol of a randomized controlled trial
<p>Abstract</p> <p>Background</p> <p>The management of Whiplash Associated Disorders is one of the most complicated challenges with high expenses for the health care system and society. There are still no general guidelines or scientific documentation to unequivocally support any single treatment for acute care following whiplash injury.</p> <p>The main purpose of this study is to try a new behavioural medicine intervention strategy at acute phase aimed to reduce the number of patients who have persistent problems after the whiplash injury. The goal is also to identify which of three different interventions that is most cost-effective for patients with Whiplash Associated Disorders. In this study we are controlling for two factors. First, the effect of behavioural medicine approach is compared with standard care. Second, the manner in which the behavioural medicine treatment is administered, Internet or face-to-face, is evaluated in it's effectiveness and cost-effectiveness.</p> <p>Methods/Design</p> <p>The study is a randomized, prospective, experimental three-group study with analyses of cost-effectiveness up to two-years follow-up. <it>Internet â based programme </it>and <it>face-to-face group treatment programme </it>are compared to <it>standard-treatment </it>only. Patient follow-ups take place three, six, twelve and 24 months, that is, short-term as well as long-term effects are evaluated. Patients will be enrolled via the emergency ward during the first week after the accident.</p> <p>Discussion</p> <p>This new self-help management will concentrate to those psychosocial factors that are shown to be predictive in long-term problems in Whiplash Associated Disorders, i.e. the importance of self-efficacy, fear of movement, and the significance of catastrophizing as a coping strategy for restoring and sustaining activities of daily life. Within the framework of this project, we will develop, broaden and evaluate current physical therapy treatment methods for acute Whiplash Associated Disorders. The project will contribute to the creation of a cost-effective behavioural medicine approach to management of acute Whiplash Associated Disorders. The results of this study will answer an important question; on what extent and how should these patients be treated at acute stage and how much does the best management cost.</p> <p>Trial registration number</p> <p>Current Controlled Trials ISRCTN61531337</p
Emergence of responsible sanctions without second order free riders, antisocial punishment or spite
While empirical evidence highlights the importance of punishment for cooperation in collective action, it remains disputed how responsible sanctions targeted predominantly at uncooperative subjects can evolve. Punishment is costly; in order to spread it typically requires local interactions, voluntary participation, or rewards. Moreover, theory and experiments indicate that some subjects abuse sanctioning opportunities by engaging in antisocial punishment (which harms cooperators), spiteful acts (harming everyone) or revenge (as a response to being punished). These arguments have led to the conclusion that punishment is maladaptive. Here, we use evolutionary game theory to show that this conclusion is premature: If interactions are non-anonymous, cooperation and punishment evolve even if initially rare, and sanctions are directed towards non-cooperators only. Thus, our willingness to punish free riders is ultimately a selfish decision rather than an altruistic act; punishment serves as a warning, showing that one is not willing to accept unfair treatments
Why sustainable, inclusive, and resilient investment makes for efficacious post-COVID medicine
Abstract: The global economy is facing an unprecedented challenge, with the risk of a protracted depression following the response to COVIDâ19. In 2014, I argued here that macroeconomic conditions made it a relatively favorable time to kickâstart investments in a resourceâefficient, low carbon economy. Yet the opportunity was, for the most part, squandered. Failure to utilize active fiscal policy contributed to growing private indebtedness, limited productivity and wage growth and widened inequality helping erode trust in institutions. All the while, greenhouse gas emissions continued to rise. This time, there are grounds for optimism that a more coordinated response toward generating an ambitious transition to net zero emissions might contribute to a strong, sustainable, and resilient recovery. This article is categorized under: Climate Economics > Economics of Mitigatio
Collaborative International Research in Clinical and Longitudinal Experience Study in NMOSD
OBJECTIVE: To develop a resource of systematically collected, longitudinal clinical data and biospecimens for assisting in the investigation into neuromyelitis optica spectrum disorder (NMOSD) epidemiology, pathogenesis, and treatment. METHODS: To illustrate its research-enabling purpose, epidemiologic patterns and disease phenotypes were assessed among enrolled subjects, including age at disease onset, annualized relapse rate (ARR), and time between the first and second attacks. RESULTS: As of December 2017, the Collaborative International Research in Clinical and Longitudinal Experience Study (CIRCLES) had enrolled more than 1,000 participants, of whom 77.5% of the NMOSD cases and 71.7% of the controls continue in active follow-up. Consanguineous relatives of patients with NMOSD represented 43.6% of the control cohort. Of the 599 active cases with complete data, 84% were female, and 76% were anti-AQP4 seropositive. The majority were white/Caucasian (52.6%), whereas blacks/African Americans accounted for 23.5%, Hispanics/Latinos 12.4%, and Asians accounted for 9.0%. The median age at disease onset was 38.4 years, with a median ARR of 0.5. Seropositive cases were older at disease onset, more likely to be black/African American or Hispanic/Latino, and more likely to be female. CONCLUSION: Collectively, the CIRCLES experience to date demonstrates this study to be a useful and readily accessible resource to facilitate accelerating solutions for patients with NMOSD
- âŠ