1,870 research outputs found
ΠΡΠΎΠ±Π»Π΅ΠΌΡ Π³ΠΎΡΡΠ΄Π°ΡΡΡΠ²Π΅Π½Π½ΠΎΠ³ΠΎ ΡΠ΅Π³ΡΠ»ΠΈΡΠΎΠ²Π°Π½ΠΈΡ ΠΈΠ½Π½ΠΎΠ²Π°ΡΠΈΠΎΠ½Π½ΠΎΠΉ Π΄Π΅ΡΡΠ΅Π»ΡΠ½ΠΎΡΡΠΈ Π½Π° ΡΠ΅Π³ΠΈΠΎΠ½Π°Π»ΡΠ½ΠΎΠΌ ΡΡΠΎΠ²Π½Π΅
Π¦Π΅Π»Ρ ΡΡΠ°ΡΡΠΈ - ΡΠ°Π·ΡΠ°Π±ΠΎΡΠΊΠ° ΠΌΠΎΠ΄Π΅Π»ΠΈ Π³ΠΎΡΡΠ΄Π°ΡΡΡΠ²Π΅Π½Π½ΠΎΠΉ ΠΈΠ½Π½ΠΎΠ²Π°ΡΠΈΠΎΠ½Π½ΠΎΠΉ ΠΏΠΎΠ»ΠΈΡΠΈΠΊΠΈ ΡΠ΅Π³ΠΈΠΎΠ½Π°Π»ΡΠ½ΠΎΠ³ΠΎ ΡΠ°Π·Π²ΠΈΡΠΈΡ Π£ΠΊΡΠ°ΠΈΠ½Ρ Π½Π° ΡΠΎΠ²ΡΠ΅ΠΌΠ΅Π½Π½ΠΎΠΌ ΡΡΠ°ΠΏΠ΅
5-Approximation for -Treewidth Essentially as Fast as -Deletion Parameterized by Solution Size
The notion of -treewidth, where is a hereditary
graph class, was recently introduced as a generalization of the treewidth of an
undirected graph. Roughly speaking, a graph of -treewidth at most
can be decomposed into (arbitrarily large) -subgraphs which
interact only through vertex sets of size which can be organized in a
tree-like fashion. -treewidth can be used as a hybrid
parameterization to develop fixed-parameter tractable algorithms for
-deletion problems, which ask to find a minimum vertex set whose
removal from a given graph turns it into a member of . The
bottleneck in the current parameterized algorithms lies in the computation of
suitable tree -decompositions.
We present FPT approximation algorithms to compute tree
-decompositions for hereditary and union-closed graph classes
. Given a graph of -treewidth , we can compute a
5-approximate tree -decomposition in time
whenever -deletion parameterized by solution size can be solved in
time for some function . The current-best
algorithms either achieve an approximation factor of or construct
optimal decompositions while suffering from non-uniformity with unknown
parameter dependence. Using these decompositions, we obtain algorithms solving
Odd Cycle Transversal in time parameterized by
-treewidth and Vertex Planarization in time parameterized by -treewidth, showing that
these can be as fast as the solution-size parameterizations and giving the
first ETH-tight algorithms for parameterizations by hybrid width measures.Comment: Conference version to appear at the European Symposium on Algorithms
(ESA 2023
Single-Exponential FPT Algorithms for Enumerating Secluded -Free Subgraphs and Deleting to Scattered Graph Classes
The celebrated notion of important separators bounds the number of small
-separators in a graph which are 'farthest from ' in a technical
sense. In this paper, we introduce a generalization of this powerful
algorithmic primitive that is phrased in terms of -secluded vertex sets:
sets with an open neighborhood of size at most .
In this terminology, the bound on important separators says that there are at
most maximal -secluded connected vertex sets containing but
disjoint from . We generalize this statement significantly: even when we
demand that avoids a finite set of forbidden induced
subgraphs, the number of such maximal subgraphs is and they can be
enumerated efficiently. This allows us to make significant improvements for two
problems from the literature.
Our first application concerns the 'Connected -Secluded -free
subgraph' problem, where is a finite set of forbidden induced
subgraphs. Given a graph in which each vertex has a positive integer weight,
the problem asks to find a maximum-weight connected -secluded vertex set such that does not contain an induced subgraph
isomorphic to any . The parameterization by is known to
be solvable in triple-exponential time via the technique of recursive
understanding, which we improve to single-exponential.
Our second application concerns the deletion problem to scattered graph
classes. Here, the task is to find a vertex set of size at most whose
removal yields a graph whose each connected component belongs to one of the
prescribed graph classes . We obtain a single-exponential
algorithm whenever each class is characterized by a finite number of
forbidden induced subgraphs. This generalizes and improves upon earlier results
in the literature.Comment: To appear at ISAAC'2
5-Approximation for ?-Treewidth Essentially as Fast as ?-Deletion Parameterized by Solution Size
The notion of ?-treewidth, where ? is a hereditary graph class, was recently introduced as a generalization of the treewidth of an undirected graph. Roughly speaking, a graph of ?-treewidth at most k can be decomposed into (arbitrarily large) ?-subgraphs which interact only through vertex sets of size ?(k) which can be organized in a tree-like fashion. ?-treewidth can be used as a hybrid parameterization to develop fixed-parameter tractable algorithms for ?-deletion problems, which ask to find a minimum vertex set whose removal from a given graph G turns it into a member of ?. The bottleneck in the current parameterized algorithms lies in the computation of suitable tree ?-decompositions.
We present FPT-approximation algorithms to compute tree ?-decompositions for hereditary and union-closed graph classes ?. Given a graph of ?-treewidth k, we can compute a 5-approximate tree ?-decomposition in time f(?(k)) ? n^?(1) whenever ?-deletion parameterized by solution size can be solved in time f(k) ? n^?(1) for some function f(k) ? 2^k. The current-best algorithms either achieve an approximation factor of k^?(1) or construct optimal decompositions while suffering from non-uniformity with unknown parameter dependence. Using these decompositions, we obtain algorithms solving Odd Cycle Transversal in time 2^?(k) ? n^?(1) parameterized by bipartite-treewidth and Vertex Planarization in time 2^?(k log k) ? n^?(1) parameterized by planar-treewidth, showing that these can be as fast as the solution-size parameterizations and giving the first ETH-tight algorithms for parameterizations by hybrid width measures
Search-Space Reduction via Essential Vertices
We investigate preprocessing for vertex-subset problems on graphs. While the notion of kernelization, originating in parameterized complexity theory, is a formalization of provably effective preprocessing aimed at reducing the total instance size, our focus is on finding a non-empty vertex set that belongs to an optimal solution. This decreases the size of the remaining part of the solution which still has to be found, and therefore shrinks the search space of fixed-parameter tractable algorithms for parameterizations based on the solution size. We introduce the notion of a c-essential vertex as one that is contained in all c-approximate solutions. For several classic combinatorial problems such as Odd Cycle Transversal and Directed Feedback Vertex Set, we show that under mild conditions a polynomial-time preprocessing algorithm can find a subset of an optimal solution that contains all 2-essential vertices, by exploiting packing/covering duality. This leads to FPT algorithms to solve these problems where the exponential term in the running time depends only on the number of non-essential vertices in the solution
Towards a full-reference, information-theoretic quality assessment method for X-ray images
This work aims at defining an information-theoretic quality assessment technique for cardiovascular X-ray images, using a full-reference scheme (relying on averaging a sequence to obtain a noiseless reference). With the growth of advanced signal processing in medical imaging, such an approach will enable objective comparisons of the quality of processed images. A concept for describing the quality of an image is to express it in terms of its information capacity. Shannon has derived this capacity for noisy channel coding. However, for X-ray images, the noise is signal-dependent and non-additive, so that Shannon's theorem is not directly applicable. To overcome this complication, we exploit the fact that any invertible mapping on a signal does not change its information content. We show that it is possible to transform the images in such a way that the Shannon theorem can be applied. A general method for calculating such a transformation is used, given a known relation between signal mean and noise standard deviation. After making the noise signal-independent, it is possible to assess the information content of an image and to calculate an overall quality metric (e.g. information capacity) which includes the effects of sharpness, contrast and noise. We have applied this method on phantom images under different acquisition conditions and computed the information capacity for those images. We aim to show that the results of this assessment are consistent with variations in noise, contrast and sharpness, introduced by system settings and image processing
ΠΠΊΠ»Π°Π΄ ΡΠ΅Π³ΠΈΠΎΠ½Π°Π»ΡΠ½ΡΡ ΠΈ Π³Π»ΠΎΠ±Π°Π»ΡΠ½ΡΡ ΡΠ°ΠΊΡΠΎΡΠΎΠ² Π² ΠΌΠ΅ΠΆΠ³ΠΎΠ΄ΠΎΠ²ΡΡ ΠΈΠ·ΠΌΠ΅Π½ΡΠΈΠ²ΠΎΡΡΡ Π³ΠΈΠ΄ΡΠΎΠΌΠ΅ΡΠ΅ΠΎΡΠΎΠ»ΠΎΠ³ΠΈΡΠ΅ΡΠΊΠΈΡ ΡΡΠ»ΠΎΠ²ΠΈΠΉ ΠΏΡΠΈΠ±ΡΠ΅ΠΆΠ½ΠΎΠΉ Π·ΠΎΠ½Ρ Π§Π΅ΡΠ½ΠΎΠ³ΠΎ ΠΌΠΎΡΡ
ΠΡΠΏΠΎΠ»Π½Π΅Π½ ΡΠ°ΠΊΡΠΎΡΠ½ΡΠΉ Π°Π½Π°Π»ΠΈΠ· ΡΡΠ΄ΠΎΠ² ΡΡΠ΅Π΄Π½Π΅Π³ΠΎΠ΄ΠΎΠ²ΡΡ
ΠΈ ΡΡΠ΅Π΄Π½Π΅ΠΏΡΡΠΈΠ»Π΅ΡΠ½ΠΈΡ
Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ ΠΌΠ΅ΡΠ΅ΠΎΡΠΎΠ»ΠΎΠ³ΠΈΡΠ΅ΡΠΊΠΈΡ
ΠΈ Π³ΠΈΠ΄ΡΠΎΠ»ΠΎΠ³ΠΈΡΠ΅ΡΠΊΠΈΡ
ΠΏΠ°ΡΠ°ΠΌΠ΅ΡΡΠΎΠ² ΠΏΠΎ Π΄Π°Π½Π½ΡΠΌ ΠΈΠ·ΠΌΠ΅ΡΠ΅Π½ΠΈΠΉ Π½Π° Π±Π΅ΡΠ΅Π³ΠΎΠ²ΡΡ
Π³ΠΈΠ΄ΡΠΎΠΌΠ΅ΡΡΡΠ°Π½ΡΠΈΡΡ
. ΠΠΎΠ»ΡΡΠ΅Π½Ρ ΠΊΠΎΠ»ΠΈΡΠ΅ΡΡΠ²Π΅Π½Π½ΡΠ΅ ΠΎΡΠ΅Π½ΠΊΠΈ Π²ΠΊΠ»Π°Π΄Π° Π³Π»ΠΎΠ±Π°Π»ΡΠ½ΡΡ
ΠΈ ΡΠ΅Π³ΠΈΠΎΠ½Π°Π»ΡΠ½ΡΡ
ΡΠ°ΠΊΡΠΎΡΠΎΠ² Π² ΠΌΠ΅ΠΆΠ³ΠΎΠ΄ΠΎΠ²ΡΡ ΠΈ Π΄Π΅ΠΊΠ°Π΄Π½ΡΡ ΠΈΠ·ΠΌΠ΅Π½ΡΠΈΠ²ΠΎΡΡΡ ΠΏΠΎΠΊΠ°Π·Π°ΡΠ΅Π»Π΅ΠΉ Π³ΠΈΠ΄ΡΠΎΠΌΠ΅ΡΠ΅ΠΎΡΠΎΠ»ΠΎΠ³ΠΈΡΠ΅ΡΠΊΠΎΠ³ΠΎ ΡΠ΅ΠΆΠΈΠΌΠ° ΡΠ΅ΡΠ½ΠΎΠΌΠΎΡΡΠΊΠΎΠΉ ΠΏΡΠΈΠ±ΡΠ΅ΠΆΠ½ΠΎΠΉ Π·ΠΎΠ½Ρ Π£ΠΊΡΠ°ΠΈΠ½Ρ.ΠΠΈΠΊΠΎΠ½Π°Π½ΠΎ ΡΠ°ΠΊΡΠΎΡΠ½ΠΈΠΉ Π°Π½Π°Π»ΡΠ· ΡΡΠ΄ΡΠ² ΡΠ΅ΡΠ΅Π΄Π½ΡΠΎΡΡΡΠ½ΠΈΡ
Ρ ΡΠ΅ΡΠ΅Π΄Π½ΡΠΎΠΏβΡΡΠΈΡΡΡΠ½ΠΈΡ
Π·Π½Π°ΡΠ΅Π½Ρ ΠΌΠ΅ΡΠ΅ΠΎΡΠΎΠ»ΠΎΠ³ΡΡΠ½ΠΈΡ
Ρ Π³ΡΠ΄ΡΠΎΠ»ΠΎΠ³ΡΡΠ½ΠΈΡ
Π²Π΅Π»ΠΈΡΠΈΠ½ Π·Π° Π΄Π°Π½ΠΈΠΌΠΈ Π²ΠΈΠΌΡΡΡΠ²Π°Π½Ρ Π½Π° Π±Π΅ΡΠ΅Π³ΠΎΠ²ΠΈΡ
Π³ΡΠ΄ΡΠΎΠΌΠ΅ΡΡΡΠ°Π½ΡΡΡΡ
. ΠΡΡΠΈΠΌΠ°Π½Ρ ΠΊΡΠ»ΡΠΊΡΡΠ½Ρ ΠΎΡΡΠ½ΠΊΠΈ Π²Π½Π΅ΡΠΊΡ Π³Π»ΠΎΠ±Π°Π»ΡΠ½ΠΈΡ
Ρ ΡΠ΅Π³ΡΠΎΠ½Π°Π»ΡΠ½ΠΈΡ
ΡΠ°ΠΊΡΠΎΡΡΠ² Ρ ΠΌΡΠΆΡΡΡΠ½Ρ ΡΠ° Π΄Π΅ΠΊΠ°Π΄Π½Ρ ΠΌΡΠ½Π»ΠΈΠ²ΡΡΡΡ ΠΏΠΎΠΊΠ°Π·Π½ΠΈΠΊΡΠ² Π³ΡΠ΄ΡΠΎΠΌΠ΅ΡΠ΅ΠΎΡΠΎΠ»ΠΎΠ³ΡΡΠ½ΠΎΠ³ΠΎ ΡΠ΅ΠΆΠΈΠΌΡ ΡΠΎΡΠ½ΠΎΠΌΠΎΡΡΡΠΊΠΎΡ ΠΏΡΠΈΠ±Π΅ΡΠ΅ΠΆΠ½ΠΎΡ ΡΠΌΡΠ³ΠΈ Π£ΠΊΡΠ°ΡΠ½ΠΈ.Factor analysis of the time-series of annual and five-year averaged meteorological and hydrological values measured on shore hydrometeorological stations was performed. Quantitative estimations were obtained for the global and regional factors input to the interannual and decadal variability of the Ukrainian Black Sea coastal zone hydrometeorological regimen indices
What do we not know (yet) about recovery colleges?:A study protocol on their (cost-)effectiveness, mechanisms of action, fidelity and positioning
BackgroundRecovery Colleges (RCs) have spread across the globe as a new way of supporting people with mental vulnerabilities in their recovery journey. RCs focus on βlearningβ rather than βcuringβ and in that line facilitate a transition from being a passive, dependent patient/client to an active, empowered student learning to live life, despite vulnerabilities. Peer support and co-creation are central in RCs, as peers learn from each other by sharing personal experiences with mental vulnerabilities in an accessible, inspiring and stimulating atmosphere. The implementation of RCs is highly encouraged internationally, and as a result RCs and related self-help initiatives increasingly emerge. However, high-quality research on RCs is scarce and there is a call for thorough investigation of (cost-)effectiveness, mechanisms of action, cross-border fidelity and positioning of RCs. In response, this research project aims to fill these gaps.MethodsThis research project entails (1) a prospective quasi-experimental effectiveness study and economic evaluation, (2) a multifaceted qualitative study to elaborate on the mechanisms of action of RCs for those involved (3) a study to develop a (Dutch) Fidelity Measure of Recovery Colleges, and (4) an organisational case study to describe the positioning of RCs in relation to other mental health care services and community-based initiatives. Following the ideals of co-creation and empowerment in RCs we conduct this research project in co-creation with RC students from Enik Recovery College in Utrecht, the Netherlands.DiscussionThis research project will lead to one of the first longitudinal controlled quantitative evaluations of both cost-effectiveness and effectiveness of RC attendance in a broad sense (beyond attending courses alone). Moreover, we will gather data on a micro level (i.e., impact on RC students), meso level (i.e., organisational fidelity) and macro level (i.e., positioning in the care and support domain), capturing all important perspectives when scrutinizing the impact of complex systems. Finally, we will demonstrate the validity and value of embracing experiential knowledge in science as a complementary source of information, leading to a more profound understanding of what is researched.<br/
Grass-clover mixtures: benefits for arable and livestock farms and biodiversity.
Introduction: Grass-clover mixtures show many benefits for sustainable agriculture. In the Netherlands, organic arable and livestock farmers often work together in a so-called partner farm concept: the arable farms grow one-year grass-clover leys to widen their crop rotation and as fodder for a livestock farm in exchange for manure. The aim of this research was to investigate the effect of different grass-clover mixtures and monocultures in a one-year ley on both aboveground and belowground parameters in light of the benefits of the ley for livestock farms, arable farms and biodiversit
- β¦