1,537 research outputs found
Therapeutic Development in Neurofibromatosis
Although neurofibromatosis (NF) was initially recognized in the nineteenth century, only in the past two decades we have witnessed a paradigm shift in therapeutics. This progress is driven by the increasing understanding of the natural history of the NF-associated tumors and understanding of the molecular landscape of these disorders. Multiple clinical trials have been launched evaluating non-surgical treatment modalities and more studies are in the pipeline. Recently, the NF community has adopted standardized endpoints recommended by the Response Evaluation in Neurofibromatosis and Schwannomatosis (REiNS) International Collaboration established in 2011. Such collaborations among academic, regulatory and supporting communities are crucial for providing the infrastructure needed for advancing the therapeutic development in the field of NF
Planning When Goals Change: A Moving Target Search Approach
International audienceDevising intelligent robots or agents that interact with humans is a major challenge for artificial intelligence. In such contexts, agents must constantly adapt their decisions according to human activities and modify their goals. In this paper, we tackle this problem by introducing a novel planning approach, called Moving Goal Planning (MGP), to adapt plans to goal evolutions. This planning algorithm draws inspiration from Moving Target Search (MTS) algorithms. In order to limit the number of search iterations and to improve its efficiency, MGP delays as much as possible triggering new searches when the goal changes over time. To this purpose, MGP uses two strategies: Open Check (OC) that checks if the new goal is still in the current search tree and Plan Follow (PF) that estimates whether executing actions of the current plan brings MGP closer to the new goal. Moreover, MGP uses a parsimonious strategy to update incrementally the search tree at each new search that reduces the number of calls to the heuristic function and speeds up the search. Finally, we show evaluation results that demonstrate the effectiveness of our approach
Additive Pattern Database Heuristics
We explore a method for computing admissible heuristic evaluation functions
for search problems. It utilizes pattern databases, which are precomputed
tables of the exact cost of solving various subproblems of an existing problem.
Unlike standard pattern database heuristics, however, we partition our problems
into disjoint subproblems, so that the costs of solving the different
subproblems can be added together without overestimating the cost of solving
the original problem. Previously, we showed how to statically partition the
sliding-tile puzzles into disjoint groups of tiles to compute an admissible
heuristic, using the same partition for each state and problem instance. Here
we extend the method and show that it applies to other domains as well. We also
present another method for additive heuristics which we call dynamically
partitioned pattern databases. Here we partition the problem into disjoint
subproblems for each state of the search dynamically. We discuss the pros and
cons of each of these methods and apply both methods to three different problem
domains: the sliding-tile puzzles, the 4-peg Towers of Hanoi problem, and
finding an optimal vertex cover of a graph. We find that in some problem
domains, static partitioning is most effective, while in others dynamic
partitioning is a better choice. In each of these problem domains, either
statically partitioned or dynamically partitioned pattern database heuristics
are the best known heuristics for the problem
Phase Transition in the Number Partitioning Problem
Number partitioning is an NP-complete problem of combinatorial optimization.
A statistical mechanics analysis reveals the existence of a phase transition
that separates the easy from the hard to solve instances and that reflects the
pseudo-polynomiality of number partitioning. The phase diagram and the value of
the typical ground state energy are calculated.Comment: minor changes (references, typos and discussion of results
GC skew is a conserved property of unmethylated CpG island promoters across vertebrates.
GC skew is a measure of the strand asymmetry in the distribution of guanines and cytosines. GC skew favors R-loops, a type of three stranded nucleic acid structures that form upon annealing of an RNA strand to one strand of DNA, creating a persistent RNA:DNA hybrid. Previous studies show that GC skew is prevalent at thousands of human CpG island (CGI) promoters and transcription termination regions, which correspond to hotspots of R-loop formation. Here, we investigated the conservation of GC skew patterns in 60 sequenced chordates genomes. We report that GC skew is a conserved sequence characteristic of the CGI promoter class in vertebrates. Furthermore, we reveal that promoter GC skew peaks at the exon 1/ intron1 junction and that it is highly correlated with gene age and CGI promoter strength. Our data also show that GC skew is predictive of unmethylated CGI promoters in a range of vertebrate species and that it imparts significant DNA hypomethylation for promoters with intermediate CpG densities. Finally, we observed that terminal GC skew is conserved for a subset of vertebrate genes that tend to be located significantly closer to their downstream neighbors, consistent with a role for R-loop formation in transcription termination
Random Costs in Combinatorial Optimization
The random cost problem is the problem of finding the minimum in an
exponentially long list of random numbers. By definition, this problem cannot
be solved faster than by exhaustive search. It is shown that a classical
NP-hard optimization problem, number partitioning, is essentially equivalent to
the random cost problem. This explains the bad performance of heuristic
approaches to the number partitioning problem and allows us to calculate the
probability distributions of the optimum and sub-optimum costs.Comment: 4 pages, Revtex, 2 figures (eps), submitted to PR
Embarrassingly Parallel Search
International audienceWe propose the Embarrassingly Parallel Search, a simple and efficient method for solving constraint programming problems in parallel. We split the initial problem into a huge number of independent subproblems and solve them with available workers (i.e., cores of machines). The decomposition into subproblems is computed by selecting a subset of variables and by enumerating the combinations of values of these variables that are not detected inconsistent by the propagation mechanism of a CP Solver. The experiments on satisfaction problems and on optimization problems suggest that generating between thirty and one hundred subproblems per worker leads to a good scalability. We show that our method is quite competitive with the work stealing approach and able to solve some classical problems at the maximum capacity of the multi-core machines. Thanks to it, a user can parallelize the resolution of its problem without modifying the solver or writing any parallel source code and can easily replay the resolution of a problem
- …