1,998 research outputs found
Parameterized Study of the Test Cover Problem
We carry out a systematic study of a natural covering problem, used for
identification across several areas, in the realm of parameterized complexity.
In the {\sc Test Cover} problem we are given a set of items
together with a collection, , of distinct subsets of these items called
tests. We assume that is a test cover, i.e., for each pair of items
there is a test in containing exactly one of these items. The
objective is to find a minimum size subcollection of , which is still a
test cover. The generic parameterized version of {\sc Test Cover} is denoted by
-{\sc Test Cover}. Here, we are given and a
positive integer parameter as input and the objective is to decide whether
there is a test cover of size at most . We study four
parameterizations for {\sc Test Cover} and obtain the following:
(a) -{\sc Test Cover}, and -{\sc Test Cover} are fixed-parameter
tractable (FPT).
(b) -{\sc Test Cover} and -{\sc Test Cover} are
W[1]-hard. Thus, it is unlikely that these problems are FPT
Monitoring and Occurrence of Heavy PAHs in Pomace Oil Supply Chain Using a Double-Step Solid-Phase Purification and HPLC-FLD Determination
Polycyclic aromatic hydrocarbons (PAHs) are ubiquitous environmental and processing contaminants generated by both spontaneous and anthropogenic incomplete combustion processes of organic matter. Contamination of PAHs in vegetable oils can result from several factors and processes, including environmental contamination, oil processing, and migration from food contact materials. The determination of PAHs in edible oil presents a challenge because of the complexity of the matrix. Since PAHs are present at lower levels than triglycerides, it is necessary to isolate the compounds of interest from the rest of the matrix. To this purpose, a new purification approach based on a double solid-phase extraction (SPE) step followed by high performance liquid chromatographyâfluorometric detector (HPLC-FLD) analysis was developed. The method involves a first purification step by using a 5 g silica SPE cartridge, previously washed with dichloromethane (20 mL), dried completely, and then conditioned with n-hexane (20 mL). The triglycerides are retained by the silica, while the PAH-containing fraction is eluted with a mixture of n-hexane/dichloromethane (70/30, v/v). After evaporation, the residue is loaded on a 5 g amino SPE cartridge and eluted with n-hexane/toluene (70/30, v/v) before HPLC-FLD analysis. The focus was the evaluation of the contribution of the various phases of the pomace oil supply chain in terms of the heavy PAHs (PAH8) concentration. Data collected showed that pomace contamination increased (by 15 times) as storage time increased. In addition, the process of pomace drying, which is necessary to reduce its moisture content before solvent extraction of the residual oil, appeared to significantly contribute to the total heavy PAHs content, with increases in value by up to 75 times
Evaluating synteny for improved comparative studies
Motivation: Comparative genomics aims to understand the structure and function of genomes by translating knowledge gained about some genomes to the object of study. Early approaches used pairwise comparisons, but today researchers are attempting to leverage the larger potential of multi-way comparisons. Comparative genomics relies on the structuring of genomes into syntenic blocks: blocks of sequence that exhibit conserved features across the genomes. Syntenic blocs are required for complex computations to scale to the billions of nucleotides present in many genomes; they enable comparisons across broad ranges of genomes because they filter out much of the individual variability; they highlight candidate regions for in-depth studies; and they facilitate whole-genome comparisons through visualization tools. However, the concept of syntenic block remains loosely defined. Tools for the identification of syntenic blocks yield quite different results, thereby preventing a systematic assessment of the next steps in an analysis. Current tools do not include measurable quality objectives and thus cannot be benchmarked against themselves. Comparisons among tools have also been neglectedâwhat few results are given use superficial measures unrelated to quality or consistency. Results: We present a theoretical model as well as an experimental basis for comparing syntenic blocks and thus also for improving or designing tools for the identification of syntenic blocks. We illustrate the application of the model and the measures by applying them to syntenic blocks produced by three different contemporary tools (DRIMM-Synteny, i-ADHoRe and Cyntenator) on a dataset of eight yeast genomes. Our findings highlight the need for a well founded, systematic approach to the decomposition of genomes into syntenic blocks. Our experiments demonstrate widely divergent results among these tools, throwing into question the robustness of the basic approach in comparative genomics. We have taken the first step towards a formal approach to the construction of syntenic blocks by developing a simple quality criterion based on sound evolutionary principles. Contact: [email protected]
Transcranial random noise stimulation (tRNS): a wide range of frequencies is needed for increasing cortical excitability
Transcranial random noise stimulation (tRNS) is a recent neuromodulation protocol. The high-frequency band (hf-tRNS) has shown to be the most effective in enhancing neural excitability. The frequency band of hf-tRNS typically spans from 100 to 640 Hz. Here we asked whether both the lower and the higher half of the high-frequency band are needed for increasing neural excitability. Three frequency ranges (100\u2013400 Hz, 400\u2013700 Hz, 100\u2013700 Hz) and Sham conditions were delivered for 10 minutes at an intensity of 1.5 mA over the primary motor cortex (M1). Single-pulse transcranial magnetic stimulation (TMS) was delivered over the same area at baseline, 0, 10, 20, 30, 45 and 60 minutes after stimulation, while motor evoked potentials (MEPs) were recorded to evaluate changes in cortical excitability. Only the full-band condition (100\u2013700 Hz) was able to modulate excitability by enhancing MEPs at 10 and 20 minutes after stimulation: neither the higher nor the lower sub-range of the high-frequency band significantly modulated cortical excitability. These results show that the efficacy of tRNS is strictly related to the width of the selected frequency range
To Google or not : differences on how online searches predict names and faces
Word and face recognition are processes of interest for a large number of fields, including both clinical psychology and computer calculations. The research examined here aims to evaluate the role of an online frequencyâs ability to predict both face and word recognition by examining the stability of these processes in a given amount of time. The study will further examine the differences between traditional theories and current contextual frequency approaches. Reaction times were recorded through both a logarithmic transformation and through a Bayesian approach. The Bayes factor notation was employed as an additional test to support the evidence provided by the data. Although differences between face and name recognition were found, the results suggest that latencies for both face and name recognition are stable for a period of six months and online news frequencies better predict reaction time for both classical frequentist analyses. These findings support the use of the contextual diversity approach
Further investigations into the single metal deposition (SMD II) technique for the detection of latent fingermarks
Single metal deposition (SMD II), a recently proposed method for the development of latent fingermarks, was investigated by systematically altering aspects of the procedure to assess their effect on the level of development and contrast achieved. Gold nanoparticle size, temperature of the deposition solution bath, and orbital shaking during detection were shown to affect the levels of development and contrast obtained. Gold nanoparticles of diameter 15â21 nm were found to be most effective for satisfactory visualisation of latent fingermarks, while solutions that were applied at room temperature were found to adequately balance the ratio between the contrast of the fingermark ridge detail and the level of background staining achieved. Finally, optimum levels of development and contrast were obtained through constant agitation of both solution baths at approximately 50 RPM throughout the submersion time. SMD II was also tested on a large variety of substrate types and shown to be effective on a range of porous, non-porous, and semi-porous surfaces; however, the detection quality can be significantly influenced by the substrate nature. This resulted in the production of dark grey, white, or gold coloured fingermarks on different surfaces, as well as reversed detection on certain types of plastic, similarly seen through the use of vacuum metal deposition. © 2016 Elsevier Ireland Lt
Simultaneous Orthogonal Planarity
We introduce and study the problem: Given planar
graphs each with maximum degree 4 and the same vertex set, do they admit an
OrthoSEFE, that is, is there an assignment of the vertices to grid points and
of the edges to paths on the grid such that the same edges in distinct graphs
are assigned the same path and such that the assignment induces a planar
orthogonal drawing of each of the graphs?
We show that the problem is NP-complete for even if the shared
graph is a Hamiltonian cycle and has sunflower intersection and for
even if the shared graph consists of a cycle and of isolated vertices. Whereas
the problem is polynomial-time solvable for when the union graph has
maximum degree five and the shared graph is biconnected. Further, when the
shared graph is biconnected and has sunflower intersection, we show that every
positive instance has an OrthoSEFE with at most three bends per edge.Comment: Appears in the Proceedings of the 24th International Symposium on
Graph Drawing and Network Visualization (GD 2016
Information filtering via Iterative Refinement
With the explosive growth of accessible information, expecially on the
Internet, evaluation-based filtering has become a crucial task. Various systems
have been devised aiming to sort through large volumes of information and
select what is likely to be more relevant. In this letter we analyse a new
ranking method, where the reputation of information providers is determined
self-consistently.Comment: 10 pages, 3 figures. Accepted for publication on Europhysics Letter
- âŠ