10 research outputs found

    NLC-2 graph recognition and isomorphism

    Get PDF
    NLC-width is a variant of clique-width with many application in graph algorithmic. This paper is devoted to graphs of NLC-width two. After giving new structural properties of the class, we propose a O(n2m)O(n^2 m)-time algorithm, improving Johansson's algorithm \cite{Johansson00}. Moreover, our alogrithm is simple to understand. The above properties and algorithm allow us to propose a robust O(n2m)O(n^2 m)-time isomorphism algorithm for NLC-2 graphs. As far as we know, it is the first polynomial-time algorithm.Comment: soumis \`{a} WG 2007; 12

    Extending Partial Representations of Circle Graphs

    No full text
    The partial representation extension problem is a recently introduced generalization of the recognition problem. A circle graph is an intersection graph of chords of a circle. We study the partial representation extension problem for circle graphs, where the input consists of a graph G and a partial representation R′ giving some pre-drawn chords that represent an induced subgraph of G. The question is whether one can extend R′ to a representation R of the entire G, i.e., whether one can draw the remaining chords into a partially pre-drawn representation. Our main result is a polynomial-time algorithm for partial representation extension of circle graphs. To show this, we describe the structure of all representation a circle graph based on split decomposition. This can be of an independent interest

    Practical and Efficient Split Decomposition via Graph-Labelled Trees

    No full text
    International audienceSplit decomposition of graphs was introduced by Cunningham (under the name join decomposition) as a generalization of the modular decomposition. This paper undertakes an investigation into the algorithmic properties of split decompo- sition. We do so in the context of graph-labelled trees (GLTs), a new combinatorial object designed to simplify its consideration. GLTs are used to derive an incremental characterization of split decomposition, with a simple combinatorial description, and to explore its properties with respect to Lexicographic Breadth-First Search (LBFS). Applying the incremental characterization to an LBFS ordering results in a split de- composition algorithm that runs in time O(n + m)α(n + m), where α is the inverse Ackermann function, whose value is smaller than 4 for any practical graph. Com- pared to Dahlhaus' linear time split decomposition algorithm (Dahlhaus in J. Algo- rithms 36(2):205-240, 2000), which does not rely on an incremental construction, our algorithm is just as fast in all but the asymptotic sense and full implementatio

    Analysis of Congruency Effects of Corporate Responsibility Code Implementation on Corporate Sustainability in Bio-Economy

    No full text
    The present study attempts at providing an overview of the congruency effect of social reporting on corporate sustainability in bio-economy. The paper was structured to identify the main congruency forces that can constitute both the strategic and operational dimensions linked to the implementation of a corporate responsibility code within companies from the bio-economy sector. Research has shown that the congruency effect is stronger when companies report a greater tendency towards social reporting and pay more attention to the relationships with stakeholders. Moreover, the results of the study aimed to shed a new light on the role of a company's social identity both in its internal process and in consolidating its position in the market by using CSR reporting elements

    Pseudorapidity densities of charged particles with transverse momentum thresholds in pp collisions at √ s = 5.02 and 13 TeV

    No full text
    The pseudorapidity density of charged particles with minimum transverse momentum (pT) thresholds of 0.15, 0.5, 1, and 2 GeV/c is measured in pp collisions at the center of mass energies of √s=5.02 and 13 TeV with the ALICE detector. The study is carried out for inelastic collisions with at least one primary charged particle having a pseudorapidity (η) within 0.8pT larger than the corresponding threshold. In addition, measurements without pT-thresholds are performed for inelastic and nonsingle-diffractive events as well as for inelastic events with at least one charged particle having |η|2GeV/c), highlighting the importance of such measurements for tuning event generators. The new measurements agree within uncertainties with results from the ATLAS and CMS experiments obtained at √s=13TeV.

    Direct observation of the dead-cone effect in quantum chromodynamics

    No full text
    At particle collider experiments, elementary particle interactions with large momentum transfer produce quarks and gluons (known as partons) whose evolution is governed by the strong force, as described by the theory of quantum chromodynamics (QCD) [1]. The vacuum is not transparent to the partons and induces gluon radiation and quark pair production in a process that can be described as a parton shower [2]. Studying the pattern of the parton shower is one of the key experimental tools in understanding the properties of QCD. This pattern is expected to depend on the mass of the initiating parton, through a phenomenon known as the dead-cone effect, which predicts a suppression of the gluon spectrum emitted by a heavy quark of mass m and energy E, within a cone of angular size m/E around the emitter [3]. A direct observation of the dead-cone effect in QCD has not been possible until now, due to the challenge of reconstructing the cascading quarks and gluons from the experimentally accessible bound hadronic states. Here we show the first direct observation of the QCD dead-cone by using new iterative declustering techniques [4, 5] to reconstruct the parton shower of charm quarks. This result confirms a fundamental feature of QCD, which is derived more generally from its origin as a gauge quantum field theory. Furthermore, the measurement of a dead-cone angle constitutes the first direct experimental observation of the non-zero mass of the charm quark, which is a fundamental constant in the standard model of particle physics.The direct measurement of the QCD dead cone in charm quark fragmentation is reported, using iterative declustering of jets tagged with a fully reconstructed charmed hadron.In particle collider experiments, elementary particle interactions with large momentum transfer produce quarks and gluons (known as partons) whose evolution is governed by the strong force, as described by the theory of quantum chromodynamics (QCD). These partons subsequently emit further partons in a process that can be described as a parton shower which culminates in the formation of detectable hadrons. Studying the pattern of the parton shower is one of the key experimental tools for testing QCD. This pattern is expected to depend on the mass of the initiating parton, through a phenomenon known as the dead-cone effect, which predicts a suppression of the gluon spectrum emitted by a heavy quark of mass mQm_{\rm{Q}} and energy EE, within a cone of angular size mQm_{\rm{Q}}/EE around the emitter. Previously, a direct observation of the dead-cone effect in QCD had not been possible, owing to the challenge of reconstructing the cascading quarks and gluons from the experimentally accessible hadrons. We report the direct observation of the QCD dead cone by using new iterative declustering techniques to reconstruct the parton shower of charm quarks. This result confirms a fundamental feature of QCD. Furthermore, the measurement of a dead-cone angle constitutes a direct experimental observation of the non-zero mass of the charm quark, which is a fundamental constant in the standard model of particle physics

    Nano-antimicrobials: A New Paradigm for Combating Mycobacterial Resistance

    No full text

    Antiinflammatory therapy with canakinumab for atherosclerotic disease

    No full text
    BACKGROUND: Experimental and clinical data suggest that reducing inflammation without affecting lipid levels may reduce the risk of cardiovascular disease. Yet, the inflammatory hypothesis of atherothrombosis has remained unproved. METHODS: We conducted a randomized, double-blind trial of canakinumab, a therapeutic monoclonal antibody targeting interleukin-1β, involving 10,061 patients with previous myocardial infarction and a high-sensitivity C-reactive protein level of 2 mg or more per liter. The trial compared three doses of canakinumab (50 mg, 150 mg, and 300 mg, administered subcutaneously every 3 months) with placebo. The primary efficacy end point was nonfatal myocardial infarction, nonfatal stroke, or cardiovascular death. RESULTS: At 48 months, the median reduction from baseline in the high-sensitivity C-reactive protein level was 26 percentage points greater in the group that received the 50-mg dose of canakinumab, 37 percentage points greater in the 150-mg group, and 41 percentage points greater in the 300-mg group than in the placebo group. Canakinumab did not reduce lipid levels from baseline. At a median follow-up of 3.7 years, the incidence rate for the primary end point was 4.50 events per 100 person-years in the placebo group, 4.11 events per 100 person-years in the 50-mg group, 3.86 events per 100 person-years in the 150-mg group, and 3.90 events per 100 person-years in the 300-mg group. The hazard ratios as compared with placebo were as follows: in the 50-mg group, 0.93 (95% confidence interval [CI], 0.80 to 1.07; P=0.30); in the 150-mg group, 0.85 (95% CI, 0.74 to 0.98; P=0.021); and in the 300-mg group, 0.86 (95% CI, 0.75 to 0.99; P=0.031). The 150-mg dose, but not the other doses, met the prespecified multiplicity-adjusted threshold for statistical significance for the primary end point and the secondary end point that additionally included hospitalization for unstable angina that led to urgent revascularization (hazard ratio vs. placebo, 0.83; 95% CI, 0.73 to 0.95; P=0.005). Canakinumab was associated with a higher incidence of fatal infection than was placebo. There was no significant difference in all-cause mortality (hazard ratio for all canakinumab doses vs. placebo, 0.94; 95% CI, 0.83 to 1.06; P=0.31). CONCLUSIONS: Antiinflammatory therapy targeting the interleukin-1β innate immunity pathway with canakinumab at a dose of 150 mg every 3 months led to a significantly lower rate of recurrent cardiovascular events than placebo, independent of lipid-level lowering. Copyright © 2017 Massachusetts Medical Society
    corecore