305 research outputs found

    The paradox between resistance to hypoxia and liability to hypoxic damage in hyperglycemic peripheral nerves. Evidence for glycolysis involvement

    Get PDF
    Isolated ventral and dorsal rat spinal roots incubated in normal (2.5 mM) or high glucose (25 mM) concentrations or in high concentrations of other hexoses were exposed transiently to hypoxia (30 min) in a solution of low buffering power. Compound nerve action potentials, extracellular direct current potentials, and interstitial pH were continuously recorded before, during, and after hypoxia. Ventral roots incubated in 25 mM D-glucose showed resistance to hypoxia. Dorsal roots, on the other hand, revealed electrophysiological damage by hyperglycemic hypoxia as indicated by a lack of posthypoxic recovery. In both types of spinal roots, interstitial acidification was most pronounced during hyperglycemic hypoxia. The changes in the sensitivity to hypoxia induced by high concentrations of D-glucose were imitated by high concentrations of D-mannose. In contrast, D-galactose, L-glucose, D-fructose, and L-fucose did not have such effects. Resistance to hypoxia, hypoxia-generated interstitial acidification, and hypoxia-induced electrophysiological damage were absent after pharmacological inhibition of nerve glycolysis with iodoacetate. These observations indicate 1) that enhanced anaerobic glycolysis produces resistance to hypoxia in hyperglycemic peripheral nerves and 2) that acidification may impair the function of peripheral axons when anaerobic glycolysis proceeds in a tissue with reduced buffering power

    The Complexity of Repairing, Adjusting, and Aggregating of Extensions in Abstract Argumentation

    Full text link
    We study the computational complexity of problems that arise in abstract argumentation in the context of dynamic argumentation, minimal change, and aggregation. In particular, we consider the following problems where always an argumentation framework F and a small positive integer k are given. - The Repair problem asks whether a given set of arguments can be modified into an extension by at most k elementary changes (i.e., the extension is of distance k from the given set). - The Adjust problem asks whether a given extension can be modified by at most k elementary changes into an extension that contains a specified argument. - The Center problem asks whether, given two extensions of distance k, whether there is a "center" extension that is a distance at most (k-1) from both given extensions. We study these problems in the framework of parameterized complexity, and take the distance k as the parameter. Our results covers several different semantics, including admissible, complete, preferred, semi-stable and stable semantics

    Development of a device to simulate tooth mobility

    Get PDF
    Objectives: The testing of new materials under simulation of oral conditions is essential in medicine. For simulation of fracture strength different simulation devices are used for test set-up. The results of these in vitro tests differ because there is no standardization of tooth mobility in simulation devices. The aim of this study is to develop a simulation device that depicts the tooth mobility curve as accurately as possible and creates reproducible and scalable mobility curves. Materials and methods: With the aid of published literature and with the help of dentists, average forms of tooth classes were generated. Based on these tooth data, different abutment tooth shapes and different simulation devices were designed with a CAD system and were generated with a Rapid Prototyping system. Then, for all simulation devices the displacement curves were created with a universal testing machine and compared with the tooth mobility curve. With this new information, an improved adapted simulation device was constructed. Results: A simulations device that is able to simulate the mobility curve of natural teeth with high accuracy and where mobility is reproducible and scalable was developed

    Parameterized Complexity of the k-anonymity Problem

    Full text link
    The problem of publishing personal data without giving up privacy is becoming increasingly important. An interesting formalization that has been recently proposed is the kk-anonymity. This approach requires that the rows of a table are partitioned in clusters of size at least kk and that all the rows in a cluster become the same tuple, after the suppression of some entries. The natural optimization problem, where the goal is to minimize the number of suppressed entries, is known to be APX-hard even when the records values are over a binary alphabet and k=3k=3, and when the records have length at most 8 and k=4k=4 . In this paper we study how the complexity of the problem is influenced by different parameters. In this paper we follow this direction of research, first showing that the problem is W[1]-hard when parameterized by the size of the solution (and the value kk). Then we exhibit a fixed parameter algorithm, when the problem is parameterized by the size of the alphabet and the number of columns. Finally, we investigate the computational (and approximation) complexity of the kk-anonymity problem, when restricting the instance to records having length bounded by 3 and k=3k=3. We show that such a restriction is APX-hard.Comment: 22 pages, 2 figure

    Locality and Bounding-Box Quality of Two-Dimensional Space-Filling Curves

    Full text link
    Space-filling curves can be used to organise points in the plane into bounding-box hierarchies (such as R-trees). We develop measures of the bounding-box quality of space-filling curves that express how effective different space-filling curves are for this purpose. We give general lower bounds on the bounding-box quality measures and on locality according to Gotsman and Lindenbaum for a large class of space-filling curves. We describe a generic algorithm to approximate these and similar quality measures for any given curve. Using our algorithm we find good approximations of the locality and the bounding-box quality of several known and new space-filling curves. Surprisingly, some curves with relatively bad locality by Gotsman and Lindenbaum's measure, have good bounding-box quality, while the curve with the best-known locality has relatively bad bounding-box quality.Comment: 24 pages, full version of paper to appear in ESA. Difference with first version: minor editing; Fig. 2(m) correcte

    Parameterized Edge Hamiltonicity

    Full text link
    We study the parameterized complexity of the classical Edge Hamiltonian Path problem and give several fixed-parameter tractability results. First, we settle an open question of Demaine et al. by showing that Edge Hamiltonian Path is FPT parameterized by vertex cover, and that it also admits a cubic kernel. We then show fixed-parameter tractability even for a generalization of the problem to arbitrary hypergraphs, parameterized by the size of a (supplied) hitting set. We also consider the problem parameterized by treewidth or clique-width. Surprisingly, we show that the problem is FPT for both of these standard parameters, in contrast to its vertex version, which is W-hard for clique-width. Our technique, which may be of independent interest, relies on a structural characterization of clique-width in terms of treewidth and complete bipartite subgraphs due to Gurski and Wanke

    Mass deposition fluxes of Saharan mineral dust to the tropical northeast Atlantic Ocean: an intercomparison of methods

    Get PDF
    Mass deposition fluxes of mineral dust to the tropical northeast Atlantic Ocean were determined within this study. In the framework of SOPRAN (Surface Ocean Processes in the Anthropocene), the interaction between the atmosphere and the ocean in terms of material exchange were investigated at the Cape Verde atmospheric observatory (CVAO) on the island Sao Vicente for January 2009. Five different methods were applied to estimate the deposition flux, using different meteorological and physical measurements, remote sensing, and regional dust transport simulations. The set of observations comprises micrometeorological measurements with an ultra-sonic anemometer and profile measurements using 2-D anemometers at two different heights, and microphysical measurements of the size-resolved mass concentrations of mineral dust. In addition, the total mass concentration of mineral dust was derived from absorption photometer observations and passive sampling. The regional dust model COSMO-MUSCAT was used for simulations of dust emission and transport, including dry and wet deposition processes. This model was used as it describes the AOD's and mass concentrations realistic compared to the measurements and because it was run for the time period of the measurements. The four observation-based methods yield a monthly average deposition flux of mineral dust of 12–29 ng m−2 s−1. The simulation results come close to the upper range of the measurements with an average value of 47 ng m−2 s−1. It is shown that the mass deposition flux of mineral dust obtained by the combination of micrometeorological (ultra-sonic anemometer) and microphysical measurements (particle mass size distribution of mineral dust) is difficult to compare to modeled mass deposition fluxes when the mineral dust is inhomogeneously distributed over the investigated area
    • …
    corecore