621 research outputs found

    On Temporal Graph Exploration

    Full text link
    A temporal graph is a graph in which the edge set can change from step to step. The temporal graph exploration problem TEXP is the problem of computing a foremost exploration schedule for a temporal graph, i.e., a temporal walk that starts at a given start node, visits all nodes of the graph, and has the smallest arrival time. In the first part of the paper, we consider only temporal graphs that are connected at each step. For such temporal graphs with nn nodes, we show that it is NP-hard to approximate TEXP with ratio O(n1−ϔ)O(n^{1-\epsilon}) for any Ï”>0\epsilon>0. We also provide an explicit construction of temporal graphs that require Θ(n2)\Theta(n^2) steps to be explored. We then consider TEXP under the assumption that the underlying graph (i.e. the graph that contains all edges that are present in the temporal graph in at least one step) belongs to a specific class of graphs. Among other results, we show that temporal graphs can be explored in O(n1.5k2log⁥n)O(n^{1.5} k^2 \log n) steps if the underlying graph has treewidth kk and in O(nlog⁥3n)O(n \log^3 n) steps if the underlying graph is a 2×n2\times n grid. In the second part of the paper, we replace the connectedness assumption by a weaker assumption and show that mm-edge temporal graphs with regularly present edges and with random edges can always be explored in O(m)O(m) steps and O(mlog⁥n)O(m \log n) steps with high probability, respectively. We finally show that the latter result can be used to obtain a distributed algorithm for the gossiping problem.Comment: This is an extended version of an ICALP 2015 pape

    The creative process in graphic design: breaking out of established work modes through modularity

    Get PDF
    The field of graphic design is influenced by rapid technological and social changes, challenging us to redefine how we think about the creative design process. In this thesis, the well-known concept of modularity will be investigated from a contemporary perspective as a way to break out of established work modes which rely on a linear design process. Six types of modularity, as defined for use in product design, create the framework for a series of visual explorations. The underlying method is an iterative design process of graphic prototyping and modeling, followed by a critical review of the visual outcome. These explorations demonstrate how modularity can encourage creativity in the graphic design process. The benefit of a modular approach to the creative process is supported by research from the fields of psychology and design

    Scheduling with Explorable Uncertainty

    Get PDF
    We introduce a novel model for scheduling with explorable uncertainty. In this model, the processing time of a job can potentially be reduced (by an a priori unknown amount) by testing the job. Testing a job j takes one unit of time and may reduce its processing time from the given upper limit p\u27_j (which is the time taken to execute the job if it is not tested) to any value between 0 and p\u27_j. This setting is motivated e.g. by applications where a code optimizer can be run on a job before executing it. We consider the objective of minimizing the sum of completion times on a single machine. All jobs are available from the start, but the reduction in their processing times as a result of testing is unknown, making this an online problem that is amenable to competitive analysis. The need to balance the time spent on tests and the time spent on job executions adds a novel flavor to the problem. We give the first and nearly tight lower and upper bounds on the competitive ratio for deterministic and randomized algorithms. We also show that minimizing the makespan is a considerably easier problem for which we give optimal deterministic and randomized online algorithms

    Connectivity measures for internet topologies.

    Get PDF
    The topology of the Internet has initially been modelled as an undirected graph, where vertices correspond to so-called Autonomous Systems (ASs),and edges correspond to physical links between pairs of ASs. However, in order to capture the impact of routing policies, it has recently become apparent that one needs to classify the edges according to the existing economic relationships (customer-provider, peer-to-peer or siblings) between the ASs. This leads to a directed graph model in which traffic can be sent only along so-called valley-free paths. Four different algorithms have been proposed in the literature for inferring AS relationships using publicly available data from routing tables. We investigate the differences in the graph models produced by these algorithms, focussing on connectivity measures. To this aim, we compute the maximum number of vertex-disjoint valley-free paths between ASs as well as the size of a minimum cut separating a pair of ASs. Although these problems are solvable in polynomial time for ordinary graphs, they are NP-hard in our setting. We formulate the two problems as integer programs, and we propose a number of exact algorithms for solving them. For the problem of finding the maximum number of vertex-disjoint paths, we discuss two algorithms; the first one is a branch-and-price algorithm based on the IP formulation, and the second algorithm is a non LP based branch-and-bound algorithm. For the problem of finding minimum cuts we use a branch-and-cut algo rithm, based on the IP formulation of this problem. Using these algorithms, we obtain exact solutions for both problems in reasonable time. It turns out that there is a large gap in terms of the connectivity measures between the undirected and directed models. This finding supports our conclusion that economic relationships need to be taken into account when building a topology of the Internet.Research; Internet;

    Wavelength Conversion in All-Optical Networks with Shortest-Path Routing

    Get PDF
    We consider all-optical networks with shortest-path routing that use wavelength-division multiplexing and employ wavelength conversion at specific nodes in order to maximize their capacity usage. We present efficient algorithms for deciding whether a placement of wavelength converters allows the network to run at maximum capacity, and for finding an optimal wavelength assignment when such a placement of converters is known. Our algorithms apply to both undirected and directed networks. Furthermore, we show that the problem of designing such networks, i.e., finding an optimal placement of converters, is MAX SNP-hard in both the undirected and the directed case. Finally, we give a linear-time algorithm for finding an optimal placement of converters in undirected triangle-free networks, and show that the problem remains NP-hard in bidirected triangle-free planar network

    Effect and timing of operative treatment for teratoma associated N-Methyl-d-Aspartate receptor-antibody encephalitis: A systematic review with meta-analysis

    Full text link
    Resection of an underlying ovarian teratoma in patients with N-Methyl-d-Aspartate receptor (NMDAR)-antibody encephalitis is supported by pathophysiological studies demonstrating the production of NMDAR antibodies within the teratoma. This systematic review assesses the clinical effect of teratoma resection and compares early versus late resection. Literature search was performed on the first of October 2022 (MEDLINE, Embase, CENTRAL, Web of Science). Original studies including more than three patients with NDMAR encephalitis and associated ovarian teratoma were included and evaluated with the Study Quality Assessment Tool for risk of bias. Fourteen studies referring to 1499 patients were included and analyzed in four syntheses using the fixed Mantel-Haenszel method. The rate of relapse in patients with ovarian teratoma resection was lower than in patients without resection (risk ratio for relapse 0.30, 95% CI 0.17-0.51), however the certainty level of evidence is very low. Despite some evidence pointing to a beneficial effect of early teratoma resection in patients with NMDAR-antibody encephalitis, systematically accessible data are insufficient to provide recommendations for or against resection, as well as for timing of surgery. The authors received no financial support for the research, authorship, or publication of this article. For the systematic review no clinical-trial database registration had been done

    DrugExBERT for Pharmacovigilance – A Novel Approach for Detecting Drug Experiences from User-Generated Content

    Get PDF
    Pharmaceutical companies have to maintain drug safety through pharmacovigilance systems by monitoring various sources of information about adverse drug experiences. Recently, user-generated content (UGC) has emerged as a valuable source of real-world drug experiences, posing new challenges due to its high volume and variety. We present DrugExBERT, a novel approach to extract adverse drug experiences (adverse reaction, lack of effect) and supportive drug experiences (effectiveness, intervention, indication, and off-label use) from UGC. To be able to verify the extracted drug experiences, DrugExBERT additionally provides explications in the form of UGC phrases that were critical for the extraction. In our evaluation, we demonstrate that DrugExBERT outperforms state-of-the-art pharmacovigilance approaches as well as ChatGPT on several performance measures and that DrugExBERT is data- and drug-agnostic. Thus, our novel approach can help pharmaceutical companies meet their legal obligations and ethical responsibility while ensuring patient safety and monitoring drug effectiveness

    On a simple and accurate quantum correction for Monte Carlo simulation

    Get PDF
    We investigate a quantum-correction method for Monte Carlo device simulation. The method consists of reproducing quantum mechanical density-gradient simulation by classical drift-diffusion simulation with modified effective oxide thickness and work function and using these modifications subsequently in Monte Carlo simulation. This approach is found to be highly accurate and can be used fully automatically in a technology computer-aided design (TCAD) workbench project. As an example, the methodology is applied to the Monte Carlo simulation of the on-current scaling in p- and n-type MOSFETs corresponding to a 65 nm node technology. In particular, it turns out that considering only the total threshold voltage shift still involves a significant difference to a Monte Carlo simulation based on the combined correction of oxide thickness and work function. Ultimately, this quantum correction permits to consider surface scattering as a combination of specular and diffusive scattering where the conservation of energy and parallel wave vector in the specular part takes stress-induced band structure modifications and hence the corresponding surface mobility changes on a physical basis into accoun

    Welche Coping-Strategien und Ressourcen tragen zur Erhaltung der Lebenszufriedenheit bei Geriatriepatienten bei?

    Get PDF
    Hintergrund: Aufgrund der zunehmenden Anzahl alter Menschen in der Bevölkerung, legte die bisherige gerontologische Forschung ihren Fokus hauptsĂ€chlich auf einzelne Faktoren, die erfolgreiches Altern gewĂ€hrleisten sollen, wodurch alte Menschen mit altersbedingten BeeintrĂ€chtigungen weniger BerĂŒcksichtigung fanden. In Anbetracht der erhöhten Wahrscheinlichkeit mit dem Alter physische und kognitive BeeintrĂ€chtigungen zu erleiden, kommt der Erforschung prĂ€ventiver Maßnahmen, die zur Erhaltung der Lebenszufriedenheit im hohen Alter beitragen, eine wesentliche Bedeutung zu. Daher richtet sich die Aufmerksamkeit in dieser Arbeit auf Coping-Strategien und Ressourcen, die zur Erhaltung der Lebenszufriedenheit bei Menschen mit altersbedingten BeeintrĂ€chtigungen und Einbußen beitragen. Ziel: Das Ziel dieser Studie stellte die Erforschung einzelner Coping-Strategien und Ressourcen hinsichtlich ihres Beitrags an der Erhaltung der Lebenszufriedenheit bei Menschen mit altersbedingten BeeintrĂ€chtigungen dar. Methode: Um die Höhe der Lebenszufriedenheit, als auch den Einfluss prĂ€ventiver und proaktiver Coping-Strategien und einzelner Ressourcen auf die Lebenszufriedenheit zu untersuchen wurden 41 Bewohner des Geriatriezentrums Donaustadt und 43 Personen, welche zuhause leben und diverse Hilfsdienste in Anspruch nehmen mithilfe eines standardisierten Fragebogens interviewt. Ergebnisse: Die gefundenen Resultate zeigen, dass sich Geriatriebewohner und Personen die zuhause leben in der Höhe der Lebenszufriedenheit nicht unterscheiden. Hinsichtlich angewandter Coping-Strategien und Ressourcen zeigte sich ein signifikanter Einfluss beider Konstrukte auf die Lebenszufriedenheit, wobei ein stĂ€rkerer Einfluss von prĂ€ventiven als proaktiven Coping-Strategien auf die Lebenszufriedenheit beobachtet wurde, als auch, dass persönlichkeitsbezogene Ressourcen einen höheren Einfluss auf die Lebenszufriedenheit zeigen als umweltbezogene und soziale Ressourcen. Konklusion: Die Höhe der Lebenszufriedenheit bei Geriatriepatienten unterscheidet sich nicht von jener der Personen, welche zuhause leben und Hilfsdienste in Anspruch nehmen. PrĂ€ventive Coping-Strategien und vorhandene persönlichkeitsbezogene Ressourcen zeigen einen signifikanten positiven Einfluss auf die Höhe der Lebenszufriedenheit und können daher als signifikante PrĂ€diktoren fĂŒr die Lebenszufriedenheit angenommen werden.Background: Based on the increasing older population previous research has mainly focused on several aspects leading to sucessfull aging, without paying regard to the needs of older people who suffer from physical, cognitive and social losses. Since these age-related losses become more probable with increasing age and possibly influences older peopleÂŽs life satisfaction, their is a need for its prevention. Aim: This study aimed at investigating coping-strategies and several resources which show an influence on life satisfaction and account for its preservation. Method: On the Basis of a questionaire structured interviews were conducted with 41 older people living in a nursing home and 43 people living in private households who were receiving home care services concerning their live satisfaction, use of coping-strategies and available resources. Results: The study found that people who live in a nursing home and people living in private households show a relatively high life satisfaction and there were no between group differences (distinctĂ­on) found. Concerning coping-strategies and resources a significant effekt was found for both constructs, whereupon preventive coping-strategies show a stronger impact on life satisfaction than proactive coping strategies. Furthermore internal resources where more related to a higher level of life satisfaction than external ones. Conclusion: Life satisfaction in older people living in nursing homes and older people living in private households was relatively high and there were no differences found between those two groups.Preventive coping-strategies and internal resources show a significant influence on the level of life satisfaction and can be assumed as predicting live satisfaction in older people

    Learning-Augmented Query Policies for Minimum Spanning Tree with Uncertainty

    Get PDF
    We study how to utilize (possibly erroneous) predictions in a model for computing under uncertainty in which an algorithm can query unknown data. Our aim is to minimize the number of queries needed to solve the minimum spanning tree problem, a fundamental combinatorial optimization problem that has been central also to the research area of explorable uncertainty. For all integral ? ? 2, we present algorithms that are ?-robust and (1+1/?)-consistent, meaning that they use at most ?OPT queries if the predictions are arbitrarily wrong and at most (1+1/?)OPT queries if the predictions are correct, where OPT is the optimal number of queries for the given instance. Moreover, we show that this trade-off is best possible. Furthermore, we argue that a suitably defined hop distance is a useful measure for the amount of prediction error and design algorithms with performance guarantees that degrade smoothly with the hop distance. We also show that the predictions are PAC-learnable in our model. Our results demonstrate that untrusted predictions can circumvent the known lower bound of 2, without any degradation of the worst-case ratio. To obtain our results, we provide new structural insights for the minimum spanning tree problem that might be useful in the context of query-based algorithms regardless of predictions. In particular, we generalize the concept of witness sets - the key to lower-bounding the optimum - by proposing novel global witness set structures and completely new ways of adaptively using those
    • 

    corecore