1,618 research outputs found

    Node-Weighted Prize Collecting Steiner Tree and Applications

    Get PDF
    The Steiner Tree problem has appeared in the Karp's list of the first 21 NP-hard problems and is well known as one of the most fundamental problems in Network Design area. We study the Node-Weighted version of the Prize Collecting Steiner Tree problem. In this problem, we are given a simple graph with a cost and penalty value associated with each node. Our goal is to find a subtree T of the graph minimizing the cost of the nodes in T plus penalty of the nodes not in T. By a reduction from set cover problem it can be easily shown that the problem cannot be approximated in polynomial time within factor of (1-o(1))ln n unless NP has quasi-polynomial time algorithms, where n is the number of vertices of the graph. Moss and Rabani claimed an O(log n)-approximation algorithm for the problem using a Primal-Dual approach in their STOC'01 paper \cite{moss2001}. We show that their algorithm is incorrect by providing a counter example in which there is an O(n) gap between the dual solution constructed by their algorithm and the optimal solution. Further, evidence is given that their algorithm probably does not have a simple fix. We propose a new algorithm which is more involved and introduces novel ideas in primal dual approach for network design problems. Also, our algorithm is a Lagrangian Multiplier Preserving algorithm and we show how this property can be utilized to design an O(log n)-approximation algorithm for the Node-Weighted Quota Steiner Tree problem using the Lagrangian Relaxation method. We also show an application of the Node Weighted Quota Steiner Tree problem in designing algorithm with better approximation factor for Technology Diffusion problem, a problem proposed by Goldberg and Liu in \cite{goldberg2012} (SODA 2013). In Technology Diffusion, we are given a graph G and a threshold θ(v) associated with each vertex v and we are seeking a set of initial nodes called the seed set. Technology Diffusion is a dynamic process defined over time in which each vertex is either active or inactive. The vertices in the seed set are initially activated and each other vertex v gets activated whenever there are at least θ(v) active nodes connected to v through other active nodes. The Technology Diffusion problem asks to find the minimum seed set activating all nodes. Goldberg and Liu gave an O(rllog n)-approximation algorithm for the problem where r and l are the diameter of G and the number of distinct threshold values, respectively. We improve the approximation factor to O(min{r,l}log n) by establishing a close connection between the problem and the Node Weighted Quota Steiner Tree problem

    C1-continuous space-time discretization based on Hamilton's law of varying action

    Full text link
    We develop a class of C1-continuous time integration methods that are applicable to conservative problems in elastodynamics. These methods are based on Hamilton's law of varying action. From the action of the continuous system we derive a spatially and temporally weak form of the governing equilibrium equations. This expression is first discretized in space, considering standard finite elements. The resulting system is then discretized in time, approximating the displacement by piecewise cubic Hermite shape functions. Within the time domain we thus achieve C1-continuity for the displacement field and C0-continuity for the velocity field. From the discrete virtual action we finally construct a class of one-step schemes. These methods are examined both analytically and numerically. Here, we study both linear and nonlinear systems as well as inherently continuous and discrete structures. In the numerical examples we focus on one-dimensional applications. The provided theory, however, is general and valid also for problems in 2D or 3D. We show that the most favorable candidate -- denoted as p2-scheme -- converges with order four. Thus, especially if high accuracy of the numerical solution is required, this scheme can be more efficient than methods of lower order. It further exhibits, for linear simple problems, properties similar to variational integrators, such as symplecticity. While it remains to be investigated whether symplecticity holds for arbitrary systems, all our numerical results show an excellent long-term energy behavior.Comment: slightly condensed the manuscript, added references, numerical results unchange

    The main transition in the Pink membrane model: finite-size scaling and the influence of surface roughness

    Full text link
    We consider the main transition in single-component membranes using computer simulations of the Pink model [D. Pink {\it et al.}, Biochemistry {\bf 19}, 349 (1980)]. We first show that the accepted parameters of the Pink model yield a main transition temperature that is systematically below experimental values. This resolves an issue that was first pointed out by Corvera and co-workers [Phys. Rev. E {\bf 47}, 696 (1993)]. In order to yield the correct transition temperature, the strength of the van der Waals coupling in the Pink model must be increased; by using finite-size scaling, a set of optimal values is proposed. We also provide finite-size scaling evidence that the Pink model belongs to the universality class of the two-dimensional Ising model. This finding holds irrespective of the number of conformational states. Finally, we address the main transition in the presence of quenched disorder, which may arise in situations where the membrane is deposited on a rough support. In this case, we observe a stable multi-domain structure of gel and fluid domains, and the absence of a sharp transition in the thermodynamic limit.Comment: submitted to PR

    Resource Management From Single-domain 5G to End-to-End 6G Network Slicing:A Survey

    Get PDF
    Network Slicing (NS) is one of the pillars of the fifth/sixth generation (5G/6G) of mobile networks. It provides the means for Mobile Network Operators (MNOs) to leverage physical infrastructure across different technological domains to support different applications. This survey analyzes the progress made on NS resource management across these domains, with a focus on the interdependence between domains and unique issues that arise in cross-domain and End-to-End (E2E) settings. Based on a generic problem formulation, NS resource management functionalities (e.g., resource allocation and orchestration) are examined across domains, revealing their limits when applied separately per domain. The appropriateness of different problem-solving methodologies is critically analyzed, and practical insights are provided, explaining how resource management should be rethought in cross-domain and E2E contexts. Furthermore, the latest advancements are reported through a detailed analysis of the most relevant research projects and experimental testbeds. Finally, the core issues facing NS resource management are dissected, and the most pertinent research directions are identified, providing practical guidelines for new researchers.<br/

    Analisa Indeks Biaya Untuk Pekerjaan Beton Bertulang Dengan Menggunakan Metode Sni 7394-2008 Dan Lapangan (Studi Kasus Pada Proyek Pembangunan Asrama STIKES Chmk Tahap III)

    Full text link
    The index cost affect the amount of unit price construction work. Cost index used in the calculation analysis of unit price refers to the Standar Nasional Indonesia (SNI). SNI describe about the average labor productivity in Indonesia. Labor productivity are different depend on work experience, cultural origins, and the others. This study was conducted to know the costs index of labor in Kupang by took one of case study on STIKES CHMK Dormitory Construction Project in third Phased. Cost index of reinforced concrete in this project, obtained by doingreal observation for the total of labor and time required to complete each item of reinforced concrete, especially the workfrom column,beam and plate and it began from the work of iron, formwork, casting up to demolition formwork.Then the observationsresult were analyzed descriptively. Based on the analysis result, the amount of the cost index is 0.0208 foreman: 0.0377 the head of handyman: 0.09929 handyman: 0.2502 worker to install 1m2 formwork, 0.0044 foreman: 0.0177 the head of handyman: 0.0268 handyman : 0.0796 worker to work 10 kg iron, and 0.0340 foreman: 0.0272 the head of handyman: 0.1427 handyman: 1.1888 worker to make 1m3concrete. This index used in the analysis of the differentation of labor presentage based on SNI and field method and continued with the calculation of unit price for each work item which used SNI and field method

    Evaluating Catchment Models as Multiple Working Hypotheses: on the Role of Error Metrics, Parameter Sampling, Model Structure, and Data Information Content

    Full text link
    To evaluate models as hypotheses, we developed the method of Flux Mapping to construct a hypothesis space based on dominant runoff generating mechanisms. Acceptable model runs, defined as total simulated flow with similar (and minimal) model error, are mapped to the hypothesis space given their simulated runoff components. In each modeling case, the hypothesis space is the result of an interplay of factors: model structure and parameterization, chosen error metric, and data information content. The aim of this study is to disentangle the role of each factor in model evaluation. We used two model structures (SACRAMENTO and SIMHYD), two parameter sampling approaches (Latin Hypercube Sampling of the parameter space and guided-search of the solution space), three widely used error metrics (Nash-Sutcliffe Efficiency - NSE, Kling-Gupta Efficiency skill score - KGEss, and Willmott refined Index of Agreement - WIA), and hydrological data from a large sample of Australian catchments. First, we characterized how the three error metrics behave under different error types and magnitudes independent of any modeling. We then conducted a series of controlled experiments to unpack the role of each factor in runoff generation hypotheses. We show that KGEss is a more reliable metric compared to NSE and WIA for model evaluation. We further demonstrate that only changing the error metric -- while other factors remain constant -- can change the model solution space and hence vary model performance, parameter sampling sufficiency, and or the flux map. We show how unreliable error metrics and insufficient parameter sampling impair model-based inferences, particularly runoff generation hypotheses

    Histogram- and Diffusion-Based Medical Out-of-Distribution Detection

    Full text link
    Out-of-distribution (OOD) detection is crucial for the safety and reliability of artificial intelligence algorithms, especially in the medical domain. In the context of the Medical OOD (MOOD) detection challenge 2023, we propose a pipeline that combines a histogram-based method and a diffusion-based method. The histogram-based method is designed to accurately detect homogeneous anomalies in the toy examples of the challenge, such as blobs with constant intensity values. The diffusion-based method is based on one of the latest methods for unsupervised anomaly detection, called DDPM-OOD. We explore this method and propose extensive post-processing steps for pixel-level and sample-level anomaly detection on brain MRI and abdominal CT data provided by the challenge. Our results show that the proposed DDPM method is sensitive to blur and bias field samples, but faces challenges with anatomical deformation, black slice, and swapped patches. These findings suggest that further research is needed to improve the performance of DDPM for OOD detection in medical images.Comment: 9 pages, 5 figures, submission to Medical Out-of-Distribution (MOOD) challenge at MICCAI 202

    Case Based Reasoning Untuk Mendiagnosa Penyakit Anak Menggunakan Metode Block City

    Get PDF
    The Case Based Reasoning (CBR) method is one of the methods to build a system that works by diagnosing new cases based on old cases that have occurred and providing solutions to new cases based on old cases with the highest similarity values. In this study, the authors apply CBR to diagnose diseases of children aged 1-12 years. Sources of system knowledge were obtained by collecting patient medical record files in 2014 and 2015. The calculation of similarity values using the Block City Gower method with a fairness value is 70%. This system can diagnose 10 illnesses based on 48 existing symptoms. The output of the system in the form of the illness experienced by the patient based on symptoms implanted by non-physician medical personnel, handling solution and presentation similarities with the previous case to show the truth level of the diagnosis. Based on the test of 83 new cases obtained system accuracy of 75,90%
    corecore