1,721 research outputs found
Distributed Computing in the Asynchronous LOCAL model
The LOCAL model is among the main models for studying locality in the
framework of distributed network computing. This model is however subject to
pertinent criticisms, including the facts that all nodes wake up
simultaneously, perform in lock steps, and are failure-free. We show that
relaxing these hypotheses to some extent does not hurt local computing. In
particular, we show that, for any construction task associated to a locally
checkable labeling (LCL), if is solvable in rounds in the LOCAL model,
then remains solvable in rounds in the asynchronous LOCAL model.
This improves the result by Casta\~neda et al. [SSS 2016], which was restricted
to 3-coloring the rings. More generally, the main contribution of this paper is
to show that, perhaps surprisingly, asynchrony and failures in the computations
do not restrict the power of the LOCAL model, as long as the communications
remain synchronous and failure-free
Dynamic and Multi-functional Labeling Schemes
We investigate labeling schemes supporting adjacency, ancestry, sibling, and
connectivity queries in forests. In the course of more than 20 years, the
existence of labeling schemes supporting each of these
functions was proven, with the most recent being ancestry [Fraigniaud and
Korman, STOC '10]. Several multi-functional labeling schemes also enjoy lower
or upper bounds of or
respectively. Notably an upper bound of for
adjacency+siblings and a lower bound of for each of the
functions siblings, ancestry, and connectivity [Alstrup et al., SODA '03]. We
improve the constants hidden in the -notation. In particular we show a lower bound for connectivity+ancestry and
connectivity+siblings, as well as an upper bound of for connectivity+adjacency+siblings by altering existing
methods.
In the context of dynamic labeling schemes it is known that ancestry requires
bits [Cohen, et al. PODS '02]. In contrast, we show upper and lower
bounds on the label size for adjacency, siblings, and connectivity of
bits, and to support all three functions. There exist efficient
adjacency labeling schemes for planar, bounded treewidth, bounded arboricity
and interval graphs. In a dynamic setting, we show a lower bound of
for each of those families.Comment: 17 pages, 5 figure
Distributed Exact Shortest Paths in Sublinear Time
The distributed single-source shortest paths problem is one of the most
fundamental and central problems in the message-passing distributed computing.
Classical Bellman-Ford algorithm solves it in time, where is the
number of vertices in the input graph . Peleg and Rubinovich (FOCS'99)
showed a lower bound of for this problem, where
is the hop-diameter of .
Whether or not this problem can be solved in time when is
relatively small is a major notorious open question. Despite intensive research
\cite{LP13,N14,HKN15,EN16,BKKL16} that yielded near-optimal algorithms for the
approximate variant of this problem, no progress was reported for the original
problem.
In this paper we answer this question in the affirmative. We devise an
algorithm that requires time, for , and time, for larger . This
running time is sublinear in in almost the entire range of parameters,
specifically, for . For the all-pairs shortest paths
problem, our algorithm requires time, regardless of
the value of .
We also devise the first algorithm with non-trivial complexity guarantees for
computing exact shortest paths in the multipass semi-streaming model of
computation.
From the technical viewpoint, our algorithm computes a hopset of a
skeleton graph of without first computing itself. We then conduct
a Bellman-Ford exploration in , while computing the required edges
of on the fly. As a result, our algorithm computes exactly those edges of
that it really needs, rather than computing approximately the entire
Reconceptualising the Childâs Right to Development: Children and the Capability Approach
The article proposes adopting the Capability Approach as a theoretical framework to analyse
the childâs right to development. Currently, the childâs right to development is realised as the
childâs right to become an adult. This interpretation is problematic on several grounds, primarily
its usage of developmental psychology as an underlying narrative to conceptualise childhood
and interpret childrenâs rights, and its lack of respect for childrenâs agency. Using
the Capability Approachâs conception of âhuman developmentâ as an alternative framework
can change the way in which childhood and childrenâs development are conceptualised and,
consequently, change the interpretation of the childâs right to development. It can accommodate
simultaneously care for the childâs future and the childâs life at the present; promote
respect for a childâs agency and active participation in her own growth; and lay the foundations
for developing concrete measures of implementation
The Child's Right to Development
Protecting childrenâs development is a key principle of international childrenâs rights law. However, while the meanings of childrenâs development are a central concern of disciplines such as psychology, sociology, neurology and pedagogy, so far there has been no systematic analysis of the meaning of the childâs legal right to development. This thesis remedies this significant gap in our knowledge by establishing the foundations for analysing the childâs right to development, as protected by the UN Convention on the Rights of the Child. Interpreting the childâs right to development first requires unpacking the meaning of the term âchildrenâs developmentâ. In international childrenâs rights law, the thesis argues that the meaning of this term derives from the concept of children as âhuman becomingsâ. The focal point of this concept is the protection of childrenâs socio-psychological development and caring for their future, as adults. Consequently, the UN Convention on the Rights of the Child provides a broad protection for eight segments of childrenâs development, on top of protecting childrenâs overall right to development. Based on an analysis of the UN Committee on the Rights of the Childâs jurisprudence between the years 1993 and 2010, the thesis concludes that the Committee interprets the Convention in a way that subjugated most of the Conventionâs rights to protect childrenâs socio-psychological development, while overlooking the formulation of âdevelopmentâ as a human right. Based on literature on childhood studies, childrenâs rights theory, childrenâs development, the Capability Approach, archival research of the drafting process of the Convention, the jurisprudence of the UN Committee on the Rights of the Child, and interviews with members of the UN Committee, the thesis challenges this absorption of âchildrenâs developmentâ into legal terms, and suggests a new framework for analysis. This framework accommodates a hybrid conception of childhood, a respect for childrenâs agency, recognition of the importance of the process of maturation (âdevelopmentâ) as well as its outcome, and a cross-disciplinary understanding of âdevelopmentâ. Under the suggested framework, the childâs right to development is interpreted as a composite right that aims to ensure the childâs abilities to fulfill her or his human potential to the maximum during childhood and adulthood alike
How Long It Takes for an Ordinary Node with an Ordinary ID to Output?
In the context of distributed synchronous computing, processors perform in
rounds, and the time-complexity of a distributed algorithm is classically
defined as the number of rounds before all computing nodes have output. Hence,
this complexity measure captures the running time of the slowest node(s). In
this paper, we are interested in the running time of the ordinary nodes, to be
compared with the running time of the slowest nodes. The node-averaged
time-complexity of a distributed algorithm on a given instance is defined as
the average, taken over every node of the instance, of the number of rounds
before that node output. We compare the node-averaged time-complexity with the
classical one in the standard LOCAL model for distributed network computing. We
show that there can be an exponential gap between the node-averaged
time-complexity and the classical time-complexity, as witnessed by, e.g.,
leader election. Our first main result is a positive one, stating that, in
fact, the two time-complexities behave the same for a large class of problems
on very sparse graphs. In particular, we show that, for LCL problems on cycles,
the node-averaged time complexity is of the same order of magnitude as the
slowest node time-complexity.
In addition, in the LOCAL model, the time-complexity is computed as a worst
case over all possible identity assignments to the nodes of the network. In
this paper, we also investigate the ID-averaged time-complexity, when the
number of rounds is averaged over all possible identity assignments. Our second
main result is that the ID-averaged time-complexity is essentially the same as
the expected time-complexity of randomized algorithms (where the expectation is
taken over all possible random bits used by the nodes, and the number of rounds
is measured for the worst-case identity assignment).
Finally, we study the node-averaged ID-averaged time-complexity.Comment: (Submitted) Journal versio
Exact bounds for distributed graph colouring
We prove exact bounds on the time complexity of distributed graph colouring.
If we are given a directed path that is properly coloured with colours, by
prior work it is known that we can find a proper 3-colouring in communication rounds. We close the gap between upper and
lower bounds: we show that for infinitely many the time complexity is
precisely communication rounds.Comment: 16 pages, 3 figure
An ecohydrological journey of 4500 years reveals a stable but threatened precipitationâgroundwater recharge relation around Jerusalem
Groundwater is a key water resource in semiarid and seasonally dry regions around the world, which is replenished
by intermittent precipitation events and mediated by vegetation, soil, and regolith properties. Here, a climate
reconstruction of 4500 years for the Jerusalem region was used to determine the relation between climate, vegetation,
and groundwater recharge. Despite changes in air temperature and vegetation characteristics, simulated recharge
remained linearly related to precipitation over the entire analyzed period, with drier decades having lower rates
of recharge for a given annual precipitation due to soil memory effects. We show that in recent decades, the lack of
changes in the precipitationâgroundwater recharge relation results from the compensating responses of vegetation
to increasing CO2, i.e., increased leaf area and reduced stomatal conductance. This multicentury relation is
expected to be modified by climate change, with changes up to â20% in recharge for unchanged precipitation,
potentially jeopardizing water resource availability
Budgeted Dominating Sets in Uncertain Graphs
We study the Budgeted Dominating Set (BDS) problem on uncertain graphs, namely, graphs with a probability distribution p associated with the edges, such that an edge e exists in the graph with probability p(e). The input to the problem consists of a vertex-weighted uncertain graph ? = (V, E, p, ?) and an integer budget (or solution size) k, and the objective is to compute a vertex set S of size k that maximizes the expected total domination (or total weight) of vertices in the closed neighborhood of S. We refer to the problem as the Probabilistic Budgeted Dominating Set (PBDS) problem. In this article, we present the following results on the complexity of the PBDS problem.
1) We show that the PBDS problem is NP-complete even when restricted to uncertain trees of diameter at most four. This is in sharp contrast with the well-known fact that the BDS problem is solvable in polynomial time in trees. We further show that PBDS is ?[1]-hard for the budget parameter k, and under the Exponential time hypothesis it cannot be solved in n^o(k) time.
2) We show that if one is willing to settle for (1-?) approximation, then there exists a PTAS for PBDS on trees. Moreover, for the scenario of uniform edge-probabilities, the problem can be solved optimally in polynomial time.
3) We consider the parameterized complexity of the PBDS problem, and show that Uni-PBDS (where all edge probabilities are identical) is ?[1]-hard for the parameter pathwidth. On the other hand, we show that it is FPT in the combined parameters of the budget k and the treewidth.
4) Finally, we extend some of our parameterized results to planar and apex-minor-free graphs.
Our first hardness proof (Thm. 1) makes use of the new problem of k-Subset ?-? Maximization (k-SPM), which we believe is of independent interest. We prove its NP-hardness by a reduction from the well-known k-SUM problem, presenting a close relationship between the two problems
- âŠ