573 research outputs found
Theory of Change
Horizon 2045 (H2045) is a 25-year initiative to end the nuclear weapons century.
We urgently need to manage the intertwined existential risks of the Anthropocene—the geological era that began with the 1945 Trinity Test and is characterized by humankind’s newfound capacity to destroy itself along with all life on the planet.
Recent research has shown that concerns about existential threats have become palpable, as has the desire to solve human-made problems and move to a brighter future. This offers an important opportunity: By considering nuclear weapons in the context of other dangers, we can dismantle conventional wisdom that nuclear weapons are tools for maintaining global stability, drawing new energy to the effort to rid ourselves of them.
What makes H2045 unique is that we bring a new theory of change. Rather than centering solely on nuclear weapons, our theory of change creates common ground for organizations and thought leaders who share our vision: Humanity can, and will, move beyond the existential challenges we now face. By shifting our sole focus from nuclear challenges to a broader conception of global security, we increase the surface area for collaboration and shared learning. In so doing, we lay the groundwork for a much larger-scale effort.
This document is an invitation to think with us. It is the product of a collaborative effort. It is a snapshot of a work in process. It raises more questions than it answers. It is intended to shake the current paradigm. It uses speculative techniques to bring alternate futures to life. It may cause discomfort. It may cause inspiration.
We think this kind of work is important in shaping debates, changing narratives, and provoking change. We invite you to use this document as a jumping off point for thinking big and long term. It does not need to be read all at once. You may skip to the section that seems most intriguing and start there. What questions does it raise for you? What questions remain to be asked and answered? What answers might you have?
There is a great deal that must be done. In our next phase we will be working to translate these insights into pragmatic solutions. Inspiration and vision light the way for that journey. H2045 will expand to include others in the development of a vision that inspires change.https://digitalcommons.risd.edu/cfc_projectsprograms_globalsecurity_horizon2045/1001/thumbnail.jp
Algorithms and Bounds for Very Strong Rainbow Coloring
A well-studied coloring problem is to assign colors to the edges of a graph
so that, for every pair of vertices, all edges of at least one shortest
path between them receive different colors. The minimum number of colors
necessary in such a coloring is the strong rainbow connection number
(\src(G)) of the graph. When proving upper bounds on \src(G), it is natural
to prove that a coloring exists where, for \emph{every} shortest path between
every pair of vertices in the graph, all edges of the path receive different
colors. Therefore, we introduce and formally define this more restricted edge
coloring number, which we call \emph{very strong rainbow connection number}
(\vsrc(G)).
In this paper, we give upper bounds on \vsrc(G) for several graph classes,
some of which are tight. These immediately imply new upper bounds on \src(G)
for these classes, showing that the study of \vsrc(G) enables meaningful
progress on bounding \src(G). Then we study the complexity of the problem to
compute \vsrc(G), particularly for graphs of bounded treewidth, and show this
is an interesting problem in its own right. We prove that \vsrc(G) can be
computed in polynomial time on cactus graphs; in contrast, this question is
still open for \src(G). We also observe that deciding whether \vsrc(G) = k
is fixed-parameter tractable in and the treewidth of . Finally, on
general graphs, we prove that there is no polynomial-time algorithm to decide
whether \vsrc(G) \leq 3 nor to approximate \vsrc(G) within a factor
, unless PNP
After the A-Bomb
RISD’s Center for Complexity launches Horizon 2045, a 25-year project aimed at eliminating the threat of nuclear war.https://digitalcommons.risd.edu/cfc_projectsprograms_globalsecurity_horizon2045/1000/thumbnail.jp
Multislice Modularity Optimization in Community Detection and Image Segmentation
Because networks can be used to represent many complex systems, they have
attracted considerable attention in physics, computer science, sociology, and
many other disciplines. One of the most important areas of network science is
the algorithmic detection of cohesive groups (i.e., "communities") of nodes. In
this paper, we algorithmically detect communities in social networks and image
data by optimizing multislice modularity. A key advantage of modularity
optimization is that it does not require prior knowledge of the number or sizes
of communities, and it is capable of finding network partitions that are
composed of communities of different sizes. By optimizing multislice modularity
and subsequently calculating diagnostics on the resulting network partitions,
it is thereby possible to obtain information about network structure across
multiple system scales. We illustrate this method on data from both social
networks and images, and we find that optimization of multislice modularity
performs well on these two tasks without the need for extensive
problem-specific adaptation. However, improving the computational speed of this
method remains a challenging open problem.Comment: 3 pages, 2 figures, to appear in IEEE International Conference on
Data Mining PhD forum conference proceeding
Complexity Estimates for Two Uncoupling Algorithms
Uncoupling algorithms transform a linear differential system of first order
into one or several scalar differential equations. We examine two approaches to
uncoupling: the cyclic-vector method (CVM) and the
Danilevski-Barkatou-Z\"urcher algorithm (DBZ). We give tight size bounds on the
scalar equations produced by CVM, and design a fast variant of CVM whose
complexity is quasi-optimal with respect to the output size. We exhibit a
strong structural link between CVM and DBZ enabling to show that, in the
generic case, DBZ has polynomial complexity and that it produces a single
equation, strongly related to the output of CVM. We prove that algorithm CVM is
faster than DBZ by almost two orders of magnitude, and provide experimental
results that validate the theoretical complexity analyses.Comment: To appear in Proceedings of ISSAC'13 (21/01/2013
Signals for the Future: New Ways to Tackle Nuclear Risk
N Square is a hands-on alliance between staff, funders, and advisors working together to bring new ideas, people, and perspectives to nuclear arms control. At the heart of N Square is the Innovators Network, a vibrant community of cross-sector leaders prototyping and piloting breakthrough approaches including open intelligence platforms, movement-building tools, and innovative uses of technologies such as blockchain, AI, and machine learning to verify arms control agreements. Innovators Network fellows are visionary leaders who recognize the opportunities embedded within tough challenges. They\u27re scientists, game designers, Hollywood screenwriters, global security gurus. They\u27re inventors, branding specialists, diplomats, Gen-Y satellite imagery analysts, and public radio producers. They\u27re also philanthropists keenly interested in sustaining systemic innovation and helping collaborative networks flourish. This book describes creative solutions developed by our first cohort of innovation fellows.https://digitalcommons.risd.edu/cfc_projectsprograms_globalsecurity_nsquareinnovatorsnetwork/1001/thumbnail.jp
Solving Partition Problems Almost Always Requires Pushing Many Vertices Around
A fundamental graph problem is to recognize whether the vertex set of a graph G can be bipartitioned into sets A and B such that G[A] and G[B] satisfy properties Pi_A and Pi_B, respectively. This so-called (Pi_A,Pi_B)-Recognition problem generalizes amongst others the recognition of 3-colorable, bipartite, split, and monopolar graphs. A powerful algorithmic technique that can be used to obtain fixed-parameter algorithms for many cases of (Pi_A,Pi_B)-Recognition, as well as several other problems, is the pushing process. For bipartition problems, the process starts with an "almost correct" bipartition (A\u27,B\u27), and pushes appropriate vertices from A\u27 to B\u27 and vice versa to eventually arrive at a correct bipartition.
In this paper, we study whether (Pi_A,Pi_B)-Recognition problems for which the pushing process yields fixed-parameter algorithms also admit polynomial problem kernels. In our study, we focus on the first level above triviality, where Pi_A is the set of P_3-free graphs (disjoint unions of cliques, or cluster graphs), the parameter is the number of clusters in the cluster graph G[A], and Pi_B is characterized by a set H of connected forbidden induced subgraphs. We prove that, under the assumption that NP not subseteq coNP/poly, (Pi_A,Pi_B)-Recognition admits a polynomial kernel if and only if H contains a graph of order at most 2. In both the kernelization and the lower bound results, we make crucial use of the pushing process
- …