12,095 research outputs found
Climate change and the Delta, San Francisco Estuary and Watershed Science
Anthropogenic climate change amounts to a rapidly approaching, “new” stressor in the Sacramento–San Joaquin Delta system. In response to California’s extreme natural hydroclimatic variability, complex water-management systems have been developed, even as the Delta’s natural ecosystems have been largely devastated. Climate change is projected to challenge these management and ecological systems in different ways that are characterized by different levels of uncertainty. For example, there is high certainty that climate will warm by about 2°C more (than late-20th-century averages) by mid-century and about 4°C by end of century, if greenhouse-gas emissions continue their current rates of acceleration. Future precipitation changes are much less certain, with as many climate models projecting wetter conditions as drier. However, the same projections agree that precipitation will be more intense when storms do arrive, even as more dry days will separate storms. Warmer temperatures will likely enhance evaporative demands and raise water temperatures. Consequently, climate change is projected to yield both more extreme flood risks and greater drought risks. Sea level rise (SLR) during the 20th century was about 22cm, and is projected to increase by at least 3-fold this century. SLR together with land subsidence threatens the Delta with greater vulnerabilities to inundation and salinity intrusion. Effects on the Delta ecosystem that are traceable to warming include SLR, reduced snowpack, earlier snowmelt and larger storm-driven streamflows, warmer and longer summers, warmer summer water temperatures, and water-quality changes. These changes and their uncertainties will challenge the operations of water projects and uses throughout the Delta’s watershed and delivery areas. Although the effects of climate change on Delta ecosystems may be profound, the end results are difficult to predict, except that native species will fare worse than invaders. Successful preparation for the coming changes will require greater integration of monitoring, modeling, and decision making across time, variables, and space than has been historically normal
Time Dependence of Particle Creation from Accelerating Mirrors
Particle production due to a quantized, massless, minimally coupled scalar
field in two-dimensional flat spacetime with an accelerating mirror is
investigated, with a focus on the time dependence of the process. We analyze
first the classes of trajectories previously investigated by Carlitz and Willey
and by Walker and Davies. We then analyze four new classes of trajectories, all
of which can be expressed analytically and for which several ancillary
properties can be derived analytically. The time dependence is investigated
through the use of wave packets for the modes of the quantized field that are
in the out vacuum state. It is shown for most of the trajectories studied that
good time resolution of the particle production process can be obtained.Comment: 21 pages, 5 figure
Black Hole - Moving Mirror II: Particle Creation
There is an exact correspondence between the simplest solution to Einstein's
equations describing the formation of a black hole and a particular moving
mirror trajectory. In both cases the Bogolubov coefficients in 1+1 dimensions
are identical and can be computed analytically. Particle creation is
investigated by using wave packets. The entire particle creation history is
computed, incorporating the early-time non-thermal emission due to the
formation of the black hole (or the early-time acceleration of the moving
mirror) and the evolution to a Planckian spectrum.Comment: Contribution to MG14 Proceedings, 5 pages, 4 figure
Mirror Reflections of a Black Hole
An exact correspondence between a black hole and an accelerating mirror is
demonstrated. It is shown that for a massless minimally coupled scalar field
the same Bogolubov coefficients connecting the "in" and "out" states occur for
a (1+1)D flat spacetime with a particular perfectly reflecting accelerating
boundary trajectory and a (1+1)D curved spacetime in which a null shell
collapses to form a black hole. Generalization of the latter to the (3+1)D case
is discussed. The spectral dynamics is computed in both (1+1)-dimensional
spacetimes along with the energy flux in the spacetime with a mirror. It is
shown that the approach to equilibrium is monotonic, asymmetric in terms of the
rate, and there is a specific time which characterizes the system when it is
the most out-of-equilibrium.Comment: 25 pages, 7 figure
Physical Representation-based Predicate Optimization for a Visual Analytics Database
Querying the content of images, video, and other non-textual data sources
requires expensive content extraction methods. Modern extraction techniques are
based on deep convolutional neural networks (CNNs) and can classify objects
within images with astounding accuracy. Unfortunately, these methods are slow:
processing a single image can take about 10 milliseconds on modern GPU-based
hardware. As massive video libraries become ubiquitous, running a content-based
query over millions of video frames is prohibitive.
One promising approach to reduce the runtime cost of queries of visual
content is to use a hierarchical model, such as a cascade, where simple cases
are handled by an inexpensive classifier. Prior work has sought to design
cascades that optimize the computational cost of inference by, for example,
using smaller CNNs. However, we observe that there are critical factors besides
the inference time that dramatically impact the overall query time. Notably, by
treating the physical representation of the input image as part of our query
optimization---that is, by including image transforms, such as resolution
scaling or color-depth reduction, within the cascade---we can optimize data
handling costs and enable drastically more efficient classifier cascades.
In this paper, we propose Tahoma, which generates and evaluates many
potential classifier cascades that jointly optimize the CNN architecture and
input data representation. Our experiments on a subset of ImageNet show that
Tahoma's input transformations speed up cascades by up to 35 times. We also
find up to a 98x speedup over the ResNet50 classifier with no loss in accuracy,
and a 280x speedup if some accuracy is sacrificed.Comment: Camera-ready version of the paper submitted to ICDE 2019, In
Proceedings of the 35th IEEE International Conference on Data Engineering
(ICDE 2019
Black Hole - Moving Mirror I: An Exact Correspondence
An exact correspondence is shown between a new moving mirror trajectory in
(1+1)D and a spacetime in (1+1)D in which a black hole forms from the collapse
of a null shell. It is shown that the Bogolubov coefficients between the "in"
and "out" states are identical and the exact Bogolubov coefficients are
displayed. Generalization to the (3+1)D black hole case is discussed.Comment: Contribution to MG14 Proceedings, 5 pages, 1 figur
Logic, self-awareness and self-improvement: The metacognitive loop and the problem of brittleness
This essay describes a general approach to building perturbation-tolerant autonomous systems, based on the conviction that artificial agents should be able notice when something is amiss, assess the anomaly, and guide a solution into place. We call this basic strategy of self-guided learning the metacognitive loop; it involves the system monitoring, reasoning about, and, when necessary, altering its own decision-making components. In this essay, we (a) argue that equipping agents with a metacognitive loop can help to overcome the brittleness problem, (b) detail the metacognitive loop and its relation to our ongoing work on time-sensitive commonsense reasoning, (c) describe specific, implemented systems whose perturbation tolerance was improved by adding a metacognitive loop, and (d) outline both short-term and long-term research agendas
- …