3,468 research outputs found
A software system for laboratory experiments in image processing
Laboratory experiments for image processing courses are usually software implementations of processing algorithms, but students of image processing come from diverse backgrounds with widely differing software experience. To avoid learning overhead, the software system should be easy to learn and use, even for those with no exposure to mathematical programming languages or object-oriented programming. The class library for image processing (CLIP) supports users with knowledge of C, by providing three C++ types with small public interfaces, including natural and efficient operator overloading. CLIP programs are compact and fast. Experience in using the system in undergraduate and graduate teaching indicates that it supports subject matter learning with little distraction from language/system learning
Treatment of input uncertainty in hydrologic modeling: Doing hydrology backward with Markov chain Monte Carlo simulation
There is increasing consensus in the hydrologic literature that an appropriate framework for streamflow forecasting and simulation should include explicit recognition of forcing and parameter and model structural error. This paper presents a novel Markov chain Monte Carlo (MCMC) sampler, entitled differential evolution adaptive Metropolis (DREAM), that is especially designed to efficiently estimate the posterior probability density function of hydrologic model parameters in complex, high-dimensional sampling problems. This MCMC scheme adaptively updates the scale and orientation of the proposal distribution during sampling and maintains detailed balance and ergodicity. It is then demonstrated how DREAM can be used to analyze forcing data error during watershed model calibration using a five-parameter rainfall-runoff model with streamflow data from two different catchments. Explicit treatment of precipitation error during hydrologic model calibration not only results in prediction uncertainty bounds that are more appropriate but also significantly alters the posterior distribution of the watershed model parameters. This has significant implications for regionalization studies. The approach also provides important new ways to estimate areal average watershed precipitation, information that is of utmost importance for testing hydrologic theory, diagnosing structural errors in models, and appropriately benchmarking rainfall measurement devices
Equifinality of formal (DREAM) and informal (GLUE) Bayesian approaches in hydrologic modeling?
In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented using the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchment
A Denotational Semantics for First-Order Logic
In Apt and Bezem [AB99] (see cs.LO/9811017) we provided a computational
interpretation of first-order formulas over arbitrary interpretations. Here we
complement this work by introducing a denotational semantics for first-order
logic. Additionally, by allowing an assignment of a non-ground term to a
variable we introduce in this framework logical variables.
The semantics combines a number of well-known ideas from the areas of
semantics of imperative programming languages and logic programming. In the
resulting computational view conjunction corresponds to sequential composition,
disjunction to ``don't know'' nondeterminism, existential quantification to
declaration of a local variable, and negation to the ``negation as finite
failure'' rule. The soundness result shows correctness of the semantics with
respect to the notion of truth. The proof resembles in some aspects the proof
of the soundness of the SLDNF-resolution.Comment: 17 pages. Invited talk at the Computational Logic Conference (CL
2000). To appear in Springer-Verlag Lecture Notes in Computer Scienc
Thermal environment in two broiler barns during the first three weeks of age
The objective of this research was to evaluate the internal thermal environment of two broiler barns featuring different ventilation systems representative of Brazilian and South American poultry production industry: (a) a negative-pressure tunnel and (b) a positive- pressure lateral ventilation system. Environmental parameters such as dry bulb temperature, relative humidity and temperature-humidity index were assessed; temperature maps for day and night average conditions were determined for the first three weeks of life. Better uniformity of the thermal environment and comfort conditions inside the negative-pressure tunnel were found
Simulations of neutron background in a time projection chamber relevant to dark matter searches
Presented here are results of simulations of neutron background performed for
a time projection chamber acting as a particle dark matter detector in an
underground laboratory. The investigated background includes neutrons from rock
and detector components, generated via spontaneous fission and (alpha, n)
reactions, as well as those due to cosmic-ray muons. Neutrons were propagated
to the sensitive volume of the detector and the nuclear recoil spectra were
calculated. Methods of neutron background suppression were also examined and
limitations to the sensitivity of a gaseous dark matter detector are discussed.
Results indicate that neutrons should not limit sensitivity to WIMP-nucleon
interactions down to a level of (1 - 3) x 10^{-8} pb in a 10 kg detector.Comment: 27 pages (total, including 3 tables and 11 figures). Accepted for
publication in Nuclear Instruments and Methods in Physics Research - Section
A Hybrid Artificial Bee Colony Algorithm for Graph 3-Coloring
The Artificial Bee Colony (ABC) is the name of an optimization algorithm that
was inspired by the intelligent behavior of a honey bee swarm. It is widely
recognized as a quick, reliable, and efficient methods for solving optimization
problems. This paper proposes a hybrid ABC (HABC) algorithm for graph
3-coloring, which is a well-known discrete optimization problem. The results of
HABC are compared with results of the well-known graph coloring algorithms of
today, i.e. the Tabucol and Hybrid Evolutionary algorithm (HEA) and results of
the traditional evolutionary algorithm with SAW method (EA-SAW). Extensive
experimentations has shown that the HABC matched the competitive results of the
best graph coloring algorithms, and did better than the traditional heuristics
EA-SAW when solving equi-partite, flat, and random generated medium-sized
graphs
Linguistic DNA: Investigating Conceptual Change in Early Modern English Discourse
This article describes the background and premises of the AHRC-funded project, ‘The Linguistic DNA of Modern Western Thought’. We offer an empirical, encyclopaedic approach to historical semantics regarding ‘conceptual history’, i.e. the history of concepts that shape thought, culture and society in a particular period. We relate the project to traditional work in conceptual and semantic history and define our object of study as the discursive concept, a category of meaning encoded linguistically as a cluster of expressions that co-occur in discourse. We describe our principal data source, EEBO-TCP, and introduce our key research interests, namely, the contexts of conceptual change, the semantic structure of lexical fields and the nature of lexicalisation pressure. We outline our computational processes, which build upon the theoretical definition of discursive concepts, to discover the linguistically encoded forms underpinning the discursive concepts we seek to identify in EEBO-TCP. Finally, we share preliminary results via a worked example, exploring the discursive contexts in which paradigmatic terms of key cultural concepts emerge. We consider the extent to which particular genres, discourses and users in the early modern period make paradigms, and examine the extent to which these contexts determine the characteristics of key concepts
Time-in-area represents foraging activity in a wide-ranging pelagic forager
Successful Marine Spatial Planning depends upon the identification of areas with high importance for particular species, ecosystems or processes. For seabirds, advancements in biologging devices have enabled us to identify these areas through the detailed study of at-sea behaviour. However, in many cases, only positional data are available and the presence of local biological productivity and hence seabird foraging behaviour is inferred from these data alone, under the untested assumption that foraging activity is more likely to occur in areas where seabirds spend more time. We fitted GPS devices and accelerometers to northern gannets Morus bassanus and categorised the behaviour of individuals outside the breeding colony as plunge diving, surface foraging, floating and flying. We then used the locations of foraging events to test the efficiency of 2 approaches: time-in-area and kernel density (KD) analyses, which are widely employed to detect highly-used areas and interpret foraging behaviour from positional data. For KD analyses, the smoothing parameter (h) was calculated using the ad hoc method (KDad hoc), and KDh=9.1, where h = 9.1 km, to designate core foraging areas from location data. A high proportion of foraging events occurred in core foraging areas designated using KDad hoc, KDh=9.1, and time-in-area. Our findings demonstrate that foraging activity occurs in areas where seabirds spend more time, and that both KD analysis and the time-in-area approach are equally efficient methods for this type of analysis. However, the time-in-area approach is advantageous in its simplicity, and in its ability to provide the shapes commonly used in planning. Therefore, the time-in-area approach can be used as a simple way of using seabirds to identify ecologically important locations from both tracking and survey data
Degeneracies when T=0 Two Body Matrix Elements are Set Equal to Zero and Regge's 6j Symmetry Relations
The effects of setting all T=0 two body interaction matrix elements equal to
a constant (or zero) in shell model calculations (designated as ) are
investigated. Despite the apparent severity of such a procedure, one gets
fairly reasonable spectra. We find that using in single j shell
calculations degeneracies appear e.g. the and
states in Sc are at the same excitation energies; likewise the
I=,,9 and 10 states in Ti. The
above degeneracies involve the vanishing of certain 6j and 9j symbols. The
symmetry relations of Regge are used to explain why these vanishings are not
accidental. Thus for these states the actual deviation from degeneracy are good
indicators of the effects of the T=0 matrix elements. A further indicator of
the effects of the T=0 interaction in an even - even nucleus is to compare the
energies of states with odd angular momentum with those that are even
- …
