2,231 research outputs found
Auxiliary Variables in TLA+
Auxiliary variables are often needed for verifying that an implementation is correct with respect to a higher-level specification. They augment the formal description of the implementation without changing its semantics–that is, the set of behaviors that it describes. This paper explains rules for adding history, prophecy, and stuttering variables to TLA+ specifications, ensuring that the augmented specification is equivalent to the original one. The rules are explained with toy examples, and they are used to verify the correctness of a simplified version of a snapshot algorithm due to Afek et al
Prostorno mapiranje kemijskih svojstava tla koristeći multivarijatnu geostatistiku. Studija s oraničnih tala u istočnoj Hrvatskoj
The spatial variability of soil chemical properties is affected by factors of soil formation and human activities. Understanding their spatial variability will improve agricultural production, reduce environmental problems (e.g., soil pollution, offsite effects), and achieve sustainable agroecosystems. The main objective was to study the spatial variability of pH, soil organic matter, available phosphorus, and available potassium using univariate and multivariate methods in cropland fields in eastern Croatia. For the study, 169 (0-30 cm) soil samples were collected in a 911 ha study area. The results showed that soils had slightly acidic pH, adequate available phosphorus and potassium values for crop production, and low soil organic matter concentration. The variability was high in available phosphorus and low in pH. Soil pH, soil organic matter, available phosphorus, and potassium nugget/sill ratio was 0.00, 2.79, 18.68, and 22.08, respectively. Auxiliary variables increased the accuracy of the predictions. Soil organic matter levels were below the recommendable, and this is very likely an anthropogenic effect, even though the intrinsic process influences soil organic matter. The heterogeneous distribution of phosphorus and potassium highlighted the necessity of fertilization in some areas. For the sustainability of agroecosystems, adaptable site-specific soil management strategies need to be implemented.Prostorna varijabilnost kemijskih svojstava tla uvjetovana je pedogenetskim čimbenicima i ljudskom aktivnošću. Razumijevanje prostorne varijabilnosti poboljšati će poljoprivrednu proizvodnju, smanjiti okolišne probleme (npr. zagađenje tla, off-site učinci), i postići održivost agroekosustava. Glavni cilj rada je istraživanje prostorne varijabilnosti pH, organske tvari i biljci pristupačnog fosfora i kalija, koristeći univarijatne i multivarijatne metode na oraničnim tlima u istočnoj Hrvatskoj. Za rad je prikupljeno 169 (0-30 cm) uzoraka tla s površine od 911 ha. Rezultati pokazuju da su tla blago kisela, adekvatnog sadržaja biljci pristupačnog fosfora i kalija za biljnu proizvodnju i niskog sadržaja organske tvari tla. Varijabilnost je visoka kod biljci pristupačnog fosfora i niska kod pH tla. pH tla, organska tvar te biljci pristupačan fosfor i kalij imaju nuget/sill omjer 0.00, 2.79, 18.68, i 22.08. Pomoćni podaci povećali su preciznost predikcije. Identificiran je sadržaj organske tvari tla ispod preporučljive razine i to vrlo vjerojatno radi antropogenog utjecaja, iako i pedogenetska svojstva utječu na organsku tvar tla. Heterogena distribucija fosfora i kalija istaknula je nužnost za gnojidbom u nekim područjima. Za održivost agroekosustava potrebno je provesti prilagodljive strategije korištenja i upravljanja tlima na svakoj pojedinoj lokaciji
Computer-Assisted Program Reasoning Based on a Relational Semantics of Programs
We present an approach to program reasoning which inserts between a program
and its verification conditions an additional layer, the denotation of the
program expressed in a declarative form. The program is first translated into
its denotation from which subsequently the verification conditions are
generated. However, even before (and independently of) any verification
attempt, one may investigate the denotation itself to get insight into the
"semantic essence" of the program, in particular to see whether the denotation
indeed gives reason to believe that the program has the expected behavior.
Errors in the program and in the meta-information may thus be detected and
fixed prior to actually performing the formal verification. More concretely,
following the relational approach to program semantics, we model the effect of
a program as a binary relation on program states. A formal calculus is devised
to derive from a program a logic formula that describes this relation and is
subject for inspection and manipulation. We have implemented this idea in a
comprehensive form in the RISC ProgramExplorer, a new program reasoning
environment for educational purposes which encompasses the previously developed
RISC ProofNavigator as an interactive proving assistant.Comment: In Proceedings THedu'11, arXiv:1202.453
A multi-paradigm language for reactive synthesis
This paper proposes a language for describing reactive synthesis problems
that integrates imperative and declarative elements. The semantics is defined
in terms of two-player turn-based infinite games with full information.
Currently, synthesis tools accept linear temporal logic (LTL) as input, but
this description is less structured and does not facilitate the expression of
sequential constraints. This motivates the use of a structured programming
language to specify synthesis problems. Transition systems and guarded commands
serve as imperative constructs, expressed in a syntax based on that of the
modeling language Promela. The syntax allows defining which player controls
data and control flow, and separating a program into assumptions and
guarantees. These notions are necessary for input to game solvers. The
integration of imperative and declarative paradigms allows using the paradigm
that is most appropriate for expressing each requirement. The declarative part
is expressed in the LTL fragment of generalized reactivity(1), which admits
efficient synthesis algorithms, extended with past LTL. The implementation
translates Promela to input for the Slugs synthesizer and is written in Python.
The AMBA AHB bus case study is revisited and synthesized efficiently,
identifying the need to reorder binary decision diagrams during strategy
construction, in order to prevent the exponential blowup observed in previous
work.Comment: In Proceedings SYNT 2015, arXiv:1602.0078
Hiding variables when decomposing specifications into GR(1) contracts
We propose a method for eliminating variables from component specifications during the decomposition of GR(1) properties into contracts. The variables that can be eliminated are identified by parameterizing the communication architecture to investigate the dependence of realizability on the availability of information. We prove that the selected variables can be hidden from other components, while still expressing the resulting specification as a game with full information with respect to the remaining variables. The values of other variables need not be known all the time, so we hide them for part of the time, thus reducing the amount of information that needs to be communicated between components. We improve on our previous results on algorithmic decomposition of GR(1) properties, and prove existence of decompositions in the full information case. We use semantic methods of computation based on binary decision diagrams. To recover the constructed specifications so that humans can read them, we implement exact symbolic minimal covering over the lattice of integer orthotopes, thus deriving minimal formulae in disjunctive normal form over integer variable intervals
Fast Reconstruction and Data Scouting
Data scouting, introduced by CMS in 2011, is the use of specialized data
streams based on reduced event content, enabling LHC experiments to record
unprecedented numbers of proton-proton collision events that would otherwise be
rejected by the usual filters. These streams were created to maintain
sensitivity to new light resonances decaying to jets or muons, while requiring
minimal online and offline resources, and taking advantage of the fast and
accurate online reconstruction algorithms of the high-level trigger. The
viability of this technique was demonstrated by CMS in 2012, when 18.8
fb of collision data at TeV were collected and analyzed.
For LHC Run 2, CMS, ATLAS, and LHCb implemented or expanded similar
reduced-content data streams, promoting the concept to an essential and
flexible discovery tool for the LHC.Comment: 8 pages, 5 figures; submitted to proceedings of Connecting the Dots
201
- …