3,675,550 research outputs found
Preliminary validation of an indirect method for discharge evaluation of Pertuso Spring (Central Italy)
This paper deals with the results of the first year of the Environmental Monitoring Plan, related to the catchment project of Pertuso Spring, which is going to be exploited to supply an important water network in the South part of Roma district. The study area is located in the Upper Valley of the Aniene River (Latium, Central Italy), in the outcrop of Triassic-Cenozoic carbonate rocks, and belong to an important karst aquifer. Pertuso Spring is the main outlet of this karst aquifer and is the one of the most important water resource in the southeast part of Latium Region, used for drinking, agriculture and hydroelectric supplies. Karst aquifer feeding Pertuso Spring is an open hydrogeological system aquifer characterized by complex interactions and exchanges between groundwater and surface water which influence the aquifer water budget. Thus, evaluation of groundwater discharge from this karst spring can be affected by difficulties in performing measurements because of the insufficient knowledge about water transfer processes in the hydrological cycle and geometry of drainage conduits.
The aim of this paper is to assess the interactions between karst aquifer feeding Pertuso Spring and Aniene River based on stream discharge measurements and water geochemical tracer data in order to validate an indirect method for karst spring discharge evaluation. As a matter of fact, in this paper, there are presented the results of the application of Magnesium as a reliable tracer of karst spring discharge. This indirect method is based on the elaboration of surface water discharge measurements in relationship with Mg2+ concentration values, determined as for groundwater, coming from Pertuso Spring, as for surface water sample, collected upstream and downstream of Pertuso Spring, along Aniene River streamflow. The application of Magnesium as an environmental tracer provides a means to evaluate discharge of Pertuso Spring, as it came up to be a marker of the mixing of surface water and groundwater. On the other hand, the Magnesium ion concentration provides information for the identification of groundwater flow systems and the main hydrogeochemical processes affecting the composition of water within the karst aquifers
An information architecture for validating courseware
Courseware validation should locate Learning Objects inconsistent with the courseware instructional design being used. In order for validation to take place it is necessary to identify the implicit and explicit information needed for validation. In this paper, we identify this information and formally define an information architecture to model courseware validation information explicitly. This promotes tool-support for courseware validation and its interoperability with the courseware specifications
Synthesizing Short-Circuiting Validation of Data Structure Invariants
This paper presents incremental verification-validation, a novel approach for
checking rich data structure invariants expressed as separation logic
assertions. Incremental verification-validation combines static verification of
separation properties with efficient, short-circuiting dynamic validation of
arbitrarily rich data constraints. A data structure invariant checker is an
inductive predicate in separation logic with an executable interpretation; a
short-circuiting checker is an invariant checker that stops checking whenever
it detects at run time that an assertion for some sub-structure has been fully
proven statically. At a high level, our approach does two things: it statically
proves the separation properties of data structure invariants using a static
shape analysis in a standard way but then leverages this proof in a novel
manner to synthesize short-circuiting dynamic validation of the data
properties. As a consequence, we enable dynamic validation to make up for
imprecision in sound static analysis while simultaneously leveraging the static
verification to make the remaining dynamic validation efficient. We show
empirically that short-circuiting can yield asymptotic improvements in dynamic
validation, with low overhead over no validation, even in cases where static
verification is incomplete
What does validation of cases in electronic record databases mean? The potential contribution of free text
Electronic health records are increasingly used for research. The definition of cases or endpoints often relies on the use of coded diagnostic data, using a pre-selected group of codes. Validation of these cases, as ‘true’ cases of the disease, is crucial. There are, however, ambiguities in what is meant by validation in the context of electronic records. Validation usually implies comparison of a definition against a gold standard of diagnosis and the ability to identify false negatives (‘true’ cases which were not detected) as well as false positives (detected cases which did not have the condition). We argue that two separate concepts of validation are often conflated in existing studies. Firstly, whether the GP thought the patient was suffering from a particular condition (which we term confirmation or internal validation) and secondly, whether the patient really had the condition (external validation). Few studies have the ability to detect false negatives who have not received a diagnostic code. Natural language processing is likely to open up the use of free text within the electronic record which will facilitate both the validation of the coded diagnosis and searching for false negatives
Field validation of a dusting cloth for mycological surveillance of surfaces
Efficient monitoring of surfaces for spores of filamentous fungi is essential for detecting minor contamination even when air samples test negative for fungi. This study evaluates and compares a pad prepared using a dusting cloth with Rodac contact plates and humidified swabs for detecting mycological contamination, and concludes that the new method is superior and cheaper
Systems validation: application to statistical programs
BACKGROUND: In 2003, the United States Food and Drug Administration (FDA) released a guidance document on the scope of "Part 11" enforcement. In this guidance document, the FDA indicates an expectation of a risk-based approach to determining which systems should undergo validation. Since statistical programs manage and manipulate raw data, their implementation should be critically reviewed to determine whether or not they should undergo validation. However, the concepts of validation are not often discussed in biostatistics curriculum. DISCUSSION: This paper summarizes a "Plan, Do, Say" approach to validation that can be incorporated into statistical training so that biostatisticians can understand and implement validation principles in their research. SUMMARY: Validation is a process that requires dedicated attention. The process of validation can be easily understood in the context of the scientific method
Validation of suply chain integration scales and the effect that size, level of vertical integration, industry and country have on their value
Academic literature would appear to indicate that supply
chain integration is a major factor in explaining the performance of companies.
The results are nevertheless inconclusive. Certain authors put forward the idea that
the vast range of results obtained are due, amongst other things, to the fact that
there is no exactness to the group of scales used, no-one has yet published an
analysis of the measurement models nor clear benchmarks. In this paper, we present
the theoretical definition of four supply chain integration scales (clients, suppliers,
external and internal) and the convergent and discriminant validity of a
measurement instrument based on a list of items taken from earlier papers. We also
propose a benchmark to interpret the scales by percentiles, using a diverse international
sampling broken down into sub-samples based on sector, type of company,
size of company and degree of vertical integration.Ministerio de Ciencia e Innovación DPI2010-18243Ministerio de Educación y Ciencia DPI2006-0553
Model-driven transformation and validation of adaptive educational hypermedia using CAVIAr
Authoring of Adaptive Educational Hypermedia is a complex activity requiring the combination of a range of design and validation techniques.We demonstrate how Adaptive Educational Hypermedia can be transformed into CAVIAr courseware validation models allowing for its validation. The model-based representation and analysis of different concerns and model-based mappings and transformations are key contributors to this integrated solution. We illustrate the benefits of Model Driven Engineering methodologies that allow for interoperability between CAVIAr and a well
known Adaptive Educational Hypermedia framework. By allowing for the validation of Adaptive Educational Hypermedia, the course creator limits the risk of pedagogical problems in migrating to Adaptive Educational Hypermedia from static courseware
- …
