16 research outputs found
Options for a new integrated natural resources monitoring framework for Wales. Phase 1 project report
Healthy natural resources underpin significant economic sectors in Wales including agriculture, fisheries, tourism and forestry, they also make a significant contribution across Cabinet policies including the health and well-being agenda. In order to develop policies that build social, economic and environmental resilience and to evaluate policy implementation, a robust natural resources monitoring framework is required. Current monitoring activities are of varying quality, not sufficiently aligned to the new legislative and policy landscape, disjointed and when considered as a whole, potentially not as cost-effective as they could be. This project was tasked with identifying options and developing recommendations for an integrated natural resources monitoring framework for Wales reflecting the ambitions and integrating principles of the Environment Act and Well Being of Future Generations Act. The monitoring community, the Welsh Government and Natural Resources Wales Core Evidence Group, the project
team, stakeholders and partners, have agreed on a set of recommendations
Recommended from our members
Novel approaches to plant pest risk assessment
Pest risk assessment is an essential yet problematic stage in pest risk analysis (PRA) that concerns the likelihood and consequences of pest introduction. The aim of this study was to develop methodologies for risk assessment and to explore different approaches that could lead to the development of new methods for practical PRA in line with the requirement of "scientific justification" by World Trade Organisation and Food and Agriculture Organisation of the United Nations.
Current international practices were discussed and research reviewed on qualitative and quantitative approaches to risk assessment. It was proposed that risk assessment be divided into two steps: Pest risk identification (PRI) and pest risk evaluation (PRE). Mind Mapping was a valuable tool for PRI that reduced ambiguity and increased transparency. Approaches to PRE were proposed that facilitated the scoring and weighting of risk factors, and the subsequent combining of risk scores. Several methods were developed to incorporate weighting into PRA, which included subjectively assigned weighting and Delphi technique-derived weighting. Metrics for combining risk scores into an overall risk value were also explored, compared and evaluated.
Correlation and interaction between risk factors were analysed, which revealed that some risk factors were highly correlated and some were relatively independent, which meant there was some information redundancy, and therefore simplification of risk assessment was possible. Cluster analysis was applied to risk factor scores and different clusters of risk factors were identified: some more appropriate for preliminary assessment; some for determining the level of risk; and some could be eliminated.
A method to apply Principal Components Analysis (PCA) to derive weighting for individual risk factors was developed. PCA could be applied to historical data of pest introductions, previous PRA cases, or expert opinion. Genetic algorithms implemented in the software BEAGLE, were applied to PRA data. The rules obtained could distinguish high-risk situations with high accuracy, which was useful in predicting the risk of an organism by using a simplified set of conditions.
The results showed that weightings and rules differed for different taxonomic groups. Therefore it was implausible to develop a generic scheme in this way. However, it may be possible to develop patterns based on taxonomy. The results of applying several different techniques all suggested that by grouping risk factors for different purposes, risk assessment could be simplified without compromising rigor, because a) some factors were redundant; b) some factors are more important than others; and c) high risk situation could be predicted with a few key factors
Software asset management processes and model
The industry must now focus on software assets in order to improve the management of purchased software and their associated licenses: over the years, organizations have indeed purchased a significant amount of commercial software and they now have to manage their related costs while ensuring that the license's terms and conditions are respected.
Until now, the industry has been offering incomplete solutions to the management of software assets while using different approaches, terminologies and tools with varying functional scopes. The industry recognizes the need to improve Software Asset Management (SAM) but does not agree on the means to do so. This thesis proposes to start with a common industry SAM definition. To help organizations use the processes that constitute the SAM definition, a descriptive analysis of the processes, an assessment method and a graphical representation are provided to facilitate its use in the industry. Furthermore, to ensure the set of processes reflect the view and needs of the industry; the author actively participated in the writing of the ISO standard on SAM: the panel of experts contributing to ISO also provided a mean to validate several of the SAM topics
discussed in this thesis.
The research objectives are to:
1. Actively contribute to the development and to the content of the ISO international standard on SAM (ISO/IEC 19770-1).
2. Capture, idenfify and analyze elements that are relevant to SAM, including those that would not make it into the final version of the international standard.
3. Provide an analysis of the international SAM standard with respect to the 27 processes within ISO/IEC 19770-1.
4. Develop an exploratory assessment method to allow organizations to determine their gaps against ISO/IEC 19770-1.
The approach selected was to align the research work of this thesis with the then new ISO working group created in 2002 to address issues related to the management of software assets and to contribute actively to the development of an international standard on SAM processes, that is: ISO/IEC 19770-1.
The results of this thesis are:
1. A common set of processes to describe the scope and content of SAM. This allows the industry to have a common point of reference and vocabulary when referring to SAM.
2. Through a literature review covering both the industry and the research community it was possible to highlights the divergence of scope and terminology with software manufacturer and the lack of agreement of what is a SAM manager. This thesis addresses these issues by identifying the full set of SAM processes.
3. The thesis analyses the standard used as the basis of reference for the assessment, that is: the ISO/IEC 19770-1 standard on SAM. The description and analysis of this standard allows for a better understanding of the purpose of each process and the interactions across existing standards such as ISO/IEC 20000 on Service Management.
4. The thesis also proposes a method to assess and assign a maturity level to each of the processes of the ISO/IEC 19770-1 standard; the ISO/IEC 15504 standard is used to perform the assessment.
5. Organizations recognize that poor management of software assets puts the organization at risk. However, organizations did not have any common way of assessing these risks. With the use of the ISO/IEC 19770-1 standard and the assessment method, organization can now identify the maturity levels of control points and assess their impact on the organization
Semantic discovery and reuse of business process patterns
Patterns currently play an important role in modern information systems (IS) development and their use has mainly been restricted to the design and implementation phases of the development lifecycle. Given the increasing significance of business modelling in IS development, patterns have the potential of providing a viable solution for promoting reusability of recurrent generalized models in the very early stages of development. As a statement of research-in-progress this paper focuses on business process patterns and proposes an initial methodological framework for the discovery and reuse of business process patterns within the IS development lifecycle. The framework borrows ideas from the domain engineering literature and proposes the use of semantics to drive both the discovery of patterns as well as their reuse
Ultra high-density hybrid pixel sensors for the detection of charge particles
L'abstract è presente nell'allegato / the abstract is in the attachmen
Statistical and Graph-Based Signal Processing: Fundamental Results and Application to Cardiac Electrophysiology
The goal of cardiac electrophysiology is to obtain information about the mechanism, function, and performance of the electrical activities of the heart, the identification of deviation from normal pattern and the design of treatments. Offering a better insight into cardiac arrhythmias comprehension and management, signal processing can help the physician to enhance the treatment strategies, in particular in case of atrial fibrillation (AF), a very common atrial arrhythmia which is associated to significant morbidities, such as increased risk of mortality, heart failure, and thromboembolic events. Catheter ablation of AF is a therapeutic technique which uses radiofrequency energy to destroy atrial tissue involved in the arrhythmia sustenance, typically aiming at the electrical disconnection of the of the pulmonary veins triggers. However, recurrence rate is still very high, showing that the very complex and heterogeneous nature of AF still represents a challenging problem.
Leveraging the tools of non-stationary and statistical signal processing, the first part of our work has a twofold focus: firstly, we compare the performance of two different ablation technologies, based on contact force sensing or remote magnetic controlled, using signal-based criteria as surrogates for lesion assessment. Furthermore, we investigate the role of ablation parameters in lesion formation using the late-gadolinium enhanced magnetic resonance imaging. Secondly, we hypothesized that in human atria the frequency content of the bipolar signal is directly related to the local conduction velocity (CV), a key parameter characterizing the substrate abnormality and influencing atrial arrhythmias. Comparing the degree of spectral compression among signals recorded at different points of the endocardial surface in response to decreasing pacing rate, our experimental data demonstrate a significant correlation between CV and the corresponding spectral centroids.
However, complex spatio-temporal propagation pattern characterizing AF spurred the need for new signals acquisition and processing methods. Multi-electrode catheters allow whole-chamber panoramic mapping of electrical activity but produce an amount of data which need to be preprocessed and analyzed to provide clinically relevant support to the physician. Graph signal processing has shown its potential on a variety of applications involving high-dimensional data on irregular domains and complex network. Nevertheless, though state-of-the-art graph-based methods have been successful for many tasks, so far they predominantly ignore the time-dimension of data.
To address this shortcoming, in the second part of this dissertation, we put forth a Time-Vertex Signal Processing Framework, as a particular case of the multi-dimensional graph signal processing. Linking together the time-domain signal processing techniques with the tools of GSP, the Time-Vertex Signal Processing facilitates the analysis of graph structured data which also evolve in time. We motivate our framework leveraging the notion of partial differential equations on graphs. We introduce joint operators, such as time-vertex localization and we present a novel approach to significantly improve the accuracy of fast joint filtering. We also illustrate how to build time-vertex dictionaries, providing conditions for efficient invertibility and examples of constructions.
The experimental results on a variety of datasets suggest that the proposed tools can bring significant benefits in various signal processing and learning tasks involving time-series on graphs. We close the gap between the two parts illustrating the application of graph and time-vertex signal processing to the challenging case of multi-channels intracardiac signals
Third International Symposium on Space Mission Operations and Ground Data Systems, part 2
Under the theme of 'Opportunities in Ground Data Systems for High Efficiency Operations of Space Missions,' the SpaceOps '94 symposium included presentations of more than 150 technical papers spanning five topic areas: Mission Management, Operations, Data Management, System Development, and Systems Engineering. The symposium papers focus on improvements in the efficiency, effectiveness, and quality of data acquisition, ground systems, and mission operations. New technology, methods, and human systems are discussed. Accomplishments are also reported in the application of information systems to improve data retrieval, reporting, and archiving; the management of human factors; the use of telescience and teleoperations; and the design and implementation of logistics support for mission operations. This volume covers expert systems, systems development tools and approaches, and systems engineering issues
Recent Perspectives in Pyrolysis Research
Recent Perspectives in Pyrolysis Research presents and discusses different routes of pyrolytic conversions. It contains exhaustive and comprehensive reports and studies of the use of pyrolysis for energy and materials production and waste management
The engineering of an object-oriented software development methodology
EThOS - Electronic Theses Online ServiceGBUnited Kingdo