312,496 research outputs found
ITR/IM: Enabling the Creation and Use of GeoGrids for Next Generation Geospatial Information
The objective of this project is to advance science in information management, focusing in particular on geospatial information. It addresses the development of concepts, algorithms, and system architectures to enable users on a grid to query, analyze, and contribute to multivariate, quality-aware geospatial information. The approach consists of three complementary research areas: (1) establishing a statistical framework for assessing geospatial data quality; (2) developing uncertainty-based query processing capabilities; and (3) supporting the development of space- and accuracy-aware adaptive systems for geospatial datasets. The results of this project will support the extension of the concept of the computational grid to facilitate ubiquitous access, interaction, and contributions of quality-aware next generation geospatial information. By developing novel query processes as well as quality and similarity metrics the project aims to enable the integration and use of large collections of disperse information of varying quality and accuracy. This supports the evolution of a novel geocomputational paradigm, moving away from current standards-driven approaches to an inclusive, adaptive system, with example potential applications in mobile computing, bioinformatics, and geographic information systems. This experimental research is linked to educational activities in three different academic programs among the three participating sites. The outreach activities of this project include collaboration with U.S. federal agencies involved in geospatial data collection, an international partner (Brazil\u27s National Institute for Space Research), and the organization of a 2-day workshop with the participation of U.S. and international experts
Harnessing data flow and modelling potentials for sustainable development
Tackling some of the global challenges relating to health, poverty, business and the environment is known to be heavily dependent on the flow and utilisation of data. However, while enhancements in data generation, storage, modelling, dissemination and the related integration of global economies and societies are fast transforming the way we live and interact, the resulting dynamic, globalised and information society remains digitally divided. On the African continent, in particular, the division has resulted into a gap between knowledge generation and its transformation into tangible products and services which Kirsop and Chan (2005) attribute to a broken information flow. This paper proposes some fundamental approaches for a sustainable transformation of data into knowledge for the purpose of improving the peoples' quality of life. Its main strategy is based on a generic data sharing model providing access to data utilising and generating entities in a multi disciplinary environment. It highlights the great potentials in using unsupervised and supervised modelling in tackling the typically predictive-in-nature challenges we face. Using both simulated and real data, the paper demonstrates how some of the key parameters may be generated and embedded in models to enhance their predictive power and reliability.
Its main outcomes include a proposed implementation framework setting the scene for the creation of decision support systems capable of addressing the key issues in society. It is expected that a sustainable data flow will forge synergies between the private sector, academic and research institutions within and between countries. It is also expected that the paper's findings will help in the design and development of knowledge extraction from data in the wake of cloud computing and, hence, contribute towards the improvement in the peoples' overall quality of life. To void running high implementation costs, selected open source tools are recommended for developing and sustaining the system.
Key words: Cloud Computing, Data Mining, Digital Divide, Globalisation, Grid Computing, Information Society, KTP, Predictive Modelling and STI
A FRAMEWORK FOR BIOPROFILE ANALYSIS OVER GRID
An important trend in modern medicine is towards individualisation of healthcare to tailor
care to the needs of the individual. This makes it possible, for example, to personalise
diagnosis and treatment to improve outcome. However, the benefits of this can only be fully
realised if healthcare and ICT resources are exploited (e.g. to provide access to relevant data,
analysis algorithms, knowledge and expertise). Potentially, grid can play an important role
in this by allowing sharing of resources and expertise to improve the quality of care. The
integration of grid and the new concept of bioprofile represents a new topic in the healthgrid
for individualisation of healthcare.
A bioprofile represents a personal dynamic "fingerprint" that fuses together a person's
current and past bio-history, biopatterns and prognosis. It combines not just data, but also
analysis and predictions of future or likely susceptibility to disease, such as brain diseases
and cancer. The creation and use of bioprofile require the support of a number of healthcare
and ICT technologies and techniques, such as medical imaging and electrophysiology and
related facilities, analysis tools, data storage and computation clusters. The need to share
clinical data, storage and computation resources between different bioprofile centres creates
not only local problems, but also global problems.
Existing ICT technologies are inappropriate for bioprofiling because of the difficulties in the
use and management of heterogeneous IT resources at different bioprofile centres. Grid as an
emerging resource sharing concept fulfils the needs of bioprofile in several aspects, including
discovery, access, monitoring and allocation of distributed bioprofile databases, computation
resoiuces, bioprofile knowledge bases, etc. However, the challenge of how to integrate the
grid and bioprofile technologies together in order to offer an advanced distributed bioprofile
environment to support individualized healthcare remains.
The aim of this project is to develop a framework for one of the key meta-level bioprofile
applications: bioprofile analysis over grid to support individualised healthcare. Bioprofile
analysis is a critical part of bioprofiling (i.e. the creation, use and update of bioprofiles).
Analysis makes it possible, for example, to extract markers from data for diagnosis and to
assess individual's health status. The framework provides a basis for a "grid-based" solution
to the challenge of "distributed bioprofile analysis" in bioprofiling. The main contributions
of the thesis are fourfold:
A. An architecture for bioprofile analysis over grid. The design of a suitable aichitecture
is fundamental to the development of any ICT systems. The architecture creates a
meaiis for categorisation, determination and organisation of core grid components to
support the development and use of grid for bioprofile analysis;
B. A service model for bioprofile analysis over grid. The service model proposes a
service design principle, a service architecture for bioprofile analysis over grid, and
a distributed EEG analysis service model. The service design principle addresses
the main service design considerations behind the service model, in the aspects of
usability, flexibility, extensibility, reusability, etc. The service architecture identifies
the main categories of services and outlines an approach in organising services to
realise certain functionalities required by distributed bioprofile analysis applications.
The EEG analysis service model demonstrates the utilisation and development of
services to enable bioprofile analysis over grid;
C. Two grid test-beds and a practical implementation of EEG analysis over grid. The two
grid test-beds: the BIOPATTERN grid and PlymGRID are built based on existing
grid middleware tools. They provide essential experimental platforms for research in
bioprofiling over grid. The work here demonstrates how resources, grid middleware
and services can be utilised, organised and implemented to support distributed EEG
analysis for early detection of dementia. The distributed Electroencephalography
(EEG) analysis environment can be used to support a variety of research activities in
EEG analysis;
D. A scheme for organising multiple (heterogeneous) descriptions of individual grid
entities for knowledge representation of grid. The scheme solves the compatibility
and adaptability problems in managing heterogeneous descriptions (i.e. descriptions
using different languages and schemas/ontologies) for collaborated representation of
a grid environment in different scales. It underpins the concept of bioprofile analysis
over grid in the aspect of knowledge-based global coordination between components
of bioprofile analysis over grid
Recommended from our members
Regulation and optimization methodology for smart grid in Chinese electric grid operators using quality function deployment, equilibrium theory, fractal theory and mathematical programming
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonAs the world is increasingly dependent on energy for the economic and social development and China’s Total Net Electricity Generation (TNEG) has remained the highest since 1996 due to its rapid economic growth, it is important to closely examine the operations of China’s electric power market, particularly the State Grid Corporation of China (SGCC) since it is the largest Electric Power Grid Operator (EPGO) in both China and the world. This research has addressed the problem and the urgent needs for the development of a sound framework and methodology for the effective regulation and optimization of the operations and quality management of the SGCC. Based on the critical literature review, the aspects and steps of the solution to the problem have been progressively presented. Firstly, a Country Wealth (CW) curve has been developed to characterize electricity generation in terms of TNEG, with China’s unique position identified. Further, the data has clearly indicated that China’s TNEG has also been closely correlated with the economic growth and the carbon emissions during the 30 years period of 1980-2010. Secondly, compared with the Equilibrium Energy Regulation Model, there are clear deficiencies and problems with the current regulation of China’s electric power market. The improvements in the integration of regulation strategies and the formation of one single effective regulator have been identified and proposed. Thirdly, a uniform regulation structure and framework based on fractal theory and QFD (quality function deployment) has been developed to integrate the existing and future electric power strategies, including smart grid strategy and sustainable development strategy(etc.). Through the use of QFD, the EPGO (SGCC) functions and operations can be prioritized and appropriately designed. Finally, the QFD methodology has been extended to achieve the optimization of quality and service operations given the target cost of the business processes. The methodology can be applied to both business and technical processes of the EPGOs since quality may be interpreted as a total quality involving the needs and expectations of various customers or stakeholders
Recommended from our members
Challenges to the Integration of Renewable Resources at High System Penetration
Successfully integrating renewable resources into the electric grid at penetration levels to meet a 33 percent Renewables Portfolio Standard for California presents diverse technical and organizational challenges. This report characterizes these challenges by coordinating problems in time and space, balancing electric power on a range of scales from microseconds to decades and from individual homes to hundreds of miles. Crucial research needs were identified related to grid operation, standards and procedures, system design and analysis, and incentives, and public engagement in each scale of analysis. Performing this coordination on more refined scales of time and space independent of any particular technology, is defined as a “smart grid.” “Smart” coordination of the grid should mitigate technical difficulties associated with intermittent and distributed generation, support grid stability and reliability, and maximize benefits to California ratepayers by using the most economic technologies, design and operating approaches
GMES-service for assessing and monitoring subsidence hazards in coastal lowland areas around Europe. SubCoast D3.5.1
This document is version two of the user requirements for SubCoast work package 3.5, it is
SubCoast deliverable 3.5.1. Work package 3.5 aims to provide a European integrated GIS
product on subsidence and relative sea level rise. The first step of this process was to
contact the European Environment Agency as the main user to discover their user
requirements.
This document presents these requirments, the outline methodology that will be used to carry
out the integration and the datasets that will be used. In outline the main user requirements
of the EEA are:
1. Gridded approach using an Inspire compliant grid
2. The grid would hold data on:
a. Likely rate of subsidence
b. RSLR
c. Impact (Vulnerability)
d. Certainty (confidence map)
e. Contribution of ground motion to RSLR
f. A measure of certainty in the data provided
g. Metadata
3. Spatial Coverage - Ideally entire coastline of all 37 member states
a. Spatial resolution - 1km
4. Provide a measure of the degree of contribution of ground motion to RSLR
The European integration will be based around a GIS methodology. Datasets will be
integrated and interpreted to provide information on data vlues above. The main value being
a likelyhood of Subsidence. This product will initially be developed at it’s lowest level of detail
for the London area. BGS have a wealth of data for london this will enable this less detialed
product to be validated and also enable the generation of a more detailed product usig the
best data availible. One the methodology has been developed it will be pushed out to other
areas of the ewuropean coastline.
The initial input data that have been reviewed for their suitability for the European integration
are listed below. Thesea re the datasets that have European wide availibility, It is expected
that more detailed datasets will be used in areas where they are avaiilble.
1. Terrafirma Data
2. One Geology
3. One Geology Europe
4. Population Density (Geoland2)
5. The Urban Atlas (Geoland2)
6. Elevation Data
a. SRTM
b. GDEM
c. GTOPO 30
d. NextMap Europe
7. MyOceans Sea Level Data
8. Storm Surge Locations
9. European Environment Agencya.
Elevation breakdown 1km
b. Corine Land Cover 2000 (CLC2000) coastline
c. Sediment Discharges
d. Shoreline
e. Maritime Boundaries
f. Hydrodynamics and Sea Level Rise
g. Geomorphology, Geology, Erosion Trends and Coastal Defence Works
h. Corine land cover 1990
i. Five metre elevation contour line
10. FutureCoas
DESIGNING ICT COMPETENCES – INTEGRATED SYLLABUSES OF WRITING COURSES (DESIGN AND DEVELOPMENT STUDY OF ENGLISH LANGUAGE EDUCATION STUDY PROGRAM SYLLABUSES)
The development of Information and Communication Technology (ICT) has greatly affected the field of education. ICT allow for a higher quality lessons through collaboration with teachers in planning and preparing resources. ICT also develop some writing skills: spelling, grammar, punctuation, editing and re-drafting. ICTs appear as a strategy to improve learners’ difficulties and provide students’ growth in the writing skills. Therefore, this research aims to design ICT competences – integrated writing syllabuses for English Language Education Study Program (ELESP) by analyzing the existing writing syllabuses from five universities in Indonesia. The analysis involved the ICT Competences proposed by UNESCO, Digital Media Descriptors of English Profiling Grid (EPG) and other ICT based – theories. The researcher employed Design and Development Research (DDR) as a research design and qualitative research as the research method. The used stages of DDR in this study are conducted need analysis; stating the objectives, developing the preliminary syllabus, evaluate the preliminary syllabus and revising a syllabus prototype. The data sources of this research are 14 existing syllabuses of writing courses of undergraduate English Language Education Study Program. The result of the research revealed that the ICT competences are mostly integrated in the component of Teaching Method and Media in the syllabuses. The highest level of ICT competences applied in the existing syllabus is Knowledge Deepening level. Although, the integration or infusion of ICT competences were explicit and implicit mentioned in the syllabuses of writing subjects. The research then provide the procedure of ICT integration and the design the ICT competences integrated – writing syllabuses; Basic Writing, Professional Writing, Creative Writing, and Academic Writing. The proposed syllabus implemented the skill-based syllabus.
Keywords: ICT Competences, Writing Skills, Syllabus Design, DDR, ICT UNESCO Framewor
- …