282,937 research outputs found
Evaluation of the use of web technology by government of Sri Lanka to ensure food security for its citizens
Web technology is one of the key areas in information and communication
technology to be used as a powerful tool in ensuring food security which is one
of the main issues in Sri Lanka. Web technology involves in communicating and
sharing resources in network of computers all over the world. Main focus of
food security is to ensure that all people have fair access to sufficient and
quality food without endangering the future supply of the same food. In this
context, web sites play a vital and major role in achieving food security in
Sri Lanka. In this case study, websites pertaining to Sri Lankan government and
link with food security were analyzed to find out their impact in achieving the
goals of food security using web technologies and how they are being involved
in ensuring food security in Sri Lanka. The other objective of this study is to
make the Sri Lankan government aware of present situation of those websites in
addressing food security related issues and how modern web technologies could
be effectively and efficiently used to address those issues. So, the relevant
websites were checked against several criteria and scores were used to assess
their capabilities to address the concerns of food security. It was found that
the amount of emphasis given by these websites to address the issues of food
security is not satisfactory. Further, it showed that if these web sites could
be improved further, they would generate a powerful impact on ensuring food
security in Sri Lanka.Comment: International Conference of Sabaragamuwa University of Sri Lanka 2015
(ICSUSL 2015
Networks without wires: Human networks in the Information Society
It is the purpose of this paper to argue that the very significant skills we have brought as a profession to making the printed word uniformly and universally available have been overlooked. An electronic environment is being created which is inimical to scholarship and which is largely being designed by commercial and entertainment forces, which are irrelevant to the scholarly process. Even if that environment is modified and the issues described are resolved, it will remain an essentially hostile commercial environment. The academy remains largely unaware of the dangers - particularly in the area of preservation of both primary and secondary research resources. Our electronic house is built on shifting sands and a much more active approach is required from the profession to demonstrate that we can, like Sisyphus, reclimb the hill of bibliographic control and access and use that most basic skill of library school courses - the Organisation of Knowledge - to define scholarly requirements for the emerging information society. It is in fact by ensuring that our human networks are active and effective and by managing the flow of paper-based information effectively that we can best serve our readers, earn their professional respect, and position ourselves to act as guides to rather than bystanders at the information revolution
Adaptive Partitioning for Large-Scale Dynamic Graphs
Abstract—In the last years, large-scale graph processing has gained increasing attention, with most recent systems placing particular emphasis on latency. One possible technique to improve runtime performance in a distributed graph processing system is to reduce network communication. The most notable way to achieve this goal is to partition the graph by minimizing the num-ber of edges that connect vertices assigned to different machines, while keeping the load balanced. However, real-world graphs are highly dynamic, with vertices and edges being constantly added and removed. Carefully updating the partitioning of the graph to reflect these changes is necessary to avoid the introduction of an extensive number of cut edges, which would gradually worsen computation performance. In this paper we show that performance degradation in dynamic graph processing systems can be avoided by adapting continuously the graph partitions as the graph changes. We present a novel highly scalable adaptive partitioning strategy, and show a number of refinements that make it work under the constraints of a large-scale distributed system. The partitioning strategy is based on iterative vertex migrations, relying only on local information. We have implemented the technique in a graph processing system, and we show through three real-world scenarios how adapting graph partitioning reduces execution time by over 50 % when compared to commonly used hash-partitioning. I
Get yourself connected: conceptualising the role of digital technologies in Norwegian career guidance
This report outlines the role of digital technologies in the provision of career guidance. It was commissioned by the c ommittee on career guidance which is advising the Norwegian Government following a review of the countries skills system by the OECD. In this report we argue that career guidance and online career guidance in particular can support the development of Norwa y’s skills system to help meet the economic challenges that it faces.The expert committee advising Norway’s Career Guidance Initiativ
Notes on Cloud computing principles
This letter provides a review of fundamental distributed systems and economic
Cloud computing principles. These principles are frequently deployed in their
respective fields, but their inter-dependencies are often neglected. Given that
Cloud Computing first and foremost is a new business model, a new model to sell
computational resources, the understanding of these concepts is facilitated by
treating them in unison. Here, we review some of the most important concepts
and how they relate to each other
Recommended from our members
Development of an Integrated Governance Strategy for the Voluntary and Community Sector
This report on governance provides a framework for thinking about how policy makers, funders,regulators and advisers can all work with Board members and staff to enhance the effectiveness of nonprofit organisations. It was commissioned by the Active Community Unit (ACU) of the Home Office, in parallel with other reviews designed to improve the capacity of the voluntary and community sector, at a time when the sector plays an increasingly important role in the delivery of services using public funds. That role has recently been investigated in two Government reports, the Cross Cutting Review carried out by the Treasury, and the Strategy Unit review of charities and nonprofits. Our report proposes actions of three types: some that can be taken immediately, some that require further discussion with key interests, and some integration with the other ACU reviews. Taken together they provide the starting point for an evolving strategy to improve governance across the sector. We recommend ACU chairs a group charged with the responsibility for planning and implementing this. Our focus is on governance as 'the systems and processes concerned with ensuring the overall direction, supervision and accountability of an organisation'. This is often taken to mean the way that a Board, management committee or other governing body steers the overall development of an organisation, where day-to-day management is in the hands of staff or volunteers. Sometimes, of course, the committee and volunteers are the same. They – like all governing bodies – have to balance the interests of the organisation and those they are trying to serve, while being conscious of financial and legal responsibilities, and the requirements of funders and other supporters
Quality Assessment of Linked Datasets using Probabilistic Approximation
With the increasing application of Linked Open Data, assessing the quality of
datasets by computing quality metrics becomes an issue of crucial importance.
For large and evolving datasets, an exact, deterministic computation of the
quality metrics is too time consuming or expensive. We employ probabilistic
techniques such as Reservoir Sampling, Bloom Filters and Clustering Coefficient
estimation for implementing a broad set of data quality metrics in an
approximate but sufficiently accurate way. Our implementation is integrated in
the comprehensive data quality assessment framework Luzzu. We evaluated its
performance and accuracy on Linked Open Datasets of broad relevance.Comment: 15 pages, 2 figures, To appear in ESWC 2015 proceeding
- …