33,654 research outputs found
Data DNA: The Next Generation of Statistical Metadata
Describes the components of a complete statistical metadata system and suggests ways to create and structure metadata for better access and understanding of data sets by diverse users
Studying How E-Markets Evaluation Can Enhance Trust in Virtual Business Communities
One of the major drawbacks of conducting business online is the raised level of risk associated with business transactions. Potential business partners usually have limited information about each others reliability or product / service quality before an online transaction. In this paper, we focus on the problem of selecting a trustful electronic market (e-market), in order to perform business transactions with it. In particular, we examine how the decision of selecting an appropriate e-market can be facilitated by an e-market recommendation algorithm. For this purpose, a metadata model for collecting and storing e-market evaluations from the members of a virtual business community in a reusable and interoperable manner is introduced. Then, an e-market recommendation algorithm that can synthesize existing e-market evaluations stored using the metadata model, is designed. Finally, a scenario of how the presented e-market recommendation algorithm can support a virtual agribusiness community of the organic agriculture sector is discussed.E-market, metadata, recommender system, virtual community, Institutional and Behavioral Economics, Marketing,
Scottish academic publications implementing an effective networked service (SAPIENS) project
This article describes the aims and continuing progress of the Scottish Academic Periodicals Implementing an Effective Networked Service (SAPIENS) project which has been running at the University of Strathclyde's Centre for Digital Library Research since September 2001. Initially funded for two years, the project has been extended until October 2004. The rationale behind SAPIENS is the concern that small Scottish publishers, operating on limited budgets, are in danger of finding themselves marginalised in the modern information environment. The project's primary objectives are to explore the viability of, and launch, an electronic publishing service to assist small-scale Scottish publishers of academic and cultural periodicals to publish online. It has achieved these aims by implementing a demonstration service which is gradually moving into an operational mode, delivering current journals
Developing strategic learning alliances: partnerships for the provision of global education and training solutions
The paper describes a comprehensive model for the development of strategic alliances between education and corporate sectors, which is required to ensure effective provision of education and training programmes for a global market. Global economic forces, combined with recent advances in information and communication technologies, have provided unprecedented opportunities for education providers to broaden the provision of their programmes both on an international scale and across new sectors. Lifelong learning strategies are becoming increasingly recognized as an essential characteristic of a successful organization and therefore large organizations have shown a preparedness to invest in staff training and development. The demands for lifelong learning span a wide range of training and educational levels from school-level and vocational courses to graduate-level training for senior executive
Recommended from our members
Data standardization
With data rapidly becoming the lifeblood of the global economy, the ability to improve its use significantly affects both social and private welfare. Data standardization is key to facilitating and improving the use of data when data portability and interoperability are needed. Absent data standardization, a “Tower of Babel” of different databases may be created, limiting synergetic knowledge production. Based on interviews with data scientists, this Article identifies three main technological obstacles to data portability and interoperability: metadata uncertainties, data transfer obstacles, and missing data. It then explains how data standardization can remove at least some of these obstacles and lead to smoother data flows and better machine learning. The Article then identifies and analyzes additional effects of data standardization. As shown, data standardization has the potential to support a competitive and distributed data collection ecosystem and lead to easier policing in cases where rights are infringed or unjustified harms are created by data-fed algorithms. At the same time, increasing the scale and scope of data analysis can create negative externalities in the form of better profiling, increased harms to privacy, and cybersecurity harms. Standardization also has implications for investment and innovation, especially if lock-in to an inefficient standard occurs. The Article then explores whether market-led standardization initiatives can be relied upon to increase welfare, and the role governmental-facilitated data standardization should play, if at all
StackInsights: Cognitive Learning for Hybrid Cloud Readiness
Hybrid cloud is an integrated cloud computing environment utilizing a mix of
public cloud, private cloud, and on-premise traditional IT infrastructures.
Workload awareness, defined as a detailed full range understanding of each
individual workload, is essential in implementing the hybrid cloud. While it is
critical to perform an accurate analysis to determine which workloads are
appropriate for on-premise deployment versus which workloads can be migrated to
a cloud off-premise, the assessment is mainly performed by rule or policy based
approaches. In this paper, we introduce StackInsights, a novel cognitive system
to automatically analyze and predict the cloud readiness of workloads for an
enterprise. Our system harnesses the critical metrics across the entire stack:
1) infrastructure metrics, 2) data relevance metrics, and 3) application
taxonomy, to identify workloads that have characteristics of a) low sensitivity
with respect to business security, criticality and compliance, and b) low
response time requirements and access patterns. Since the capture of the data
relevance metrics involves an intrusive and in-depth scanning of the content of
storage objects, a machine learning model is applied to perform the business
relevance classification by learning from the meta level metrics harnessed
across stack. In contrast to traditional methods, StackInsights significantly
reduces the total time for hybrid cloud readiness assessment by orders of
magnitude
- …