1,717 research outputs found

    Cloud service localisation

    Get PDF
    The essence of cloud computing is the provision of software and hardware services to a range of users in dierent locations. The aim of cloud service localisation is to facilitate the internationalisation and localisation of cloud services by allowing their adaption to dierent locales. We address the lingual localisation by providing service-level language translation techniques to adopt services to dierent languages and regulatory localisation by providing standards-based mappings to achieve regulatory compliance with regionally varying laws, standards and regulations. The aim is to support and enforce the explicit modelling of aspects particularly relevant to localisation and runtime support consisting of tools and middleware services to automating the deployment based on models of locales, driven by the two localisation dimensions. We focus here on an ontology-based conceptual information model that integrates locale specication in a coherent way

    A coordination protocol for user-customisable cloud policy monitoring

    Get PDF
    Cloud computing will see a increasing demand for end-user customisation and personalisation of multi-tenant cloud service offerings. Combined with an identified need to address QoS and governance aspects in cloud computing, a need to provide user-customised QoS and governance policy management and monitoring as part of an SLA management infrastructure for clouds arises. We propose a user-customisable policy definition solution that can be enforced in multi-tenant cloud offerings through an automated instrumentation and monitoring technique. We in particular allow service processes that are run by cloud and SaaS providers to be made policy-aware in a transparent way

    A robust client-driven distributed service localisation architecture

    Get PDF
    The fundamental purpose of service-oriented computing is the ability to quickly provide software resources to global users. The main aim of service localisation is to provide a method for facilitating the internationalisation and localisation of software services by allowing them to be adapted to different locales. We address lingual localisation by providing a service interface translation using the latest web services technology to adapt services to different languages and currency conversion as an example of regulatory localisation by using real-time data provided by the European Central Bank. Units and Regulatory Localisations are performed by a conversion mapping, which we have generated for a subset of locales. The aim is to investigate a standardised view on the localisation of services by using runtime and middleware services to deploy a localisation implementation. We apply traditional software localisation ideas to service interfaces. Our contribution is a localisation platform consisting of a conceptual model classifying localisation concerns and the definition of a number of specific platform services. The architecture in which this localisation technique is client-centric in a way that it allows the localisation to be controlled and managed by the client, ultimately providing more personalisation and trust. It also addresses robustness concerns by enabling a fault-tolerant architecture for third-party service localisation in a distributed setting

    Software service adaptation based on interface localisation

    Get PDF
    The aim of Web services is the provision of software services to a range of different users in different locations. Service localisation in this context can facilitate the internationalisation and localisation of services by allowing their adaption to different locales. The authors investigate three dimensions: (i) lingual localisation by providing service-level language translation techniques to adopt services to different languages, (ii) regulatory localisation by providing standards-based mappings to achieve regulatory compliance with regionally varying laws, standards and regulations, and (iii) social localisation by taking into account preferences and customs for individuals and the groups or communities in which they participate. The objective is to support and implement an explicit modelling of aspects that are relevant to localisation and runtime support consisting of tools and middleware services to automating the deployment based on models of locales, driven by the two localisation dimensions. The authors focus here on an ontology-based conceptual information model that integrates locale specification into service architectures in a coherent way

    Knowing public services: Cross-sector intermediaries and algorithmic governance in public sector reform

    Get PDF
    Discourses of public sector reform in the UK have been shaped in recent years by the participation of new kinds of hybrid cross-sector intermediaries such as think tanks, social enterprises and other third sector organisations. This article provides a documentary analysis of Demos, NESTA and the Innovation Unit as intermediary organisations in public sector reform, exploring their promotion of modes of digital governance and their mobilisation of new software technologies as models for new kinds of governing practices. These intermediary organisations are generating a model of knowing public services that operates through collecting and analysing big data, consisting of personal information and behavioural data on individual service users, in order to co-produce personalised services. Their objective is a new style of political governance based on human-computer interaction and machine learning techniques in which citizens are to be governed as co-producers of personalised services interacting with the algorithms of database software

    Broadcasting to the masses or building communities: Polish political parties online communication during the 2011 election

    Get PDF
    The professionalisation of political communication is an evolutionary process (Lilleker & Negrine, 2002), a process that adapts to trends in communication in order to better engage and persuade the public. One of the most dramatic developments in communication has been the move towards social communication via the Internet. It is argued to affect every area of public communication, from commercial advertising and public relations to education (Macnamara, 2010). It is no longer sufficient to have an online presence; we are now in an age of i-branding; with the ‘i’ standing for interactive. Yet, trends in online political electoral campaigning over recent years indicate a shallow adoption of Web 2.0 tools, features and platforms; limited interactivity; and managed co-production. The Internet is now embedded as a campaigning tool however, largely, the technologies are adapted to the norms of political communication rather than technologies impacting upon internal organizational structures, party relationships to members and supporters, or the content and style of their communication. We examine these themes, and develop them through a focus on the targeting and networking strategies of political parties, in more detail in the context of the Polish parliamentary election of 2011. Through a sophisticated content analysis and coding scheme our paper examines the extent to which parties use features that are designed to inform, engage, mobilise or allow interaction, which audiences they seek to communicate with and how these fit communication strategies. Comparing these findings with maps built from webcrawler analysis we build a picture of the strategies of the parties and the extent to which this links to short and long term political goals. This paper firstly develops our rationale for studying party and candidate use of the Internet during elections within the Polish context. Secondly we develop a conceptual framework which contrasts the politics as usual thesis (Margolis & Resnick, 2000) with arguments surrounding the social shaping of technologies (Lievrouw, 2006) and the impact on organisational adoption of communication technologies and post-Obama trends in Internet usage (Lilleker & Jackson, 2011) and posit that, despite the threats from an interactive strategy (Stromer-Galley, 2000) one would be expected within the context of a networked society (Van Dyjk, 2006). Following an overview of our methodology and innovative analysis strategy, we present our data which focuses on three key elements. Firstly we focus on the extent to which party and candidate websites inform, engage, mobilise or permit interaction (Lilleker et al, 2011). Secondly we assess the extent to which websites attract different visitor groups (Lilleker & Jackson, 2011) and build communities (Lilleker & Koc-Michalska, 2012). Thirdly we assess the reach strategies of the websites using Webcrawler technology which analyses the use of hyperlinks and whether parties lock themselves within cyberghettoes (Sunstein, 2007) or attempt to harness the power of the network (Benkler, 2006)

    Threats to Autonomy from Emerging ICT’s

    Get PDF
    This thesis investigates possible future threats to human autonomy created by currently emerging ICT’s. Prepared for evaluation as PhD by Publication, it consists of four journal papers and one book chapter, together with explanatory material. The ICT’s under examination are drawn from the results of the ETICA project, which sought to identify emerging ICT’s of ethical import. We first evaluate this research and identify elements in need of enhancement – the social aspects pertaining to ethical impact and the need to introduce elements of General Systems Theory in order to account for ICT’s as socio-technical systems. The first two publications for evaluation present arguments from marxist and capitalist perspectives which provide an account of the social dimensions through which an ICT can reduce human autonomy. There are many competing accounts of what constitutes human autonomy. These may be grouped into classes by their primary characteristics. The third publication for evaluation cross-references these classes with the ICT’s identified by the ETICA project, showing which version of autonomy could be restricted by each ICT and how. Finally, this paper induces from this analysis some general characteristics which any ICT must exhibit if it is to restrict autonomy of any form. Since ICT’s all operate in the same environment, the ultimate effect on the individual is the aggregated effect of all those ICT’s with which they interact and can be treated as an open system. Our fourth paper for evaluation therefore develops a theory of ICT’s as systems of a socio-technical nature, titled “Integrated Domain Theory”. Our fifth publication uses Integrated Domain Theory to explore the manner in which sociotechnical systems can restrict human autonomy, no matter how conceived. This thesis thus offers two complementary answers to the primary research question

    Think tanks and third sector intermediaries of pro-am power

    Get PDF
    The paper traces the formation of "pro-am power" as a policy discourse through an analysis of key texts produced by the think tank Demos, the social enterprise Innovation Unit and the National Endowment for Science, Technology and the Arts (NESTA). They have made public service reform thinkable and intelligible through ideas and concepts that are intended to change ways of thinking about the public sector. The paper aims to conceptualise the organisational character and "intellectual style" of these institutions. These, I argue, are "innovation intermediaries" and ideational institutions staffed by intellectual workers with careers in ideas. They are structurally located in a blurry, interstitial space between think tanks, social enterprises, and technology R&D labs, as well as between public, private and third sector styles of service provision. Such organisations are preoccupied with the promises and problems of new software analytics, big data and social media applications and services, and with the promotion of a new kind of interactive citizen subject
    corecore