10,568 research outputs found

    Rethinking De-Perimeterisation: Problem Analysis And Solutions

    Get PDF
    For businesses, the traditional security approach is the hard-shell model: an organisation secures all its assets using a fixed security border, trusting the inside, and distrusting the outside. However, as technologies and business processes change, this model looses its attractiveness. In a networked world, “inside” and “outside” can no longer be clearly distinguished. The Jericho Forum - an industry consortium part of the Open Group – coined this process deperimeterisation and suggested an approach aimed at securing data rather than complete systems and infrastructures. We do not question the reality of de-perimeterisation; however, we believe that the existing analysis of the exact problem, as well as the usefulness of the proposed solutions have fallen short: first, there is no linear process of blurring boundaries, in which security mechanisms are placed at lower and lower levels, until they only surround data. To the contrary, we experience a cyclic process of connecting and disconnecting of systems. As conditions change, the basic trade-off between accountability and business opportunities is made (and should be made) every time again. Apart from that, data level security has several limitations to start with, and there is a big potential for solving security problems differently: by rearranging the responsibilities between businesses and individuals. The results of this analysis can be useful for security professionals who need to trade off different security mechanisms for their organisations and their information systems

    Back to Keynes?

    Get PDF
    After a brief review of classical, Keynesian, New Classical and New Keynesian theories of macroeconomic policy, we assess whether New Keynesian Economics captures the quintessential features stressed by J.M. Keynes. Particular attention is paid to Keynesian features omitted in New Keynesian workhorses such as the micro-founded Keynesian multiplier and the New Keynesian Phillips curve. These theories capture wage and price sluggishness and aggregate demand externalities by departing from a competitive framework and give a key role to expectations. The main deficiencies, however, are the inability to predict a pro-cyclical real wage in the face of demand shocks, the absence of inventories, credit constraints and bankruptcies in explaining the business cycle, and no effect of the nominal as well as the real interest rate on aggregate demand. Furthermore, they fail to allow for quantity rationing and to model unemployment as a catastrophic event. The macroeconomics based on the New Keynesian Phillips curve has quite a way to go before the quintessential Keynesian features are captured.Keynesian economics, New Keynesian Phillips curve, monopolistic competition, nominal wage rigidity, welfare, pro-cyclical real wage, inventories, liquidity, bankruptcy, unemployment, monetary policy

    Machine-Readable Privacy Certificates for Services

    Full text link
    Privacy-aware processing of personal data on the web of services requires managing a number of issues arising both from the technical and the legal domain. Several approaches have been proposed to matching privacy requirements (on the clients side) and privacy guarantees (on the service provider side). Still, the assurance of effective data protection (when possible) relies on substantial human effort and exposes organizations to significant (non-)compliance risks. In this paper we put forward the idea that a privacy certification scheme producing and managing machine-readable artifacts in the form of privacy certificates can play an important role towards the solution of this problem. Digital privacy certificates represent the reasons why a privacy property holds for a service and describe the privacy measures supporting it. Also, privacy certificates can be used to automatically select services whose certificates match the client policies (privacy requirements). Our proposal relies on an evolution of the conceptual model developed in the Assert4Soa project and on a certificate format specifically tailored to represent privacy properties. To validate our approach, we present a worked-out instance showing how privacy property Retention-based unlinkability can be certified for a banking financial service.Comment: 20 pages, 6 figure

    Taking our learning and teaching strategy to the next level through technology enhanced campus development

    Get PDF
    Over the last three years Abertay University has radically evolved its strategy for teaching and supporting learning. This paper outlines Abertay’s journey over the last few years, including the key features of our new pedagogic approach and its impact so far. For example, in 2016 Abertay was the highest ranked modern Scottish University in the National Student Survey (NSS) and shortlisted for the prestigious Times Higher Education “University of the Year” award.In order to further enhance our students’ progression, attainment and employability we have recognized the need to invest further in two key (and related) areas: technology enhanced learning and estate development in order to create a so-called “sticky campus” i.e. somewhere our students will want to come and stay. This has included full implementation of electronic management of assessment (EMA); blended learning; new technology-rich collaborative learning environments and science laboratories which promote richer student-staff interactions and new ways of learning; and a planned complete refurbishment of the University library which will provide a variety of learning environments (formal and informal) from summer 2017.The paper will detail the drivers for these changes; the change management processes involving a staff-student partnership involving management, academic and professional services; successes;challenges; lessons learned and future plans

    The Shift of Techno-Economic Paradigm and Its Effects on Regional Disparities

    Get PDF
    During the 1900?s we first lived thorough shift from the agricultural era to the industrial era. Nowadays, we are in the middle of the shift from the industrial era to the information era. The new era has several definitions based on different theories. At the same time, we talk about information society (knowledge is forming the main productivity factor), network society (new communication technology is connecting people), post-industrial society (change in production paradigm), service society (emphasis on services instead of production), expert society (increasing importance of skilled people and experts), learning society (learning ability becomes a critical factor), postmodern society (modernisation leads to individualism), innovation society (innovation is the driving force of economic growth), risk society (risks and uncertainty are increasing in society) and consumer society (consumer needs steer economic activities) These definitions reflect the different points of view of assessing the development we have been experiencing during the recent years. Each of these definitions emphasises different phenomena embedded in the change of present techno-economic paradigm, and each of them builds a basis for the assessment of the requirements of the changing environment. Although the definitions and theories describing the present change are mostly very abstract, some concrete indicators can be determined to describe the phase of the trajectory in the changing process of the society. The changes in the society should be assessed at regional level, especially as regional dimension is gaining importance in the development policies at the European level. In the regional context the question to rise first is, how the shift of techno-economic paradigm appears in the regional level and what its effect is on emerging regional disparities. Secondly, is it possible to evaluate, how the region?s adaptability to the shift of techno-economic paradigm correlates to its economical success. In the current study, an indicator is created to describe a regions? adaptability to the shift of techno-economic paradigm. The variables included in the adaptability indicator are derived from the theories describing the present society. The Finnish urban regions are used as the source of empirical data in this study. All Finnish urban regions are assessed based on the adaptability indicator and further on, the values of the adaptability indicator are compared to the respective values of indicators describing the economic success of the same regions.. Admittedly, the adaptability indicator does not describe the studied phenomenon completely, it might even be considered provocative. However, it gives some interesting results about the different kinds of development trajectories of urban regions, and gives valuable information for regional decision-making.

    When the signal is in the noise: Exploiting Diffix's Sticky Noise

    Get PDF
    Anonymized data is highly valuable to both businesses and researchers. A large body of research has however shown the strong limits of the de-identification release-and-forget model, where data is anonymized and shared. This has led to the development of privacy-preserving query-based systems. Based on the idea of "sticky noise", Diffix has been recently proposed as a novel query-based mechanism satisfying alone the EU Article~29 Working Party's definition of anonymization. According to its authors, Diffix adds less noise to answers than solutions based on differential privacy while allowing for an unlimited number of queries. This paper presents a new class of noise-exploitation attacks, exploiting the noise added by the system to infer private information about individuals in the dataset. Our first differential attack uses samples extracted from Diffix in a likelihood ratio test to discriminate between two probability distributions. We show that using this attack against a synthetic best-case dataset allows us to infer private information with 89.4% accuracy using only 5 attributes. Our second cloning attack uses dummy conditions that conditionally strongly affect the output of the query depending on the value of the private attribute. Using this attack on four real-world datasets, we show that we can infer private attributes of at least 93% of the users in the dataset with accuracy between 93.3% and 97.1%, issuing a median of 304 queries per user. We show how to optimize this attack, targeting 55.4% of the users and achieving 91.7% accuracy, using a maximum of only 32 queries per user. Our attacks demonstrate that adding data-dependent noise, as done by Diffix, is not sufficient to prevent inference of private attributes. We furthermore argue that Diffix alone fails to satisfy Art. 29 WP's definition of anonymization. [...
    • 

    corecore