10,095 research outputs found

    Economia colaborativa

    Get PDF
    A importância de se proceder à análise dos principais desafios jurídicos que a economia colaborativa coloca – pelas implicações que as mudanças de paradigma dos modelos de negócios e dos sujeitos envolvidos suscitam − é indiscutível, correspondendo à necessidade de se fomentar a segurança jurídica destas práticas, potenciadoras de crescimento económico e bem-estar social. O Centro de Investigação em Justiça e Governação (JusGov) constituiu uma equipa multidisciplinar que, além de juristas, integra investigadores de outras áreas, como a economia e a gestão, dos vários grupos do JusGov – embora com especial participação dos investigadores que integram o grupo E-TEC (Estado, Empresa e Tecnologia) – e de outras prestigiadas instituições nacionais e internacionais, para desenvolver um projeto neste domínio, com o objetivo de identificar os problemas jurídicos que a economia colaborativa suscita e avaliar se já existem soluções para aqueles, refletindo igualmente sobre a conveniência de serem introduzidas alterações ou se será mesmo necessário criar nova regulamentação. O resultado desta investigação é apresentado nesta obra, com o que se pretende fomentar a continuação do debate sobre este tema.Esta obra é financiada por fundos nacionais através da FCT — Fundação para a Ciência e a Tecnologia, I.P., no âmbito do Financiamento UID/05749/202

    Technical Dimensions of Programming Systems

    Get PDF
    Programming requires much more than just writing code in a programming language. It is usually done in the context of a stateful environment, by interacting with a system through a graphical user interface. Yet, this wide space of possibilities lacks a common structure for navigation. Work on programming systems fails to form a coherent body of research, making it hard to improve on past work and advance the state of the art. In computer science, much has been said and done to allow comparison of programming languages, yet no similar theory exists for programming systems; we believe that programming systems deserve a theory too. We present a framework of technical dimensions which capture the underlying characteristics of programming systems and provide a means for conceptualizing and comparing them. We identify technical dimensions by examining past influential programming systems and reviewing their design principles, technical capabilities, and styles of user interaction. Technical dimensions capture characteristics that may be studied, compared and advanced independently. This makes it possible to talk about programming systems in a way that can be shared and constructively debated rather than relying solely on personal impressions. Our framework is derived using a qualitative analysis of past programming systems. We outline two concrete ways of using our framework. First, we show how it can analyze a recently developed novel programming system. Then, we use it to identify an interesting unexplored point in the design space of programming systems. Much research effort focuses on building programming systems that are easier to use, accessible to non-experts, moldable and/or powerful, but such efforts are disconnected. They are informal, guided by the personal vision of their authors and thus are only evaluable and comparable on the basis of individual experience using them. By providing foundations for more systematic research, we can help programming systems researchers to stand, at last, on the shoulders of giants

    Defining Service Level Agreements in Serverless Computing

    Get PDF
    The emergence of serverless computing has brought significant advancements to the delivery of computing resources to cloud users. With the abstraction of infrastructure, ecosystem, and execution environments, users could focus on their code while relying on the cloud provider to manage the abstracted layers. In addition, desirable features such as autoscaling and high availability became a provider’s responsibility and can be adopted by the user\u27s application at no extra overhead. Despite such advancements, significant challenges must be overcome as applications transition from monolithic stand-alone deployments to the ephemeral and stateless microservice model of serverless computing. These challenges pertain to the uniqueness of the conceptual and implementation models of serverless computing. One of the notable challenges is the complexity of defining Service Level Agreements (SLA) for serverless functions. As the serverless model shifts the administration of resources, ecosystem, and execution layers to the provider, users become mere consumers of the provider’s abstracted platform with no insight into its performance. Suboptimal conditions of the abstracted layers are not visible to the end-user who has no means to assess their performance. Thus, SLA in serverless computing must take into consideration the unique abstraction of its model. This work investigates the Service Level Agreement (SLA) modeling of serverless functions\u27 and serverless chains’ executions. We highlight how serverless SLA fundamentally differs from earlier cloud delivery models. We then propose an approach to define SLA for serverless functions by utilizing resource utilization fingerprints for functions\u27 executions and a method to assess if executions adhere to that SLA. We evaluate the approach’s accuracy in detecting SLA violations for a broad range of serverless application categories. Our validation results illustrate a high accuracy in detecting SLA violations resulting from resource contentions and provider’s ecosystem degradations. We conclude by presenting the empirical validation of our proposed approach, which could detect Execution-SLA violations with accuracy up to 99%

    A productive response to legacy system petrification

    Get PDF
    Requirements change. The requirements of a legacy information system change, often in unanticipated ways, and at a more rapid pace than the rate at which the information system itself can be evolved to support them. The capabilities of a legacy system progressively fall further and further behind their evolving requirements, in a degrading process termed petrification. As systems petrify, they deliver diminishing business value, hamper business effectiveness, and drain organisational resources. To address legacy systems, the first challenge is to understand how to shed their resistance to tracking requirements change. The second challenge is to ensure that a newly adaptable system never again petrifies into a change resistant legacy system. This thesis addresses both challenges. The approach outlined herein is underpinned by an agile migration process - termed Productive Migration - that homes in upon the specific causes of petrification within each particular legacy system and provides guidance upon how to address them. That guidance comes in part from a personalised catalogue of petrifying patterns, which capture recurring themes underlying petrification. These steer us to the problems actually present in a given legacy system, and lead us to suitable antidote productive patterns via which we can deal with those problems one by one. To prevent newly adaptable systems from again degrading into legacy systems, we appeal to a follow-on process, termed Productive Evolution, which embraces and keeps pace with change rather than resisting and falling behind it. Productive Evolution teaches us to be vigilant against signs of system petrification and helps us to nip them in the bud. The aim is to nurture systems that remain supportive of the business, that are adaptable in step with ongoing requirements change, and that continue to retain their value as significant business assets

    Industry 4.0: product digital twins for remanufacturing decision-making

    Get PDF
    Currently there is a desire to reduce natural resource consumption and expand circular business principles whilst Industry 4.0 (I4.0) is regarded as the evolutionary and potentially disruptive movement of technology, automation, digitalisation, and data manipulation into the industrial sector. The remanufacturing industry is recognised as being vital to the circular economy (CE) as it extends the in-use life of products, but its synergy with I4.0 has had little attention thus far. This thesis documents the first investigating into I4.0 in remanufacturing for a CE contributing a design and demonstration of a model that optimises remanufacturing planning using data from different instances in a product’s life cycle. The initial aim of this work was to identify the I4.0 technology that would enhance the stability in remanufacturing with a view to reducing resource consumption. As the project progressed it narrowed to focus on the development of a product digital twin (DT) model to support data-driven decision making for operations planning. The model’s architecture was derived using a bottom-up approach where requirements were extracted from the identified complications in production planning and control that differentiate remanufacturing from manufacturing. Simultaneously, the benefits of enabling visibility of an asset’s through-life health were obtained using a DT as the modus operandi. A product simulator and DT prototype was designed to use Internet of Things (IoT) components, a neural network for remaining life estimations and a search algorithm for operational planning optimisation. The DT was iteratively developed using case studies to validate and examine the real opportunities that exist in deploying a business model that harnesses, and commodifies, early life product data for end-of-life processing optimisation. Findings suggest that using intelligent programming networks and algorithms, a DT can enhance decision-making if it has visibility of the product and access to reliable remanufacturing process information, whilst existing IoT components provide rudimentary “smart” capabilities, but their integration is complex, and the durability of the systems over extended product life cycles needs to be further explored

    Platform protocol place: a practice-based study of critical media art practice (2007-2020)

    Get PDF
    This practice-based research project focuses on critical media art practices in contemporary digital culture. The theoretical framework employed in this inquiry draws from the work of the Frankfurt School, in particular Theodor Adorno and Max Horkheimer’s The Culture Industry: Enlightenment as Mass Deception. Using Adorno & Horkheimer’s thesis as a theoretical guide, this research project formulates the concept of the digital culture industry - a concept that refers to the contemporary era of networked capitalism, an era defined by the unprecedented extraction, accumulation and manipulation of data and the material and digital infrastructures that facilitate it. This concept is used as a framing mechanism that articulates certain techno-political concerns within networked capitalism and responds to them through practice. The second concept formulated within this research project is Platform Protocol Place. The function of this second concept is to frame and outline the body of practice-based work developed in this study. It is also used to make complex technological issues accessible and to communicate these issues through public exhibition and within this written thesis. The final concept developed in this research project is tactical media archaeology. This concept describes the techniques and approaches employed in the development of the body of practice-based work that are the central focus of this research project. This approach is a synthesis of two subfields of media art practice and theory, tactical media and media archaeology. Through practice, tactical media archaeology critiques the geopolitical machinations and systems beneath the networked devices and interfaces of the digital culture industry

    Influence diagrams for complex litigation

    Get PDF
    Effective advocacy depends critically on the ability of attorneys to formulate, analyze, and compare rival courses of action. Whereas attorneys have been doing these things for centuries using little more than their gut instincts and experiences, sophisticated decision aids are now available that can improve the way attorneys assess the value of their cases and the strategic decisions that they make. These aids are proving valuable in medicine and business, but they have not impacted legal practice. This Article seeks to correct this oversight by showing how easy-to-use graphical models provide guidance for strategic legal decisions. Beginning with a paradigmatic example of a plain- tiff who must choose between proceeding to trial or settling out of court, the Article shows how decision aids handle the uncertainties and interdependencies that arise when real-world considerations are introduced. In particular, the Article makes the case that influence diagrams, a relative newcomer in the field of decision analysis, should be the decision aid of choice in complex litigation matters
    corecore