1,982 research outputs found

    Joint Chance-constrained Game for Coordinating Microgrids in Energy and Reserve Markets: A Bayesian Optimization Approach

    Full text link
    Microgrids incorporate distributed energy resources (DERs) and flexible loads, which can provide energy and reserve services for the main grid. However, due to uncertain renewable generations such as solar power, microgrids might under-deliver reserve services and breach day-ahead contracts in real-time. If multiple microgrids breach their reserve contracts simultaneously, this could lead to a severe grid contingency. This paper designs a distributionally robust joint chance-constrained (DRJCC) game-theoretical framework considering uncertain real-time reserve provisions and the value of lost load (VoLL). Leveraging historical error samples, the reserve bidding strategy of each microgrid is formulated into a two-stage Wasserstein-metrics distribution robust optimization (DRO) model. A JCC is employed to regulate the under-delivered reserve capacity of all microgrids in a non-cooperative game. Considering the unknown correlation among players, a novel Bayesian optimization method approximates the optimal individual violation rates of microgrids and market equilibrium. The proposed game framework with the optimal rates is simulated with up to 14 players in a 30-bus network. Case studies are conducted using the California power market data. The proposed Bayesian method can effectively regulate the joint violation rate of the under-delivered reserve and secure the profit of microgrids in the reserve market.Comment: Received and Revised in 2023; IEEE Journa

    Price formation in Local Electricity Markets

    Get PDF

    A long short-temory relation network for real-time prediction of patient-specific ventilator parameters

    Get PDF
    Accurate prediction of patient-specific ventilator parameters is crucial for optimizing patient-ventilator interaction. Current approaches encounter difficulties in concurrently observing long-term, time-series dependencies and capturing complex, significant features that influence the ventilator treatment process, thereby hindering the achievement of accurate prediction of ventilator parameters. To address these challenges, we propose a novel approach called the long short-term memory relation network (LSTMRnet). Our approach uses a long, short-term memory bank to store rich information and an important feature selection step to extract relevant features related to respiratory parameters. This information is obtained from the prior knowledge of the follow up model. We also concatenate the embeddings of both information types to maintain the joint learning of spatio-temporal features. Our LSTMRnet effectively preserves both time-series and complex spatial-critical feature information, enabling an accurate prediction of ventilator parameters. We extensively validate our approach using the publicly available medical information mart for intensive care (MIMIC-III) dataset and achieve superior results, which can be potentially utilized for ventilator treatment (i.e., sleep apnea-hypopnea syndrome ventilator treatment and intensive care units ventilator treatment

    Collective Embodiment and Communal Feeling: A Critical Somatics Approach to Performance for Social Change

    Get PDF
    “Collective Embodiment and Communal Feeling: A Critical Somatics Approach to Performance for Social Change” argues for a novel approach to performance for social change that focuses on the sensory and somatic dimensions of collectivity as the basis for countering the atomizing politics of neoliberalism. It proposes a critical somatics approach to the deconstruction and reconfiguration of participants’ embodied subjectivities, emphasizing the cultivation of conditions that facilitate experiences of collective embodiment and affective interdependence. Whether in the kinesthetic awareness of bodies dancing together, the situational or proprioceptive awareness of a collective engaged in creative disruption, or the physical contact of activists’ clasped arms forming a human chain in protest, these conditions require multisensory engagement, improvisational coordination, and shared feeling. Based on ethnographic accounts of the phenomenological experience of collective embodiment, I argue that such experiences enact—rather than merely argue for—forms of collectivity through their operation on the level of the body. This approach to performance for social change builds on the experience of practitioners and artist-activists in an effort to preserve the core contributions of existing techniques while seeking avenues to overcome their susceptibility to the influence of increasingly ubiquitous neoliberal frameworks. Opening with a consideration of Augusto Boal’s Theatre of the Oppressed as a touchstone example, I argue that the technique’s cognitive approach to social change and its emphasis on discursive techniques contribute to the manner in which it individualizes responsibility for combating systemic oppression. Turning to Cynthia Winton-Henry and Phil Porter’s InterPlay as an example of an affective approach to performance for social change, I critique its practitioners’ culture of individualism, but identify the critical potential of its recognition of collective embodiment. Extending this analysis to protest and direct action, I explore the existential prefiguration of communities of care and the cultivation of communal feeling, an affective and collective form of embodied cognition. After offering a series of activities designed to create the conditions for experiences of collective embodiment and develop the affective bonds of communal feeling, I close with a consideration of the broader implications of positioning speculative theory at the forefront of movements’ political practice.Doctor of Philosoph

    Grounds for a Third Place : The Starbucks Experience, Sirens, and Space

    Get PDF
    My goal in this dissertation is to help demystify or “filter” the “Starbucks Experience” for a post-pandemic world, taking stock of how a multi-national company has long outgrown its humble beginnings as a wholesale coffee bean supplier to become a digitally-integrated and hypermodern cafĂ©. I look at the role Starbucks plays within the larger cultural history of the coffee house and also consider how Starbucks has been idyllically described in corporate discourse as a comfortable and discursive “third place” for informal gathering, a term that also prescribes its own radical ethos as a globally recognized customer service platform. Attempting to square Starbucks’ iconography and rhetoric with a new critical methodology, in a series of interdisciplinary case studies, I examine the role Starbucks’ “third place” philosophy plays within larger conversations about urban space and commodity culture, analyze Starbucks advertising, architecture and art, and trace the mythical rise of the Starbucks Siren (and the reiterations and re-imaginings of the Starbucks Siren in art and media). While in corporate rhetoric Starbucks’ “third place” is depicted as an enthralling adventure, full of play, discovery, authenticity, or “romance,” I draw on critical theory to discuss how it operates today as a space of distraction, isolation, and loss

    Resource Allocation in Networking and Computing Systems: A Security and Dependability Perspective

    Get PDF
    In recent years, there has been a trend to integrate networking and computing systems, whose management is getting increasingly complex. Resource allocation is one of the crucial aspects of managing such systems and is affected by this increased complexity. Resource allocation strategies aim to effectively maximize performance, system utilization, and profit by considering virtualization technologies, heterogeneous resources, context awareness, and other features. In such complex scenario, security and dependability are vital concerns that need to be considered in future computing and networking systems in order to provide the future advanced services, such as mission-critical applications. This paper provides a comprehensive survey of existing literature that considers security and dependability for resource allocation in computing and networking systems. The current research works are categorized by considering the allocated type of resources for different technologies, scenarios, issues, attributes, and solutions. The paper presents the research works on resource allocation that includes security and dependability, both singularly and jointly. The future research directions on resource allocation are also discussed. The paper shows how there are only a few works that, even singularly, consider security and dependability in resource allocation in the future computing and networking systems and highlights the importance of jointly considering security and dependability and the need for intelligent, adaptive and robust solutions. This paper aims to help the researchers effectively consider security and dependability in future networking and computing systems.publishedVersio

    Flooded with terror: Identifying existential threat in water crisis communication and exploring gender bias in the depths of water management

    Get PDF
    The purpose of this dissertation was to advance understanding of gender inequity in water management and the ways in which threatening water communication may contribute to that inequity. Water crises are increasing with climate change and the communication of potentially fatal outcomes are ever-present via media and ongoing catastrophic climate events. While climate scholars have demonstrated that diverse decision-making groups lead to improved environmental and ethical outcomes – outcomes that include effective solutions to water crises – top-level water management in the Global North remains largely homogenously male. I explored this disconnect through the lens of Terror Management Theory (TMT) to identify how life-threatening water crisis communication may influence environmental attitudes and intergroup relations within water decision-making contexts. Terror Management Theory empirically tests the influence of mortality reminders on human behaviour and has identified predictable and replicable ways in which we respond to reminders of our eventual demise (Chapter One). Climate change has been established as a mortality reminder within Terror Management Theory research, as it evokes existential anxieties in those who consider experiencing climate change or its consequences. Water, however, had not previously been tested as a mortality reminder. The research within this dissertation was guided by three interconnected objectives: (1) to determine if threatening water messages evoke mortality salience similarly to typical TMT mortality reminders; (2) to identify how pro-environmental worldviews or identities are influenced by mortality salience and/or life-threatening water reminders; and (3) building on prior objectives, to determine whether judgments about same or different gendered water decision-makers are influenced by mortality salience from a typical and/or water-related mortality reminder. This dissertation followed social psychology methods as developed and applied within Terror Management Theory to identify the psychosocial responses to threatening water reminders (Chapters Two and Three) and the influence of these responses on gender dynamics within water crisis decision-making (Chapter Four). Findings provided confirmation that some framings of water crises evoked mortality anxieties in American and Canadian populations (Chapter Two) and delivered evidence of environmental identity reinforcement following a typical mortality or life-threatening water reminder (Chapter Three). Findings also illustrated that mortality salience influenced appraisal of male and female water managers, and that these appraisals were also influenced by underlying levels of sexism and, potentially, connected gender role stereotypes. In addition to academic contributions from this research, outcomes from this dissertation inform water communication campaigns (e.g., when threatening communications might be motivating for pro-environmental change and when might it not) and for guidance regarding equity efforts, particularly among leadership contexts that are presently male-dominated. Understanding how to develop and implement water crisis solutions is necessary in our changing climate. These solutions require recognizing how to best create and foster diverse, equitable decision-making groups that retain and respect that diversity so all can be meaningfully included

    Towards a human-centric data economy

    Get PDF
    Spurred by widespread adoption of artificial intelligence and machine learning, “data” is becoming a key production factor, comparable in importance to capital, land, or labour in an increasingly digital economy. In spite of an ever-growing demand for third-party data in the B2B market, firms are generally reluctant to share their information. This is due to the unique characteristics of “data” as an economic good (a freely replicable, non-depletable asset holding a highly combinatorial and context-specific value), which moves digital companies to hoard and protect their “valuable” data assets, and to integrate across the whole value chain seeking to monopolise the provision of innovative services built upon them. As a result, most of those valuable assets still remain unexploited in corporate silos nowadays. This situation is shaping the so-called data economy around a number of champions, and it is hampering the benefits of a global data exchange on a large scale. Some analysts have estimated the potential value of the data economy in US$2.5 trillion globally by 2025. Not surprisingly, unlocking the value of data has become a central policy of the European Union, which also estimated the size of the data economy in 827C billion for the EU27 in the same period. Within the scope of the European Data Strategy, the European Commission is also steering relevant initiatives aimed to identify relevant cross-industry use cases involving different verticals, and to enable sovereign data exchanges to realise them. Among individuals, the massive collection and exploitation of personal data by digital firms in exchange of services, often with little or no consent, has raised a general concern about privacy and data protection. Apart from spurring recent legislative developments in this direction, this concern has raised some voices warning against the unsustainability of the existing digital economics (few digital champions, potential negative impact on employment, growing inequality), some of which propose that people are paid for their data in a sort of worldwide data labour market as a potential solution to this dilemma [114, 115, 155]. From a technical perspective, we are far from having the required technology and algorithms that will enable such a human-centric data economy. Even its scope is still blurry, and the question about the value of data, at least, controversial. Research works from different disciplines have studied the data value chain, different approaches to the value of data, how to price data assets, and novel data marketplace designs. At the same time, complex legal and ethical issues with respect to the data economy have risen around privacy, data protection, and ethical AI practices. In this dissertation, we start by exploring the data value chain and how entities trade data assets over the Internet. We carry out what is, to the best of our understanding, the most thorough survey of commercial data marketplaces. In this work, we have catalogued and characterised ten different business models, including those of personal information management systems, companies born in the wake of recent data protection regulations and aiming at empowering end users to take control of their data. We have also identified the challenges faced by different types of entities, and what kind of solutions and technology they are using to provide their services. Then we present a first of its kind measurement study that sheds light on the prices of data in the market using a novel methodology. We study how ten commercial data marketplaces categorise and classify data assets, and which categories of data command higher prices. We also develop classifiers for comparing data products across different marketplaces, and we study the characteristics of the most valuable data assets and the features that specific vendors use to set the price of their data products. Based on this information and adding data products offered by other 33 data providers, we develop a regression analysis for revealing features that correlate with prices of data products. As a result, we also implement the basic building blocks of a novel data pricing tool capable of providing a hint of the market price of a new data product using as inputs just its metadata. This tool would provide more transparency on the prices of data products in the market, which will help in pricing data assets and in avoiding the inherent price fluctuation of nascent markets. Next we turn to topics related to data marketplace design. Particularly, we study how buyers can select and purchase suitable data for their tasks without requiring a priori access to such data in order to make a purchase decision, and how marketplaces can distribute payoffs for a data transaction combining data of different sources among the corresponding providers, be they individuals or firms. The difficulty of both problems is further exacerbated in a human-centric data economy where buyers have to choose among data of thousands of individuals, and where marketplaces have to distribute payoffs to thousands of people contributing personal data to a specific transaction. Regarding the selection process, we compare different purchase strategies depending on the level of information available to data buyers at the time of making decisions. A first methodological contribution of our work is proposing a data evaluation stage prior to datasets being selected and purchased by buyers in a marketplace. We show that buyers can significantly improve the performance of the purchasing process just by being provided with a measurement of the performance of their models when trained by the marketplace with individual eligible datasets. We design purchase strategies that exploit such functionality and we call the resulting algorithm Try Before You Buy, and our work demonstrates over synthetic and real datasets that it can lead to near-optimal data purchasing with only O(N) instead of the exponential execution time - O(2N) - needed to calculate the optimal purchase. With regards to the payoff distribution problem, we focus on computing the relative value of spatio-temporal datasets combined in marketplaces for predicting transportation demand and travel time in metropolitan areas. Using large datasets of taxi rides from Chicago, Porto and New York we show that the value of data is different for each individual, and cannot be approximated by its volume. Our results reveal that even more complex approaches based on the “leave-one-out” value, are inaccurate. Instead, more complex and acknowledged notions of value from economics and game theory, such as the Shapley value, need to be employed if one wishes to capture the complex effects of mixing different datasets on the accuracy of forecasting algorithms. However, the Shapley value entails serious computational challenges. Its exact calculation requires repetitively training and evaluating every combination of data sources and hence O(N!) or O(2N) computational time, which is unfeasible for complex models or thousands of individuals. Moreover, our work paves the way to new methods of measuring the value of spatio-temporal data. We identify heuristics such as entropy or similarity to the average that show a significant correlation with the Shapley value and therefore can be used to overcome the significant computational challenges posed by Shapley approximation algorithms in this specific context. We conclude with a number of open issues and propose further research directions that leverage the contributions and findings of this dissertation. These include monitoring data transactions to better measure data markets, and complementing market data with actual transaction prices to build a more accurate data pricing tool. A human-centric data economy would also require that the contributions of thousands of individuals to machine learning tasks are calculated daily. For that to be feasible, we need to further optimise the efficiency of data purchasing and payoff calculation processes in data marketplaces. In that direction, we also point to some alternatives to repetitively training and evaluating a model to select data based on Try Before You Buy and approximate the Shapley value. Finally, we discuss the challenges and potential technologies that help with building a federation of standardised data marketplaces. The data economy will develop fast in the upcoming years, and researchers from different disciplines will work together to unlock the value of data and make the most out of it. Maybe the proposal of getting paid for our data and our contribution to the data economy finally flies, or maybe it is other proposals such as the robot tax that are finally used to balance the power between individuals and tech firms in the digital economy. Still, we hope our work sheds light on the value of data, and contributes to making the price of data more transparent and, eventually, to moving towards a human-centric data economy.This work has been supported by IMDEA Networks InstitutePrograma de Doctorado en Ingeniería Telemática por la Universidad Carlos III de MadridPresidente: Georgios Smaragdakis.- Secretario: Ángel Cuevas Rumín.- Vocal: Pablo Rodríguez Rodrígue
    • 

    corecore