76 research outputs found

    LIPIcs, Volume 251, ITCS 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 251, ITCS 2023, Complete Volum

    IT\u27S NOT RAINBOWS AND UNICORNS : REGULATED COMMODITY AND WASTE PRODUCTION IN THE ALBERTA OILSANDS

    Get PDF
    This dissertation examines the regulated oilsands mining industry of Alberta, Canada, widely considered the world’s largest surface mining project. The industrial processes of oilsands mining produce well over one million barrels of petroleum commodities daily, plus even larger quantities of airborne and semisolid waste. The project argues for a critical account of production concretized in the co-constitutional relations of obdurate materiality and labor activity within a framework of regulated petro-capitalism. This pursuit requires multiple methods that combine archives, participant observation, and semi-structured interviews to understand workers’ shift-to-shift relations inside the “black box” of regulated oilsands mining production where materiality co-constitutes the processes and outcomes of resource development and waste-intensive production. Here, the central contradiction pits the industry’s colossal environmental impact and its regulated environmental relations, which – despite chronic exceedances – are held under some control by provincial and federal environmental agents, further attenuated by firms’ selective voluntary compliance with global quality standards as well as whistleblowers and otherwise “troublesome” employees. ‘It’s not rainbows and unicorns,’ explains one informant, distilling workers’ views of the safety and environmental hazards they simultaneously produce and endure as wage laborers despite pervasive regulation. In addition to buttressing geographical conceptualizations of socionatural resource production, contributions arise in the sympathetic engagement with workers, which may hold useful insights for activism against the industry’s environmental outcomes

    Vector Semantics

    Get PDF
    This open access book introduces Vector semantics, which links the formal theory of word vectors to the cognitive theory of linguistics. The computational linguists and deep learning researchers who developed word vectors have relied primarily on the ever-increasing availability of large corpora and of computers with highly parallel GPU and TPU compute engines, and their focus is with endowing computers with natural language capabilities for practical applications such as machine translation or question answering. Cognitive linguists investigate natural language from the perspective of human cognition, the relation between language and thought, and questions about conceptual universals, relying primarily on in-depth investigation of language in use. In spite of the fact that these two schools both have ‘linguistics’ in their name, so far there has been very limited communication between them, as their historical origins, data collection methods, and conceptual apparatuses are quite different. Vector semantics bridges the gap by presenting a formal theory, cast in terms of linear polytopes, that generalizes both word vectors and conceptual structures, by treating each dictionary definition as an equation, and the entire lexicon as a set of equations mutually constraining all meanings

    Deep Learning for Abstraction, Control and Monitoring of Complex Cyber-Physical Systems

    Get PDF
    Cyber-Physical Systems (CPS) consist of digital devices that interact with some physical components. Their popularity and complexity are growing exponentially, giving birth to new, previously unexplored, safety-critical application domains. As CPS permeate our daily lives, it becomes imperative to reason about their reliability. Formal methods provide rigorous techniques for verification, control and synthesis of safe and reliable CPS. However, these methods do not scale with the complexity of the system, thus their applicability to real-world problems is limited. A promising strategy is to leverage deep learning techniques to tackle the scalability issue of formal methods, transforming unfeasible problems into approximately solvable ones. The approximate models are trained over observations which are solutions of the formal problem. In this thesis, we focus on the following tasks, which are computationally challenging: the modeling and the simulation of a complex stochastic model, the design of a safe and robust control policy for a system acting in a highly uncertain environment and the runtime verification problem under full or partial observability. Our approaches, based on deep learning, are indeed applicable to real-world complex and safety-critical systems acting under strict real-time constraints and in presence of a significant amount of uncertainty.Cyber-Physical Systems (CPS) consist of digital devices that interact with some physical components. Their popularity and complexity are growing exponentially, giving birth to new, previously unexplored, safety-critical application domains. As CPS permeate our daily lives, it becomes imperative to reason about their reliability. Formal methods provide rigorous techniques for verification, control and synthesis of safe and reliable CPS. However, these methods do not scale with the complexity of the system, thus their applicability to real-world problems is limited. A promising strategy is to leverage deep learning techniques to tackle the scalability issue of formal methods, transforming unfeasible problems into approximately solvable ones. The approximate models are trained over observations which are solutions of the formal problem. In this thesis, we focus on the following tasks, which are computationally challenging: the modeling and the simulation of a complex stochastic model, the design of a safe and robust control policy for a system acting in a highly uncertain environment and the runtime verification problem under full or partial observability. Our approaches, based on deep learning, are indeed applicable to real-world complex and safety-critical systems acting under strict real-time constraints and in presence of a significant amount of uncertainty

    Three Risky Decades: A Time for Econophysics?

    Get PDF
    Our Special Issue we publish at a turning point, which we have not dealt with since World War II. The interconnected long-term global shocks such as the coronavirus pandemic, the war in Ukraine, and catastrophic climate change have imposed significant humanitary, socio-economic, political, and environmental restrictions on the globalization process and all aspects of economic and social life including the existence of individual people. The planet is trapped—the current situation seems to be the prelude to an apocalypse whose long-term effects we will have for decades. Therefore, it urgently requires a concept of the planet's survival to be built—only on this basis can the conditions for its development be created. The Special Issue gives evidence of the state of econophysics before the current situation. Therefore, it can provide excellent econophysics or an inter-and cross-disciplinary starting point of a rational approach to a new era

    Collected Papers (on Neutrosophic Theory and Its Applications in Algebra), Volume IX

    Get PDF
    This ninth volume of Collected Papers includes 87 papers comprising 982 pages on Neutrosophic Theory and its applications in Algebra, written between 2014-2022 by the author alone or in collaboration with the following 81 co-authors (alphabetically ordered) from 19 countries: E.O. Adeleke, A.A.A. Agboola, Ahmed B. Al-Nafee, Ahmed Mostafa Khalil, Akbar Rezaei, S.A. Akinleye, Ali Hassan, Mumtaz Ali, Rajab Ali Borzooei , Assia Bakali, Cenap Özel, Victor Christianto, Chunxin Bo, Rakhal Das, Bijan Davvaz, R. Dhavaseelan, B. Elavarasan, Fahad Alsharari, T. Gharibah, Hina Gulzar, Hashem Bordbar, Le Hoang Son, Emmanuel Ilojide, TĂšmĂ­tĂłpĂ© GbĂłlĂĄhĂ n JaĂ­yĂ©olĂĄ, M. Karthika, Ilanthenral Kandasamy, W.B. Vasantha Kandasamy, Huma Khan, Madad Khan, Mohsin Khan, Hee Sik Kim, Seon Jeong Kim, Valeri Kromov, R. M. Latif, Madeleine Al-Tahan, Mehmat Ali Ozturk, Minghao Hu, S. Mirvakili, Mohammad Abobala, Mohammad Hamidi, Mohammed Abdel-Sattar, Mohammed A. Al Shumrani, Mohamed Talea, Muhammad Akram, Muhammad Aslam, Muhammad Aslam Malik, Muhammad Gulistan, Muhammad Shabir, G. Muhiuddin, Memudu Olaposi Olatinwo, Osman Anis, Choonkil Park, M. Parimala, Ping Li, K. Porselvi, D. Preethi, S. Rajareega, N. Rajesh, Udhayakumar Ramalingam, Riad K. Al-Hamido, Yaser Saber, Arsham Borumand Saeid, Saeid Jafari, Said Broumi, A.A. Salama, Ganeshsree Selvachandran, Songtao Shao, Seok-Zun Song, Tahsin Oner, M. Mohseni Takallo, Binod Chandra Tripathy, Tugce Katican, J. Vimala, Xiaohong Zhang, Xiaoyan Mao, Xiaoying Wu, Xingliang Liang, Xin Zhou, Yingcang Ma, Young Bae Jun, Juanjuan Zhang

    Advances in Computer Science and Engineering

    Get PDF
    The book Advances in Computer Science and Engineering constitutes the revised selection of 23 chapters written by scientists and researchers from all over the world. The chapters cover topics in the scientific fields of Applied Computing Techniques, Innovations in Mechanical Engineering, Electrical Engineering and Applications and Advances in Applied Modeling

    Sublinear Computation Paradigm

    Get PDF
    This open access book gives an overview of cutting-edge work on a new paradigm called the “sublinear computation paradigm,” which was proposed in the large multiyear academic research project “Foundations of Innovative Algorithms for Big Data.” That project ran from October 2014 to March 2020, in Japan. To handle the unprecedented explosion of big data sets in research, industry, and other areas of society, there is an urgent need to develop novel methods and approaches for big data analysis. To meet this need, innovative changes in algorithm theory for big data are being pursued. For example, polynomial-time algorithms have thus far been regarded as “fast,” but if a quadratic-time algorithm is applied to a petabyte-scale or larger big data set, problems are encountered in terms of computational resources or running time. To deal with this critical computational and algorithmic bottleneck, linear, sublinear, and constant time algorithms are required. The sublinear computation paradigm is proposed here in order to support innovation in the big data era. A foundation of innovative algorithms has been created by developing computational procedures, data structures, and modelling techniques for big data. The project is organized into three teams that focus on sublinear algorithms, sublinear data structures, and sublinear modelling. The work has provided high-level academic research results of strong computational and algorithmic interest, which are presented in this book. The book consists of five parts: Part I, which consists of a single chapter on the concept of the sublinear computation paradigm; Parts II, III, and IV review results on sublinear algorithms, sublinear data structures, and sublinear modelling, respectively; Part V presents application results. The information presented here will inspire the researchers who work in the field of modern algorithms

    Parameterized analysis of complexity

    Get PDF
    • 

    corecore