3,677 research outputs found

    Climate Change and Critical Agrarian Studies

    Full text link
    Climate change is perhaps the greatest threat to humanity today and plays out as a cruel engine of myriad forms of injustice, violence and destruction. The effects of climate change from human-made emissions of greenhouse gases are devastating and accelerating; yet are uncertain and uneven both in terms of geography and socio-economic impacts. Emerging from the dynamics of capitalism since the industrial revolution ā€” as well as industrialisation under state-led socialism ā€” the consequences of climate change are especially profound for the countryside and its inhabitants. The book interrogates the narratives and strategies that frame climate change and examines the institutionalised responses in agrarian settings, highlighting what exclusions and inclusions result. It explores how different people ā€” in relation to class and other co-constituted axes of social difference such as gender, race, ethnicity, age and occupation ā€” are affected by climate change, as well as the climate adaptation and mitigation responses being implemented in rural areas. The book in turn explores how climate change ā€“ and the responses to it - affect processes of social differentiation, trajectories of accumulation and in turn agrarian politics. Finally, the book examines what strategies are required to confront climate change, and the underlying political-economic dynamics that cause it, reflecting on what this means for agrarian struggles across the world. The 26 chapters in this volume explore how the relationship between capitalism and climate change plays out in the rural world and, in particular, the way agrarian struggles connect with the huge challenge of climate change. Through a huge variety of case studies alongside more conceptual chapters, the book makes the often-missing connection between climate change and critical agrarian studies. The book argues that making the connection between climate and agrarian justice is crucial

    LIPIcs, Volume 251, ITCS 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 251, ITCS 2023, Complete Volum

    Auditable and performant Byzantine consensus for permissioned ledgers

    Get PDF
    Permissioned ledgers allow users to execute transactions against a data store, and retain proof of their execution in a replicated ledger. Each replica verifies the transactionsā€™ execution and ensures that, in perpetuity, a committed transaction cannot be removed from the ledger. Unfortunately, this is not guaranteed by todayā€™s permissioned ledgers, which can be re-written if an arbitrary number of replicas collude. In addition, the transaction throughput of permissioned ledgers is low, hampering real-world deployments, by not taking advantage of multi-core CPUs and hardware accelerators. This thesis explores how permissioned ledgers and their consensus protocols can be made auditable in perpetuity; even when all replicas collude and re-write the ledger. It also addresses how Byzantine consensus protocols can be changed to increase the execution throughput of complex transactions. This thesis makes the following contributions: 1. Always auditable Byzantine consensus protocols. We present a permissioned ledger system that can assign blame to individual replicas regardless of how many of them misbehave. This is achieved by signing and storing consensus protocol messages in the ledger and providing clients with signed, universally-verifiable receipts. 2. Performant transaction execution with hardware accelerators. Next, we describe a cloud-based ML inference service that provides strong integrity guarantees, while staying compatible with current inference APIs. We change the Byzantine consensus protocol to execute machine learning (ML) inference computation on GPUs to optimize throughput and latency of ML inference computation. 3. Parallel transactions execution on multi-core CPUs. Finally, we introduce a permissioned ledger that executes transactions, in parallel, on multi-core CPUs. We separate the execution of transactions between the primary and secondary replicas. The primary replica executes transactions on multiple CPU cores and creates a dependency graph of the transactions that the backup replicas utilize to execute transactions in parallel.Open Acces

    La traduzione specializzata allā€™opera per una piccola impresa in espansione: la mia esperienza di internazionalizzazione in cinese di BioreticsĀ© S.r.l.

    Get PDF
    Global markets are currently immersed in two all-encompassing and unstoppable processes: internationalization and globalization. While the former pushes companies to look beyond the borders of their country of origin to forge relationships with foreign trading partners, the latter fosters the standardization in all countries, by reducing spatiotemporal distances and breaking down geographical, political, economic and socio-cultural barriers. In recent decades, another domain has appeared to propel these unifying drives: Artificial Intelligence, together with its high technologies aiming to implement human cognitive abilities in machinery. The ā€œLanguage Toolkit ā€“ Le lingue straniere al servizio dellā€™internazionalizzazione dellā€™impresaā€ project, promoted by the Department of Interpreting and Translation (ForlƬ Campus) in collaboration with the Romagna Chamber of Commerce (ForlƬ-Cesena and Rimini), seeks to help Italian SMEs make their way into the global market. It is precisely within this project that this dissertation has been conceived. Indeed, its purpose is to present the translation and localization project from English into Chinese of a series of texts produced by BioreticsĀ© S.r.l.: an investor deck, the company website and part of the installation and use manual of the AliquisĀ© framework software, its flagship product. This dissertation is structured as follows: Chapter 1 presents the project and the company in detail; Chapter 2 outlines the internationalization and globalization processes and the Artificial Intelligence market both in Italy and in China; Chapter 3 provides the theoretical foundations for every aspect related to Specialized Translation, including website localization; Chapter 4 describes the resources and tools used to perform the translations; Chapter 5 proposes an analysis of the source texts; Chapter 6 is a commentary on translation strategies and choices

    Business Functions Capabilities and Small and Medium Enterprisesā€™ Internationalization

    Get PDF
    Ineffective global expansion can adversely affect small and medium enterprises (SMEs) business outcomes. Business leaders are concerned with developing effective global expansion strategies to penetrate potential international markets, thus enhancing sustainability. Grounded in the business management systems theory, the purpose of this qualitative multi-case study was to explore strategies that leaders of Sub-Saharan Africa manufacturing SMEs use for global expansion. The participants were five manufacturing value-adding SME leaders participating in export markets. Using Yinā€™s five steps data analysis process, six themes emerged: (a) enterprise characterization, (b) understanding the enterpriseā€™s product, (c) intra-enterprise factor-based strategies for export participation, (d) the enterpriseā€™s external factor-based strategies for successful export venture, (e) global expansion strategies, and (f) serendipitous findings. A key recommendation for SME leaders is to analyze the critical components of their products and prepare to adjust them to the demand dimensions of the target market. The implications for positive social change include the potential to increase the enterpriseā€™s wealth, increase employment, reduce poverty for all value chain participants, and growth in gross domestic product

    Insights into temperature controls on rockfall occurrence and cliff erosion

    Get PDF
    A variety of environmental triggers have been associated with the occurrence of rockfalls however their role and relative significance remains poorly constrained. This is in part due to the lack of concurrent data on rockfall occurrence and cliff face conditions at temporal resolutions that mirror the variability of environmental conditions, and over durations for large enough numbers of rockfall events to be captured. The aim of this thesis is to fill this data gap, and then to specifically focus on the role of temperature in triggering rockfall that this data illuminates. To achieve this, a long-term multiannual 3D rockfall dataset and contemporaneous Infrared Thermography (IRT) monitoring of cliff surface temperatures has been generated. The approaches used in this thesis are undertaken at East Cliff, Whitby, which is a coastal cliff located in North Yorkshire, UK. The monitored section is ~ 200 m wide and ~65 m high, with a total cliff face area of ~9,592 mĀ². A method for the automated quantification of rockfall volumes is used to explore data collected between 2017ā€“2019 and 2021, with the resulting inventory including > 8,300 rockfalls from 2017ā€“2019 and > 4,100 rockfalls in 2021, totalling > 12,400 number of rockfalls. The analysis of the inventory demonstrates that during dry conditions, increases in rockfall frequency are coincident with diurnal surface temperature fluctuations, notably at sunrise, noon and sunset in all seasons, leading to a marked diurnal pattern of rockfall. Statistically significant relationships are observed to link cliff temperature and rockfall, highlighting the response of rock slopes to absolute temperatures and changes in temperature. This research also shows that inclement weather constitutes the dominant control over the annual production of rockfalls but also quantifies the period when temperature controls are dominant. Temperature-controlled rockfall activity is shown to have an important erosional role, particularly in periods of iterative erosion dominated by small size rockfalls. As such, this thesis provides for the first high-resolution evidence of temperature controls on rockfall activity, cliff erosion and landform development

    Nonlocal games and their device-independent quantum applications

    Get PDF
    Device-independence is a property of certain protocols that allows one to ensure their proper execution given only classical interaction with devices and assuming the correctness of the laws of physics. This scenario describes the most general form of cryptographic security, in which no trust is placed in the hardware involved; indeed, one may even take it to have been prepared by an adversary. Many quantum tasks have been shown to admit device-independent protocols by augmentation with "nonlocal games". These are games in which noncommunicating parties jointly attempt to fulfil some conditions imposed by a referee. We introduce examples of such games and examine the optimal strategies of players who are allowed access to different possible shared resources, such as entangled quantum states. We then study their role in self-testing, private random number generation, and secure delegated quantum computation. Hardware imperfections are naturally incorporated in the device-independent scenario as adversarial, and we thus also perform noise robustness analysis where feasible. We first study a generalization of the Merminā€“Peres magic square game to arbitrary rectangular dimensions. After exhibiting some general properties, these "magic rectangle" games are fully characterized in terms of their optimal win probabilities for quantum strategies. We find that for mƗn magic rectangle games with dimensions m,nā‰„3, there are quantum strategies that win with certainty, while for dimensions 1Ɨn quantum strategies do not outperform classical strategies. The final case of dimensions 2Ɨn is richer, and we give upper and lower bounds that both outperform the classical strategies. As an initial usage scenario, we apply our findings to quantum certified randomness expansion to find noise tolerances and rates for all magic rectangle games. To do this, we use our previous results to obtain the winning probabilities of games with a distinguished input for which the devices give a deterministic outcome and follow the analysis of C. A. Miller and Y. Shi [SIAM J. Comput. 46, 1304 (2017)]. Self-testing is a method to verify that one has a particular quantum state from purely classical statistics. For practical applications, such as device-independent delegated verifiable quantum computation, it is crucial that one self-tests multiple Bell states in parallel while keeping the quantum capabilities required of one side to a minimum. We use our 3Ɨn magic rectangle games to obtain a self-test for n Bell states where one side needs only to measure single-qubit Pauli observables. The protocol requires small input sizes [constant for Alice and O(log n) bits for Bob] and is robust with robustness O(nāµ/Ā²āˆšĪµ), where Īµ is the closeness of the ideal (perfect) correlations to those observed. To achieve the desired self-test, we introduce a one-side-local quantum strategy for the magic square game that wins with certainty, we generalize this strategy to the family of 3Ɨn magic rectangle games, and we supplement these nonlocal games with extra check rounds (of single and pairs of observables). Finally, we introduce a device-independent two-prover scheme in which a classical verifier can use a simple untrusted quantum measurement device (the client device) to securely delegate a quantum computation to an untrusted quantum server. To do this, we construct a parallel self-testing protocol to perform device-independent remote state preparation of n qubits and compose this with the unconditionally secure universal verifiable blind quantum computation (VBQC) scheme of J. F. Fitzsimons and E. Kashefi [Phys. Rev. A 96, 012303 (2017)]. Our self-test achieves a multitude of desirable properties for the application we consider, giving rise to practical and fully device-independent VBQC. It certifies parallel measurements of all cardinal and intercardinal directions in the XY-plane as well as the computational basis, uses few input questions (of size logarithmic in n for the client and a constant number communicated to the server), and requires only single-qubit measurements to be performed by the client device

    Advances and Applications of DSmT for Information Fusion. Collected Works, Volume 5

    Get PDF
    This ļ¬fth volume on Advances and Applications of DSmT for Information Fusion collects theoretical and applied contributions of researchers working in different ļ¬elds of applications and in mathematics, and is available in open-access. The collected contributions of this volume have either been published or presented after disseminating the fourth volume in 2015 in international conferences, seminars, workshops and journals, or they are new. The contributions of each part of this volume are chronologically ordered. First Part of this book presents some theoretical advances on DSmT, dealing mainly with modiļ¬ed Proportional Conļ¬‚ict Redistribution Rules (PCR) of combination with degree of intersection, coarsening techniques, interval calculus for PCR thanks to set inversion via interval analysis (SIVIA), rough set classiļ¬ers, canonical decomposition of dichotomous belief functions, fast PCR fusion, fast inter-criteria analysis with PCR, and improved PCR5 and PCR6 rules preserving the (quasi-)neutrality of (quasi-)vacuous belief assignment in the fusion of sources of evidence with their Matlab codes. Because more applications of DSmT have emerged in the past years since the apparition of the fourth book of DSmT in 2015, the second part of this volume is about selected applications of DSmT mainly in building change detection, object recognition, quality of data association in tracking, perception in robotics, risk assessment for torrent protection and multi-criteria decision-making, multi-modal image fusion, coarsening techniques, recommender system, levee characterization and assessment, human heading perception, trust assessment, robotics, biometrics, failure detection, GPS systems, inter-criteria analysis, group decision, human activity recognition, storm prediction, data association for autonomous vehicles, identiļ¬cation of maritime vessels, fusion of support vector machines (SVM), Silx-Furtif RUST code library for information fusion including PCR rules, and network for ship classiļ¬cation. Finally, the third part presents interesting contributions related to belief functions in general published or presented along the years since 2015. These contributions are related with decision-making under uncertainty, belief approximations, probability transformations, new distances between belief functions, non-classical multi-criteria decision-making problems with belief functions, generalization of Bayes theorem, image processing, data association, entropy and cross-entropy measures, fuzzy evidence numbers, negator of belief mass, human activity recognition, information fusion for breast cancer therapy, imbalanced data classiļ¬cation, and hybrid techniques mixing deep learning with belief functions as well
    • ā€¦
    corecore