705 research outputs found

    Digital Subjectivation and Financial Markets: Criticizing Social Studies of Finance with Lazzarato

    Get PDF
    The recently rising field of Critical Data Studies is still facing fundamental questions. Among these is the enigma of digital subjectivation. Who are the subjects of Big Data? A field where this question is particularly pressing is finance. Since the 1990s traders have been steadily integrated into computerized data assemblages, which calls for an ontology that eliminates the distinction between human sovereign subjects and non-human instrumental objects. The latter subjectivize traders in pre-conscious ways, because human consciousness runs too slow to follow the volatility of the market. In response to this conundrum Social Studies of Finance has drawn on Actor-Network Theory to interpret financial markets as technically constructed networks of human and non-human actors. I argue that in order to develop an explicitly critical data study it might be advantageous to refer to Maurizio Lazzarato’s theory of machinic subjugation instead. Although both accounts describe financial digital subjectivation similarly, Lazzarato has the advantage of coupling his description to a clear critique of and resistance to finance

    Toward Business Integrity Modeling and Analysis Framework for Risk Measurement and Analysis

    Get PDF
    Financialization has contributed to economic growth but has caused scandals, misselling, rogue trading, tax evasion, and market speculation. To a certain extent, it has also created problems in social and economic instability. It is an important aspect of Enterprise Security, Privacy, and Risk (ESPR), particularly in risk research and analysis. In order to minimize the damaging impacts caused by the lack of regulatory compliance, governance, ethical responsibilities, and trust, we propose a Business Integrity Modeling and Analysis (BIMA) framework to unify business integrity with performance using big data predictive analytics and business intelligence. Comprehensive services include modeling risk and asset prices, and consequently, aligning them with business strategies, making our services, according to market trend analysis, both transparent and fair. The BIMA framework uses Monte Carlo simulation, the Black–Scholes–Merton model, and the Heston model for performing financial, operational, and liquidity risk analysis and present outputs in the form of analytics and visualization. Our results and analysis demonstrate supplier bankruptcy modeling, risk pricing, high-frequency pricing simulations, London Interbank Offered Rate (LIBOR) rate simulation, and speculation detection results to provide a variety of critical risk analysis. Our approaches to tackle problems caused by financial services and the operational risk clearly demonstrate that the BIMA framework, as the outputs of our data analytics research, can effectively combine integrity and risk analysis together with overall business performance and can contribute to operational risk research

    Challenges of Big Data Analysis

    Full text link
    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article give overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasis on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions

    Advanced power routing framework for optimal economic operation and control of solar photovoltaic-based islanded microgrid

    Full text link
    © 2019 Institution of Engineering and Technology. All rights reserved. Energy sharing through a microgrid (MG) is essential for islanded communities to maximise the use of distributed energy resources (DERs) and battery energy storage systems (BESSs). Proper energy management and control strategies of such MGs can offer revenue to prosumers (active consumers with DERs) by routing excess energy to their neighbours and maintaining grid constraints at the same time. This paper proposes an advanced power-routing framework for a solarphotovoltaic (PV)-based islanded MG with a central storage system (CSS). An optimisation-based economic operation for the MG is developed that determines the power routing and energy sharing in the MG in the day-ahead stage. A modified droop controller-based real-time control strategy has been established that maintains the voltage constraints of the MG. The proposed power-routing framework is verified via a case study for a typical islanded MG. The outcome of the optimal economic operation and a controller verification of the proposed framework are presented to demonstrate the effectiveness of the proposed powerrouting framework. Results reveal that the proposed framework performs a stable control operation and provides a profit of 57 AU$/day at optimal conditions

    What Europe Knows and Thinks About Algorithms Results of a Representative Survey. Bertelsmann Stiftung eupinions February 2019

    Get PDF
    We live in an algorithmic world. Day by day, each of us is affected by decisions that algorithms make for and about us – generally without us being aware of or consciously perceiving this. Personalized advertisements in social media, the invitation to a job interview, the assessment of our creditworthiness – in all these cases, algorithms already play a significant role – and their importance is growing, day by day. The algorithmic revolution in our daily lives undoubtedly brings with it great opportunities. Algorithms are masters at handling complexity. They can manage huge amounts of data quickly and efficiently, processing it consistently every time. Where humans reach their cognitive limits, find themselves making decisions influenced by the day’s events or feelings, or let themselves be influenced by existing prejudices, algorithmic systems can be used to benefit society. For example, according to a study by the Expert Council of German Foundations on Integration and Migration, automotive mechatronic engineers with Turkish names must submit about 50 percent more applications than candidates with German names before being invited to an in-person job interview (Schneider, Yemane and Weinmann 2014). If an algorithm were to make this decision, such discrimination could be prevented. However, automated decisions also carry significant risks: Algorithms can reproduce existing societal discrimination and reinforce social inequality, for example, if computers, using historical data as a basis, identify the male gender as a labor-market success factor, and thus systematically discard job applications from woman, as recently took place at Amazon (Nickel 2018)

    INTEGRATION OF BLOCKCHAIN TECHNOLOGIES AND MACHINE LEARNING WITH DEEP ANALYSIS

    Get PDF
    The successful development of the digital economy, which we can observe since the advent of the internet, is closely related to progress in several "frontier technologies" (frontier technologies), among which the most important, according to the scientific community and international organizations, are such software-oriented technologies as blockchain, Big Data Analytics (Big Data), Artificial Intelligence (AI) and cloud Computing (Cloud Computing), as well as specialized machine-oriented equipment: 3D printers, internet of Things devices (Internet of things Things, IoT), automation and robotics. Significant progress in the application of these technologies contributes to the growth of production capabilities, labor productivity, and capital return of both digital companies and enterprises of the non-digital economy while transforming their established business models and principles of generating income and expenses of companies. This makes it necessary to study the above technologies in detail from the point of view of analyzing their essence, role, and potential for use in various spheres of economic life. Although the term "blockchain" has recently entered scientific and public use, the idea of the technology appeared in the late 1980s, namely in 1989. Lamport proposed "a model for achieving consensus on results in a network of computers, where computers or the network itself can be unreliable". In 2008, Satoshi Nakamoto proposed the concept of using a decentralized computer network to operate a P2P electronic money system. In the article "Bitcoin: a Peer-to-Peer Electronic Cash System" published on the internet, the innovator described the algorithm of functioning of the Bitcoin cryptocurrency as a completely independent electronic cash system from a single issue Center, which does not require the trust (mediation) of a third party, but relies on direct operations between the parties to the transaction, protected by cryptographic encryption

    Is cloud computing the digital solution to the future of banking?

    Get PDF
    Acknowledgment: We express our thanks to the editors and anonymous referees for their constructive comments and suggestions, which have helped us significantly improve this paper. We have benefitted from discussions with colleagues and participants of different seminars/workshops in China and the UK. All remaining errors are our own. This research is financially supported by the Natural Science Foundation of China (72173036, 71973148) and the Chinese National Funding of Social Sciences (19CJY065).Peer reviewedPublisher PD

    Salmon, sensors, and translation : the agency of Big Data in environmental governance

    Get PDF
    This paper explores the emerging role of Big Data in environmental governance. We focus on the case of salmon aquaculture management from 2011 to 2017 in Macquarie Harbour, Australia, and compare this with the foundational case that inspired the development of the concept of ‘translation’ in actor-network theory, that of scallop domestication in St Brieuc Bay, France, in the 1970s. A key difference is the salience of environmental data in the contemporary case. Recent dramatic events in the environmental governance of Macquarie Harbour have been driven by increasing spatial and temporal resolution of environmental monitoring, including real-time data collection from sensors mounted on the fish themselves. The resulting environmental data now takes centre stage in increasingly heated debates over how the harbour should be managed: overturning long-held assumptions about environmental interactions, inducing changes in regulatory practices and institutions, fracturing historical alliances and shaping the on-going legitimacy of the industry. Environmental Big Data is now a key actor within the networks that constitute and enact environmental governance. Given its new and unpredictable agency, control over access to data is likely to become critical in future power struggles over environmental resources and their governance. © The Author(s) 2018
    • …
    corecore