63 research outputs found

    Factorization in Cybersecurity: a Dual Role of Defense and Vulnerability in the Age of Quantum Computing

    Get PDF
    One of the most critical components of modern cryptography and thus cybersecurity is the ability to factor large integers quickly and efficiently. RSA encryption, one of the most used types, is based largely on the assumption that factoring for large numbers is computationally infeasible for humans and computers alike. However, with quantum computers, people can use an algorithm like Shor’s algorithm to perform the same task exponentially faster than any normal device ever could. This investigation will go into the strength and vulnerability of RSA encryption using the power of factorization in an age of quantum computers.We start by looking at the foundations of both classical and quantum factoring with greater detail at number field sieve (NFS) and Shor’s. We examine the mathematical background of each topic and the associated algorithms. We conclude with theoretical analysis and experimental simulations that address the difficulty and implications of the above-mentioned algorithms in cryptography. The final thing that I will be discussing is where quantum computing is at present and how this could pose a threat to the current type of cryptographic systems, we use every day. I will be mentioning how we need post-quantum cryptography and how people are currently creating algorithms that are designed to be attack-resistant even to large-scale quantum computers. This investigation has shown the changing dynamics of cybersecurity in the quantum era and helps us understand the challenges and the need to innovate the current cryptographic systems

    Quantum Computing Standards & Accounting Information Systems

    Full text link
    This research investigates the potential implications of quantum technology on accounting information systems, and business overall. This endeavor focuses on the vulnerabilities of quantum computers and the emergence of quantum-resistant encryption algorithms. This paper critically analyzes quantum standards and their transformative effects on the efficiency, expediency, and security of commerce. By comparing the differences, similarities, and limitations of quantum standards, the research presents a collection of best practices and adaptation methods to fortify organizations against cyber threats in the quantum era. The study provides a guide to understanding and navigating the interplay between quantum technology and standard-setting organizations, enabling organizations to safeguard the integrity of their practices and adapt proactively to the challenges ushered in by the advent of quantum supremacy. This endeavor also contributes to research by painting the standard-setting ecosystem and noting its intricate processes. The findings include the identification of organizations involved with quantum standards, as well as observed distinctions, similarities, and limitations between American and European standards

    Psychopower and Ordinary Madness: Reticulated Dividuals in Cognitive Capitalism

    Get PDF
    Despite the seemingly neutral vantage of using nature for widely-distributed computational purposes, neither post-biological nor post-humanist teleology simply concludes with the real "end of nature" as entailed in the loss of the specific ontological status embedded in the identifier "natural." As evinced by the ecological crises of the Anthropocene—of which the 2019 Brazil Amazon rainforest fires are only the most recent—our epoch has transfixed the “natural order" and imposed entropic artificial integration, producing living species that become “anoetic,” made to serve as automated exosomatic residues, or digital flecks. I further develop Gilles Deleuze’s description of control societies to upturn Foucauldian biopower, replacing its spacio-temporal bounds with the exographic excesses in psycho-power; culling and further detailing Bernard Stiegler’s framework of transindividuation and hyper-control, I examine how becoming-subject is predictively facilitated within cognitive capitalism and what Alexander Galloway terms “deep digitality.” Despite the loss of material vestiges qua virtualization—which I seek to trace in an historical review of industrialization to postindustrialization—the drive-based and reticulated "internet of things" facilitates a closed loop from within the brain to the outside environment, such that the aperture of thought is mediated and compressed. The human brain, understood through its material constitution, is susceptible to total datafication’s laminated process of “becoming-mnemotechnical,” and, as neuroplasticity is now a valid description for deep-learning and neural nets, we are privy to the rebirth of the once-discounted metaphor of the “cybernetic brain.” Probing algorithmic governmentality while posing noetic dreaming as both technical and pharmacological, I seek to analyze how spirit is blithely confounded with machine-thinking’s gelatinous cognition, as prosthetic organ-adaptation becomes probabilistically molded, networked, and agentially inflected (rather than simply externalized)

    A Heterogeneous Parallel Non-von Neumann Architecture System for Accurate and Efficient Machine Learning Molecular Dynamics

    Full text link
    This paper proposes a special-purpose system to achieve high-accuracy and high-efficiency machine learning (ML) molecular dynamics (MD) calculations. The system consists of field programmable gate array (FPGA) and application specific integrated circuit (ASIC) working in heterogeneous parallelization. To be specific, a multiplication-less neural network (NN) is deployed on the non-von Neumann (NvN)-based ASIC (SilTerra 180 nm process) to evaluate atomic forces, which is the most computationally expensive part of MD. All other calculations of MD are done using FPGA (Xilinx XC7Z100). It is shown that, to achieve similar-level accuracy, the proposed NvN-based system based on low-end fabrication technologies (180 nm) is 1.6x faster and 10^2-10^3x more energy efficiency than state-of-the-art vN based MLMD using graphics processing units (GPUs) based on much more advanced technologies (12 nm), indicating superiority of the proposed NvN-based heterogeneous parallel architecture

    Hype in Science Communication: Exploring Scientists' Attitudes and Practices in Quantum Physics

    Full text link
    An interpretive phenomenological approach is adopted to investigate scientists' attitudes and practices related to hype in science communication. Twenty-four active quantum physicists participated in 5 focus groups. Through a semi-structured questionnaire, their use of hype, attitudes, behaviours, and perspectives on hype in science communication were observed. The main results show that scientists primarily attribute hype generation to themselves, major corporations, and marketing departments. They see hype as crucial for research funding and use it strategically, despite concerns. Scientists view hype as coercive, compromising their work's integrity, leading to mostly negative feelings about it, except for collaborator-generated hype. A dissonance exists between scientists' involvement in hype, their opinions, and the negative emotions it triggers. They manage this by attributing responsibility to the academic system, downplaying their practices. This reveals hype in science communication as a calculated, persuasive tactic by academic stakeholders, aligning with a neoliberal view of science. Implications extend to science communication, media studies, regulation, and academia.Comment: 23 page

    Low-Power Computer Vision: Improve the Efficiency of Artificial Intelligence

    Get PDF
    Energy efficiency is critical for running computer vision on battery-powered systems, such as mobile phones or UAVs (unmanned aerial vehicles, or drones). This book collects the methods that have won the annual IEEE Low-Power Computer Vision Challenges since 2015. The winners share their solutions and provide insight on how to improve the efficiency of machine learning systems

    EXPLORING PARALLELS BETWEEN ISLAMIC THEOLOGY AND TECHNOLOGICAL METAPHORS

    Get PDF
    As the scope of innovative technologies is expanding, their implications and applications are increasingly intersecting with various facets of society, including the deeply rooted traditions of religion. This paper embarks on an exploratory journey to bridge the perceived divide between advancements in technology and faith, aiming to catalyze a dialogue between the religious and scientific communities. The former often views technological progress through a lens of conflict rather than compatibility. By utilizing a technology-centric perspective, we draw metaphorical parallels between the functionalities of new technologies and some theological concepts of Islam. The purpose is not to reinterpret religious concepts but to illustrate how these two domains can coexist harmoniously. This comparative analysis serves as a conversation starter with an intention to mitigate any apprehensions towards technology by highlighting its potential to align with religious concepts. By fostering an environment where technological innovations are seen as tools for enhancement rather than threats to tradition, we contribute to a more inclusive discourse that encourages the religious community to engage with and potentially embrace contemporary technological advancements

    TPU v4: An Optically Reconfigurable Supercomputer for Machine Learning with Hardware Support for Embeddings

    Full text link
    In response to innovations in machine learning (ML) models, production workloads changed radically and rapidly. TPU v4 is the fifth Google domain specific architecture (DSA) and its third supercomputer for such ML models. Optical circuit switches (OCSes) dynamically reconfigure its interconnect topology to improve scale, availability, utilization, modularity, deployment, security, power, and performance; users can pick a twisted 3D torus topology if desired. Much cheaper, lower power, and faster than Infiniband, OCSes and underlying optical components are <5% of system cost and <3% of system power. Each TPU v4 includes SparseCores, dataflow processors that accelerate models that rely on embeddings by 5x-7x yet use only 5% of die area and power. Deployed since 2020, TPU v4 outperforms TPU v3 by 2.1x and improves performance/Watt by 2.7x. The TPU v4 supercomputer is 4x larger at 4096 chips and thus ~10x faster overall, which along with OCS flexibility helps large language models. For similar sized systems, it is ~4.3x-4.5x faster than the Graphcore IPU Bow and is 1.2x-1.7x faster and uses 1.3x-1.9x less power than the Nvidia A100. TPU v4s inside the energy-optimized warehouse scale computers of Google Cloud use ~3x less energy and produce ~20x less CO2e than contemporary DSAs in a typical on-premise data center.Comment: 15 pages; 16 figures; to be published at ISCA 2023 (the International Symposium on Computer Architecture

    Hackers: a case-study of the social shaping of computing

    Get PDF
    The study is an examination of hacking, placing the act in the context of theories of technological change. The account of hacking is used to substantiate those theories that emphasise the societal shaping of technology over the notion of technological determinism. The evolution of hacking is traced, showing how it reflects changing trends in the nature of information: the most vivid of these is the conceptualisation of information known as 'cyberspace'. Instead of simply cataloguing the impact of technical changes within computing, and the effects they have had upon information, the study shows how technical change takes place in a process of negotiation and conflict between groups.The two main groups analysed are those of the Computer Underground (CU) and the Computer Security Industry (CSI). The experiences and views of both groups are recounted in what constitute internalist and externalist accounts of hacking and its significance. The internalist account is the evidence provided by hackers themselves. It addresses such issues as what motivates the act of hacking; whether there is an identifiable hacking culture; and why it is almost an exclusively male activity. The externalist account contains the perceptions of hacking held by those outside the activity.The state of computing's security measures and its vulnerability to hacking is described, and evidence is provided of the extent to which hacking gives rise to technical knowledge that could be of potential use in the fixing of security weaknesses. The division within the CSI between those broadly cooperative with hackers and those largely hostile to them is examined, and the reasons why hacking knowledge is not generally utilised are explored. Hackers are prevented from gaining legitimacy within computing in a process referred to as 'closure'. Examples include hackers being stigmatised through the use of analogies that compare their computing activities to conventional crimes such as burglary and tresspass.Stigmatisation is carried out by the CSI who use it in a process of professional boundary formation to distinguish themselves from hackers. It is also used by other authority figures such as Members of Parliament whose involvement in the process of closure takes the form of the anti-hacking legislation they have passed, an analysis of which concludes this study
    • 

    corecore