91 research outputs found

    The Depth Conditions of Possibility: The Data Episteme

    Get PDF
    Book review of Colin Koopman's How We Became Our Data (2019

    Wide-address spaces - exploring the design space

    Get PDF
    In a recent issue of Operating System Review, Hayter and McAuley [1991] argue that future high-performance systems trade a traditional, bus-based organization for one where all components are linked together by network switches (the Desk-Area Network). In this issue of Operating System Review, Leslie, McAuley and Mullender conclude that DAN-based architectures allow the exploitation of shared memory on a wider scale than just a single (multi)processor. In this paper, we will explore how emerging 64-bit processors can be used to implement shared address spaces spanning multiple machines

    Emulating Digital Logic using Transputer Networks (Very High Parallelism = Simplicity = Performance)

    Get PDF
    Modern VLSI technology has changed the economic rules by which the balance between processing power, memory and communications is decided in computing systems. This will have a profound impact on the design rules for the controlling software. In particular, the criteria for judging efficiency of the algorithms will be somewhat different. This paper explores some of these implications through the development of highly parallel and highly distributable algorithms based on occam and transputer networks. The major results reported are a new simplicity for software designs, a corresponding ability to reason (formally and informally) about their properties, the reusability of their components and some real performance figures which demonstrate their practicality. Some guidelines to assist in these designs are also given. As a vehicle for discussion, an interactive simulator is developed for checking the functional and timing characteristics of digital logic circuits of arbitrary complexity

    Book Review: E. Martín-Monje, I. Elorza, I. and B. García Riaza (eds) (2016). Technology-Enhanced Language Learning for Specialized Domains. Practical applications and mobility. New York: Routledge, pp. 286, ISBN: 978-1-315-65172-9.

    Get PDF
    Computers have had a significant presence in language teaching since the 1960s, while the obvious emerging development of “educational technology” can be established in the early 1980s. By then, this term began to obtain significant popularity, since instructional media started to get a wider impact on educational practices. Since then, terminology has shifted significantly, from the initial Computer-Assisted Language Learning (CALL) to Technology-Enhanced Language Learning (TELL), subtly considering the fact that present computers are transforming less obvious “on the surface” while, at same time, being completely necessary. Computers lead other kinds of technology, such as audio, video and the World Wide Web, so that the current focus is on the communication which is facilitated by the computer rather than the machine itself

    Smart plugs: A low cost solution for programmable control of domestic loads

    Get PDF
    Balancing energy demand and production is becoming a more and more challenging task for energy utilities. This is due to a number of different reasons among which the larger penetration of renewable energies which are more difficult to predict and the meagre availability of financial resources to upgrade the existing power grid. While the traditional solution is to dynamically adapt energy production to follow the time-varying demand, a new trend is to drive the demand itself by means of Direct Load Control (DLC). In this paper we consider a scenario where DLC functionalities are deployed at a large set of small deferrable energy loads, like appliances of residential users. The required additional intelligence and communication capabilities may be introduced through smart plugs, without the need to replace older 'dumb' appliances. Smart plugs are inserted between the appliances plugs and the power sockets and directly connected to the Internet. An open software architecture allows to abstract the hardware sensors and actuators integrated in the plug and to easily program different load control applications

    IUC Independent Policy Report: At the End of the End of History

    Get PDF
    The IUC Independent Policy Report was drafted by the IUC Legal Standards Research Group, organized by a Steering Committee chaired by Ugo Mattei (International University College of Turin), coordinated by Edoardo Reviglio (International University College of Turin) and Giuseppe Mastruzzo (International University College of Turin), and composed by Franco Bassanini (University of Rome “La Sapienza”), Guido Calabresi (Yale University), Antoine Garapon (Institut des Hautes Etudes sur la Justice, Paris), and Tibor Varady (Central European University, Budapest). Contributors include Eugenio Barcellona (Eastern Piedmont University), Mauro Bussani (University of Trieste), Giuliano G. Castellano (Ecole Polytechnique Preg/CRG), Moussa Djir´e (Bamako University), Liu Guanghua (Lanzhou University), Golnoosh Hakimdavar (University of Turin), John Haskell (SOAS), Jedidiah J. Kroncke (Yale Law School), Andrea Lollini (Bologna University), Alberto Lucarelli (Federico II University), Boris N. Mamlyuk, (University of Turin), Alberto Monti (Bocconi University), Sergio Ariel Muro (Torquato di Tella University), Domenico Nicol`o (Mediterranean University of Reggio Calabria), and Nicola Sartori (University of Michigan). The IUC Independent Policy Report argues for a radical change of perspective, capable of restoring the supremacy of the law over the economic system. It is not only about finance, nor is it only about economics or policy. In this sense a transnational set of normative principles is needed in order to establish a global legal system capable of controlling economic processes, rather than being controlled by them. Within this framework a series of policy proposals are presented in order to effectively implement a new system of global standards. The current Western standard of living is unsustainable. Should the rest share the model of development of the West, our planet will simply not be capable of resisting the growth in consumption and pollution. Within this fundamental setting of scarcity in resources, using the rhetoric of the end of history as the polar star for growth, development and ultimately happiness of the whole world is simply a cynical lie. We argue here for the beginning of a necessary process aimed at the development of a legal system that is much less about creating an effcient backbone for an exploitive economy and much more about a vision of civilization, justice and respect where the laws of nature and those of humans converge in a sustainable long-term philosophy. Principles of justice, responsibility and long term environmental protection, rather than short term economic contingency and strong interests must set the legal agenda. A new governance and bottom-up inclusive integration of knowledge-based economies (wherever located), which is crucial to the very survival of humankind, cannot happen without defning new terms of a widely accepted standard of long term justice in the transnational context, hence the urgency to conceive legitimate transnational legal structures and possibly some apparatus of “superlegality.” The report is composed of fve sections. After having presented the pitfalls of the prevailing theoretical apparatus, an alternative cultural grid upon which policy actions should be shaped is presented. In this sense several normative proposals - revisiting the key characteristics of the current system - are offered aiming at acquiring a wider perspective over the actual global crisis

    GENERATYWNY MODEL Z DEEP FAKE AUGUMENTATION DLA SYGNAŁÓW Z FONOKARDIOGRAMU ORAZ ELEKTROKARDIOGRAMU W STRUKTURACH LSGAN ORAZ CYCLE GAN

    Get PDF
    In order to diagnose a range of cardiac conditions, it is important to conduct an accurate evaluation of either phonocardiogram (PCG) and electrocardiogram (ECG) data. Artificial intelligence and machine learning-based computer-assisted diagnostics are becoming increasingly commonplace in modern medicine, assisting clinicians in making life-or-death decisions. The requirement for an enormous amount of information for training to establish the framework for a deep learning-based technique is an empirical challenge in the field of medicine. This increases the risk of personal information being misused. As a direct result of this issue, there has been an explosion in the study of methods for creating synthetic patient data. Researchers have attempted to generate synthetic ECG or PCG readings. To balance the dataset, ECG data were first created on the MIT-BIH arrhythmia database using LS GAN and Cycle GAN. Next, using VGGNet, studies were conducted to classify arrhythmias for the synthesized ECG signals. The synthesized signals performed well and resembled the original signal and the obtained precision of 91.20%, recall of 89.52% and an F1 score of 90.35%.W celu zdiagnozowania szeregu chorób serca, istotne jest przeprowadzenie dokładnej oceny danych z fonokardiogramu (PCG) i elektrokardiogram (EKG). Sztuczna inteligencja i diagnostyka wspomagana komputerowo, oparta na uczeniu maszynowym stają się coraz bardziej powszechne we współczesnej medycynie, pomagając klinicystom w podejmowaniu krytycznych decyzji. Z kolei, Wymóg ogromnej ilości informacji do trenowania, w celu ustalenia platformy (ang. framework) techniki, opartej na głębokim uczeniu stanowi empiryczne wyzwanie w obszarze medycyny. Zwiększa to ryzyko niewłaściwego wykorzystania danych osobowych. Bezpośrednim skutkiem tego problemu był gwałtowny rozwój badań nad metodami tworzenia syntetycznych danych pacjentów. Badacze podjęli próbę wygenerowania syntetycznych odczytów diagramów EKG lub PCG. Stąd, w celu zrównoważenia zbioru danych, w pierwszej kolejności utworzono dane EKG w bazie danych arytmii MIT-BIH przy użyciu struktur sieci generatywnych LSGAN i Cycle GAN. Następnie, wykorzystując strukturę sieci VGGNet, przeprowadzono badania, mające na celu klasyfikację arytmii na potrzeby syntetyzowanych sygnałów EKG. Dla wygenerowanych sygnałów, przypominających sygnał oryginalny uzyskano dobre rezultaty. Należy podkreślić, że uzyskana dokładność wynosiła 91,20%, powtarzalność 89,52% i wynik F1 – odpowiednio 90,35%
    corecore