108,328 research outputs found
Form and Data - from linear Calculus to cybernetic Computation and Interaction
Digital architecture developed in the 1960s and, supported by CAAD the 1990s, has created the path towards an architecture produced by computer and architect in a mutual relationship. The evolution of architecture since the 1970s led to the beginning of the first digital turn in the 1990s, and subsequently to the emergence of new typologies of buildings, architects and design tools; atom-based, bit-based (virtual) [1], and cyber-physical as a combination of both. The paper provides an insight into historical foundations of CAAD insofar as it engages with complexity in mechanics, geometry, and space between the 1600s and 1950s. I will address a selection of principles discovered, and mechanisms invented before computer-aided-architectural-design; those include the typewriter, the Cartesian grid and a pre-cyber-physical system by Hermann von Helmholtz. The paper concludes with a summary and an outlook to the future of CAAD challenged by the variety of correlations of disparate data sets
A general guide to applying machine learning to computer architecture
The resurgence of machine learning since the late 1990s has been enabled by significant advances in computing performance and the growth of big data. The ability of these algorithms to detect complex patterns in data which are extremely difficult to achieve manually, helps to produce effective predictive models. Whilst computer architects have been accelerating the performance of machine learning algorithms with GPUs and custom hardware, there have been few implementations leveraging these algorithms to improve the computer system performance. The work that has been conducted, however, has produced considerably promising results.
The purpose of this paper is to serve as a foundational base and guide to future computer
architecture research seeking to make use of machine learning models for improving system efficiency.
We describe a method that highlights when, why, and how to utilize machine learning
models for improving system performance and provide a relevant example showcasing the effectiveness of applying machine learning in computer architecture. We describe a process of data
generation every execution quantum and parameter engineering. This is followed by a survey of a
set of popular machine learning models. We discuss their strengths and weaknesses and provide
an evaluation of implementations for the purpose of creating a workload performance predictor
for different core types in an x86 processor. The predictions can then be exploited by a scheduler
for heterogeneous processors to improve the system throughput. The algorithms of focus are
stochastic gradient descent based linear regression, decision trees, random forests, artificial neural
networks, and k-nearest neighbors.This work has been supported by the European Research Council (ERC) Advanced Grant RoMoL (Grant Agreemnt 321253) and by the Spanish Ministry of Science and Innovation (contract TIN 2015-65316P).Peer ReviewedPostprint (published version
Deep Space Network information system architecture study
The purpose of this article is to describe an architecture for the Deep Space Network (DSN) information system in the years 2000-2010 and to provide guidelines for its evolution during the 1990s. The study scope is defined to be from the front-end areas at the antennas to the end users (spacecraft teams, principal investigators, archival storage systems, and non-NASA partners). The architectural vision provides guidance for major DSN implementation efforts during the next decade. A strong motivation for the study is an expected dramatic improvement in information-systems technologies, such as the following: computer processing, automation technology (including knowledge-based systems), networking and data transport, software and hardware engineering, and human-interface technology. The proposed Ground Information System has the following major features: unified architecture from the front-end area to the end user; open-systems standards to achieve interoperability; DSN production of level 0 data; delivery of level 0 data from the Deep Space Communications Complex, if desired; dedicated telemetry processors for each receiver; security against unauthorized access and errors; and highly automated monitor and control
CORPORATE INFRASTRUCTURE FOR FIFTH GENERATION COMPUTERS
Arithmetically each level of human generation contributes to 6.72 generations of Computer upgrading. But, this effect could pay back to the human society only by tuning the corporate infrastructure to utilise these computer innovations optimally. The future computer electronics works towards drastic cost reduction and process speed optimization. Pre-fifth generation computer environment in late 1980s and early 1990s will be dominated by a circularly integrated general purpose computer network, and the organisation will be vertically integration in the hierarchical administration, and horizontal interaction at implementation levels. The Fifth generation architecture with its innovative techniques will be tuned to accept keyed, voice, picture inputs and process towards decision and action guidelines, using a knowledge based management and problem solving and inference modules. The machine is also expected to become more intelligent with the passage of time. The proposed Fifth Generation organisational structure is hence designed with the maintenance of 1990s hierarchical and machine interfaces, circularly integrated policy, management work flow, intensified problem-solution, knowledge/intelligence orientation, integrated human-machine interface and mutual training setup. Based on a survey of Hardware and Software development in USA and Japan by the author in 1984. The paper is supported with 4 Schematic Diagrams and a Post-Publication Appraisal. Published in Indian Management, Journal of the All India Management Association, New Delhi. India. June 1985, Pages 19 to 28.Computers, Computer Architecture, Computer Generations, Computer Network, Corporate Infra-structure, Circular Integration, Fifth Generation Computers, Group Work, Horizontal Inter-action, Human Machine Interface, Mutual Training, Infra-structure, Intelligent Machine, Knowledge base, Problem Solving, Vertically Integrated Administration
Recommended from our members
Introduction to the Special Issue on Software Architecture for Language Engineering
Every building, and every computer program, has an architecture: structural and organisational principles that underpin its design and construction. The garden shed
once built by one of the authors had an ad hoc architecture, extracted (somewhat painfully) from the imagination during a slow and non-deterministic process that, luckily, resulted in a structure which keeps the rain on the outside and the mower on the inside (at least for the time being). As well as being ad hoc (i.e. not informed by analysis of similar practice or relevant science or engineering) this architecture is implicit: no explicit design was made, and no records or documentation kept of the construction process. The pyramid in the courtyard of the Louvre, by contrast, was constructed in a process involving explicit design performed by qualified engineers with a wealth of theoretical and practical knowledge of the properties of materials, the relative merits and strengths of different construction techniques, et cetera. So it is with software: sometimes it is thrown together by enthusiastic amateurs; sometimes it is architected, built to last, and intended to be 'not something you finish, but something you start' (to paraphrase Brand (1994). A number of researchers argued in the early and middle 1990s that the field of computational infrastructure or architecture for human language computation merited an increase in attention. The reasoning was that the increasingly large-scale and technologically significant nature of language processing science was placing increasing burdens of an engineering nature on research and development workers seeking robust and practical methods (as was the increasingly collaborative nature of research in this field, which puts a large premium on software integration and interoperation). Over the intervening period a number of significant systems and practices have been developed in what we may call Software Architecture for Language Engineering (SALE). This special issue represented an opportunity for practitioners in this area to report their work in a coordinated setting, and to present a snapshot of the state-ofthe-art in infrastructural work, which may indicate where further development and further take-up of these systems can be of benefit
Governing by internet architecture
In the past thirty years, the exponential rise in the number of Internet users around the word and the intensive use of the digital networks have brought to light crucial political issues. Internet is now the object of regulations. Namely, it is a policy domain. Yet, its own architecture represents a new regulative structure, one deeply affecting politics and everyday life. This article considers some of the main transformations of the Internet induced by privatization and militarization processes, as well as their consequences on societies and human beings.En los últimos treinta años ha crecido de manera exponencial el número de usuarios de Internet alrededor del mundo y el uso intensivo de conexiones digitales ha traÃdo a la luz cuestiones polÃticas cruciales. Internet es ahora objeto de regulaciones. Es decir, es un ámbito de la polÃtica. Aún su propia arquitectura representa una nueva estructura reguladora, que afecta profundamente la polÃtica y la vida cotidiana. Este artÃculo considera algunas de las principales transformaciones de Internet inducida por procesos de privatización y militarización, como también sus consecuencias en las sociedades y en los seres humanos
How open is open enough?: Melding proprietary and open source platform strategies
Computer platforms provide an integrated architecture of hardware and software standards as a basis for developing complementary assets. The most successful platforms were owned by proprietary sponsors that controlled platform evolution and appropriated associated rewards.
Responding to the Internet and open source systems, three traditional vendors of proprietary platforms experimented with hybrid strategies which attempted to combine the advantages of open source software while retaining control and differentiation. Such hybrid standards strategies reflect the competing imperatives for adoption and appropriability, and suggest the conditions under which such strategies may be preferable to either the purely open or purely proprietary alternatives
A new test framework for communications-critical large scale systems
None of today’s large scale systems could function without the reliable availability of a varied range of network communications capabilities. Whilst software, hardware and communications technologies have been advancing throughout the past two decades, the methods commonly used by industry for testing large scale systems which incorporate critical communications interfaces have not kept pace. This paper argues for the need for a specifically tailored framework to achieve effective and precise testing of communications-critical large scale systems (CCLSSs). The paper briefly discusses how generic test approaches are leading to inefficient and costly test activities in industry. The paper then outlines the features of an alternative CCLSS domain-specific test framework, and then provides an example based on a real case study. The paper concludes with an evaluation of the benefits observed during the case study and an outline of the available evidence that such benefits can be realized with other comparable systems
- …