14 research outputs found
Technology infrastructure in information technology industries
Abstract not availableeconomics of technology business administration and economics
The engineers and the urban system, 1968-1974/
Thesis (S.M. in Architecture Studies)--Massachusetts Institute of Technology, Dept. of Architecture, 2012.Cataloged from PDF version of thesis.Includes bibliographical references (p. 76-81).The social policy agenda of the Great Society was shaped by the recognition that if broad social improvement was to be achieved in urban America, social planning and state intervention based on systemically acquired expertise about the city would have to be developed. As a case study in the development of such expertise, in this thesis I explore the work of the Urban Systems Laboratory (USL), established in 1968 at the Massachusetts Institute of Technology (MIT) with funding from the Ford Foundation. Using computationally intensive methods, behavioral models and the latest techniques of the information sciences, research at the USL emphasized the role of rational, analytical, social scientific expertise in managing social conflict. In particular, I explore the work of Jay W. Forrester, a member of the USL whose research on the city was published in 1969 as Urban Dynamics. Using an IBM Systems/360 Model 67, Forrester built one of the first large-scale, interactive, computational models of a city specifically to explore the consequences of the social policies of the period and more generally the possibility of the social engineering of complex social systems in a postindustrial society. This project of the production of expertise at the USL struggled to secure legitimacy in the early 1970s as the attempt to treat the problems of urban America as phenomena to be handled by a new class of experts was overwhelmed by the sheer scale of urban turmoil.by Alexander Hilton Wood.S.M.in Architecture Studie
Recommended from our members
Explaining the Vertical-to-Horizontal Transition in the Computer Industry
This paper seeks to explain the technological forces that led to the rise of vertically integrated corporations in the late 19th Century and the opposing forces that led to a vertical-to-horizontal transition in the computer industry one hundred years later. I first model the technology of step processes with bottlenecks and show how this technology rewards vertical integration, a hierarchical organization, and the use of direct authority. These properties in turn became the organizational hallmarks of so-called “modern” corporations. I then model platform systems, showing that, in contrast to step processes, this technology rewards the multiplication of options, increasing risk, and modularity. Moreover, given a modular architecture, a platform system can be open, with different components supplied by separate firms with no loss of interoperability or efficiency. Openness multiplies options and expands diversity, thus increasing the platform system’s value. The last two decades of the 20th Century saw the rise of three distinct types of open platforms in the computer industry: (1) “forward open” platforms with downstream complementors; (2) “backward open” modular supply networks; and (3) “open exchange” platforms designed to facilitate transactions and other forms of social interaction. Whereas in 1980, vertically integrated firms dominated the industry, by 2000, the “verticals” had essentially disappeared. The largest firms in the industry in 2000 were sponsors and participants in open platform systems. I argue that the vertical-to-horizontal transition in the computer industry was an organizational response to a fundamental change in economic rewards to the technologies of rationalized step processes vs. open platform systems
The Silent Arms Race: The Role of the Supercomputer During the Cold War, 1947-1963
One of the central features of the Cold War is the Arms Race. The United States and the Union of Soviet Socialist republics vied for supremacy over the globe for a fifty-year period in which there were several arms races; atomic weapons, thermonuclear weapons and various kinds of conventional weapons. However, there is another arms race that goes unsung during this period of history and that is in the area of supercomputing. The other types of arms races are taken for granted by historians and others, but the technological competition between the superpowers would have been impossible without the historically silent arms race in the area of supercomputers. The construction of missiles, jets as well as the testing of nuclear weapons had serious implications for international relations. Often perception is more important than fact. Perceived power maintained a deterrent effect on the two superpowers. If one superpower suspected that they, in fact, had an advantage over the other then the balance of power would be upset and more aggressive measures might have been taken in various fronts of the conflict, perhaps leading to war. Due to this, it was necessary to maintain a balance of power not only in weapons but in supercomputing as well. Considering the role that the computers played, it is time for closer historical scrutiny
Continuous path : the evolution of process control technologies in post-war Britain
Automation - the alliance of a series of advances in manufacturing technology with the
academic discipline of cybernetics - was the centre of both popular and technical
debate for a number of years in the mid-1950s. Alarmists predicted social disruption,
economic hardship, and a massive de-skilling of the workforce; while technological
positivists saw automation as an enabling technology that would introduce a new age
of prosperity. At the same time as this debate was taking place, increasingly
sophisticated control technologies based on digital electronics and the principle of
feedback control were being developed and applied to industrial manufacturing
systems. This thesis examines two stages in the evolution of process control
technology: the numerical control of machine tools; and the development of the small
computer, or minicomputer. In each case two key themes are explored: the notion of
industrial failure; and the role of new technologies in Britain's industrial decline.
In Britain, four projects were undertaken to develop point-to-point or
continuous path automatic controllers for machine tools in the mid-1950s - three by
electronics firms and one by a traditional machine tool manufacturer. However,
although automation was dominating popular debate at the time, the anticipated
market for numerically controlled systems failed to appear, and all of the early projects
were abandoned. It is argued that while the electronics firms naively misdirected their
limited marketing capabilities, the root of the problem was the traditional machine tool
manufacturers' conservatism and their failure to embrace the new technology.
A decade later, small computers based on new semiconductor technologies had
emerged in the United States. Originally developed for roles in industrial automation,
they soon began to compete at the low end of the mainframe computer market. Soon
afterwards a number of British firms - electronic goods manufacturers, entrepreneurial
start-ups, and even office machinery suppliers - began to develop minicomputers. The
Wilson government saw computers as a central element of industrial modernisation,
and thus a part of its solution to Britain's economic decline, so the Ministry of
Technology was charged with the promotion of the British minicomputer industry.
However, US-built systems proved more competitive, and by the mid-1970s they had
come to dominate the market, with the few remaining British firms relegated as niche
players. It is argued that government involvement in the minicomputer industry was
ineffectual, and that the minicomputer manufacturers' organisational cultures played a
major role in the failure of the British industry
Recommended from our members
The Origins of the Underline as Visual Representation of the Hyperlink on the Web: A Case Study in Skeuomorphism
This thesis investigates the process by which the underline came to be used as the default signifier of hyperlinks on the World Wide Web. Created in 1990 by Tim Berners-Lee, the web quickly became the most used hypertext system in the world, and most browsers default to indicating hyperlinks with an underline. To answer the question of why the underline was chosen over competing demarcation techniques, the thesis applies the methods of history of technology and sociology of technology. Before the invention of the web, the underline–also known as the vinculum–was used in many contexts in writing systems; collecting entities together to form a whole and ascribing additional meaning to the content. This early usage made the underline a natural choice, semantically, as a hyperlink signifier. The technological context in which the web was created also played a significant role in the use of the underline. Early computer systems were limited in their display capacities and the interfaces created on them were influenced by these constraints. Specifically, the NeXT computer on which Berners-Lee developed the web made underlining text very simple using its Interface Builder software. The combination of the existing semantic meaning of the underline in written language, coupled with the affordances of contemporary computer systems resulted in the commonplace use of the underline to signify hyperlinks
Battle of the Brains: Election-Night Forecasting at the Dawn of the Computer Age
This dissertation examines journalists' early encounters with computers as tools for news reporting, focusing on election-night forecasting in 1952. Although election night 1952 is frequently mentioned in histories of computing and journalism as a quirky but seminal episode, it has received little scholarly attention. This dissertation asks how and why election night and the nascent field of television news became points of entry for computers in news reporting.
The dissertation argues that although computers were employed as pathbreaking "electronic brains" on election night 1952, they were used in ways consistent with a long tradition of election-night reporting. As central events in American culture, election nights had long served to showcase both news reporting and new technology, whether with 19th-century devices for displaying returns to waiting crowds or with 20th-century experiments in delivering news by radio.
In 1952, key players - television news broadcasters, computer manufacturers, and critics - showed varied reactions to employing computers for election coverage. But this computer use in 1952 did not represent wholesale change. While live use of the new technology was a risk taken by broadcasters and computer makers in a quest for attention, the underlying methodology of forecasting from early returns did not represent a sharp break with pre-computer approaches. And while computers were touted in advance as key features of election-night broadcasts, the "electronic brains" did not replace "human brains" as primary sources of analysis on election night in 1952.
This case study chronicles the circumstances under which a new technology was employed by a relatively new form of the news media. On election night 1952, the computer was deployed not so much to revolutionize news reporting as to capture public attention. It functioned in line with existing values and practices of election-night journalism. In this important instance, therefore, the new technology's technical features were less a driving force for adoption than its usefulness as a wonder and as a symbol to enhance the prestige of its adopters. This suggests that a new technology's capacity to provide both technical and symbolic social utility can be key to its chances for adoption by the news media
Accountants\u27 index. Twenty-fourth supplement, January-December 1975
https://egrove.olemiss.edu/aicpa_accind/1026/thumbnail.jp