768 research outputs found

    The Carnegie Image Tube Committee and the Development of Electronic Imaging Devices in Astronomy, 1953-1976

    Get PDF
    abstract: This dissertation examines the efforts of the Carnegie Image Tube Committee (CITC), a group created by Vannevar Bush and composed of astronomers and physicists, who sought to develop a photoelectric imaging device, generally called an image tube, to aid astronomical observations. The Carnegie Institution of Washington’s Department of Terrestrial Magnetism coordinated the CITC, but the committee included members from observatories and laboratories across the United States. The CITC, which operated from 1954 to 1976, sought to replace direct photography as the primary means of astronomical imaging. Physicists, who gained training in electronics during World War II, led the early push for the development of image tubes in astronomy. Vannevar Bush’s concern for scientific prestige led him to form a committee to investigate image tube technology, and postwar federal funding for the sciences helped the CITC sustain development efforts for a decade. During those development years, the CITC acted as a mediator between the astronomical community and the image tube producers but failed to engage astronomers concerning various development paths, resulting in a user group without real buy-in on the final product. After a decade of development efforts, the CITC designed an image tube, which Radio Corporation of American manufactured, and, with additional funding from the National Science Foundation, the committee distributed to observatories around the world. While excited about the potential of electronic imaging, few astronomers used the Carnegie-developed device regularly. Although the CITC’s efforts did not result in an overwhelming adoption of image tubes by the astronomical community, examining the design, funding, production, and marketing of the Carnegie image tube shows the many and varied processes through which astronomers have acquired new tools. Astronomers’ use of the Carnegie image tube to acquire useful scientific data illustrates factors that contribute to astronomers’ adoption or non-adoption of those new tools.Dissertation/ThesisDoctoral Dissertation History and Philosophy of Science 201

    Bodies, Ideas, and Dynamics: Historical Perspectives on Systems Thinking in Engineering

    Get PDF
    Today, the idea that technology consists not simply of individual machines but of systems of components and interconnections underlies much of engineering theory and practice. Yet this idea is relatively new in the history of technology; it evolved over a long period, spanning more than a century, as engineers grappled with the implications of machinery and collections of apparatus that spread over broad geographical areas. A historical perspective on systems thinking provides a critical background for contemplating new directions in “engineering systems,” by highlighting the problems that have constantly challenged engineers, as well as the new puzzles posed by today’s world

    The ingenuity of common workmen: and the invention of the computer

    Get PDF
    Since World War II, state support for scientific research has been assumed crucial to technological and economic progress. Governments accordingly spent tremendous sums to that end. Nothing epitomizes the alleged fruits of that involvement better than the electronic digital computer. The first such computer has been widely reputed to be the ENIAC, financed by the U.S. Army for the war but finished afterwards. Vastly improved computers followed, initially paid for in good share by the Federal Government of the United States, but with the private sector then dominating, both in development and use, and computers are of major significance.;Despite the supposed success of public-supported science, evidence is that computers would have evolved much the same without it but at less expense. Indeed, the foundations of modern computer theory and technology were articulated before World War II, both as a tool of applied mathematics and for information processing, and the computer was itself on the cusp of reality. Contrary to popular understanding, the ENIAC actually represented a movement backwards and a dead end.;Rather, modern computation derived more directly, for example, from the prewar work of John Vincent Atanasoff and Clifford Berry, a physics professor and graduate student, respectively, at Iowa State College (now University) in Ames, Iowa. They built the Atanasoff Berry Computer (ABC), which, although special purpose and inexpensive, heralded the efficient and elegant design of modern computers. Moreover, while no one foresaw commercialization of computers based on the ungainly and costly ENIAC, the commercial possibilities of the ABC were immediately evident, although unrealized due to war. Evidence indicates, furthermore, that the private sector was willing and able to develop computers beyond the ABC and could have done so more effectively than government, to the most sophisticated machines.;A full and inclusive history of computers suggests that Adam Smith, the eighteenth century Scottish philosopher, had it right. He believed that minimal and aloof government best served society, and that the inherent genius of citizens was itself enough to ensure the general prosperity

    Intertwingled: The Work and Influence of Ted Nelson

    Get PDF
    History of Computing; Computer Appl. in Arts and Humanities; Data Structures; User Interfaces and Human Computer Interactio

    Lost in the archive: vision, artefact and loss in the evolution of hypertext

    Full text link
    How does one write the history of a technical machine? Can we say that technical machines have their own genealogies, their own evolutionary dynamic? The technical artefact constitutes a series of objects, a lineage or a line. At a cursory level, we can see this in the fact that technical machines come in generations - they adapt and adopt characteristics over time, one suppressing the other as it becomes obsolete. It is argued that technics has its own evolutionary dynamic, and that this dynamic stems neither from biology nor from human societies. Yet 'it is impossible to deny the role of human thought in the creation of technical artefacts' (Guattari 1995, p. 37). Stones do not automatically rise up into a wall - humans 'invent' technical objects. This, then, raises the question of technical memory. Is it humans that remember previous generations of machines and transfer their characteristics to new machines? If so, how and where do they remember them? It is suggested that humans learn techniques from technical artefacts, and transfer these between machines. This theory of technical evolution is then used to understand the genealogy of hypertext. The historical differentiations of hypertext in different technical systems is traced. Hypertext is defined as both a technical artefact and also a set of techniques: both are a part of this third milieu, technics. The difference between technical artefact and technical vision is highlighted, and it is suggested that technique and vision change when they are externalised as material artefact. The primary technique traced is association, the organisational principle behind the hypertext systems explored in the manuscript. In conclusion, invention is shown to be an act of exhumation, the transfer and retroactiviation of techniques from the past. This thesis presents an argument for a new model of technical evolution, a model which claims that technics constitutes its own dynamic, and that this dynamic exceeds human evolution. It traces the genealogy of hypertext as a set of techniques and as series of material artefacts. To create this geneaology I draw on interviews conducted with Douglas Engelbart, Ted Nelson and Andries van Dam, as well as a wide variety of primary and secondary resources

    The Emergence of Modernity in the Early Aerospace Industry 1950-1970

    Get PDF
    The literature regarding the management of science during the early aerospace industry between 1950 and 1970, and more specifically the period of Lockheed’s transition from aeronautics into aerospace, is sparse and the subject is insufficiently studied. In this dissertation I examine how the practice of science changed in the United States in the decades after World War II. The scientific endeavor in the United States manifested a transformation from an enlightenment-based set of norms to a new modern ethos. For all intents and purposes, the traditional intellectual boundaries between basic science, applied science, and technology dissolved across the majority of the American scientific endeavor in the from 1950 to 1970 as science became objective oriented. This is in contrast to the common acceptance among historians of science who place this change in the 1970’s and 1980’s concomitant with the development of the entrepreneurial university. The majority of original research presented here comes from the archives of Lockheed executives Willis Hawkins, who started as an engineer and would ultimately ascend to company President, and Ben Rich who was a staff scientist who also rose to lead Lockheed’s semi-autonomous Advanced Development Program or Skunk Works. The dissertation is grounded in Etzkowitz and Leydesdorff’s Triple-Helix Model of science and demonstrates that the early aerospace industry conducted and managed science with the modern characteristics of human resources circulation, the development of innovation networks, reflexive output circulation, and non-linear innovation. Additionally, I amend the Triple-Helix model by proposing that the original version, which is scaled to describe large organizational and national science policy, can be used to model modern science management at three discrete scales – macro, meso, and micro. Finally, the study identifies as outcomes the role of industry associations, the loaning of human resources, and of capitalism during the period, and in the persistence of the Linear Model

    Interface Fantasies and Futures: Designing Human-Computer Relations in the Shadow of Memex

    Get PDF
    This dissertation is about how designers, experimental writers, and innovative thinkers have imagined both computer interfaces and the human/machine relations that might emerge through engagement with different kinds of interfaces. Although futuristic thinking about digital media and their interfaces has changed over time, we can isolate some constants that have persisted through almost all mainstream practices of interface design, particularly in American culture. Drawing from a historical trajectory that I associate with Vannevar Bush and his speculative invention, which he called “memex” in a 1945 essay, I name these constants sterilization and compartmentalization. They are two tendencies or values that I identify in mid-20th-century dreams of mastering information spaces by mastering their interfaces. My project shows how individuals and groups have reinforced or resisted these values in the engineering and design of computer interfaces, both speculative and real. The urge to sterilize and compartmentalize computers has directly and indirectly shaped what we expect and demand from our computers (and the things we make with them) today, and these values trace the horizon of what human-computer relations could be possible in the future

    The DARPA Model for Transformative Technologies

    Get PDF
    "The U.S. Defense Advanced Research Projects Agency (DARPA) has played a remarkable role in the creation new transformative technologies, revolutionizing defense with drones and precision-guided munitions, and transforming civilian life with portable GPS receivers, voice-recognition software, self-driving cars, unmanned aerial vehicles, and, most famously, the ARPANET and its successor, the Internet. Other parts of the U.S. Government and some foreign governments have tried to apply the ‘DARPA model’ to help develop valuable new technologies. But how and why has DARPA succeeded? Which features of its operation and environment contribute to this success? And what lessons does its experience offer for other U.S. agencies and other governments that want to develop and demonstrate their own ‘transformative technologies’? This book is a remarkable collection of leading academic research on DARPA from a wide range of perspectives, combining to chart an important story from the Agency’s founding in the wake of Sputnik, to the current attempts to adapt it to use by other federal agencies. Informative and insightful, this guide is essential reading for political and policy leaders, as well as researchers and students interested in understanding the success of this agency and the lessons it offers to others.

    Enhancing Inter-Document Similarity Using Sub Max

    Get PDF
    Document similarity, a core theme in Information Retrieval (IR), is a machine learning (ML) task associated with natural language processing (NLP). It is a measure of the distance between two documents given a set of rules. For the purpose of this thesis, two documents are similar if they are semantically alike, and describe similar concepts. While document similarity can be applied to multiple tasks, we focus our work on the accuracy of models in detecting referenced papers as similar documents using their sub max similarity. Multiple approaches have been used to determine the similarity of documents in regards to literature reviews. Some of such approaches use the number of similar citations, the similarity between the body of text, and the figures present in those documents. This researcher hypothesized that documents with sections of high similarity(sub max) but a global low similarity are prone to being overlooked by existing models as the global score of the documents are used to measure similarity. In this study, we aim to detect, measure, and show the similarity of documents based on the maximum similarity of their subsections. The sub max of any two given documents is the subsections of those documents with the highest similarity. By comparing subsections of the documents in our corpus and using the sub max, we were able to improve the performance of some models by over 100%
    • 

    corecore