11 research outputs found

    The ingenuity of common workmen: and the invention of the computer

    Get PDF
    Since World War II, state support for scientific research has been assumed crucial to technological and economic progress. Governments accordingly spent tremendous sums to that end. Nothing epitomizes the alleged fruits of that involvement better than the electronic digital computer. The first such computer has been widely reputed to be the ENIAC, financed by the U.S. Army for the war but finished afterwards. Vastly improved computers followed, initially paid for in good share by the Federal Government of the United States, but with the private sector then dominating, both in development and use, and computers are of major significance.;Despite the supposed success of public-supported science, evidence is that computers would have evolved much the same without it but at less expense. Indeed, the foundations of modern computer theory and technology were articulated before World War II, both as a tool of applied mathematics and for information processing, and the computer was itself on the cusp of reality. Contrary to popular understanding, the ENIAC actually represented a movement backwards and a dead end.;Rather, modern computation derived more directly, for example, from the prewar work of John Vincent Atanasoff and Clifford Berry, a physics professor and graduate student, respectively, at Iowa State College (now University) in Ames, Iowa. They built the Atanasoff Berry Computer (ABC), which, although special purpose and inexpensive, heralded the efficient and elegant design of modern computers. Moreover, while no one foresaw commercialization of computers based on the ungainly and costly ENIAC, the commercial possibilities of the ABC were immediately evident, although unrealized due to war. Evidence indicates, furthermore, that the private sector was willing and able to develop computers beyond the ABC and could have done so more effectively than government, to the most sophisticated machines.;A full and inclusive history of computers suggests that Adam Smith, the eighteenth century Scottish philosopher, had it right. He believed that minimal and aloof government best served society, and that the inherent genius of citizens was itself enough to ensure the general prosperity

    Critical Programming: Toward a Philosophy of Computing

    Get PDF
    Beliefs about the relationship between human beings and computing machines and their destinies have alternated from heroic counterparts to conspirators of automated genocide, from apocalyptic extinction events to evolutionary cyborg convergences. Many fear that people are losing key intellectual and social abilities as tasks are offloaded to the everywhere of the built environment, which is developing a mind of its own. If digital technologies have contributed to forming a dumbest generation and ushering in a robotic moment, we all have a stake in addressing this collective intelligence problem. While digital humanities continue to flourish and introduce new uses for computer technologies, the basic modes of philosophical inquiry remain in the grip of print media, and default philosophies of computing prevail, or experimental ones propagate false hopes. I cast this as-is situation as the post-postmodern network dividual cyborg, recognizing that the rational enlightenment of modernism and regressive subjectivity of postmodernism now operate in an empire of extended mind cybernetics combined with techno-capitalist networks forming societies of control. Recent critical theorists identify a justificatory scheme foregrounding participation in projects, valorizing social network linkages over heroic individualism, and commending flexibility and adaptability through life long learning over stable career paths. It seems to reify one possible, contingent configuration of global capitalism as if it was the reflection of a deterministic evolution of commingled technogenesis and synaptogenesis. To counter this trend I offer a theoretical framework to focus on the phenomenology of software and code, joining social critiques with textuality and media studies, the former proposing that theory be done through practice, and the latter seeking to understand their schematism of perceptibility by taking into account engineering techniques like time axis manipulation. The social construction of technology makes additional theoretical contributions dispelling closed world, deterministic historical narratives and requiring voices be given to the engineers and technologists that best know their subject area. This theoretical slate has been recently deployed to produce rich histories of computing, networking, and software, inform the nascent disciplines of software studies and code studies, as well as guide ethnographers of software development communities. I call my syncretism of these approaches the procedural rhetoric of diachrony in synchrony, recognizing that multiple explanatory layers operating in their individual temporal and physical orders of magnitude simultaneously undergird post-postmodern network phenomena. Its touchstone is that the human-machine situation is best contemplated by doing, which as a methodology for digital humanities research I call critical programming. Philosophers of computing explore working code places by designing, coding, and executing complex software projects as an integral part of their intellectual activity, reflecting on how developing theoretical understanding necessitates iterative development of code as it does other texts, and how resolving coding dilemmas may clarify or modify provisional theories as our minds struggle to intuit the alien temporalities of machine processes

    Ramon Llull's Ars Magna

    Get PDF
    corecore