7,729 research outputs found

    Complementarity constraints and induced innovation: Some evidence from the first IT regime

    Get PDF
    Technological search is often depicted to be random. This paper takes a different view and analyses how innovative recombinant search is triggered, how it is done and what initial conditions influence the final design of technological artefacts. We argue that complementarities (non-separabilities) play an important role as focusing devices guiding the search for new combinations. Our analysis takes the perspective of technology adopters and not that of inventors or innovators of new products. We illustrate the process of decomposition and re-composition under the presence of binding complementarity constraints with a historical case study on the establishment of the First IT Regime at the turn of the 19th century.Technological regimes, systemic innovation, adoption of technologies, complexity,information technology 1870-1930

    Complementarity constraints and induced innovation: some evidence from the First IT Regime

    Get PDF
    Technological search is often depicted to be random. This paper takes a different view and analyses how innovative recombinant search is triggered, how it is done and what initial conditions influence the final design of technological artefacts. We argue that complementarities (non-separabilities) play an important role as focusing devices guiding the search for new combinations. Our analysis takes the perspective of technology adopters and not that of inventors or innovators of new products. We illustrate the process of decomposition and re-composition under the presence of binding complementarity constraints with a historical case study on the establishment of the First IT Regime at the turn of the 19th century.research and development ;

    Proceedings of the ECSCW'95 Workshop on the Role of Version Control in CSCW Applications

    Full text link
    The workshop entitled "The Role of Version Control in Computer Supported Cooperative Work Applications" was held on September 10, 1995 in Stockholm, Sweden in conjunction with the ECSCW'95 conference. Version control, the ability to manage relationships between successive instances of artifacts, organize those instances into meaningful structures, and support navigation and other operations on those structures, is an important problem in CSCW applications. It has long been recognized as a critical issue for inherently cooperative tasks such as software engineering, technical documentation, and authoring. The primary challenge for versioning in these areas is to support opportunistic, open-ended design processes requiring the preservation of historical perspectives in the design process, the reuse of previous designs, and the exploitation of alternative designs. The primary goal of this workshop was to bring together a diverse group of individuals interested in examining the role of versioning in Computer Supported Cooperative Work. Participation was encouraged from members of the research community currently investigating the versioning process in CSCW as well as application designers and developers who are familiar with the real-world requirements for versioning in CSCW. Both groups were represented at the workshop resulting in an exchange of ideas and information that helped to familiarize developers with the most recent research results in the area, and to provide researchers with an updated view of the needs and challenges faced by application developers. In preparing for this workshop, the organizers were able to build upon the results of their previous one entitled "The Workshop on Versioning in Hypertext" held in conjunction with the ECHT'94 conference. The following section of this report contains a summary in which the workshop organizers report the major results of the workshop. The summary is followed by a section that contains the position papers that were accepted to the workshop. The position papers provide more detailed information describing recent research efforts of the workshop participants as well as current challenges that are being encountered in the development of CSCW applications. A list of workshop participants is provided at the end of the report. The organizers would like to thank all of the participants for their contributions which were, of course, vital to the success of the workshop. We would also like to thank the ECSCW'95 conference organizers for providing a forum in which this workshop was possible

    Validation and verification of the interconnection of hardware intellectual property blocks for FPGA-based packet processing systems

    Get PDF
    As networks become more versatile, the computational requirement for supporting additional functionality increases. The increasing demands of these networks can be met by Field Programmable Gate Arrays (FPGA), which are an increasingly popular technology for implementing packet processing systems. The fine-grained parallelism and density of these devices can be exploited to meet the computational requirements and implement complex systems on a single chip. However, the increasing complexity of FPGA-based systems makes them susceptible to errors and difficult to test and debug. To tackle the complexity of modern designs, system-level languages have been developed to provide abstractions suited to the domain of the target system. Unfortunately, the lack of formality in these languages can give rise to errors that are not caught until late in the design cycle. This thesis presents three techniques for verifying and validating FPGA-based packet processing systems described in a system-level description language. First, a type system is applied to the system description language to detect errors before implementation. Second, system-level transaction monitoring is used to observe high-level events on-chip following implementation. Third, the high-level information embodied in the system description language is exploited to allow the system to be automatically instrumented for on-chip monitoring. This thesis demonstrates that these techniques catch errors which are undetected by traditional verification and validation tools. The locations of faults are specified and errors are caught earlier in the design flow, which saves time by reducing synthesis iterations

    Body language, security and e-commerce

    Get PDF
    Security is becoming an increasingly more important concern both at the desktop level and at the network level. This article discusses several approaches to authenticating individuals through the use of biometric devices. While libraries might not implement such devices, they may appear in the near future of desktop computing, particularly for access to institutional computers or for access to sensitive information. Other approaches to computer security focus on protecting the contents of electronic transmissions and verification of individual users. After a brief overview of encryption technologies, the article examines public-key cryptography which is getting a lot of attention in the business world in what is called public key infrastructure. It also examines other efforts, such as IBM’s Cryptolope, the Secure Sockets Layer of Web browsers, and Digital Certificates and Signatures. Secure electronic transmissions are an important condition for conducting business on the Net. These business transactions are not limited to purchase orders, invoices, and contracts. This could become an important tool for information vendors and publishers to control access to the electronic resources they license. As license negotiators and contract administrators, librarians need to be aware of what is happening in these new technologies and the impact that will have on their operations

    Bounded Rationality in the Economics of Organization Present Use and (Some) Future Possibilities

    Get PDF
    The way in which bounded rationality enters contemporary organizational economics theorizing is examined. It is argued that, as it is being used, bounded rationality is neither necessary nor sufficient for producing the results of organizational economics. It is at best a rhetorical device, used for the purpose of loosely explaining incomplete contracts. However, it is possible to incorporate much richer notions of bounded rationality, founded on research in cognitive psychology, and to illuminate the study of economic organization by means of such notions. A number of examples are provided.Varieties of bounded rationality, incomplete contracts, economic organization, cognitive psychology

    The theory of the lemon markets in IS research

    Get PDF
    The “lemon” problem was initially posed by Nobel Prize winner Akerlof in his seminal article of 1970 and showed how a market with unbalanced information, called information asymmetry, can lead to complete disappearance or to offerings with poor quality where bad products (lemons) wipe out the good ones. Empirical evidence for Akerlof’s theory came originally from the market of used cars, where the lemon is a well known problem. However the theoretical model of the “lemon” problem has proven also to be valid on other markets and in comparable situations like internal markets. The theory is also been used more and more in IS research especially since the emerging e-commerce initiatives and the continuous growth of e-markets and auctions. In this chapter we bring a description of the theory by presenting its nomological network and its linkages to other well known theories in IS research. The relevance for the theory is shown to explain phenomenon’s in the IS discipline. An overview is given of current and past IS articles using the Lemon Market theory (LMT) together with a bibliographical analysis of the references to the original Akerlof article

    Lex Informatica: The Formulation of Information Policy Rules through Technology

    Get PDF
    Historically, law and government regulation have established default rules for information policy, including constitutional rules on freedom of expression and statutory rights of ownership of information. This Article will show that for network environments and the Information Society, however, law and government regulation are not the only source of rule-making. Technological capabilities and system design choices impose rules on participants. The creation and implementation of information policy are embedded in network designs and standards as well as in system configurations. Even user preferences and technical choices create overarching, local default rules. This Article argues, in essence, that the set of rules for information flows imposed by technology and communication networks form a “Lex Informatica” that policymakers must understand, consciously recognize, and encourage
    • 

    corecore