58 research outputs found

    L’option européenne du Canada dans les années 80

    Get PDF
    The year 1982 marked the twenty-fifth anniversary of the Treaty of Rome, creating the EEC, and the tenth anniversary of Canada s Third Option, which aimed in part at using the new Europe as a counterweight to American influence. This article attempts, first to set the last ten year s of Canada s European policy in the context of postwar economic and strategic relationships and, secondly, to evaluate the European aspect of the Third Option and its prospects for the 1980s.Canada s image of, and policy towards, Western Europe, has always had two distinct elements — economic and strategic - whose interplay describes various phases of the relationship. Until about 1957, the two images were roughly congruent and Canada s economic and military policies mutually reinforcing. A multilateral Atlantic economic community was seen as the underpinning of collective defence. The next fifteen years or so saw the breakup of this patte m, with the emergence of the Community, continued British attempts to join from 1961 on, and the decline in Canadian support for NATO.The birth of the Third Option and the pursuit of the Framework Agreement with the Community saw the gradual revival of Canada's military relationship with Europe, this time in the service of economic diversification and growth. A variety of factors, however - global, European, North American and domestic Canadian - hampered the development of the new link with the Community. Many of these, especially the structural ones underlying the crisis of the world economy and the stagnation of European integration - are likely to persist through the 1980 s. For the next few year s, then, the European option will probably be (a) less communautaire, (b) a less important focus in trade policy relative to other areas, and (c) once more in the service of military and political ends

    Useful Descriptions of Organizational Processes: Collecting Data for the Process Handbook

    Get PDF
    This paper describes a data collection methodology for business process analysis. Unlike static objects, business processes are semi-repetitive sequences of events that are often widely distributed in time and space, with ambiguous boundaries. To redesign or even just describe a business process requires an approach that is sensitive to these aspects of the phenomena. The method described here is intended to generate semi-formal process representations suitable for inclusion in a "handbook" of organizational processes. Using basic techniques of ethnographic interviewing and observation, the method helps users map decomposition, specialization, and dependency relationships at an intermediate level of abstraction meaningful to participants. By connecting new process descriptions to an existing taxonomy of similar descriptions in the Handbook, this method helps build a common vocabulary for process description and analysis.

    Why aspect graphs are not (yet) practical for computer vision

    Full text link
    A panel discussion on the theme of this report was organized for the 1991 IEEE Workshop on Directions in Automated CAD-Based Vision. This report contains the revised comments of the panel discussion participants.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/30175/1/0000560.pd

    Focusing and Calibration of Large Scale Network Sensors using GraphBLAS Anonymized Hypersparse Matrices

    Full text link
    Defending community-owned cyber space requires community-based efforts. Large-scale network observations that uphold the highest regard for privacy are key to protecting our shared cyberspace. Deployment of the necessary network sensors requires careful sensor placement, focusing, and calibration with significant volumes of network observations. This paper demonstrates novel focusing and calibration procedures on a multi-billion packet dataset using high-performance GraphBLAS anonymized hypersparse matrices. The run-time performance on a real-world data set confirms previously observed real-time processing rates for high-bandwidth links while achieving significant data compression. The output of the analysis demonstrates the effectiveness of these procedures at focusing the traffic matrix and revealing the underlying stable heavy-tail statistical distributions that are necessary for anomaly detection. A simple model of the corresponding probability of detection (pdp_{\rm d}) and probability of false alarm (pfap_{\rm fa}) for these distributions highlights the criticality of network sensor focusing and calibration. Once a sensor is properly focused and calibrated it is then in a position to carry out two of the central tenets of good cybersecurity: (1) continuous observation of the network and (2) minimizing unbrokered network connections.Comment: Accepted to IEEE HPEC, 9 pages, 12 figures, 1 table, 63 references, 2 appendice
    • …
    corecore