2,068 research outputs found

    Unattended network operations technology assessment study. Technical support for defining advanced satellite systems concepts

    Get PDF
    The results are summarized of an unattended network operations technology assessment study for the Space Exploration Initiative (SEI). The scope of the work included: (1) identified possible enhancements due to the proposed Mars communications network; (2) identified network operations on Mars; (3) performed a technology assessment of possible supporting technologies based on current and future approaches to network operations; and (4) developed a plan for the testing and development of these technologies. The most important results obtained are as follows: (1) addition of a third Mars Relay Satellite (MRS) and MRS cross link capabilities will enhance the network's fault tolerance capabilities through improved connectivity; (2) network functions can be divided into the six basic ISO network functional groups; (3) distributed artificial intelligence technologies will augment more traditional network management technologies to form the technological infrastructure of a virtually unattended network; and (4) a great effort is required to bring the current network technology levels for manned space communications up to the level needed for an automated fault tolerance Mars communications network

    Advanced information processing system: The Army fault tolerant architecture conceptual study. Volume 2: Army fault tolerant architecture design and analysis

    Get PDF
    Described here is the Army Fault Tolerant Architecture (AFTA) hardware architecture and components and the operating system. The architectural and operational theory of the AFTA Fault Tolerant Data Bus is discussed. The test and maintenance strategy developed for use in fielded AFTA installations is presented. An approach to be used in reducing the probability of AFTA failure due to common mode faults is described. Analytical models for AFTA performance, reliability, availability, life cycle cost, weight, power, and volume are developed. An approach is presented for using VHSIC Hardware Description Language (VHDL) to describe and design AFTA's developmental hardware. A plan is described for verifying and validating key AFTA concepts during the Dem/Val phase. Analytical models and partial mission requirements are used to generate AFTA configurations for the TF/TA/NOE and Ground Vehicle missions

    Data Synchronization Technology: Standards, Business Values and Implications

    Get PDF
    The Internet-enabled connectivity has created opportunities for businesses to conduct various forms of collaborative activities. However, the findings of several surveys indicate that the deficiencies in data quality might compromise the potential benefits of joint efforts. Global data synchronization (GDS), the process of timely updating product data to maintain the data consistency among business partners, is viewed as the key to materialize the benefits of e-collaboration in the global supply chain setting. In the paper, we present the need for data synchronization, discuss the evolution of technical standards of data identification schemes, and introduce the Global Data Synchronization Network (GDSN), the platform on which global data synchronization is substantiated. We detail the structure of GDSN and the protocols for the process of GDS. Furthermore, we discuss business and management implications of GDS, different approaches to implementing GDS, and challenges to the implementation of GDS. The emergence of GDS and GDSN presents research opportunities on issues relating to the implementation of GDS, the relationship between GDSN and EPCglobal Network, the impact of GDS on inter-organizational relationships, the network effect of global standards, and evolution of complementary standards. We discuss these research opportunities. In brief, the article covers the history, present status, and future of GDS and GDSN, as well as their potentials, benefits, and implementation issues

    Interprocess communication in highly distributed systems

    Get PDF
    Issued as Final technical report, Project no. G-36-632Final technical report has title: Interprocess communication in highly distributed system

    Executable Model Synthesis and Property Validation for Message Sequence Chart Specifications

    Get PDF
    Message sequence charts (MSCā€™s) are a formal language for the speciļ¬cation of scenarios in concurrent real-time systems. The thesis addresses the synthesis of executable object-oriented design-time models from MSC speciļ¬cations. The synthesis integrates with the software development process, its purpose being to automatically create working prototypes from speciļ¬cations without error and create executable models on which properties may be validated. The usefulness of existing algorithms for the synthesis of ROOM (Real-Time Object Oriented Modeling) models from MSCā€™s has been evaluated from the perspective of an applications programmer ac-cording to various criteria. A number of new synthesis features have been proposed to address them, and applied to a telephony call management system for illustration. These include the speciļ¬cation and construction of hierarchical structure and behavior of ROOM actors, views, multiple containment, replication, resolution of non-determinism and automatic coordination. Generalizations and algorithms have been provided. The hierarchical actor structure, replication, FSM merging, and global coordinator algorithms have been implemented in the Mesa CASE tool. A comparison is made to other speciļ¬cation and modeling languages and their synthesis, such as SDL, LSCā€™s, and statecharts. Another application of synthesis is to generate a model with support for the automated validation of safety and liveness properties. The Mobility Management services of the GSM digital mobile telecommunications system were speciļ¬ed in MSCā€™s. A Promela model of the system was then synthesized. A number of optimizations have been proposed to reduce the complexity of the model in order to successfully perform a validation of it. Properties of the system were encoded in Linear Temporal Logic, and the Promela model was used to automatically validate a number of identiļ¬ed properties using the model checker Spin. A ROOM model was then synthesized from the validated MSC speciļ¬cation using the proposed reļ¬nement features

    TRACKING AND TRACING PORTAL FOR PROJECT LOGISTICS. A Review on the Interconnectivity of EDI, ERP and Cloud-based Systems

    Get PDF
    Tracking and tracing is becoming an essential factor for the success of project logistics. The safety and on-time arrival of shipments has become the primary concerns for manufacturing companies. The paper has introduced an overall approach to track and trace their deliveries from the starting point to the end-customer. Detail implementation of the whole solution will not be presented, yet each component in the system will be analyzed and discussed. Electronic Data Exchange (EDI) has been around for the last 30 years and is known for providing logistics companies a fast, reliable way to exchange information electronically. EDI, together with Enterprise Resource Planning (ERP), are considered as one of the remarkable emerging technologies which play an important role in supply chain management tracking network. Although the implementation of EDI and ERP systems is not straight forward and not easy to established, many logistics companies are still seeing this as a vital factor which can help companies to establish a sustainable development, increase productivity and reduce costs. In this paper, the interconnectivity of EDI, ERP, and cloud-based systems in tracking and tracing portal will be analyzed in business perspective in order to define what benefits it could achieve for logistics and supply chain management tracking network. A case study of Logistics Tracking Network (LogTrack) project is presented and examined with the view to implement, evaluate and manage the interconnectivity of EDI, ERP, and cloud-based systems in a practical point of view. Information collected from this research project will be analyzed to provide a list of mapping attributes between these systems and used as a basic for the further development of tracking and tracing portal. The impacts and implications of such system for managing the business logistics are discussed and presented in conclusion.fi=OpinnƤytetyƶ kokotekstinƤ PDF-muodossa.|en=Thesis fulltext in PDF format.|sv=LƤrdomsprov tillgƤngligt som fulltext i PDF-format

    Connected Information Management

    Get PDF
    Society is currently inundated with more information than ever, making efficient management a necessity. Alas, most of current information management suffers from several levels of disconnectedness: Applications partition data into segregated islands, small notes donā€™t fit into traditional application categories, navigating the data is different for each kind of data; data is either available at a certain computer or only online, but rarely both. Connected information management (CoIM) is an approach to information management that avoids these ways of disconnectedness. The core idea of CoIM is to keep all information in a central repository, with generic means for organization such as tagging. The heterogeneity of data is taken into account by offering specialized editors. The central repository eliminates the islands of application-specific data and is formally grounded by a CoIM model. The foundation for structured data is an RDF repository. The RDF editing meta-model (REMM) enables form-based editing of this data, similar to database applications such as MS access. Further kinds of data are supported by extending RDF, as follows. Wiki text is stored as RDF and can both contain structured text and be combined with structured data. Files are also supported by the CoIM model and are kept externally. Notes can be quickly captured and annotated with meta-data. Generic means for organization and navigation apply to all kinds of data. Ubiquitous availability of data is ensured via two CoIM implementations, the web application HYENA/Web and the desktop application HYENA/Eclipse. All data can be synchronized between these applications. The applications were used to validate the CoIM ideas
    • ā€¦
    corecore