8,485 research outputs found

    SMiT: Local System Administration Across Disparate Environments Utilizing the Cloud

    Get PDF
    System administration can be tedious. Most IT departments maintain several (if not several hundred) computers, each of which requires periodic housecleaning: updating of software, clearing of log files, removing old cache files, etc. Compounding the problem is the computing environment itself. Because of the distributed nature of these computers, system administration time is often consumed in repetitive tasks that should be automated. Although current system administration tools exist, they are often centralized, unscalable, unintuitive, or inflexible. To meet the needs of system administrators and IT professionals, we developed the Script Management Tool (SMiT). SMiT is a web-based tool that permits administration of distributed computers from virtually anywhere via a common web browser. SMiT consists of a cloud-based server running on Google App Engine enabling users to intuitively create, manage, and deploy administration scripts. To support local execution of scripts, SMiT provides an execution engine that runs on the organization’s local machines and communicates with the server to fetch scripts, execute them, and deliver results back to the server. Because of its distributed asynchronous architecture SMiT is scalable to thousands of machines. SMiT is also extensible to a wide variety of system administration tasks via its plugin architecture

    Actors and factors - bridging social science findings and urban land use change modeling

    Get PDF
    Recent uneven land use dynamics in urban areas resulting from demographic change, economic pressure and the cities’ mutual competition in a globalising world challenge both scientists and practitioners, among them social scientists, modellers and spatial planners. Processes of growth and decline specifically affect the urban environment, the requirements of the residents on social and natural resources. Social and environmental research is interested in a better understanding and ways of explaining the interactions between society and landscape in urban areas. And it is also needed for making life in cities attractive, secure and affordable within or despite of uneven dynamics.\ud The position paper upon “Actors and factors – bridging social science findings and urban land use change modeling” presents approaches and ideas on how social science findings on the interaction of the social system (actors) and the land use (factors) are taken up and formalised using modelling and gaming techniques. It should be understood as a first sketch compiling major challenges and proposing exemplary solutions in the field of interest

    Domain specific software architectures: Command and control

    Get PDF
    GTE is the Command and Control contractor for the Domain Specific Software Architectures program. The objective of this program is to develop and demonstrate an architecture-driven, component-based capability for the automated generation of command and control (C2) applications. Such a capability will significantly reduce the cost of C2 applications development and will lead to improved system quality and reliability through the use of proven architectures and components. A major focus of GTE's approach is the automated generation of application components in particular subdomains. Our initial work in this area has concentrated in the message handling subdomain; we have defined and prototyped an approach that can automate one of the most software-intensive parts of C2 systems development. This paper provides an overview of the GTE team's DSSA approach and then presents our work on automated support for message processing

    Inference as a data management problem

    Get PDF
    Inference over OWL ontologies with large A-Boxes has been researched as a data management problem in recent years. This work adopts the strategy of applying a tableaux-based reasoner for complete T-Box classification, and using a rule-based mechanism for scalable A-Box reasoning. Specifically, we establish for the classified T-Box an inference framework, which can be used to compute and materialise inference results. The inference we focus on is type inference in A-Box reasoning, which we define as the process of deriving for each A-Box instance its memberships of OWL classes and properties. As our approach materialises the inference results, it in general provides faster query processing than non-materialising techniques, at the expense of larger space requirement and slower update speed. When the A-Box size is suitable for an RDBMS, we compile the inference framework to triggers, which incrementally update the inference materialisation from both data inserts and data deletes, without needing to re-compute the whole inference. More importantly, triggers make inference available as atomic consequences of inserts or deletes, which preserves the ACID properties of transactions, and such inference is known as transactional reasoning. When the A-Box size is beyond the capability of an RDBMS, we then compile the inference framework to Spark programmes, which provide scalable inference materialisation in a Big Data system, and our evaluation considers up to reasoning 270 million A-Box facts. Evaluating our work, and comparing with two state-of-the-art reasoners, we empirically verify that our approach is able to perform scalable inference materialisation, and to provide faster query processing with comparable completeness of reasoning.Open Acces

    Active databases, business rules and reactive agents - what is the connection?

    Get PDF
    These three technologies were and still are mainly treated separately. Since not much work has been carried out in defining and combining them together, we are going to present what has been done and put accent on what could be done. Namely, they rely upon similar paradigms and concepts, as will be shown later on, and can be treated as complementary technologies. In this paper we will show that reactive agents react according to some set of business rules and active databases can be used as a suitable means for implementing business rules and in those way reactive agents as well. Since reactive agents have been well defined, recent improvements in the fields of active databases technology and especially business rules provide the reason to consider the benefits to be achieved from combining these fields

    Active databases, business rules and reactive agents - what is the connection?

    Get PDF
    These three technologies were and still are mainly treated separately. Since not much work has been carried out in defining and combining them together, we are going to present what has been done and put accent on what could be done. Namely, they rely upon similar paradigms and concepts, as will be shown later on, and can be treated as complementary technologies. In this paper we will show that reactive agents react according to some set of business rules and active databases can be used as a suitable means for implementing business rules and in those way reactive agents as well. Since reactive agents have been well defined, recent improvements in the fields of active databases technology and especially business rules provide the reason to consider the benefits to be achieved from combining these fields

    Synthesis of ATM switch controller modules with the protocol compiler from Synopsis

    Get PDF
    In order to manage the higher complexity of VLSI chips and to reach shorter design cycles, the design effort becomes increasingly focused on higher levels of abstraction. We describe the modeling of some modules of an high speed telecommunication circuit, an ATM Switch Controller (ASC) using the Protocol Compiler or Dali (TM) from Synopsys (TM). Dali (TM) supports a fast and compact graphical entry of a protocol controller hardware description with graphical signs similarly to formula symbols. The output of the synthesis with Dali (TM) is simulated. Advantages of this design method are discussed and the results of the synthesis are presented

    Towards an Efficient Evaluation of General Queries

    Get PDF
    Database applications often require to evaluate queries containing quantifiers or disjunctions, e.g., for handling general integrity constraints. Existing efficient methods for processing quantifiers depart from the relational model as they rely on non-algebraic procedures. Looking at quantified query evaluation from a new angle, we propose an approach to process quantifiers that makes use of relational algebra operators only. Our approach performs in two phases. The first phase normalizes the queries producing a canonical form. This form permits to improve the translation into relational algebra performed during the second phase. The improved translation relies on a new operator - the complement-join - that generalizes the set difference, on algebraic expressions of universal quantifiers that avoid the expensive division operator in many cases, and on a special processing of disjunctions by means of constrained outer-joins. Our method achieves an efficiency at least comparable with that of previous proposals, better in most cases. Furthermore, it is considerably simpler to implement as it completely relies on relational data structures and operators
    • …
    corecore