514 research outputs found

    An authorization model for XML databases

    Full text link
    Université de Pau et des Pays de l’Adou

    The Development of a graduate course on identity management for the Department of Networking, Security, and Systems Administration

    Get PDF
    Digital identities are being utilized more than ever as a means to authenticate computer users in order to control access to systems, web services, and networks. To maintain these digital identities, administrators turn to Identity Management solutions to offer protection for users, business partners, and networks. This paper proposes an analysis of Identity Management to be accomplished in the form of a graduate level course of study for a ten-week period for the Networking, Security, and Systems Administration department at Rochester Institute of Technology. This course will be designed for this department because of its emphasis on securing, protecting, and managing the identities of users within and across networks. Much of the security-related courses offered by the department focus primarily on security within enterprises. Therefore, Identity Management, a topic that is becoming more popular within enterprises each day, would compliment these courses. Students that enroll in this course will be more equipped to satisfy the needs of modern enterprises when they graduate because they will have a better understanding of how to address security issues that involve managing user identities across networks, systems, and enterprises. This course will focus on several aspects of Identity Management and its use in enterprises today. Covered during the course will be the frameworks of Identity Management, for instance, Liberty Identity Federation Framework and OASIS SAML 2.0; the Identity Management models; and some of the major Identity Management solutions that are in use today such as Liberty Alliance, Microsoft Passport, and Shibboleth. This course will also provide the opportunity to gain hands on experience by facilitating exemplar technologies used in laboratory investigations

    PolyOrBAC: a security framework for critical infrastructures

    Get PDF
    International audienceDue to physical and logical vulnerabilities, a critical infrastructure (CI) can encounter failures of various degrees of severity, and since there are many interdependencies between CIs, simple failures can have dramatic consequences on the users. In this paper, we mainly focus on malicious threats that might affect the information and communication system that controls the Critical Infrastructure, i.e., the Critical Information Infrastructure (CII). To address the security challenges that are specific of CIIs, we propose a collaborative access control framework called PolyOrBAC. This approach offers each organization taking part in the CII the capacity of collaborating with the other ones, while maintaining a control on its resources and on its internal security policy. The interactions between organizations participating in the CII are implemented through web services (WS), and for each WS a contract is signed between the service-provider organization and the service-user organization. The contract describes the WS functions and parameters, the liability of each party and the security rules controlling the interactions. At runtime, the compliance of all interactions with these security rules is checked. Every deviation from the signed contracts triggers an alarm, the concerned parties are notified and audits can be used as evidence for sanctioning the party responsible for the deviation. Our approach is illustrated by a practical scenario, based on real emergency actions in an electric power grid infrastructure, and a simulation test bed has been implemented to animate this scenario and experiment with its security issues

    Improving the Authentication Mechanism of Business to Consumer (B2C) Platform in a Cloud Computing Environment: Preliminary Findings

    Get PDF
    The reliance of e-commerce infrastructure on cloud computing environment has undoubtedly increased the security challenges in web-based e-commerce portals. This has necessitated the need for a built-in security feature, essentially to improve the authentication mechanism, during the execution of its dependent transactions. Comparative analysis of the existing works and studies on XML-based authentication and non-XML signaturebased security mechanisms for authentication in Business to Consumer (B2C) e-commerce showed the advantage of using XML-based authentication, and its inherent weaknesses and limitations. It is against this background that this study, based on review and meta-analysis of previous works, proposes an improved XML digital signature with RSA algorithm, as a novel algorithmic framework that improves the authentication strength of XML digital signature in the B2C e-commerce in a cloud-based environment. Our future works include testing and validation, and simulation, of the proposed authentication framework in Cisco’s XML Management Interface with inbuilt feature of NETCONF. The evaluation will be done in conformity to international standard and guideline –such as W3C and NIST

    A Geographic Information System Framework for the Management of Sensor Deployments

    Get PDF
    A prototype Geographic Information System (GIS) framework has been developed to map, manage, and monitor sensors with respect to other geographic features, including land base and in-plant features. The GIS framework supports geographic placement and subsequent discovery, query, and tasking of sensors in a network-centric environment using Web services. The framework couples the GIS feature placement logic of sensors with an extensible ontology which captures the capabilities, properties, protocols, integrity constraints, and other parameters of interest for a large variety of sensor types. The approach is significant in that custom, GIS-based interfaces can be rapidly developed via the integration of sensors and sensor networks into applications without having detailed knowledge of the sensors’ underlying device drivers by leveraging service-oriented computing infrastructure within the GIS framework

    Microarrays in cancer research

    Get PDF
    Microarray technology has presented the scientific community with a compelling approach that allows for simultaneous evaluation of all cellular processes at once. Cancer, being one of the most challenging diseases due to its polygenic nature, presents itself as a perfect candidate for evaluation by this approach. Several recent articles have provided significant insight into the strengths and limitations of microarrays. Nevertheless, there are strong indications that this approach will provide new molecular markers that could be used in diagnosis and prognosis of cancers (1, 2). To achieve these goals it is essential that there is a seamless integration of clinical and molecular biological data that allows us to elucidate genes and pathways involved in various cancers. To this effect we are currently evaluating gene expression profiles in human brain, ovarian, breast and hematopoetic, lung, colorectal, head and neck and biliary tract cancers. To address the issues we have a joint team of scientists, doctors and computer scientists from two Virginia Universities and a major healthcare provider. The study has been divided into several focus groups that include; Tissue Bank Clinical & Pathology Laboratory Data, Chip Fabrication, QA/QC, Tissue Devitalization, Database Design and Data Analysis, using multiple microarray platforms. Currently over 300 consenting patients have been enrolled in the study with the largest number being that of breast cancer patients. Clinical data on each patient is being compiled into a secure and interactive relational database and integration of these data elements will be accomplished by a common programming interface. This clinical database contains several key parameters on each patient including demographic (risk factors, nutrition, co-morbidity, familial history), histopathology (non genetic predictors), tumor, treatment and follow-up information. Gene expression data derived from the tissue samples will be linked to this database, which allows us to query the data at multiple levels. The challenge of tissue acquisition and processing is of paramount importance to the success of this venture. A tissue devitalization timeline protocol was devised to ensure sample and RNA integrity. Stringent protocols are being employed to ascertain accurate tumor homogeneity, by serial dissection of each tumor sample at 10\u3bcM frozen sections followed by histopathological evaluation. The multiple platforms being utilized in this study include Affimetrix, Oligo-Chips and custom-designed cDNA arrays. Selected RNA samples will be evaluated on each platform between the groups. Analysis steps will involve normalization and standardization of gene expression data followed by hierarchical clustering to determine co-regulation profiles. The aim of this conjoint effort is to elucidate pathways and genes involved in various cancers, resistance mechanisms, molecular markers for diagnosis and prognosis

    Service composition based on SIP peer-to-peer networks

    Get PDF
    Today the telecommunication market is faced with the situation that customers are requesting for new telecommunication services, especially value added services. The concept of Next Generation Networks (NGN) seems to be a solution for this, so this concept finds its way into the telecommunication area. These customer expectations have emerged in the context of NGN and the associated migration of the telecommunication networks from traditional circuit-switched towards packet-switched networks. One fundamental aspect of the NGN concept is to outsource the intelligence of services from the switching plane onto separated Service Delivery Platforms using SIP (Session Initiation Protocol) to provide the required signalling functionality. Caused by this migration process towards NGN SIP has appeared as the major signalling protocol for IP (Internet Protocol) based NGN. This will lead in contrast to ISDN (Integrated Services Digital Network) and IN (Intelligent Network) to significantly lower dependences among the network and services and enables to implement new services much easier and faster. In addition, further concepts from the IT (Information Technology) namely SOA (Service-Oriented Architecture) have largely influenced the telecommunication sector forced by amalgamation of IT and telecommunications. The benefit of applying SOA in telecommunication services is the acceleration of service creation and delivery. Main features of the SOA are that services are reusable, discoverable combinable and independently accessible from any location. Integration of those features offers a broader flexibility and efficiency for varying demands on services. This thesis proposes a novel framework for service provisioning and composition in SIP-based peer-to-peer networks applying the principles of SOA. One key contribution of the framework is the approach to enable the provisioning and composition of services which is performed by applying SIP. Based on this, the framework provides a flexible and fast way to request the creation for composite services. Furthermore the framework enables to request and combine multimodal value-added services, which means that they are no longer limited regarding media types such as audio, video and text. The proposed framework has been validated by a prototype implementation
    • …
    corecore