342 research outputs found

    Improvement of DHRA-DMDC Physical Access Software DBIDS Using Cloud Computing Technology: a Case Study

    Get PDF
    The U.S government has created and been executing an Identity and Management (IdM) vision to support a global, robust, trusted and interoperable identity management capability that provides the ability to correctly identify individuals and non-person entities in support of DoD mission operations. Many Directives and Instructions have been issued to standardize the process to design, re-designed new and old systems with latest available technologies to meet the visions requirements. In this thesis we introduce a cloud-based architecture for the Defense Biometric Identification System (DBIDS), along with a set of DBIDS Cloud Services that supports the proposed architecture. This cloud-based architecture will move DBIDS in the right direction to meet Dod IdM visions and goals by decoupling current DBIDS functions into DBIDS core services to create interoperability and flexibility to expand future DBIDS with new requirements. The thesis will show its readers how DBIDS Cloud Services will help Defense Manpower Data Center (DMDC) easily expanding DBIDS functionalities such as connecting to other DMDC services or federated services for vetting purposes. This thesis will also serve as a recommendation of a blue-print for DBIDS architecture to support new generation of DBIDS application. This is a step closer in moving DMDC Identity Enterprise Solution toward DoD IdM realizing vision and goals. The thesis also includes a discussion of how to utilize virtualized DBIDS workstations to address software-deployment and maintenance issues to resolve configuration and deployment issues which have been costly problems for DMDC over the years.http://archive.org/details/improvementofdhr109457379Civilian, Department of Defens

    Middleware Technologies for Cloud of Things - a survey

    Get PDF
    The next wave of communication and applications rely on the new services provided by Internet of Things which is becoming an important aspect in human and machines future. The IoT services are a key solution for providing smart environments in homes, buildings and cities. In the era of a massive number of connected things and objects with a high grow rate, several challenges have been raised such as management, aggregation and storage for big produced data. In order to tackle some of these issues, cloud computing emerged to IoT as Cloud of Things (CoT) which provides virtually unlimited cloud services to enhance the large scale IoT platforms. There are several factors to be considered in design and implementation of a CoT platform. One of the most important and challenging problems is the heterogeneity of different objects. This problem can be addressed by deploying suitable "Middleware". Middleware sits between things and applications that make a reliable platform for communication among things with different interfaces, operating systems, and architectures. The main aim of this paper is to study the middleware technologies for CoT. Toward this end, we first present the main features and characteristics of middlewares. Next we study different architecture styles and service domains. Then we presents several middlewares that are suitable for CoT based platforms and lastly a list of current challenges and issues in design of CoT based middlewares is discussed.Comment: http://www.sciencedirect.com/science/article/pii/S2352864817301268, Digital Communications and Networks, Elsevier (2017

    Middleware Technologies for Cloud of Things - a survey

    Full text link
    The next wave of communication and applications rely on the new services provided by Internet of Things which is becoming an important aspect in human and machines future. The IoT services are a key solution for providing smart environments in homes, buildings and cities. In the era of a massive number of connected things and objects with a high grow rate, several challenges have been raised such as management, aggregation and storage for big produced data. In order to tackle some of these issues, cloud computing emerged to IoT as Cloud of Things (CoT) which provides virtually unlimited cloud services to enhance the large scale IoT platforms. There are several factors to be considered in design and implementation of a CoT platform. One of the most important and challenging problems is the heterogeneity of different objects. This problem can be addressed by deploying suitable "Middleware". Middleware sits between things and applications that make a reliable platform for communication among things with different interfaces, operating systems, and architectures. The main aim of this paper is to study the middleware technologies for CoT. Toward this end, we first present the main features and characteristics of middlewares. Next we study different architecture styles and service domains. Then we presents several middlewares that are suitable for CoT based platforms and lastly a list of current challenges and issues in design of CoT based middlewares is discussed.Comment: http://www.sciencedirect.com/science/article/pii/S2352864817301268, Digital Communications and Networks, Elsevier (2017

    The DBCLS BioHackathon: standardization and interoperability for bioinformatics web services and workflows. The DBCLS BioHackathon Consortium*

    Get PDF
    Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies

    MPEG21 DI Browser : an MPEG-21 based architecture for the consumption of digital items

    Get PDF
    Um exemplar tem a paginação em folhas e o outro exemplar tem a paginação em folhas, mas ambos têm 125 (folhas ou páginas)Tese de mestrado. tecnologia Multimédia, Perfil Engenharia. 2006. Faculdade de Engenharia. Universidade do Port

    Data mining for anomaly detection in maritime traffic data

    Get PDF
    For the past few years, oceans have become once again, an important means of communication and transport. In fact, traffic density throughout the globe has suffered a substantial growth, which has risen some concerns. With this expansion, the need to achieve a high Maritime Situational Awareness (MSA) is imperative. At the present time, this need may be more easily fulfilled thanks to the vast amount of data available regarding maritime traffic. However, this brings in another issue: data overload. Currently, there are so many data sources, so many data to obtain information from, that the operators cannot handle it. There is a pressing need for systems that help to sift through all the data, analysing and correlating, helping in this way the decision making process. In this dissertation, the main goal is to use different sources of data in order to detect anomalies and contribute to a clear Recognised Maritime Picture (RMP). In order to do so, it is necessary to know what types of data exist and which ones are available for further analysis. The data chosen for this dissertation was Automatic Identification System (AIS) and Monitorização Contínua das Atividades da Pesca (MONICAP) data, also known as Vessel Monitoring System (VMS) data. In order to store 1 year worth of AIS and MONICAP data, a PostgreSQL database was created. To analyse and draw conclusions from the data, a data mining tool was used, namely, Orange. Tests were conducted in order to assess the correlation between data sources and find anomalies. The importance of data correlation has never been so important and with this dissertation the aim is to show that there is a simple and effective way to get answers from great amounts of data.Nos últimos anos, os oceanos tornaram-se, mais uma vez, um importante meio de comunicação e transporte. De facto, a densidade de tráfego global sofreu um crescimento substancial, o que levantou algumas preocupações. Com esta expansão, a necessidade de atingir um elevado Conhecimento Situacional Marítimo (CSM) é imperativa. Hoje em dia, esta necessidade pode ser satisfeita mais facilmente graças à vasta quantidade de dados disponíveis de tráfego marítimo. No entanto, isso leva a outra questão: sobrecarga de dados. Atualmente existem tantas fontes de dados, tantos dados dos quais extrair informação, que os operadores não conseguem acompanhar. Existe uma necessidade premente para sistemas que ajudem a escrutinar todos os dados, analisando e correlacionando, contribuindo desta maneira ao processo de tomada de decisão. Nesta dissertação, o principal objetivo é usar diferentes fontes de dados para detetar anomalias e contribuir para uma clara Recognised Maritime Picture (RMP). Para tal, é necessário saber que tipos de dados existem e quais é que se encontram disponíveis para análise posterior. Os dados escolhidos para esta dissertação foram dados Automatic Identification System (AIS) e dados de Monitorização Contínua das Atividades da Pesca (MONICAP), também conhecidos como dados de Vessel Monitoring System (VMS). De forma a armazenar dados correspondentes a um ano de AIS e MONICAP, foi criada uma base de dados em PostgreSQL. Para analisar e retirar conclusões, foi utilizada uma ferramenta de data mining, nomeadamente, o Orange. De modo a que pudesse ser avaliada a correlação entre fontes de dados e serem detetadas anomalias foram realizados vários testes. A correlação de dados nunca foi tão importante e pretende-se com esta dissertação mostrar que existe uma forma simples e eficaz de obter respostas de grandes quantidades de dado
    corecore