276,231 research outputs found
Learning regulatory compliance data for data governance in financial services industry by machine learning models
While regulatory compliance data has been governed in the financial services industry for a long time to identify, assess, remediate and prevent risks, improving data governance (âDGâ) has emerged as a new paradigm that uses machine learning models to enhance the level of data management.
In the literature, there is a research gap. Machine learning models have not been extensively applied to DG processes by a) predicting data quality (âDQâ) in supervised learning and taking temporal sequences and correlations of data noise into account in DQ prediction; b) predicting DQ in unsupervised learning and learning the importance of data noise jointly with temporal sequences and correlations of data noise in DQ prediction; c) analyzing DQ prediction at a granular level; d) measuring network run-time saving in DQ prediction; and e) predicting information security compliance levels.
Our main research focus is whether our ML models accurately predict DQ and information security compliance levels during DG processes of financial institutions by learning regulatory compliance data from both theoretical and experimental perspectives.
We propose five machine learning models including a) a DQ prediction sequential learning model in supervised learning; b) a DQ prediction sequential learning model with an attention mechanism in unsupervised learning; c) a DQ prediction analytical model; d) a DQ prediction network efficiency improvement model; and e) an information security compliance prediction model.
Experimental results demonstrate the effectiveness of these models by accurately predicting DQ in supervised learning, precisely predicting DQ in unsupervised learning, analyzing DQ prediction by divergent dimensions such as risk types and business segments, saving significant network run-time in DQ prediction for improving the network efficiency, and accurately predicting information security compliance levels.
Our models strengthen DG capabilities of financial institutions by improving DQ, data risk management, bank-wide risk management, and information security based on regulatory requirements in the financial services industry including Basel Committee on Banking Supervision Standard Number 239, Australia Prudential Regulation Authority (âAPRAâ) Standard Number CPG 235 and APRA Standard Number CPG 234. These models are part of DG programs under the DG framework of financial institutions
Context Aware Adaptable Applications - A global approach
Actual applications (mostly component based) requirements cannot be expressed without a ubiquitous and mobile part for end-users as well as for M2M applications (Machine to Machine). Such an evolution implies context management in order to evaluate the consequences of the mobility and corresponding mechanisms to adapt or to be adapted to the new environment. Applications are then qualified as context aware applications. This first part of this paper presents an overview of context and its management by application adaptation. This part starts by a definition and proposes a model for the context. It also presents various techniques to adapt applications to the context: from self-adaptation to supervised approached. The second part is an overview of architectures for adaptable applications. It focuses on platforms based solutions and shows information flows between application, platform and context. Finally it makes a synthesis proposition with a platform for adaptable context-aware applications called Kalimucho. Then we present implementations tools for software components and a dataflow models in order to implement the Kalimucho platform
Temporal verification in secure group communication system design
The paper discusses an experience in using a real-time UML/SysML profile and a formal verification toolkit to check a secure group communication system against temporal requirements. A generic framework is proposed and specialized for hierarchical groups
Recommended from our members
Multimedia delivery in the future internet
The term âNetworked Mediaâ implies that all kinds of media including text, image, 3D graphics, audio
and video are produced, distributed, shared, managed and consumed on-line through various networks,
like the Internet, Fiber, WiFi, WiMAX, GPRS, 3G and so on, in a convergent manner [1]. This white
paper is the contribution of the Media Delivery Platform (MDP) cluster and aims to cover the Networked
challenges of the Networked Media in the transition to the Future of the Internet.
Internet has evolved and changed the way we work and live. End users of the Internet have been confronted
with a bewildering range of media, services and applications and of technological innovations concerning
media formats, wireless networks, terminal types and capabilities. And there is little evidence that the pace
of this innovation is slowing. Today, over one billion of users access the Internet on regular basis, more
than 100 million users have downloaded at least one (multi)media file and over 47 millions of them do so
regularly, searching in more than 160 Exabytes1 of content. In the near future these numbers are expected
to exponentially rise. It is expected that the Internet content will be increased by at least a factor of 6, rising
to more than 990 Exabytes before 2012, fuelled mainly by the users themselves. Moreover, it is envisaged
that in a near- to mid-term future, the Internet will provide the means to share and distribute (new)
multimedia content and services with superior quality and striking flexibility, in a trusted and personalized
way, improving citizensâ quality of life, working conditions, edutainment and safety.
In this evolving environment, new transport protocols, new multimedia encoding schemes, cross-layer inthe
network adaptation, machine-to-machine communication (including RFIDs), rich 3D content as well as
community networks and the use of peer-to-peer (P2P) overlays are expected to generate new models of
interaction and cooperation, and be able to support enhanced perceived quality-of-experience (PQoE) and
innovative applications âon the moveâ, like virtual collaboration environments, personalised services/
media, virtual sport groups, on-line gaming, edutainment. In this context, the interaction with content
combined with interactive/multimedia search capabilities across distributed repositories, opportunistic P2P
networks and the dynamic adaptation to the characteristics of diverse mobile terminals are expected to
contribute towards such a vision.
Based on work that has taken place in a number of EC co-funded projects, in Framework Program 6 (FP6)
and Framework Program 7 (FP7), a group of experts and technology visionaries have voluntarily
contributed in this white paper aiming to describe the status, the state-of-the art, the challenges and the way
ahead in the area of Content Aware media delivery platforms
Moving Object Trajectories Meta-Model And Spatio-Temporal Queries
In this paper, a general moving object trajectories framework is put forward
to allow independent applications processing trajectories data benefit from a
high level of interoperability, information sharing as well as an efficient
answer for a wide range of complex trajectory queries. Our proposed meta-model
is based on ontology and event approach, incorporates existing presentations of
trajectory and integrates new patterns like space-time path to describe
activities in geographical space-time. We introduce recursive Region of
Interest concepts and deal mobile objects trajectories with diverse
spatio-temporal sampling protocols and different sensors available that
traditional data model alone are incapable for this purpose.Comment: International Journal of Database Management Systems (IJDMS) Vol.4,
No.2, April 201
Dynamic deployment of context-aware access control policies for constrained security devices
Securing the access to a server, guaranteeing a certain level of protection over an encrypted communication channel, executing particular counter measures when attacks are detected are examples of security requirements. Such requirements are identi ed based on organizational purposes and expectations in terms of resource access and availability and also on system vulnerabilities and threats. All these requirements belong to the so-called security policy. Deploying the policy means enforcing, i.e., con guring, those security components and mechanisms so that the system behavior be nally the one speci ed by the policy. The deployment issue becomes more di cult as the growing organizational requirements and expectations generally leave behind the integration of new security functionalities in the information system: the information system will not always embed the necessary security functionalities for the proper deployment of contextual security requirements. To overcome this issue, our solution is based on a central entity approach which takes in charge unmanaged contextual requirements and dynamically redeploys the policy when context changes are detected by this central entity. We also present an improvement over the OrBAC (Organization-Based Access Control) model. Up to now, a controller based on a contextual OrBAC policy is passive, in the sense that it assumes policy evaluation triggered by access requests. Therefore, it does not allow reasoning about policy state evolution when actions occur. The modi cations introduced by our work overcome this limitation and provide a proactive version of the model by integrating concepts from action speci cation languages
Rayman: Interoperability use of Meteorological Observation
The observation of atmospheric phenomena enables generating of knowledge about the weather and meteors occurrence in a region. When this information is georeferenced it becomes useful for a great number of professional and public activities in the field of e.g. building, infrastructures, aeronautics, biota, tourism, agriculture and energy. At the present time access to that information is limited. Few meteorological agencies apply geo-Standards, hindering the development of GIS tools for monitoring, threshold alerts and decision support helping. This work describes how public agencies publish meteorological data and the solution developed at the Spanish Electrical Network (REE) to store the information provided by the Spanish Meteorological Agency (AEMET). The implemented solution enables the access to the weather observations collected by the meteorological agency and the rays captured by the detection network in a interoperable way and the exploitation, by as well a desktop GIS capable of connecting with Oracle-Spatial database as through the interfaces of the OGC standardized services (WMS, WFS and SOS)
- âŠ