466,082 research outputs found
Model-Based Development of firewall rule sets: Diagnosing model inconsistencies
The design and management of firewall rule sets is a very difficult and error-prone task because of the
difficulty of translating access control requirements into complex low-level firewall languages. Although
high-level languages have been proposed to model firewall access control lists, none has been widely
adopted by the industry. We think that the main reason is that their complexity is close to that of many
existing low-level languages. In addition, none of the high-level languages that automatically generate
firewall rule sets verifies the model prior to the code-generation phase. Error correction in the early
stages of the development process is cheaper compared to the cost associated with correcting errors in
the production phase. In addition, errors generated in the production phase usually have a huge impact
on the reliability and robustness of the generated code and final system.
In this paper, we propose the application of the ideas of Model-Based Development to firewall access control
list modelling and automatic rule set generation. First, an analysis of the most widely used firewall
languages in the industry is conducted. Next, a Platform-Independent Model for firewall ACLs is proposed.
This model is the result of exhaustive analysis and of a discussion of different alternatives for models
in a bottom-up methodology. Then, it is proposed that a verification stage be added in the early stages
of the Model-Based Development methodology, and a polynomial time complexity process and algorithms
are proposed to detect and diagnose inconsistencies in the Platform-Independent Model. Finally,
a theoretical complexity analysis and empirical tests with real models were conducted, in order to prove
the feasibility of our proposal in real environments
An ontology framework for developing platform-independent knowledge-based engineering systems in the aerospace industry
This paper presents the development of a novel knowledge-based engineering (KBE) framework for implementing platform-independent knowledge-enabled product design systems within the aerospace industry. The aim of the KBE framework is to strengthen the structure, reuse and portability of knowledge consumed within KBE systems in view of supporting the cost-effective and long-term preservation of knowledge within such systems. The proposed KBE framework uses an ontology-based approach for semantic knowledge management and adopts a model-driven architecture style from the software engineering discipline. Its phases are mainly (1) Capture knowledge required for KBE system; (2) Ontology model construct of KBE system; (3) Platform-independent model (PIM) technology selection and implementation and (4) Integration of PIM KBE knowledge with computer-aided design system. A rigorous methodology is employed which is comprised of five qualitative phases namely, requirement analysis for the KBE framework, identifying software and ontological engineering elements, integration of both elements, proof of concept prototype demonstrator and finally experts validation. A case study investigating four primitive three-dimensional geometry shapes is used to quantify the applicability of the KBE framework in the aerospace industry. Additionally, experts within the aerospace and software engineering sector validated the strengths/benefits and limitations of the KBE framework. The major benefits of the developed approach are in the reduction of man-hours required for developing KBE systems within the aerospace industry and the maintainability and abstraction of the knowledge required for developing KBE systems. This approach strengthens knowledge reuse and eliminates platform-specific approaches to developing KBE systems ensuring the preservation of KBE knowledge for the long term
Leverage Ratios and Basel III: Proposed Basel III Leverage and Supplementary Leverage Ratios
The Basel III Leverage Ratio, as originally agreed upon in December 2010, has recently undergone
revisions and updates – both in relation to those proposed by the Basel Committee on Banking
Supervision – as well as proposals introduced in the United States. Whilst recent proposals have
been introduced by the Basel Committee to improve, particularly, the denominator component of
the Leverage Ratio, new requirements have been introduced in the U.S to upgrade and increase
these ratios, and it is those updates which relate to the Basel III Supplementary Leverage Ratio that
have primarily generated a lot of interests. This is attributed not only to concerns that many
subsidiaries of US Bank Holding Companies (BHCs) will find it cumbersome to meet such
requirements, but also to potential or possible increases in regulatory capital arbitrage: a
phenomenon which plagued the era of the original 1988 Basel Capital Accord and which also
partially provided impetus for the introduction of Basel II.
This paper is aimed at providing an analysis of the recent updates which have taken place in respect
of the Basel III Leverage Ratio and the Basel III Supplementary Leverage Ratio – both in respect
of recent amendments introduced by the Basel Committee and proposals introduced in the United
States. It will also consider the consequences – as well as the impact - which the U.S Leverage
ratios could have on Basel III. There are ongoing debates in relation to revision by the Basel
Committee, as well as the most recent U.S proposals to update Basel III Leverage ratios and whilst
these revisions have been welcomed to a large extent, in view of the need to address Tier One
capital requirements and exposure criteria, there is every likelihood, indication, as well as tendency
that many global systemically important banks (GSIBS), and particularly their subsidiaries, will
resort to capital arbitrage. What is likely to be the impact of the recent proposals in the U.S.?
The recent U.S proposals are certainly very encouraging and should also serve as impetus for other
jurisdictions to adopt a pro-active approach – particularly where existing ratios or standards appear
to be inadequate. This paper also adopts the approach of evaluating the causes and consequences of
the most recent updates by the Basel Committee, as well as those revisions which have taken place
in the U.S, by attempting to balance the merits of the respective legislative updates and proposals.
The value of adopting leverage ratios as a supplementary regulatory tool will also be illustrated by
way of reference to the impact of the recent legislative changes on risk taking activities, as well as
the need to also supplement capital adequacy requirements with the Basel Leverage ratios and the
Basel liquidity standard
Autonomic management of multiple non-functional concerns in behavioural skeletons
We introduce and address the problem of concurrent autonomic management of
different non-functional concerns in parallel applications build as a
hierarchical composition of behavioural skeletons. We first define the problems
arising when multiple concerns are dealt with by independent managers, then we
propose a methodology supporting coordinated management, and finally we discuss
how autonomic management of multiple concerns may be implemented in a typical
use case. The paper concludes with an outline of the challenges involved in
realizing the proposed methodology on distributed target architectures such as
clusters and grids. Being based on the behavioural skeleton concept proposed in
the CoreGRID GCM, it is anticipated that the methodology will be readily
integrated into the current reference implementation of GCM based on Java
ProActive and running on top of major grid middleware systems.Comment: 20 pages + cover pag
Losing the War Against Dirty Money: Rethinking Global Standards on Preventing Money Laundering and Terrorism Financing
Following a brief overview in Part I.A of the overall system to prevent money laundering, Part I.B describes the role of the private sector, which is to identify customers, create a profile of their legitimate activities, keep detailed records of clients and their transactions, monitor their transactions to see if they conform to their profile, examine further any unusual transactions, and report to the government any suspicious transactions. Part I.C continues the description of the preventive measures system by describing the government\u27s role, which is to assist the private sector in identifying suspicious transactions, ensure compliance with the preventive measures requirements, and analyze suspicious transaction reports to determine those that should be investigated.
Parts I.D and I.E examine the effectiveness of this system. Part I.D discusses successes and failures in the private sector\u27s role. Borrowing from theory concerning the effectiveness of private sector unfunded mandates, this Part reviews why many aspects of the system are failing, focusing on the subjectivity of the mandate, the disincentives to comply, and the lack of comprehensive data on client identification and transactions. It notes that the system includes an inherent contradiction: the public sector is tasked with informing the private sector how best to detect launderers and terrorists, but to do so could act as a road map on how to avoid detection should such information fall into the wrong hands. Part I.D discusses how financial institutions do not and cannot use scientifically tested statistical means to determine if a particular client or set of transactions is more likely than others to indicate criminal activity. Part I.D then turns to a discussion of a few issues regarding the impact the system has but that are not related to effectiveness, followed by a summary and analysis of how flaws might be addressed.
Part I.E continues by discussing the successes and failures in the public sector\u27s role. It reviews why the system is failing, focusing on the lack of assistance to the private sector in and the lack of necessary data on client identification and transactions. It also discusses how financial intelligence units, like financial institutions, do not and cannot use scientifically tested statistical means to determine probabilities of criminal activity. Part I concludes with a summary and analysis tying both private and public roles together.
Part II then turns to a review of certain current techniques for selecting income tax returns for audit. After an overview of the system, Part II first discusses the limited role of the private sector in providing tax administrators with information, comparing this to the far greater role the private sector plays in implementing preventive measures. Next, this Part turns to consider how tax administrators, particularly the U.S. Internal Revenue Service, select taxpayers for audit, comparing this to the role of both the private and public sectors in implementing preventive measures. It focuses on how some tax administrations use scientifically tested statistical means to determine probabilities of tax evasion. Part II then suggests how flaws in both private and public roles of implementing money laundering and terrorism financing preventive measures might be theoretically addressed by borrowing from the experience of tax administration. Part II concludes with a short summary and analysis that relates these conclusions to the preventive measures system.
Referring to the analyses in Parts I and II, Part III suggests changes to the current preventive measures standard. It suggests that financial intelligence units should be uniquely tasked with analyzing and selecting clients and transactions for further investigation for money laundering and terrorism financing. The private sector\u27s role should be restricted to identifying customers, creating an initial profile of their legitimate activities, and reporting such information and all client transactions to financial intelligence units
KEMNAD: A Knowledge Engineering Methodology for Negotiating Agent Development
Automated negotiation is widely applied in various domains. However, the development of such systems is a complex knowledge and software engineering task. So, a methodology there will be helpful. Unfortunately, none of existing methodologies can offer sufficient, detailed support for such system development. To remove this limitation, this paper develops a new methodology made up of: (1) a generic framework (architectural pattern) for the main task, and (2) a library of modular and reusable design pattern (templates) of subtasks. Thus, it is much easier to build a negotiating agent by assembling these standardised components rather than reinventing the wheel each time. Moreover, since these patterns are identified from a wide variety of existing negotiating agents(especially high impact ones), they can also improve the quality of the final systems developed. In addition, our methodology reveals what types of domain knowledge need to be input into the negotiating agents. This in turn provides a basis for developing techniques to acquire the domain knowledge from human users. This is important because negotiation agents act faithfully on the behalf of their human users and thus the relevant domain knowledge must be acquired from the human users. Finally, our methodology is validated with one high impact system
Criticality analysis for improving maintenance, felling and pruning cycles in power lines
16th IFAC Symposium on Information Control Problems in Manufacturing INCOM 2018
Bergamo, Italy, 11–13 June 2018.
Edited by Marco Macchi, László Monostori, Roberto PintoThis paper deals with the process of criticality analysis in overhead power lines, as a tool to improve maintenance, felling & pruning programs. Felling & pruning activities are tasks that utility companies must accomplish to respect the servitudes of the overhead lines, concerned with distances to vegetation, buildings, infrastructures and other networks crossings. Conceptually, these power lines servitudes can be considered as failure modes of the maintainable items under our analysis (power line spans), and the criticality analysis methodology developed, will therefore help to optimize actions to avoid these as other failure modes of the line maintainable items. The approach is interesting, but another relevant contribution of the paper is the process followed for the automation of the analysis. Automation is possible by utilizing existing companies IT systems and databases. The paper explains how to use data located in Enterprise Assets Management Systems, GIS and Dispatching systems for a fast, reliable, objective and dynamic criticality analysis. Promising results are included and also discussions about how this technique may result in important implications for this type of businesse
Combining goal-oriented and model-driven approaches to solve the Payment Problem Scenario
Motivated by the objective to provide an improved participation of business domain experts in the design of service-oriented integration solutions, we extend our previous work on using the COSMO methodology for service mediation by introducing a goal-oriented approach to requirements engineering. With this approach, business requirements including the motivations behind the mediation solution are better understood, specified, and aligned with their technical implementations. We use the Payment Problem Scenario of the SWS Challenge to illustrate the extension
- …