9,201 research outputs found
Advanced attack tree based intrusion detection
Computer network systems are constantly under attack or have to deal with attack
attempts. The first step in any networkâs ability to fight against intrusive attacks
is to be able to detect intrusions when they are occurring. Intrusion Detection
Systems (IDS) are therefore vital in any kind of network, just as antivirus is a
vital part of a computer system. With the increasing computer network intrusion
sophistication and complexity, most of the victim systems are compromised by
sophisticated multi-step attacks. In order to provide advanced intrusion detection
capability against the multi-step attacks, it makes sense to adopt a rigorous and
generalising view to tackling intrusion attacks. One direction towards achieving
this goal is via modelling and consequently, modelling based detection.
An IDS is required that has good quality of detection capability, not only to
be able to detect higher-level attacks and describe the state of ongoing multi-step
attacks, but also to be able to determine the achievement of high-level attack
detection even if any of the modelled low-level attacks are missed by the detector,
because no alert being generated may represent that the corresponding low-level
attack is either not being conducted by the adversary or being conducted by the
adversary but evades the detection.
This thesis presents an attack tree based intrusion detection to detect multistep
attacks. An advanced attack tree modelling technique, Attack Detection Tree,
is proposed to model the multi-step attacks and facilitate intrusion detection. In
addition, the notion of Quality of Detectability is proposed to describe the ongoing
states of both intrusion and intrusion detection. Moreover, a detection uncertainty
assessment mechanism is proposed to apply the measured evidence to deal with
the uncertainty issues during the assessment process to determine the achievement
of high-level attacks even if any modelled low-level incidents may be missing
Towards a Formalism-Based Toolkit for Automotive Applications
The success of a number of projects has been shown to be significantly
improved by the use of a formalism. However, there remains an open issue: to
what extent can a development process based on a singular formal notation and
method succeed. The majority of approaches demonstrate a low level of
flexibility by attempting to use a single notation to express all of the
different aspects encountered in software development. Often, these approaches
leave a number of scalability issues open. We prefer a more eclectic approach.
In our experience, the use of a formalism-based toolkit with adequate notations
for each development phase is a viable solution. Following this principle, any
specific notation is used only where and when it is really suitable and not
necessarily over the entire software lifecycle. The approach explored in this
article is perhaps slowly emerging in practice - we hope to accelerate its
adoption. However, the major challenge is still finding the best way to
instantiate it for each specific application scenario. In this work, we
describe a development process and method for automotive applications which
consists of five phases. The process recognizes the need for having adequate
(and tailored) notations (Problem Frames, Requirements State Machine Language,
and Event-B) for each development phase as well as direct traceability between
the documents produced during each phase. This allows for a stepwise
verification/validation of the system under development. The ideas for the
formal development method have evolved over two significant case studies
carried out in the DEPLOY project
Distributed First Order Logic
Distributed First Order Logic (DFOL) has been introduced more than ten years
ago with the purpose of formalising distributed knowledge-based systems, where
knowledge about heterogeneous domains is scattered into a set of interconnected
modules. DFOL formalises the knowledge contained in each module by means of
first-order theories, and the interconnections between modules by means of
special inference rules called bridge rules. Despite their restricted form in
the original DFOL formulation, bridge rules have influenced several works in
the areas of heterogeneous knowledge integration, modular knowledge
representation, and schema/ontology matching. This, in turn, has fostered
extensions and modifications of the original DFOL that have never been
systematically described and published. This paper tackles the lack of a
comprehensive description of DFOL by providing a systematic account of a
completely revised and extended version of the logic, together with a sound and
complete axiomatisation of a general form of bridge rules based on Natural
Deduction. The resulting DFOL framework is then proposed as a clear formal tool
for the representation of and reasoning about distributed knowledge and bridge
rules
The Vampire and the FOOL
This paper presents new features recently implemented in the theorem prover
Vampire, namely support for first-order logic with a first class boolean sort
(FOOL) and polymorphic arrays. In addition to having a first class boolean
sort, FOOL also contains if-then-else and let-in expressions. We argue that
presented extensions facilitate reasoning-based program analysis, both by
increasing the expressivity of first-order reasoners and by gains in
efficiency
Historical collaborative geocoding
The latest developments in digital have provided large data sets that can
increasingly easily be accessed and used. These data sets often contain
indirect localisation information, such as historical addresses. Historical
geocoding is the process of transforming the indirect localisation information
to direct localisation that can be placed on a map, which enables spatial
analysis and cross-referencing. Many efficient geocoders exist for current
addresses, but they do not deal with the temporal aspect and are based on a
strict hierarchy (..., city, street, house number) that is hard or impossible
to use with historical data. Indeed historical data are full of uncertainties
(temporal aspect, semantic aspect, spatial precision, confidence in historical
source, ...) that can not be resolved, as there is no way to go back in time to
check. We propose an open source, open data, extensible solution for geocoding
that is based on the building of gazetteers composed of geohistorical objects
extracted from historical topographical maps. Once the gazetteers are
available, geocoding an historical address is a matter of finding the
geohistorical object in the gazetteers that is the best match to the historical
address. The matching criteriae are customisable and include several dimensions
(fuzzy semantic, fuzzy temporal, scale, spatial precision ...). As the goal is
to facilitate historical work, we also propose web-based user interfaces that
help geocode (one address or batch mode) and display over current or historical
topographical maps, so that they can be checked and collaboratively edited. The
system is tested on Paris city for the 19-20th centuries, shows high returns
rate and is fast enough to be used interactively.Comment: WORKING PAPE
Manifest domains:analysis and description
Abstract
We show that manifest domains, an understanding of which are a prerequisite for software requirements prescriptions, can be precisely described: narrated and formalised. We show that such manifest domains can be understood as a collection of endurant, that is, basically spatial entities: parts, components and materials, and perdurant, that is, basically temporal entities: actions, events and behaviours. We show that parts can be modeled in terms of external qualities whether: atomic or composite parts, having internal qualities: unique identifications, mereologies, which model relations between parts, and attributes. We show that the manifest domain analysis endeavour can be supported by a calculus of manifest domain analysis prompts: is_entity, is_endurant, is_perdurant, is_part, is_component, is_material, is_atomic, is_composite, has_components, has_materials, has_concrete_type, attribute_names, is_stationary, etcetera; and show how the manifest domain description endeavour can be supported by a calculus of manifest domain description prompts: observe_part_sorts, observe_part_type, observe_components, observe_materials, observe_unique_identifier, observe_mereology, observe_attributes. We show how to model attributes, essentially following Michael Jackson (Software requirements & specifications: a lexicon of practice, principles and prejudices. ACM Press, Addison-Wesley, Reading,
1995
), but with a twist: The attribute model introduces the attribute analysis prompts is_static_attribute, is_dynamic_attribute, is_inert_attribute, is_reactive_attribute, is_active_attribute, is_autonomous_attribute, is_biddable_attribute and is_programmable_attribute. The twist suggests ways of modeling âaccessâ to the values of these kinds of attributes: the static attributes by simply
âcopyingâ
them, once, the reactive and programmable attributes by
âcarryingâ
them as function parameters whose values are kept always updated, and the remaining, the external_attributes, by inquiring, when needed, as to their value, as if they were always offered on CSP-like channels (Hoare, Communicating sequential processes. C.A.R. Hoare series in computer science. Prentice-Hall International, London,
2004
). We show how to model essential aspects of perdurants in terms of their signatures based on the concepts of endurants. And we show how one can âcompileâ descriptions of endurant parts into descriptions of perdurant behaviours. We do not show prompt calculi for perdurants. The above contributions express a method with principles, techniques and tools for constructing domain descriptions. It is important to realise that we do not wish to nor claim that the method can describe all that it is interesting to know about domains.
</jats:p
A CASE STUDY ON HOW PRIMARY-SCHOOL IN-SERVICE TEACHERS CONJECTURE AND PROVE: AN APPROACH FROM THE MATHEMATICAL COMMUNITY
This paper studies how four primary-school in-service teachers develop the mathematical practices of conjecturing and proving. From the consideration of professional development as the legitimate peripheral participation in communities of practice, these teachersâ mathematical practices have been characterised by using a theoretical framework (consisting of categories of activities) that describes and explains how a research mathematician develops these two mathematical practices. This research has adopted a qualitative methodology and, in particular, a case study methodological approach. Data was collected in a working session on professional development while the four participants discussed two questions that invoked the development of the mathematical practices of conjecturing and proving. The results of this study show the significant presence of informal activities when the four participants conjecture, while few informal activities have been observed when they strive to prove a result. In addition, the use of examples (an informal activity) differs in the two practices, since examples support the conjecturing process but constitute obstacles for the proving process. Finally, the findings are contrasted with other related studies and several suggestions are presented that may be derived from this work to enhance professional development
(Re)ordering and (dis)ordering of street trade:the case of Recife, Brazil
Informal urban street trade is a prevalent feature across the Global South where much of the production and/or buying and selling of goods and services is unregulated. For this reason, local authorities have historically seen it as backward, inefficient and detrimental to the development of urban areas and have thus developed formalisation programmes aimed to control and ultimately make it disappear. Critics argue that the design and implementation of these programmes can marginalise and disempower informal traders as it acts against the tradersâ livelihoods and long-established practices they have developed for decades. This research speaks to these concerns and aims to investigate how informal urban street trade manages to continuously reproduce itself despite formalising efforts to make it vanish. The study follows a post-structuralist approach informed by post-development sensibilities (Escobar, 2011). The purpose is two-fold. First, to critically investigate the implications of imposed power-knowledge essentialism inherent to formalisation processes (Foucault, 1980). Second, to analyse the ways in which cultural and socioeconomic development is enacted through the daily assembling of informal urban street trade (FarĂas and Bender, 2012; McFarlane, 2011). The research offers a thick ethnographic inquiry, conducted over a one year-long period (2014-2015) in the urban centre of Recife, Northeast capital of Pernambuco state, Brazil. Recife is a particularly rich site to investigate these issues as informal urban street trade has historically been pervasive of its squares and streets and the municipally has in place a formalisation programme aimed to gather information about traders, license them and relocate them into purposefully-built facilities. The ethnographic inquiry focused on the practices, knowledges, materials and technologies associated with the daily work of both informal traders, selling on the streets, and governing officials implementing the formalisation programme, both on the streets and on the City Council office. Primary data collection was gathered through ethnographic observations and fieldnote diaries enriched with pictures and audio recordings of the day-to-day sensorial experience of informal urban street trade. This was enhanced with informal conversations as well as semi-structured and unstructured interviews with governing bodiesâ officials, licenced and unlicensed street traders, formal shop owners, and a diversified set of urban citizens. The thesis highlights that formalisation, through the introduction of regulations, classification schemes and practices of classifying traders through an information system, seeks to establish and expand an individualistic developmentality among all actors. Through this, formalisation aims to shape and normalise their everyday practices to focus on the City Councilâs agenda of rendering informal street trade as problematic and turning the solution of formalised trade not only unquestionable, but desirable by all. More problematically, the formalisation programmeâs overdetermination of what a socioeconomic order is to be and its imposition of individualising subjectivities to assist in its implementation acts against the tradersâ collective and community-based understanding of work and livelihoods which, contrary to the formalisation discourse, greatly benefit the cultural and socioeconomic development of these communities. This is achieved through the tradersâ daily assembling of work, value and supply on the streets. The findings reveal that the collective organisation of tradersâ work is strongly based on a âcooperative ethosâ that is not only efficient in taking advantage of and adapting to the challenging conditions of street markets, but also is key on the ongoing fostering and strengthening of the local community identity. The findings also show that traders, through their tacit knowledge of the best fits between products, services and sites, are key in shaping the valuation of both formal and informal enterprises as well as urban sites thus bolstering the local economy. Lastly, the findings also reveal that, through their interactions with formal and informal supply circuits, street traders are fundamental for the distribution and promotion of local artists and producers thus helping on the support and fostering of local culture. The main contribution of this research is it offers novel empirical and theoretical insights on the ways in which formalisation and informality are performed. It richly reveals the contested nature of development that is negotiated daily between the individualist developmentality imposed by formalisation and the communitarian- based development possibilities which are enacted through informal trading practices. These developmental possibilities are turned invisible by formalisation as classification enforces a strong reading of street trade which is ontologically distant and even contrary to the community-based values which make street trade not only resilient to formalising efforts but also adaptive to the challenging conditions and, more importantly, central to the cultural and socioeconomic development of these communities
Suggestions On How To Combine The Platonic Forms To Overcome The Interpretative Difficulties Of The Parmenides Dialogue
This paper provides an original approach to research on the logical processes that determine how certain forms participate in others. By introducing the concept of relational participation, the problems of self-referentiality of the Platonic forms can be dealt with more effectively. Applying this to the forms of likeness and unlikeness in Parmenides 132d-133a reveals a possible way to resolve different versions of the Third Man Argument. The method of generating numbers from oddness and evenness may also be of interest; relational participation in these forms clarifies the interpretation of Parmenides 143e-144a
- âŠ