740 research outputs found
Recommended from our members
Validating digital forensic evidence
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.This dissertation focuses on the forensic validation of computer evidence. It is a
burgeoning field, by necessity, and there have been significant advances in the detection and gathering of evidence related to electronic crimes. What makes the computer
forensics field similar to other forensic fields is that considerable emphasis is placed on the validity of the digital evidence. It is not just the methods used to collect the evidence that is a concern. What is also a problem is that perpetrators of digital crimes may be engaged in what is called anti-forensics. Digital forensic evidence techniques are deliberately thwarted and corrupted by those under investigation. In traditional forensics
the link between evidence and perpetrator's actions is often straightforward: a fingerprint on an object indicates that someone has touched the object. Anti-forensic activity would be the equivalent of having the ability to change the nature of the fingerprint before, or during the investigation, thus making the forensic evidence collected invalid or less
reliable. This thesis reviews the existing security models and digital forensics, paying
particular attention to anti-forensic activity that affects the validity of data collected in the form of digital evidence. This thesis will build on the current models in this field and suggest a tentative first step model to manage and detect possibility of anti-forensic activity. The model is concerned with stopping anti-forensic activity, and thus is not a forensic model in the normal sense, it is what will be called a “meta-forensic” model. A
meta-forensic approach is an approach intended to stop attempts to invalidate digital forensic evidence. This thesis proposes a formal procedure and guides forensic examiners to look at evidence in a meta-forensic way
Knowledge Discovery in Database: Induction Graph and Cellular Automaton
In this article we present the general architecture of a cellular machine, which makes it possible to reduce the size of induction graphs, and to optimize automatically the generation of symbolic rules. Our objective is to propose a tool for detecting and eliminating non relevant variables from the database. The goal, after acquisition by machine learning from a set of data, is to reduce the complexity of storage, thus to decrease the computing time. The objective of this work is to experiment a cellular machine for systems of inference containing rules. Our system relies upon the graphs generated by the SIPINA method. After an introduction aiming at positioning our contribution within the area of machine learning, we briefly present the SIPINA method for automatic retrieval of knowledge starting from data. We then describe our cellular system and the phase of knowledge post-processing, in particular the validation and the use of extracted knowledge. The presentation of our system is mostly done through an example taken from medical diagnosis
How effective indeed is present-day mathematics?
We argue that E. Wigner’s well-known claim that mathematics is unreasonably effective in physics (and not in the natural sciences in general, as the title of his article suggests) is only one side of the hill. The other side is the surprising insufficiency of present-day mathematics to capture the uniformities that arise in science outside physics. We describe roughly what the situation is in the areas of (a) everyday reasoning, (b) theory of meaning and (c) vagueness. We make also the point that mathematics, as we know it today, founded on the concept of set, need not be a conceptually final and closed system, but only a stage in a developing subject
- …