9 research outputs found
Аппаратная реализация преобразования SubBytes алгоритма шифрования Rijndael (AES)
Проведено аналіз існуючих рішень апаратної реалізації підстановлюючого блоку S-Box, що є основою перетворення SuBytes алгоритму шифрування Rijndael. Виконано синтез, моделювання та верифікацію динамічних і функціональних характеристик найбільш поширених варіантів на основі їхніх описів мовою VHDL.Analysis of existing solutions for substitution module S-Box hardware implementations which is the basic part of SubBytes transformation in Rijndael encryption algorithm is provided. Synthesis, modeing and dynamic and functional characteristics verification for the most popular options based on appropriate VHDL source codes are performed
Technical and legal perspectives on forensics scenario
The dissertation concerns digital forensic. The expression digital forensic (sometimes called digital forensic science)
is the science that studies the identification, storage, protection, retrieval, documentation, use, and every
other form of computer data processing in order to be evaluated in a legal trial. Digital forensic is a branch of
forensic science. First of all, digital forensic represents the extension of theories, principles and procedures that
are typical and important elements of the forensic science, computer science and new technologies. From this
conceptual viewpoint, the logical consideration concerns the fact that the forensic science studies the legal value
of specific events in order to contrive possible sources of evidence. The branches of forensic science are: physiological
sciences, social sciences, forensic criminalistics and digital forensics. Moreover, digital forensic includes
few categories relating to the investigation of various types of devices, media or artefacts. These categories are:
- computer forensic: the aim is to explain the current state of a digital artefact; such as a computer system,
storage medium or electronic document;
- mobile device forensic: the aim is to recover digital evidence or data from mobile device, such as image, log
call, log sms and so on;
- network forensic: the aim is related to the monitoring and analysis of network traffic (local, WAN/Internet,
UMTS, etc.) to detect intrusion more in general to find network evidence;
- forensic data analysis: the aim is examine structured data to discover evidence usually related to financial
crime;
- database forensic: the aim is related to databases and their metadata.
The origin and historical development of the discipline of study and research of digital forensic are closely
related to progress in information and communication technology in the modern era. In parallel with the changes
in society due to new technologies and, in particular, the advent of the computer and electronic networks, there
has been a change in the mode of collection, management and analysis of evidence. Indeed, in addition to
the more traditional, natural and physical elements, the procedures have included further evidence that although
equally capable of identifying an occurrence, they are inextricably related to a computer or a computer network
or electronic means. The birth of computer forensics can be traced back to 1984, when the FBI and other
American investigative agencies have began to use software for the extraction and analysis of data on a personal
computer. At the beginning of the 80s, the CART(Computer Analysis and Response Team) was created within
the FBI, with the express purpose of seeking the so-called digital evidence. This term is used to denote all the
information stored or transmitted in digital form that may have some probative value. While the term evidence,
more precisely, constitutes the judicial nature of digital data, the term forensic emphasizes the procedural nature
of matter, literally, "to be presented to the Court". Digital forensic have a huge variety of applications. The
most common applications are related to crime or cybercrime. Cybercrime is a growing problem for government,
business and private.
- Government: security of the country (terrorism, espionage, etc.) or social problems (child pornography,
child trafficking and so on).
- Business: purely economic problems, for example industrial espionage.
- Private: personal safety and possessions, for example phishing, identity theft.
Often many techniques, used in digital forensics, are not formally defined and the relation between the technical
procedure and the law is not frequently taken into consideration. From this conceptual perspective, the research
work intends to define and optimize the procedures and methodologies of digital forensic in relation to Italian
regulation, testing, analysing and defining the best practice, if they are not defined, concerning common software.
The research questions are:
1. The problem of cybercrime is becoming increasingly significant for governments, businesses and citizens.
- In relation to governments, cybercrime involves problems concerning national security, such as terrorism
and espionage, and social questions, such as trafficking in children and child pornography.
- In relation to businesses, cybercrime entails problems concerning mainly economic issues, such as
industrial espionage.
- In relation to citizens, cybercrime involves problems concerning personal security, such as identity
thefts and fraud.
2. Many techniques, used within the digital forensic, are not formally defined.
3. The relation between procedures and legislation are not always applied and taken into consideratio
The impact of information security awareness training on information security behaviour
Information Security awareness initiatives are seen as critical to any
information security programme. But, how do we determine the
effectiveness of these awareness initiatives? We could get our employees
to write a test after the awareness to determine how well they
understand the policies, but this does not show how they affect the
employee’s on the job behaviour. Does awareness training have a direct
influence on the security behaviour of individuals, and what is the direct
benefit of awareness training? This research report aims to answer the
question: To what extent does information security awareness training
influence information security behaviour?
Technologies meant to provide security ultimately depend on the
effective implementation and operation of these technologies by people.
Thus awareness of policies is needed by all individuals in an organisation
to ensure that policies are well understood and not misinterpreted. Some
researchers have maintained that educating users is futile mainly
because it is believed that it is difficult to teach users complex security
issues and, secondly, because if security is seen as secondary by the user
they will not pay enough attention to it.
This research found that, firstly, there is a shortage of in-depth
information security awareness research and that behavioural concepts
are not properly taken into account for security awareness programmes.
There is a shortage of theoretical models explaining how awareness
training affects behaviour. Secondly, this research tested a proposed
model empirically using system-generated data as indicators of behaviour
in a pretest-posttest experimental design. It was found that security
awareness training was effective in terms of end-users retaining security
knowledge. However, there was no evidence to suggest that security
awareness by itself is sufficient to ensure compliant behaviour by endusers.
Security awareness training is a necessary, integral component
that could influence compliant behaviour, but is not adequate to do so
fully. Practitioners must insist that their security awareness programmes
are measured in terms of effectiveness and focus on behavioural aspects
to complement traditional security awareness initiatives
CBAC – a model for conflict-based access control
Organisations that seek a competitive advantage cannot afford to compromise their brand reputation or expose it to disrepute. When employees leek information, it is not only the breach of confidentiality that is a problem, but it also causes a major brand reputation problem for the organisation. Any possible breach of confidentiality should be minimised by implementing adequate security within the organisation and among its employees. An important issue to address is the development of suitable access control models that are able to restrict access not only to unauthorised data sets, but also to unauthorised combinations of data sets. Within organisations such as banks, clients may exist that are in conflict with one another. This conflict results from the fact that clients are functioning in the same business domain and that their information should be shielded from one another because they are in competition for various reasons. When information on any of these conflicting clients is extracted from their data sets via a data-mining process and used to their detriment or to the benefit of the guilty party, this is considered a breach of confidentiality. In data-mining environments, access control usually strips the data of any identity so as to concentrate on tendencies and ensure that data cannot be traced back to a respondent. There is an active research field in data mining that focuses specifically on ‘preserving’ the privacy of the data during the data-mining process. However, this approach does not account for those situations when data mining needs to be performed to give answers to specific clients. In such cases, when the clients’ identity cannot be stripped, it is essential to minimise the chances of a possible breach of confidentiality. For this reason, this thesis investigated an environment where conflicting clients’ information can easily be gathered and used or sold, as to justify the inclusion of conflict management in the proposed access control model. This thesis presents the Conflict-based Access Control (CBAC) model. The model makes it possible to manage conflict on different levels of severity among the clients of an organisation – not only as specified by the clients, but also as calculated by the organisation. Both types of conflict have their own cut-off points when the conflict is considered to be of no value any longer. Finally, a proof-of-concept prototype illustrates that the incorporation of conflict management is a viable solution to the problem of access control as it minimises the chances of a breach of confidentialityThesis (PhD)--University of Pretoria, 2012.Computer Scienceunrestricte
An exploratory study of techniques in passive network telescope data analysis
Careful examination of the composition and concentration of malicious traffic in transit on the channels of the Internet provides network administrators with a means of understanding and predicting damaging attacks directed towards their networks. This allows for action to be taken to mitigate the effect that these attacks have on the performance of their networks and the Internet as a whole by readying network defences and providing early warning to Internet users. One approach to malicious traffic monitoring that has garnered some success in recent times, as exhibited by the study of fast spreading Internet worms, involves analysing data obtained from network telescopes. While some research has considered using measures derived from network telescope datasets to study large scale network incidents such as Code-Red, SQLSlammer and Conficker, there is very little documented discussion on the merits and weaknesses of approaches to analyzing network telescope data. This thesis is an introductory study in network telescope analysis and aims to consider the variables associated with the data received by network telescopes and how these variables may be analysed. The core research of this thesis considers both novel and previously explored analysis techniques from the fields of security metrics, baseline analysis, statistical analysis and technical analysis as applied to analysing network telescope datasets. These techniques were evaluated as approaches to recognize unusual behaviour by observing the ability of these techniques to identify notable incidents in network telescope dataset
A framework for the application of network telescope sensors in a global IP network
The use of Network Telescope systems has become increasingly popular amongst security researchers in recent years. This study provides a framework for the utilisation of this data. The research is based on a primary dataset of 40 million events spanning 50 months collected using a small (/24) passive network telescope located in African IP space. This research presents a number of differing ways in which the data can be analysed ranging from low level protocol based analysis to higher level analysis at the geopolitical and network topology level. Anomalous traffic and illustrative anecdotes are explored in detail and highlighted. A discussion relating to bogon traffic observed is also presented. Two novel visualisation tools are presented, which were developed to aid in the analysis of large network telescope datasets. The first is a three-dimensional visualisation tool which allows for live, near-realtime analysis, and the second is a two-dimensional fractal based plotting scheme which allows for plots of the entire IPv4 address space to be produced, and manipulated. Using the techniques and tools developed for the analysis of this dataset, a detailed analysis of traffic recorded as destined for port 445/tcp is presented. This includes the evaluation of traffic surrounding the outbreak of the Conficker worm in November 2008. A number of metrics relating to the description and quantification of network telescope configuration and the resultant traffic captures are described, the use of which it is hoped will facilitate greater and easier collaboration among researchers utilising this network security technology. The research concludes with suggestions relating to other applications of the data and intelligence that can be extracted from network telescopes, and their use as part of an organisation’s integrated network security system
A Code of Conduct for Computer Forensic Investigators
The amount of electronic data that is held about individuals and their activities is
staggering. Tools enabling data recovery, believed deleted, vary in consistency and
reliability of result. Data under review can be fed into investigative tools which also
vary immensely in reliability, consistency, quality and indeed price.
Conclusions and inferences drawn from the use of these tools can be morally, socially
and commercially damaging for the individuals or entities being investigated. Often not
purely because of the lack of experience of the investigator, but also because of the
simplistic operation of the toolsets.
Whilst prescriptive guidelines exist in the public sector for the proper handling, analysis
and reporting of computer evidence, little commercially independent professional
guidance exists in the private sector. This lack of guidance has led to a position whereby
actors in the field of data forensics have few challenges as to their expertise or
experience. Recent cases of incompetence and crossing ethical and professional
boundaries provide strong support for a National, preferably International certification
and training scheme for data forensic analysts, supported by clear ethical codes.
This research in light of the above challenges, provides examples of failures in
extrapolation, operator understanding and tool use; argues a proposal for a code of
conduct to ensure correct and repeatable process is followed; along with a suggested
outline for the creation of the supervision of conformity to that code in the private
sector. The current forensics community and academic research body of knowledge,
supported by the extensive experience of the researcher have been the major inputs to
the work. The outputs of this work are intended to form a solid base for the furtherance
of the Computer Forensics profession, and as such will represent a significant
contribution to the advancement and knowledge base of that profession