3,507 research outputs found
A Decade of Shared Tasks in Digital Text Forensics at PAN
[EN] Digital text forensics aims at examining the originality and
credibility of information in electronic documents and, in this regard, to extract and analyze information about the authors of these documents. The research field has been substantially developed during the last decade. PAN is a series of shared tasks that started in 2009 and significantly contributed to attract the attention of the research community in well-defined digital text forensics tasks. Several benchmark datasets have been developed to assess the state-of-the-art performance in a wide range of tasks. In this paper, we present the evolution of both the examined tasks and the developed datasets during the last decade. We also briefly introduce the upcoming PAN 2019 shared tasks.We are indebted to many colleagues and friends who contributed greatly to PAN's tasks: Maik Anderka, Shlomo Argamon, Alberto Barrón-Cedeño, Fabio Celli, Fabio Crestani, Walter Daelemans, Andreas Eiselt, Tim Gollub,
Parth Gupta, Matthias Hagen, Teresa Holfeld, Patrick Juola, Giacomo Inches, Mike
Kestemont, Moshe Koppel, Manuel Montes-y-Gómez, Aurelio Lopez-Lopez, Francisco
Rangel, Miguel Angel Sánchez-Pérez, Günther Specht, Michael Tschuggnall, and Ben
Verhoeven. Our special thanks go to PAN¿s sponsors throughout the years and not
least to the hundreds of participants.Potthast, M.; Rosso, P.; Stamatatos, E.; Stein, B. (2019). A Decade of Shared Tasks in Digital Text Forensics at PAN. Lecture Notes in Computer Science. 11438:291-300. https://doi.org/10.1007/978-3-030-15719-7_39S2913001143
A Forensic First Look at a POS Device: Searching For PCI DSS Data Storage Violations
According to the Verizon 2018 Data Breach Investigations Report , 321 POS terminals (user devices) were involved in data breaches in 2017 [1]. These data breaches involved standalone POS terminals as well as associated controller systems. This paper examines a standalone Point-of-Sale (POS) system commonly used in smaller retail stores and restaurants to extract unencrypted data and identify possible violations of the Payment Card Industry Data Security Standard (PCI DSS) requirement to protect stored cardholder data. Persistent storage (flash memory chips) were removed from the devices and their contents were successfully acquired. Information about the device and the code running on it was successfully extracted, although no PCI DSS data storage violations were identified
Hominid butchers and biting crocodiles in the African Plio-Pleistocene.
Zooarchaeologists have long relied on linear traces and pits found on the surfaces of ancient bones to infer ancient hominid behaviors such as slicing, chopping, and percussive actions during butchery of mammal carcasses. However, such claims about Plio-Pleistocene hominids rely mostly on very small assemblages of bony remains. Furthermore, recent experiments on trampling animals and biting crocodiles have shown each to be capable of producing mimics of such marks. This equifinality-the creation of similar products by different processes-makes deciphering early archaeological bone assemblages difficult. Bone modifications among Ethiopian Plio-Pleistocene hominid and faunal remains at Asa Issie, Maka, Hadar, and Bouri were reassessed in light of these findings. The results show that crocodiles were important modifiers of these bone assemblages. The relative roles of hominids, mammalian carnivores, and crocodiles in the formation of Oldowan zooarchaeological assemblages will only be accurately revealed by better bounding equifinality. Critical analysis within a consilience-based approach is identified as the pathway forward. More experimental studies and increased archaeological fieldwork aimed at generating adequate samples are now required
Reconstructing the progress of digital forensic evidence examination and analysis
Abstract. Many commands and tools are used during the evidence examination and analysis stages of digital forensics. If the need to replicate the exact steps from these stages arises later, doing so without proper documentation can be an arduous task. Thus, this thesis focuses on determining how the story of digital forensics progression could be told. To tell the story, this thesis contributes a three-piece system consisting of an updated version of the data collection tool titled Hardtrace, an Application Programming Interface (API) for summarizing and storing collected data to the cloud, and lastly a visualizer application allowing forensic researchers to visually inspect the steps taken during examination and analysis.
To obtain data on digital forensics progression and to test the system, a case study was conducted. The study’s participants had to complete a memory forensics Capture the Flag challenge while using Hardtrace. Collected data from each participant was sent to the cloud API. The system’s ability to reconstruct and detail the progression of participants work was tested by performing visual and statistical analysis on the summarized data. System performance testing was also conducted.
The results demonstrated that the presented system was able to detail, through visualization, the steps case study participants took while solving the challenge. Statistical summary analysis provided a large quantity of information on how each participant worked, deepening the understanding gained from just visual analysis. Finally, performance analysis showed that the system is able to summarize and visualize data in seconds. Updates to Hardtrace reduced command execution times significantly, nonetheless, the more system calls a tool or command performs, the more execution time overhead is still added by Hardtrace.Digitaaliforensiikan todisteiden tutkimisen ja analyysin rekonstruointi. Tiivistelmä. Digitaaliforensiikan todisteiden tutkimis- ja analyysivaiheissa käytetään useita komentoja ja työkaluja. Jos nämä vaiheet on myöhemmin toistettava samoin tai tutkijan on kerrottava, miten todisteita käsiteltiin, voi tehtyjen toimenpiteiden muistaminen olla haastavaa ilman kunnollista dokumentaatiota. Täten, tämä tutkielma keskittyy ratkaisemaan miten tutkimis- ja analyysivaiheiden eteneminen voitaisiin kertoa ohjelmallisesti. Tätä tarkoitusta varten tässä tutkielmassa toteutettiin kolmiosainen järjestelmä, jonka osat ovat Hardtrace, ohjelmointirajapinta ja visualisointiohjelma. Hardtrace on jo olemassa oleva datankeräystyökalu, jota tässä työssä päivitetään. Pilveen sijoitetun ohjelmointirajapinnan tehtävä on vastaanottaa ja säilyttää Hardtracen tuottamaa dataa, sekä luoda siitä tiivistelmiä. Visualisointiohjelman avulla forensiikkatutkija pystyy tarkastelemaan visuaalisesti tekemänsä forensiikkatutkimuksen etenemistä.
Toteutetun järjestelmän kykyä rekonstruoida digitaaliforensiikan vaiheiden eteneminen testattiin tapaustutkimuksella. Tutkimuksen osallistujat suorittivat muistiforensiikka Capture the Flag -haasteen ja heidän suorituksista kerättiin dataa Hardtracella. Ohjelmointirajapinnan kerätystä datasta tuottamia tiivistelmiä analysoitiin visuaalisesti ja tilastollisesti.
Tutkielman tulokset näyttivät, että järjestelmä kykeni kertomaan visualisoinnin keinoin, miten tapaustutkimuksen osallistujat selättivät heille annetun haasteen. Osallistujien suoritusten tilastollinen analyysi tuotti paljon lisätietoa osallistujien toiminnasta. Järjestelmän suorituskyvyn havaittiin olevan hyvä dataa tiivistäessä ja visualisoidessa. Hardtraceen tehdyt päivitykset laskivat komentojen ja työkalujen suoritusaikoja huomattavasti, mutta tästä huolimatta mitä enemmän järjestelmäkutsuja komento tai työkalu käyttää, sitä enemmän Hardtrace suoritusaikaa kasvattaa
Recommended from our members
A Framework for the Systematic Evaluation of Malware Forensic Tools
Following a series of high profile miscarriages of justice linked to questionable expert evidence, the post of the Forensic Science Regulator was created in 2008 with a remit to improve the standard of practitioner competences and forensic procedures. It has since moved to incorporate a greater level of scientific practice in these areas, as used in the production of expert evidence submitted to the UK Criminal Justice System. Accreditation to their codes of practice and conduct will become mandatory for all forensic practitioners by October 2017. A variety of challenges with expert evidence are explored and linked to a lack of a scientific methodology underpinning the processes followed. In particular, the research focuses upon investigations where malicious software (‘malware’) has been identified.
A framework, called the ‘Malware Analysis Tool Evaluation Framework’ (MATEF), has been developed to address this lack of methodology to evaluate software tools used during investigations involving malware. A prototype implementation of the framework was used to evaluate two tools against a population of over 350,000 samples of malware. Analysis of the findings indicated that the choice of tool could impact on the number of artefacts observed in malware forensic investigations as well as identifying the optimal execution time for a given tool when observing malware artefacts.
Three different measures were used to evaluate the framework. The first of these evaluated the framework against the requirements and determined that these were largely met. Where the requirements were not met these are attributed to matters either outside scope or the fledgling nature of the research. Another measure used to evaluate the framework was to consider its performance in terms of speed and resource utilisation. This identified scope for improvement in terms of the time to complete a test and the need for more economical use of disk space. Finally, the framework provides a scientific means to evaluate malware analysis tools, hence addressing the Research Question subject to the level at which ground truth is established.
A number of contributions are produced as the output of this work. First there is confirmation for the case for a lack of trusted practice in the field of malware forensics. Second, the MATEF itself, as it facilitates the production of empirical evidence of a tool’s ability to detect malware artefacts. A third contribution is a set of requirements for establishing trusted practice in the use of malware artefact detection tools. Finally, empirical evidence that supports both the notion that the choice of tool can impact on the number of artefacts observed in malware forensic investigations as well as identifying the optimal execution time for a given tool when observing malware artefacts
Technical and legal perspectives on forensics scenario
The dissertation concerns digital forensic. The expression digital forensic (sometimes called digital forensic science)
is the science that studies the identification, storage, protection, retrieval, documentation, use, and every
other form of computer data processing in order to be evaluated in a legal trial. Digital forensic is a branch of
forensic science. First of all, digital forensic represents the extension of theories, principles and procedures that
are typical and important elements of the forensic science, computer science and new technologies. From this
conceptual viewpoint, the logical consideration concerns the fact that the forensic science studies the legal value
of specific events in order to contrive possible sources of evidence. The branches of forensic science are: physiological
sciences, social sciences, forensic criminalistics and digital forensics. Moreover, digital forensic includes
few categories relating to the investigation of various types of devices, media or artefacts. These categories are:
- computer forensic: the aim is to explain the current state of a digital artefact; such as a computer system,
storage medium or electronic document;
- mobile device forensic: the aim is to recover digital evidence or data from mobile device, such as image, log
call, log sms and so on;
- network forensic: the aim is related to the monitoring and analysis of network traffic (local, WAN/Internet,
UMTS, etc.) to detect intrusion more in general to find network evidence;
- forensic data analysis: the aim is examine structured data to discover evidence usually related to financial
crime;
- database forensic: the aim is related to databases and their metadata.
The origin and historical development of the discipline of study and research of digital forensic are closely
related to progress in information and communication technology in the modern era. In parallel with the changes
in society due to new technologies and, in particular, the advent of the computer and electronic networks, there
has been a change in the mode of collection, management and analysis of evidence. Indeed, in addition to
the more traditional, natural and physical elements, the procedures have included further evidence that although
equally capable of identifying an occurrence, they are inextricably related to a computer or a computer network
or electronic means. The birth of computer forensics can be traced back to 1984, when the FBI and other
American investigative agencies have began to use software for the extraction and analysis of data on a personal
computer. At the beginning of the 80s, the CART(Computer Analysis and Response Team) was created within
the FBI, with the express purpose of seeking the so-called digital evidence. This term is used to denote all the
information stored or transmitted in digital form that may have some probative value. While the term evidence,
more precisely, constitutes the judicial nature of digital data, the term forensic emphasizes the procedural nature
of matter, literally, "to be presented to the Court". Digital forensic have a huge variety of applications. The
most common applications are related to crime or cybercrime. Cybercrime is a growing problem for government,
business and private.
- Government: security of the country (terrorism, espionage, etc.) or social problems (child pornography,
child trafficking and so on).
- Business: purely economic problems, for example industrial espionage.
- Private: personal safety and possessions, for example phishing, identity theft.
Often many techniques, used in digital forensics, are not formally defined and the relation between the technical
procedure and the law is not frequently taken into consideration. From this conceptual perspective, the research
work intends to define and optimize the procedures and methodologies of digital forensic in relation to Italian
regulation, testing, analysing and defining the best practice, if they are not defined, concerning common software.
The research questions are:
1. The problem of cybercrime is becoming increasingly significant for governments, businesses and citizens.
- In relation to governments, cybercrime involves problems concerning national security, such as terrorism
and espionage, and social questions, such as trafficking in children and child pornography.
- In relation to businesses, cybercrime entails problems concerning mainly economic issues, such as
industrial espionage.
- In relation to citizens, cybercrime involves problems concerning personal security, such as identity
thefts and fraud.
2. Many techniques, used within the digital forensic, are not formally defined.
3. The relation between procedures and legislation are not always applied and taken into consideratio
Deep Learning Methods for Malware and Intrusion Detection: A Systematic Literature Review
Android and Windows are the predominant operating systems used in mobile environment and personal computers and it is expected that their use will rise during the next decade. Malware is one of the main threats faced by these platforms as well as Internet of Things (IoT) environment and the web. With time, these threats are becoming more and more sophisticated and detecting them using traditional machine learning techniques is a hard task. Several research studies have shown that deep learning methods achieve better accuracy comparatively and can learn to efficiently detect and classify new malware samples. In this paper, we present a systematic literature review of the recent studies that focused on intrusion and malware detection and their classification in various environments using deep learning techniques. We searched five well-known digital libraries and collected a total of 107 papers that were published in scholarly journals or preprints. We carefully read the selected literature and critically analyze it to find out which types of threats and what platform the researchers are targeting and how accurately the deep learning-based systems can detect new security threats. This survey will have a positive impact on the learning capabilities of beginners who are interested in starting their research in the area of malware detection using deep learning methods. From the detailed critical analysis, it is identified that CNN, LSTM, DBN, and autoencoders are the most frequently used deep learning methods that have effectively been used in various application scenarios
- …