339 research outputs found
Understanding Hackers' Work: An Empirical Study of Offensive Security Practitioners
Offensive security-tests are a common way to pro-actively discover potential
vulnerabilities. They are performed by specialists, often called
penetration-testers or white-hat hackers. The chronic lack of available
white-hat hackers prevents sufficient security test coverage of software.
Research into automation tries to alleviate this problem by improving the
efficiency of security testing. To achieve this, researchers and tool builders
need a solid understanding of how hackers work, their assumptions, and pain
points.
In this paper, we present a first data-driven exploratory qualitative study
of twelve security professionals, their work and problems occurring therein. We
perform a thematic analysis to gain insights into the execution of security
assignments, hackers' thought processes and encountered challenges.
This analysis allows us to conclude with recommendations for researchers and
tool builders to increase the efficiency of their automation and identify novel
areas for research.Comment: 11 pages, we have chosen the category "Software Engineering" and not
"Cryptography and Security" as while this is a paper about security
practices, we target software engineering researcher
Exploring Text Mining and Analytics for Applications in Public Security: An in-depth dive into a systematic literature review
Text mining and related analytics emerge as a technological approach to support human activities in extracting useful knowledge through texts in several formats. From a managerial point of view, it can help organizations in planning and decision-making processes, providing information that was not previously evident through textual materials produced internally or even externally. In this context, within the public/governmental scope, public security agencies are great beneficiaries of the tools associated with text mining, in several aspects, from applications in the criminal area to the collection of people's opinions and sentiments about the actions taken to promote their welfare. This article reports details of a systematic literature review focused on identifying the main areas of text mining application in public security, the most recurrent technological tools, and future research directions. The searches covered four major article bases (Scopus, Web of Science, IEEE Xplore, and ACM Digital Library), selecting 194 materials published between 2014 and the first half of 2021, among journals, conferences, and book chapters. There were several findings concerning the targets of the literature review, as presented in the results of this article
Defense Against Model Extraction Attacks on Recommender Systems
The robustness of recommender systems has become a prominent topic within the
research community. Numerous adversarial attacks have been proposed, but most
of them rely on extensive prior knowledge, such as all the white-box attacks or
most of the black-box attacks which assume that certain external knowledge is
available. Among these attacks, the model extraction attack stands out as a
promising and practical method, involving training a surrogate model by
repeatedly querying the target model. However, there is a significant gap in
the existing literature when it comes to defending against model extraction
attacks on recommender systems. In this paper, we introduce Gradient-based
Ranking Optimization (GRO), which is the first defense strategy designed to
counter such attacks. We formalize the defense as an optimization problem,
aiming to minimize the loss of the protected target model while maximizing the
loss of the attacker's surrogate model. Since top-k ranking lists are
non-differentiable, we transform them into swap matrices which are instead
differentiable. These swap matrices serve as input to a student model that
emulates the surrogate model's behavior. By back-propagating the loss of the
student model, we obtain gradients for the swap matrices. These gradients are
used to compute a swap loss, which maximizes the loss of the student model. We
conducted experiments on three benchmark datasets to evaluate the performance
of GRO, and the results demonstrate its superior effectiveness in defending
against model extraction attacks
Regulating competition in the digital network industry: A proposal for progressive ecosystem regulation
The digital sector is a cornerstone of the modern economy, and regulating digital enterprises can be considered the new frontier for regulators and competition authorities. To capture and address the competitive dynamics of digital markets we need to rethink our (competition) laws and regulatory strategies. The thesis develops new approaches to regulating digital markets by viewing them as part of a network industry. By combining insights from our experiences with existing regulation in telecommunications with insights from economics literature and management theory, the thesis concludes by proposing a new regulatory framework called ‘progressive ecosystem regulation’. The thesis is divided in three parts and has three key findings or contributions. The first part explains why digital platforms such as Google’s search engine, Meta’s social media platforms and Amazon’s Marketplace are prone to monopolization. Here, the thesis develops a theory of ‘digital natural monopoly’, which explains why competition in digital platform markets is likely to lead to concentration by its very nature.The second part of the thesis puts forward that competition in digital markets persists, even if there is monopoly in a market. Here, the thesis develops a conceptual framework for competition between digital ecosystems, which consists of group of actors and products. Digital enterprises compete to carve out a part of the digital network industry where they can exert control, and their strong position in a platform market can be used offensively or defensively to steer competition between ecosystems. The thesis then sets out four phases of ecosystem competition, which helps to explain when competition in the digital network industry is healthy and when it is likely to become problematic.The third and final part of the thesis brings together these findings and draws lessons from our experiences of regulating the network industry for telecommunications. Based on the insights developed in the thesis it puts forward a proposal for ‘progressive ecosystem regulation’. The purpose of this regulation is to protect and empower entrants from large digital ecosystems so that they can develop new products and innovate disruptively. This regulatory framework would create three regulatory pools: a heavily regulated, lightly regulated and entrant pool. The layered regulatory framework allows regulators to adjust who receives protection under the regulation and who faces the burdens relatively quickly, so that the regulatory framework reflects the fast pace of innovation and changing nature of digital markets. With this proposal, the thesis challenges and enriches our existing notions on regulation and specifically how we should regulate digital markets
2023- The Twenty-seventh Annual Symposium of Student Scholars
The full program book from the Twenty-seventh Annual Symposium of Student Scholars, held on April 18-21, 2023. Includes abstracts from the presentations and posters.https://digitalcommons.kennesaw.edu/sssprograms/1027/thumbnail.jp
Regulating competition in the digital network industry: A proposal for progressive ecosystem regulation
The digital sector is a cornerstone of the modern economy, and regulating digital enterprises can be considered the new frontier for regulators and competition authorities. To capture and address the competitive dynamics of digital markets we need to rethink our (competition) laws and regulatory strategies. The thesis develops new approaches to regulating digital markets by viewing them as part of a network industry. By combining insights from our experiences with existing regulation in telecommunications with insights from economics literature and management theory, the thesis concludes by proposing a new regulatory framework called ‘progressive ecosystem regulation’. The thesis is divided in three parts and has three key findings or contributions. The first part explains why digital platforms such as Google’s search engine, Meta’s social media platforms and Amazon’s Marketplace are prone to monopolization. Here, the thesis develops a theory of ‘digital natural monopoly’, which explains why competition in digital platform markets is likely to lead to concentration by its very nature.The second part of the thesis puts forward that competition in digital markets persists, even if there is monopoly in a market. Here, the thesis develops a conceptual framework for competition between digital ecosystems, which consists of group of actors and products. Digital enterprises compete to carve out a part of the digital network industry where they can exert control, and their strong position in a platform market can be used offensively or defensively to steer competition between ecosystems. The thesis then sets out four phases of ecosystem competition, which helps to explain when competition in the digital network industry is healthy and when it is likely to become problematic.The third and final part of the thesis brings together these findings and draws lessons from our experiences of regulating the network industry for telecommunications. Based on the insights developed in the thesis it puts forward a proposal for ‘progressive ecosystem regulation’. The purpose of this regulation is to protect and empower entrants from large digital ecosystems so that they can develop new products and innovate disruptively. This regulatory framework would create three regulatory pools: a heavily regulated, lightly regulated and entrant pool. The layered regulatory framework allows regulators to adjust who receives protection under the regulation and who faces the burdens relatively quickly, so that the regulatory framework reflects the fast pace of innovation and changing nature of digital markets. With this proposal, the thesis challenges and enriches our existing notions on regulation and specifically how we should regulate digital markets
A Survey on Malware Detection with Graph Representation Learning
Malware detection has become a major concern due to the increasing number and
complexity of malware. Traditional detection methods based on signatures and
heuristics are used for malware detection, but unfortunately, they suffer from
poor generalization to unknown attacks and can be easily circumvented using
obfuscation techniques. In recent years, Machine Learning (ML) and notably Deep
Learning (DL) achieved impressive results in malware detection by learning
useful representations from data and have become a solution preferred over
traditional methods. More recently, the application of such techniques on
graph-structured data has achieved state-of-the-art performance in various
domains and demonstrates promising results in learning more robust
representations from malware. Yet, no literature review focusing on graph-based
deep learning for malware detection exists. In this survey, we provide an
in-depth literature review to summarize and unify existing works under the
common approaches and architectures. We notably demonstrate that Graph Neural
Networks (GNNs) reach competitive results in learning robust embeddings from
malware represented as expressive graph structures, leading to an efficient
detection by downstream classifiers. This paper also reviews adversarial
attacks that are utilized to fool graph-based detection methods. Challenges and
future research directions are discussed at the end of the paper.Comment: Preprint, submitted to ACM Computing Surveys on March 2023. For any
suggestions or improvements, please contact me directly by e-mai
Recommended from our members
Rulemaking as Play: A Transdisciplinary Inquiry about Virtual Worldmaking
In the age of computing, we rely on software to manage our days, from the moment we wake up until we go to sleep. Software predicts the future based on actualized data from the past. It produces procedures instead of experiences and solutions instead of care. Software systems tend to perpetuate a normalized state of equilibrium. Their application in social media, predictive policing, and social profiling is increasingly erasing diversity in culture and identity. Our immediate reality is narrowing towards cultural conventions shared among the powerful few, whose voices directly influence contemporary digital culture.
On the other hand, computational collective intelligence can sometimes generate emergent forces to counter this tendency and force software systems to open up. Historically, artists from different artistic moments have adopted collaborative making to redefine the boundary of creative expression. Video Gaming, especially open-world simulation games, is rapidly being adopted as an emerging form of communication, expression, and self-organization.
How can gaming conventions such as Narrative Emergence, Hacking, and Modding help us understand collective play as countering forces against the systematic tendency of normalization? How can people from diverse backgrounds come together to contemplate, make, and simulate rules and conditions for an alternative virtual world? What does it mean to design and virtually inhabit a world where rules are rewritten continuously by everyone, and no one is in control
- …