227,151 research outputs found
Interoperability-Guided Testing of QUIC Implementations using Symbolic Execution
The main reason for the standardization of network protocols, like QUIC, is
to ensure interoperability between implementations, which poses a challenging
task. Manual tests are currently used to test the different existing
implementations for interoperability, but given the complex nature of network
protocols, it is hard to cover all possible edge cases.
State-of-the-art automated software testing techniques, such as Symbolic
Execution (SymEx), have proven themselves capable of analyzing complex
real-world software and finding hard to detect bugs. We present a SymEx-based
method for finding interoperability issues in QUIC implementations, and explore
its merit in a case study that analyzes the interoperability of picoquic and
QUANT. We find that, while SymEx is able to analyze deep interactions between
different implementations and uncovers several bugs, in order to enable
efficient interoperability testing, implementations need to provide additional
information about their current protocol state.Comment: 6 page
OpenSBT: A Modular Framework for Search-based Testing of Automated Driving Systems
Search-based software testing (SBT) is an effective and efficient approach
for testing automated driving systems (ADS). However, testing pipelines for ADS
testing are particularly challenging as they involve integrating complex
driving simulation platforms and establishing communication protocols and APIs
with the desired search algorithm. This complexity prevents a wide adoption of
SBT and thorough empirical comparative experiments with different simulators
and search approaches. We present OpenSBT, an open-source, modular and
extensible framework to facilitate the SBT of ADS. With OpenSBT, it is possible
to integrate simulators with an embedded system under test, search algorithms
and fitness functions for testing. We describe the architecture and show the
usage of our framework by applying different search algorithms for testing
Automated Emergency Braking Systems in CARLA as well in the high-fidelity
Prescan simulator in collaboration with our industrial partner DENSO. OpenSBT
is available at https://git.fortiss.org/opensbt
Formal-Guided Fuzz Testing: Targeting Security Assurance from Specification to Implementation for 5G and Beyond
Softwarization and virtualization in 5G and beyond necessitate thorough
testing to ensure the security of critical infrastructure and networks,
requiring the identification of vulnerabilities and unintended emergent
behaviors from protocol designs to their software stack implementation. To
provide an efficient and comprehensive solution, we propose a novel and
first-of-its-kind approach that connects the strengths and coverage of formal
and fuzzing methods to efficiently detect vulnerabilities across protocol logic
and implementation stacks in a hierarchical manner. We design and implement
formal verification to detect attack traces in critical protocols, which are
used to guide subsequent fuzz testing and incorporate feedback from fuzz
testing to broaden the scope of formal verification. This innovative approach
significantly improves efficiency and enables the auto-discovery of
vulnerabilities and unintended emergent behaviors from the 3GPP protocols to
software stacks. Following this approach, we discover one identifier leakage
model, one DoS attack model, and two eavesdrop attack models due to the absence
of rudimentary MITM protection within the protocol, despite the existence of a
Transport Layer Security (TLS) solution to this issue for over a decade. More
remarkably, guided by the identified formal analysis and attack models, we
exploit 61 vulnerabilities using fuzz testing demonstrated on srsRAN platforms.
These identified vulnerabilities contribute to fortifying protocol-level
assumptions and refining the search space. Compared to state-of-the-art fuzz
testing, our united formal and fuzzing methodology enables auto-assurance by
systematically discovering vulnerabilities. It significantly reduces
computational complexity, transforming the non-practical exponential growth in
computational cost into linear growth
Applying Formal Methods to Networking: Theory, Techniques and Applications
Despite its great importance, modern network infrastructure is remarkable for
the lack of rigor in its engineering. The Internet which began as a research
experiment was never designed to handle the users and applications it hosts
today. The lack of formalization of the Internet architecture meant limited
abstractions and modularity, especially for the control and management planes,
thus requiring for every new need a new protocol built from scratch. This led
to an unwieldy ossified Internet architecture resistant to any attempts at
formal verification, and an Internet culture where expediency and pragmatism
are favored over formal correctness. Fortunately, recent work in the space of
clean slate Internet design---especially, the software defined networking (SDN)
paradigm---offers the Internet community another chance to develop the right
kind of architecture and abstractions. This has also led to a great resurgence
in interest of applying formal methods to specification, verification, and
synthesis of networking protocols and applications. In this paper, we present a
self-contained tutorial of the formidable amount of work that has been done in
formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial
Opaque Service Virtualisation: A Practical Tool for Emulating Endpoint Systems
Large enterprise software systems make many complex interactions with other
services in their environment. Developing and testing for production-like
conditions is therefore a very challenging task. Current approaches include
emulation of dependent services using either explicit modelling or
record-and-replay approaches. Models require deep knowledge of the target
services while record-and-replay is limited in accuracy. Both face
developmental and scaling issues. We present a new technique that improves the
accuracy of record-and-replay approaches, without requiring prior knowledge of
the service protocols. The approach uses Multiple Sequence Alignment to derive
message prototypes from recorded system interactions and a scheme to match
incoming request messages against prototypes to generate response messages. We
use a modified Needleman-Wunsch algorithm for distance calculation during
message matching. Our approach has shown greater than 99% accuracy for four
evaluated enterprise system messaging protocols. The approach has been
successfully integrated into the CA Service Virtualization commercial product
to complement its existing techniques.Comment: In Proceedings of the 38th International Conference on Software
Engineering Companion (pp. 202-211). arXiv admin note: text overlap with
arXiv:1510.0142
LZfuzz: a fast compression-based fuzzer for poorly documented protocols
Real-world infrastructure offers many scenarios where protocols (and other details) are not released due to being considered too sensitive or for other reasons. This situation makes it hard to apply fuzzing techniques to test their security and reliability, since their full documentation is only available to their developers, and domain developer expertise does not necessarily intersect with fuzz-testing expertise (nor deployment responsibility). State-of-the-art fuzzing techniques, however, work best when protocol specifications are available. Still, operators whose networks include equipment communicating via proprietary protocols should be able to reap the benefits of fuzz-testing them. In particular, administrators should be able to test proprietary protocols in the absence of end-to-end application-level encryption to understand whether they can withstand injection of bad traffic, and thus be able to plan adequate network protection measures. Such protocols can be observed in action prior to fuzzing, and packet captures can be used to learn enough about the structure of the protocol to make fuzzing more efficient. Various machine learning approaches, e.g. bioinformatics methods, have been proposed for learning models of the targeted protocols. The problem with most of these approaches to date is that, although sometimes quite successful, they are very computationally heavy and thus are hardly practical for application by network administrators and equipment owners who cannot easily dedicate a compute cluster to such tasks. We propose a simple method that, despite its roughness, allowed us to learn facts useful for fuzzing from protocol traces at much smaller CPU and time costs. Our fuzzing approach proved itself empirically in testing actual proprietary SCADA protocols in an isolated control network test environment, and was also successful in triggering flaws in implementations of several popular commodity Internet protocols. Our fuzzer, LZfuzz (pronounced ``lazy-fuzz\u27\u27) relies on a variant of Lempel--Ziv compression algorithm to guess boundaries between the structural units of the protocol, and builds on the well-known free software GPF fuzzer
On the Experimental Evaluation of Vehicular Networks: Issues, Requirements and Methodology Applied to a Real Use Case
One of the most challenging fields in vehicular communications has been the
experimental assessment of protocols and novel technologies. Researchers
usually tend to simulate vehicular scenarios and/or partially validate new
contributions in the area by using constrained testbeds and carrying out minor
tests. In this line, the present work reviews the issues that pioneers in the
area of vehicular communications and, in general, in telematics, have to deal
with if they want to perform a good evaluation campaign by real testing. The
key needs for a good experimental evaluation is the use of proper software
tools for gathering testing data, post-processing and generating relevant
figures of merit and, finally, properly showing the most important results. For
this reason, a key contribution of this paper is the presentation of an
evaluation environment called AnaVANET, which covers the previous needs. By
using this tool and presenting a reference case of study, a generic testing
methodology is described and applied. This way, the usage of the IPv6 protocol
over a vehicle-to-vehicle routing protocol, and supporting IETF-based network
mobility, is tested at the same time the main features of the AnaVANET system
are presented. This work contributes in laying the foundations for a proper
experimental evaluation of vehicular networks and will be useful for many
researchers in the area.Comment: in EAI Endorsed Transactions on Industrial Networks and Intelligent
Systems, 201
- …