Injection of IP packet loss is a versatile method for emulating real-world
network conditions in performance studies. In order to reproduce realistic
packet-loss patterns, stochastic fault-models are used. In this report we
desribe our implementation of a Linux kernel module using a Continuous-Time
Gilbert Model for packet-loss injection