1 research outputs found

    Strategic attacks on trust models via bandit optimization

    No full text
    Trust and reputation systems are designed to mitigate risks associated with decisions to rely upon systems over which there is no direct control. The effectiveness of trust models are typically evaluated against relatively shallow metrics regarding the sophistication of potential attacks. In reality, such systems may be open to strategic attacks, which need to be investigated in-depth if trust model resilience is to be more fully understood. Here, we devise an orchestrated attack strategy for a specific state-of-the-art statistical trust model (HABIT). We evaluate how these intelligent attack strategies can influence predictions of target trustworthiness by this model. Our conjecture is that this approach represents a stronger benchmark for the assessment of trust models in general.</p
    corecore