Out-of-distribution (OOD) detection is critical for ensuring the reliability
of open-world intelligent systems. Despite the notable advancements in existing
OOD detection methodologies, our study identifies a significant performance
drop under the scarcity of training samples. In this context, we introduce a
novel few-shot OOD detection benchmark, carefully constructed to address this
gap. Our empirical analysis reveals the superiority of ParameterEfficient
Fine-Tuning (PEFT) strategies, such as visual prompt tuning and visual adapter
tuning, over conventional techniques, including fully fine-tuning and linear
probing tuning in the few-shot OOD detection task. Recognizing some crucial
information from the pre-trained model, which is pivotal for OOD detection, may
be lost during the fine-tuning process, we propose a method termed
DomainSpecific and General Knowledge Fusion (DSGF). This approach is designed
to be compatible with diverse fine-tuning frameworks. Our experiments show that
the integration of DSGF significantly enhances the few-shot OOD detection
capabilities across various methods and fine-tuning methodologies, including
fully fine-tuning, visual adapter tuning, and visual prompt tuning. The code
will be released