1 research outputs found

    FMAS: Fast Multi-Objective SuperNet Architecture Search for Semantic Segmentation

    Full text link
    We present FMAS, a fast multi-objective neural architecture search framework for semantic segmentation. FMAS subsamples the structure and pre-trained parameters of DeepLabV3+, without fine-tuning, dramatically reducing training time during search. To further reduce candidate evaluation time, we use a subset of the validation dataset during the search. Only the final, Pareto non-dominated, candidates are ultimately fine-tuned using the complete training set. We evaluate FMAS by searching for models that effectively trade accuracy and computational cost on the PASCAL VOC 2012 dataset. FMAS finds competitive designs quickly, e.g., taking just 0.5 GPU days to discover a DeepLabV3+ variant that reduces FLOPs and parameters by 10%\% and 20%\% respectively, for less than 3%\% increased error. We also search on an edge device called GAP8 and use its latency as the metric. FMAS is capable of finding 2.2×\times faster network with 7.61%\% MIoU loss.Comment: Accepted as a full paper by the TinyML Research Symposium 202
    corecore