63 research outputs found

    Exploiting Counter-Examples for Active Learning with Partial labels

    Full text link
    This paper studies a new problem, \emph{active learning with partial labels} (ALPL). In this setting, an oracle annotates the query samples with partial labels, relaxing the oracle from the demanding accurate labeling process. To address ALPL, we first build an intuitive baseline that can be seamlessly incorporated into existing AL frameworks. Though effective, this baseline is still susceptible to the \emph{overfitting}, and falls short of the representative partial-label-based samples during the query process. Drawing inspiration from human inference in cognitive science, where accurate inferences can be explicitly derived from \emph{counter-examples} (CEs), our objective is to leverage this human-like learning pattern to tackle the \emph{overfitting} while enhancing the process of selecting representative samples in ALPL. Specifically, we construct CEs by reversing the partial labels for each instance, and then we propose a simple but effective WorseNet to directly learn from this complementary pattern. By leveraging the distribution gap between WorseNet and the predictor, this adversarial evaluation manner could enhance both the performance of the predictor itself and the sample selection process, allowing the predictor to capture more accurate patterns in the data. Experimental results on five real-world datasets and four benchmark datasets show that our proposed method achieves comprehensive improvements over ten representative AL frameworks, highlighting the superiority of WorseNet. The source code will be available at \url{https://github.com/Ferenas/APLL}.Comment: 29 pages, Under revie

    Context-dependent compensation among phosphatidylserine-recognition receptors

    Get PDF
    Phagocytes express multiple phosphatidylserine (PtdSer) receptors that recognize apoptotic cells. It is unknown whether these receptors are interchangeable or if they play unique roles during cell clearance. Loss of the PtdSer receptor Mertk is associated with apoptotic corpse accumulation in the testes and degeneration of photoreceptors in the eye. Both phenotypes are linked to impaired phagocytosis by specialized phagocytes: Sertoli cells and the retinal pigmented epithelium (RPE). Here, we overexpressed the PtdSer receptor BAI1 in mice lacking MerTK (Mertk(-/-) Bai1(Tg)) to evaluate PtdSer receptor compensation in vivo. While Bai1 overexpression rescues clearance of apoptotic germ cells in the testes of Mertk(-/-) mice it fails to enhance RPE phagocytosis or prevent photoreceptor degeneration. To determine why MerTK is critical to RPE function, we examined visual cycle intermediates and performed unbiased RNAseq analysis of RPE from Mertk(+/+) and Mertk(-/-) mice. Prior to the onset of photoreceptor degeneration, Mertk(-/-) mice had less accumulation of retinyl esters and dysregulation of a striking array of genes, including genes related to phagocytosis, metabolism, and retinal disease in humans. Collectively, these experiments establish that not all phagocytic receptors are functionally equal, and that compensation among specific engulfment receptors is context and tissue dependent

    A Modified Differential Coherent Bit Synchronization Algorithm for BeiDou Weak Signals with Large Frequency Deviation

    No full text
    BeiDou system navigation messages are modulated with a secondary NH (Neumann-Hoffman) code of 1 kbps, where frequent bit transitions limit the coherent integration time to 1 millisecond. Therefore, a bit synchronization algorithm is necessary to obtain bit edges and NH code phases. In order to realize bit synchronization for BeiDou weak signals with large frequency deviation, a bit synchronization algorithm based on differential coherent and maximum likelihood is proposed. Firstly, a differential coherent approach is used to remove the effect of frequency deviation, and the differential delay time is set to be a multiple of bit cycle to remove the influence of NH code. Secondly, the maximum likelihood function detection is used to improve the detection probability of weak signals. Finally, Monte Carlo simulations are conducted to analyze the detection performance of the proposed algorithm compared with a traditional algorithm under the CN0s of 20~40 dB-Hz and different frequency deviations. The results show that the proposed algorithm outperforms the traditional method with a frequency deviation of 50 Hz. This algorithm can remove the effect of BeiDou NH code effectively and weaken the influence of frequency deviation. To confirm the feasibility of the proposed algorithm, real data tests are conducted. The proposed algorithm is suitable for BeiDou weak signal bit synchronization with large frequency deviation

    Research on Lightweight Disaster Classification Based on High-Resolution Remote Sensing Images

    No full text
    With the increasing frequency of natural disasters becoming, it is very important to classify and identify disasters. We propose a lightweight disaster classification model, which has lower computation and parameter quantities and a higher accuracy than other classification models. For this purpose, this paper specially proposes the SDS-Network algorithm, which is optimized on ResNet, to deal with the above problems of remote sensing images. First, it implements the spatial attention mechanism to improve the accuracy of the algorithm; then, the depth separable convolution is introduced to reduce the number of model calculations and parameters while ensuring the accuracy of the algorithm; finally, the effect of the model is increased by adjusting some hyperparameters. The experimental results show that, compared with the classic AlexNet, ResNet18, VGG16, VGG19, and Densenet121 classification models, the SDS-Network algorithm in this paper has a higher accuracy, and when compared with the lightweight models mobilenet series, shufflenet series, squeezenet series, and mnasnet series, it has lower model complexity and a higher accuracy rate. According to a comprehensive performance comparison of the charts made in this article, it is found that the SDS-Network algorithm is still better than the regnet series algorithm. Furthermore, after verification with a public data set, the SDS-Network algorithm in this paper is found to have a good generalization ability. Thus, we can conclude that the SDS-Network classification model of the algorithm in this paper has a good classification effect, and it is suitable for disaster classification tasks. Finally, it is verified on public data sets that the proposed SDS-Network has good generalization ability and portability
    corecore