201 research outputs found

    MiR-126 enhances cisplatin chemosensitivity in hepatocellular carcinoma cells by targeting IRS1

    Get PDF
    Purpose: To investigate the potential role of miR-126 in regulating the proliferation and cisplatin chemosensitivity of human hepatocellular carcinoma (HCC) cells.Methods: The expression of miR-126 was evaluated using clinical HCC specimens. MiR-126-mediated downregulation of Insulin Receptor Substrate 1 (IRS1) was determined by qRT-PCR, western blot and luciferase reporter assay. Cell Counting Kit-8 (CCK8) and colony formation assays were performed to examine the proliferation of HCC cells.Results: Decreased expression of miR126 was found in HCC tumors and was correlated with poor survival in HCC patients. In HCC cells, miR-126 targeted IRS1 for downregulation, through which miR- 126 suppressed the growth of HCC cells and desensitized HCC cells to cisplatin treatment.Conclusion: MiR-126a impairs the proliferation and cisplatin chemoresistance of HCC cells by targeting IRS1.Keywords: miR-126, IRS1, HCC, cisplatin, chemosensitivit

    Generative Adversarial Mapping Networks

    Full text link
    Generative Adversarial Networks (GANs) have shown impressive performance in generating photo-realistic images. They fit generative models by minimizing certain distance measure between the real image distribution and the generated data distribution. Several distance measures have been used, such as Jensen-Shannon divergence, ff-divergence, and Wasserstein distance, and choosing an appropriate distance measure is very important for training the generative network. In this paper, we choose to use the maximum mean discrepancy (MMD) as the distance metric, which has several nice theoretical guarantees. In fact, generative moment matching network (GMMN) (Li, Swersky, and Zemel 2015) is such a generative model which contains only one generator network GG trained by directly minimizing MMD between the real and generated distributions. However, it fails to generate meaningful samples on challenging benchmark datasets, such as CIFAR-10 and LSUN. To improve on GMMN, we propose to add an extra network FF, called mapper. FF maps both real data distribution and generated data distribution from the original data space to a feature representation space R\mathcal{R}, and it is trained to maximize MMD between the two mapped distributions in R\mathcal{R}, while the generator GG tries to minimize the MMD. We call the new model generative adversarial mapping networks (GAMNs). We demonstrate that the adversarial mapper FF can help GG to better capture the underlying data distribution. We also show that GAMN significantly outperforms GMMN, and is also superior to or comparable with other state-of-the-art GAN based methods on MNIST, CIFAR-10 and LSUN-Bedrooms datasets.Comment: 9 pages, 7 figure
    • …
    corecore