74,939 research outputs found
MKP1 mediates resistance to therapy in HER2-positive breast tumors.
Mitogen-activated protein kinase phosphatase 1 (MKP1 or DUSP1) is an antiapoptotic phosphatase that is overexpressed in many cancers, including breast cancer. MKP1 expression is inducible in radiation-treated breast cancer cells, and correlates with human epidermal growth factor receptor 2 (ERBB2, HER2) expression. The role of MKP1 in therapy resistance suggests that targeting MKP1 in HER2-positive breast tumors may significantly enhance the efficacy of anti-HER2 and other anticancer therapies
Recommended from our members
A method to take account of inhomogeneity in mechanical component reliability calculations
YesThis paper proposes a method by which material inhomogeneity may be taken into account in a reliability calculation. The method employs Monte-Carlo simulation; and introduces a material strength index, and a standard deviation of material strength to model the variation in the strength of a component throughout its volume. The method is compared to conventional load-strength interference theory. The results are identical for the case of homogeneous material, but reliability is shown to reduce for the same load as the component volume increases. The case of a tensile bar is used to explore the variation of reliability with component volume
A Simple Proximal Stochastic Gradient Method for Nonsmooth Nonconvex Optimization
We analyze stochastic gradient algorithms for optimizing nonconvex, nonsmooth
finite-sum problems. In particular, the objective function is given by the
summation of a differentiable (possibly nonconvex) component, together with a
possibly non-differentiable but convex component. We propose a proximal
stochastic gradient algorithm based on variance reduction, called ProxSVRG+.
Our main contribution lies in the analysis of ProxSVRG+. It recovers several
existing convergence results and improves/generalizes them (in terms of the
number of stochastic gradient oracle calls and proximal oracle calls). In
particular, ProxSVRG+ generalizes the best results given by the SCSG algorithm,
recently proposed by [Lei et al., 2017] for the smooth nonconvex case.
ProxSVRG+ is also more straightforward than SCSG and yields simpler analysis.
Moreover, ProxSVRG+ outperforms the deterministic proximal gradient descent
(ProxGD) for a wide range of minibatch sizes, which partially solves an open
problem proposed in [Reddi et al., 2016b]. Also, ProxSVRG+ uses much less
proximal oracle calls than ProxSVRG [Reddi et al., 2016b]. Moreover, for
nonconvex functions satisfied Polyak-\L{}ojasiewicz condition, we prove that
ProxSVRG+ achieves a global linear convergence rate without restart unlike
ProxSVRG. Thus, it can \emph{automatically} switch to the faster linear
convergence in some regions as long as the objective function satisfies the PL
condition locally in these regions. ProxSVRG+ also improves ProxGD and
ProxSVRG/SAGA, and generalizes the results of SCSG in this case. Finally, we
conduct several experiments and the experimental results are consistent with
the theoretical results.Comment: 32nd Conference on Neural Information Processing Systems (NeurIPS
2018
- …