693 research outputs found
A Family of Subgradient-Based Methods for Convex Optimization Problems in a Unifying Framework
We propose a new family of subgradient- and gradient-based methods which
converges with optimal complexity for convex optimization problems whose
feasible region is simple enough. This includes cases where the objective
function is non-smooth, smooth, have composite/saddle structure, or are given
by an inexact oracle model. We unified the way of constructing the subproblems
which are necessary to be solved at each iteration of these methods. This
permitted us to analyze the convergence of these methods in a unified way
compared to previous results which required different approaches for each
method/algorithm. Our contribution rely on two well-known methods in non-smooth
convex optimization: the mirror-descent method by Nemirovski-Yudin and the
dual-averaging method by Nesterov. Therefore, our family of methods includes
them and many other methods as particular cases. For instance, the proposed
family of classical gradient methods and its accelerations generalize Devolder
et al.'s, Nesterov's primal/dual gradient methods, and Tseng's accelerated
proximal gradient methods. Also our family of methods can partially become
special cases of other universal methods, too. As an additional contribution,
the novel extended mirror-descent method removes the compactness assumption of
the feasible region and the fixation of the total number of iterations which is
required by the original mirror-descent method in order to attain the optimal
complexity.Comment: 31 pages. v3: Major revision. Research Report B-477, Department of
Mathematical and Computing Sciences, Tokyo Institute of Technology, February
201
An Investigation of Sponsorships Opportunities in Athletic Training Rooms of NCAA Universities
The purpose of this study was to identify the barriers, avenues, and possibilities for marketing intercollegiate athletic training rooms. In particular, this study examined potential sources of support for athletic training rooms by addressing a) current trends in sponsorship within athletic training rooms; b) market tactics used to substantiate sponsorships in athletic training rooms; c) how existing marketing tactics have the greatest potential for growth; and d) the need for athletic training rooms to acquire and maintain sponsorship.
In this study, an online survey and volunteer telephone interview were administered to head athletic trainers representing athletic training rooms at Division I-A, I-AA, II, and III institutions. The institutions were chosen based on their football program as listed in the 2003 National Association of Collegiate Directors of Athletics Directory.
According to the data collected, 75.6% of the participants desired sponsorship. Current trends in sponsorships within athletic training rooms revealed that 13.3% of the participating athletic trainers utilized sponsorships as the primary outside revenue resource. Volunteer services to an athletic training room were the most utilized form of sponsorships. Nearly 75% of the participants desired equipment sponsorship in athletic training rooms. The equipment was also categorized as the most desired to allocate more money when available.
Even though there are potential sponsors in athletic training rooms, 38.3% of the participants had no sponsorship within their athletic training rooms. Athletic trainers utilized donation, fundraising, and sponsorship to financially strengthen an athletic training room. However, the majority of athletic training rooms received the outside funding of $999 or less in the 2003-2004 academic year.
Only 3.5% of sponsorships were initiated by sponsors; then, athletic trainers can not just wait for sponsorships to happen naturally in their athletic training rooms. This research proved that possessing sponsorships in athletic training rooms is very possible to execute. The final thing athletic trainers interested in beginning marketing might consider is how to think outside of the traditional box and act accordingly. This study guides intercollegiate athletic trainers to begin marketing their athletic training room; then, they can discover greater possibilities within marketing the athletic training profession
Automorphisms of rank-one generated hyperbolicity cones and their derivative relaxations
A hyperbolicity cone is said to be rank-one generated (ROG) if all its
extreme rays have rank one, where the rank is computed with respect to the
underlying hyperbolic polynomial. This is a natural class of hyperbolicity
cones which are strictly more general than the ROG spectrahedral cones. In this
work, we present a study of the automorphisms of ROG hyperbolicity cones and
their derivative relaxations. One of our main results states that the
automorphisms of the derivative relaxations are exactly the automorphisms of
the original cone fixing a certain direction. As an application, we completely
determine the automorphisms of the derivative relaxations of the nonnegative
orthant and of the cone of positive semidefinite matrices. More generally, we
also prove relations between the automorphisms of a spectral cone and the
underlying permutation-invariant set, which might be of independent interest.Comment: 25 pages. Some minor fixes and changes. To appear at the SIAM Journal
on Applied Algebra and Geometr
A Parameter-Free Conditional Gradient Method for Composite Minimization under H\"older Condition
In this paper we consider a composite optimization problem that minimizes the
sum of a weakly smooth function and a convex function with either a bounded
domain or a uniformly convex structure. In particular, we first present a
parameter-dependent conditional gradient method for this problem, whose step
sizes require prior knowledge of the parameters associated with the H\"older
continuity of the gradient of the weakly smooth function, and establish its
rate of convergence. Given that these parameters could be unknown or known but
possibly conservative, such a method may suffer from implementation issue or
slow convergence. We therefore propose a parameter-free conditional gradient
method whose step size is determined by using a constructive local quadratic
upper approximation and an adaptive line search scheme, without using any
problem parameter. We show that this method achieves the same rate of
convergence as the parameter-dependent conditional gradient method. Preliminary
experiments are also conducted and illustrate the superior performance of the
parameter-free conditional gradient method over the methods with some other
step size rules.Comment: 33 pages, 3 figure
- …