1,472,966 research outputs found
Utah Suicide Prevention Plan 2017-2021
Living in Utah has many advantages including the best snow on Earth and many beautiful national and state parks in which the opportunity for outdoor adventure is almost unlimited. Utah also ranks high in a number of health and happiness related outcomes. In spite of all that Utah has to offer, Utah continually ranks in the top ten states for high suicide rates in the U.S. People in Utah also experience higher rates of associated mood disorders. The Utah Suicide Prevention Coalition is dedicated to better understanding this paradox and implementing prevention, intervention and postvention strategies to decrease suicide and the associated suffering it brings.Suicide is a major preventable public health problem in Utah and the 8th leading cause of death (2010-2015 inclusive). Every suicide death causes a ripple effect of immeasurable pain to individuals, families, and communities throughout the state. From 2009 to 2015, Utah's age-adjusted suicide rate was 19.9 per 100,000 persons. This is an average of 525 suicide deaths per year. Suicide was the second-leading cause of death for Utahns ages 10 to 39 years old in 2013 and the number one cause of death for youth ages 10-17. Many more people attempt suicide than die by suicide. The most recent data show that 6,039 Utahns were seen in emergency departments (2014) and 2,314 Utahns were hospitalized for self-inflicted injuries including suicide attempts (UDOH Indicator-based Information System for Public Health, 2014). One in fifteen Utah adults report having had serious thoughts of suicide in the past year (SAMHSA National Survey on Drug Use and Health, 2008-2009). According to the Student Health and Risk Prevention Survey, 14.4 % of youth grades 6-12 report seriously considering suicide, 6.7% of Utah youth grades 6-12 students attempted suicide one or more times and 13.9% of students report harming themselves without the intention of dying in the prior year.While suicide is a leading cause of death and many people report thoughts of suicide, the topic is still largely met with silence and shame. It is critical for all of us to challenge this silence using both research and personal stories of recovery. Everyone plays a role in suicide prevention and it is up to each one of us to help create communities in which people are able to feel safe and supported in disclosing suicide risk, including mental illness and substance use problems. We need to break down the barriers that keep people from accessing care and support for prevention, early intervention and crisis services. As you review this plan, we encourage you to identify how you can implement any of the strategies and help create suicide safer communities
Permanental Vectors
A permanental vector is a generalization of a vector with components that are
squares of the components of a Gaussian vector, in the sense that the matrix
that appears in the Laplace transform of the vector of Gaussian squares is not
required to be either symmetric or positive definite. In addition the power of
the determinant in the Laplace transform of the vector of Gaussian squares,
which is -1/2, is allowed to be any number less than zero.
It was not at all clear what vectors are permanental vectors. In this paper
we characterize all permanental vectors in and give applications to
permanental vectors in and to the study of permanental processes
Computing covariant vectors, Lyapunov vectors, Oseledets vectors, and dichotomy projectors: a comparative numerical study
Covariant vectors, Lyapunov vectors, or Oseledets vectors are increasingly
being used for a variety of model analyses in areas such as partial
differential equations, nonautonomous differentiable dynamical systems, and
random dynamical systems. These vectors identify spatially varying directions
of specific asymptotic growth rates and obey equivariance principles. In recent
years new computational methods for approximating Oseledets vectors have been
developed, motivated by increasing model complexity and greater demands for
accuracy. In this numerical study we introduce two new approaches based on
singular value decomposition and exponential dichotomies and comparatively
review and improve two recent popular approaches of Ginelli et al. (2007) and
Wolfe and Samelson (2007). We compare the performance of the four approaches
via three case studies with very different dynamics in terms of symmetry,
spectral separation, and dimension. We also investigate which methods perform
well with limited data
Skip-Thought Vectors
We describe an approach for unsupervised learning of a generic, distributed
sentence encoder. Using the continuity of text from books, we train an
encoder-decoder model that tries to reconstruct the surrounding sentences of an
encoded passage. Sentences that share semantic and syntactic properties are
thus mapped to similar vector representations. We next introduce a simple
vocabulary expansion method to encode words that were not seen as part of
training, allowing us to expand our vocabulary to a million words. After
training our model, we extract and evaluate our vectors with linear models on 8
tasks: semantic relatedness, paraphrase detection, image-sentence ranking,
question-type classification and 4 benchmark sentiment and subjectivity
datasets. The end result is an off-the-shelf encoder that can produce highly
generic sentence representations that are robust and perform well in practice.
We will make our encoder publicly available.Comment: 11 page
Co-occurrence Vectors from Corpora vs. Distance Vectors from Dictionaries
A comparison was made of vectors derived by using ordinary co-occurrence
statistics from large text corpora and of vectors derived by measuring the
inter-word distances in dictionary definitions. The precision of word sense
disambiguation by using co-occurrence vectors from the 1987 Wall Street Journal
(20M total words) was higher than that by using distance vectors from the
Collins English Dictionary (60K head words + 1.6M definition words). However,
other experimental results suggest that distance vectors contain some different
semantic information from co-occurrence vectors.Comment: 6 pages, appeared in the Proc. of COLING94 (pp. 304-309)
Kochen-Specker Vectors
We give a constructive and exhaustive definition of Kochen-Specker (KS)
vectors in a Hilbert space of any dimension as well as of all the remaining
vectors of the space. KS vectors are elements of any set of orthonormal states,
i.e., vectors in n-dim Hilbert space, H^n, n>3 to which it is impossible to
assign 1s and 0s in such a way that no two mutually orthogonal vectors from the
set are both assigned 1 and that not all mutually orthogonal vectors are
assigned 0. Our constructive definition of such KS vectors is based on
algorithms that generate MMP diagrams corresponding to blocks of orthogonal
vectors in R^n, on algorithms that single out those diagrams on which algebraic
0-1 states cannot be defined, and on algorithms that solve nonlinear equations
describing the orthogonalities of the vectors by means of statistically
polynomially complex interval analysis and self-teaching programs. The
algorithms are limited neither by the number of dimensions nor by the number of
vectors. To demonstrate the power of the algorithms, all 4-dim KS vector
systems containing up to 24 vectors were generated and described, all 3-dim
vector systems containing up to 30 vectors were scanned, and several general
properties of KS vectors were found.Comment: 19 pages, 6 figures, title changed, introduction thoroughly
rewritten, n-dim rotation of KS vectors defined, original Kochen-Specker 192
(117) vector system translated into MMP diagram notation with a new graphical
representation, results on Tkadlec's dual diagrams added, several other new
results added, journal version: to be published in J. Phys. A, 38 (2005). Web
page: http://m3k.grad.hr/pavici
- …