1,050 research outputs found
The liminality of trajectory shifts in institutional entrepreneurship
In this paper, we develop a process model of trajectory shifts in institutional entrepreneurship. We focus on the liminal periods experienced by institutional entrepreneurs when they, unlike the rest of the organization, recognize limits in the present and seek to shift a familiar past into an unfamiliar and uncertain future. Such periods involve a situation where the new possible future, not yet fully formed, exists side-by-side with established innovation trajectories. Trajectory shifts are moments of truth for institutional entrepreneurs, but little is known about the underlying mechanisms of how entrepreneurs reflectively deal with liminality to conceive and bring forth new innovation trajectories. Our in-depth case study research at CarCorp traces three such mechanisms (reflective dissension, imaginative projection, and eliminatory exploration) and builds the basis for understanding the liminality of trajectory shifts. The paper offers theoretical implications for the institutional entrepreneurship literature
Tunable Double Negative Band Structure from Non-Magnetic Coated Rods
A system of periodic poly-disperse coated nano-rods is considered. Both the
coated nano-rods and host material are non-magnetic. The exterior nano-coating
has a frequency dependent dielectric constant and the rod has a high dielectric
constant. A negative effective magnetic permeability is generated near the Mie
resonances of the rods while the coating generates a negative permittivity
through a field resonance controlled by the plasma frequency of the coating and
the geometry of the crystal. The explicit band structure for the system is
calculated in the sub-wavelength limit. Tunable pass bands exhibiting negative
group velocity are generated and correspond to simultaneously negative
effective dielectric permittivity and magnetic permeability. These can be
explicitly controlled by adjusting the distance between rods, the coating
thickness, and rod diameters
Inducing Probabilistic Grammars by Bayesian Model Merging
We describe a framework for inducing probabilistic grammars from corpora of
positive samples. First, samples are {\em incorporated} by adding ad-hoc rules
to a working grammar; subsequently, elements of the model (such as states or
nonterminals) are {\em merged} to achieve generalization and a more compact
representation. The choice of what to merge and when to stop is governed by the
Bayesian posterior probability of the grammar given the data, which formalizes
a trade-off between a close fit to the data and a default preference for
simpler models (`Occam's Razor'). The general scheme is illustrated using three
types of probabilistic grammars: Hidden Markov models, class-based -grams,
and stochastic context-free grammars.Comment: To appear in Grammatical Inference and Applications, Second
International Colloquium on Grammatical Inference; Springer Verlag, 1994. 13
page
On Hilberg's Law and Its Links with Guiraud's Law
Hilberg (1990) supposed that finite-order excess entropy of a random human
text is proportional to the square root of the text length. Assuming that
Hilberg's hypothesis is true, we derive Guiraud's law, which states that the
number of word types in a text is greater than proportional to the square root
of the text length. Our derivation is based on some mathematical conjecture in
coding theory and on several experiments suggesting that words can be defined
approximately as the nonterminals of the shortest context-free grammar for the
text. Such operational definition of words can be applied even to texts
deprived of spaces, which do not allow for Mandelbrot's ``intermittent
silence'' explanation of Zipf's and Guiraud's laws. In contrast to
Mandelbrot's, our model assumes some probabilistic long-memory effects in human
narration and might be capable of explaining Menzerath's law.Comment: To appear in Journal of Quantitative Linguistic
Frame Permutation Quantization
Frame permutation quantization (FPQ) is a new vector quantization technique
using finite frames. In FPQ, a vector is encoded using a permutation source
code to quantize its frame expansion. This means that the encoding is a partial
ordering of the frame expansion coefficients. Compared to ordinary permutation
source coding, FPQ produces a greater number of possible quantization rates and
a higher maximum rate. Various representations for the partitions induced by
FPQ are presented, and reconstruction algorithms based on linear programming,
quadratic programming, and recursive orthogonal projection are derived.
Implementations of the linear and quadratic programming algorithms for uniform
and Gaussian sources show performance improvements over entropy-constrained
scalar quantization for certain combinations of vector dimension and coding
rate. Monte Carlo evaluation of the recursive algorithm shows that mean-squared
error (MSE) decays as 1/M^4 for an M-element frame, which is consistent with
previous results on optimal decay of MSE. Reconstruction using the canonical
dual frame is also studied, and several results relate properties of the
analysis frame to whether linear reconstruction techniques provide consistent
reconstructions.Comment: 29 pages, 5 figures; detailed added to proof of Theorem 4.3 and a few
minor correction
Monitoring Temporal Changes in the Specificity of an Oral HIV Test: A Novel Application for Use in Postmarketing Surveillance
BACKGROUND: Postmarketing surveillance is routinely conducted to monitor performance of pharmaceuticals and testing devices in the marketplace. However, these surveillance methods are often done retrospectively and, as a result, are not designed to detect issues with performance in real-time. METHODS AND FINDINGS: Using HIV antibody screening test data from New York City STD clinics, we developed a formal, statistical method of prospectively detecting temporal clusters of poor performance of a screening test. From 2005 to 2008, New York City, as well as other states, observed unexpectedly high false-positive (FP) rates in an oral fluid-based rapid test used for screening HIV. We attempted to formally assess whether the performance of this HIV screening test statistically deviated from both local expectation and the manufacturer's claim for the test. Results indicate that there were two significant temporal clusters in the FP rate of the oral HIV test, both of which exceeded the manufacturer's upper limit of the 95% CI for the product. Furthermore, the FP rate of the test varied significantly by both STD clinic and test lot, though not by test operator. CONCLUSIONS: Continuous monitoring of surveillance data has the benefit of providing information regarding test performance, and if conducted in real-time, it can enable programs to examine reasons for poor test performance in close proximity to the occurrence. Techniques used in this study could be a valuable addition for postmarketing surveillance of test performance and may become particularly important with the increase in rapid testing methods
Search algorithms as a framework for the optimization of drug combinations
Combination therapies are often needed for effective clinical outcomes in the
management of complex diseases, but presently they are generally based on
empirical clinical experience. Here we suggest a novel application of search
algorithms, originally developed for digital communication, modified to
optimize combinations of therapeutic interventions. In biological experiments
measuring the restoration of the decline with age in heart function and
exercise capacity in Drosophila melanogaster, we found that search algorithms
correctly identified optimal combinations of four drugs with only one third of
the tests performed in a fully factorial search. In experiments identifying
combinations of three doses of up to six drugs for selective killing of human
cancer cells, search algorithms resulted in a highly significant enrichment of
selective combinations compared with random searches. In simulations using a
network model of cell death, we found that the search algorithms identified the
optimal combinations of 6-9 interventions in 80-90% of tests, compared with
15-30% for an equivalent random search. These findings suggest that modified
search algorithms from information theory have the potential to enhance the
discovery of novel therapeutic drug combinations. This report also helps to
frame a biomedical problem that will benefit from an interdisciplinary effort
and suggests a general strategy for its solution.Comment: 36 pages, 10 figures, revised versio
The first study of 54 new eccentric eclipsing binaries in our Galaxy
We present an analysis of the apsidal motion and light curve parameters of 54 galactic Algol-type binaries never before studied. This is the first analysis of such a large sample of eccentric eclipsing binaries in our Galaxy, and has enabled us to identify several systems that are worthy of further study. Bringing together data from various databases and surveys, supplemented with new observations, we have been able to trace the long-term evolution of the eccentric orbit over durations extending back up to several decades. Our present study explores a rather different sample of stars to those presented in the previously published catalogue of eccentric eclipsing binaries by Bulut & Demircan (2007), sampling to fainter magnitudes, covering later spectral types, sensitive to different orbital periods with more than 50% of our systems having periods longer than 6 days. The typical apsidal motion in the sample is rather slow (mostly of order of centuries long), although in some cases this is less than 50 years. All of the systems, except one, have eccentricities less than 0.5, with an average value of 0.23. Several of the stars also show evidence for additional period variability. In particular we can identify three systems in the sample, HD 44093, V611 Pup, and HD 313631, which likely represent relativistic apsidal rotators
- …