1,186 research outputs found

    II. \u3cem\u3eBranzburg\u3c/em\u3e and the Protection of Reporters\u27 Sources

    Get PDF

    II. \u3cem\u3eBranzburg\u3c/em\u3e and the Protection of Reporters\u27 Sources

    Get PDF

    Legal Pitfalls in the Right to Know

    Get PDF
    I hope that my response to Professor Emerson is more than chauvinistic, and does not reflect merely an Olympian view from the vantage point of a powerful press that easily obtains access, an access perhaps not available to the less powerful. I believe my objection amounts to more than that, however. As Emerson concedes, the right to know is a qualified right, whereas the right to communicate is substantially absolute. My fear is that if the courts begin to enforce the right to know, the qualifications applicable to the right to know may be applied to the right to communicate, thus curtailing existing first amendment rights

    Statistical Power in Operations Management Research

    Get PDF
    This paper discusses the need and importance of statistical power analysis in field-based empirical research in Production and Operations Management (POM) and related disciplines. The concept of statistical power analysis is explained in detail and its relevance in designing and conducting empirical experiments is discussed. Statistical power reflects the degree to which differences in sample data in a statistical test can be detected. A high power is required to reduce the probability of failing to detect an effect when it is present. This paper also examines the relationship between statistical power, significance level, sample size and effect size. A probability tree analysis further explains the importance of statistical power by showing the relationship between Type 11 errors and the probability of making wrong decisions in statistical analysis. A power analysis of 28 articles (524 statistical tests) in the Journal of Operations Management and in Decision Sciences shows that 60% of empirical studies do not have high power levels. This means that several of these tests will have a low degree of repeatability. This and other similar issues involving statistical power will become increasingly important as empirical studies in POM study relatively smaller effects

    Non-symmetric trapped surfaces in the Schwarzschild and Vaidya spacetimes

    Full text link
    Marginally trapped surfaces (MTSs) are commonly used in numerical relativity to locate black holes. For dynamical black holes, it is not known generally if this procedure is sufficiently reliable. Even for Schwarzschild black holes, Wald and Iyer constructed foliations which come arbitrarily close to the singularity but do not contain any MTSs. In this paper, we review the Wald-Iyer construction, discuss some implications for numerical relativity, and generalize to the well known Vaidya spacetime describing spherically symmetric collapse of null dust. In the Vaidya spacetime, we numerically locate non-spherically symmetric trapped surfaces which extend outside the standard spherically symmetric trapping horizon. This shows that MTSs are common in this spacetime and that the event horizon is the most likely candidate for the boundary of the trapped region.Comment: 4 pages, 3 figures; v2: minor modifications; v3: clarified conclusion

    The Asymptotic Falloff of Local Waveform Measurements in Numerical Relativity

    Get PDF
    We examine current numerical relativity computations of gravitational waves, which typically determine the asymptotic waves at infinity by extrapolation from finite (small) radii. Using simulations of a black hole binary with accurate wave extraction at r=1000Mr=1000M, we show that extrapolations from the near-zone are self-consistent in approximating measurements at this radius, although with a somewhat reduced accuracy. We verify that ψ4\psi_4 is the dominant asymptotic contribution to the gravitational energy (as required by the peeling theorem) but point out that gauge effects may complicate the interpretation of the other Weyl components

    FMRI Reveals a Dissociation between Grasping and Perceiving the Size of Real 3D Objects

    Get PDF
    Background Almost 15 years after its formulation, evidence for the neuro-functional dissociation between a dorsal action stream and a ventral perception stream in the human cerebral cortex is still based largely on neuropsychological case studies. To date, there is no unequivocal evidence for separate visual computations of object features for performance of goal-directed actions versus perceptual tasks in the neurologically intact human brain. We used functional magnetic resonance imaging to test explicitly whether or not brain areas mediating size computation for grasping are distinct from those mediating size computation for perception. Methodology/Principal Findings Subjects were presented with the same real graspable 3D objects and were required to perform a number of different tasks: grasping, reaching, size discrimination, pattern discrimination or passive viewing. As in prior studies, the anterior intraparietal area (AIP) in the dorsal stream was more active during grasping, when object size was relevant for planning the grasp, than during reaching, when object properties were irrelevant for movement planning (grasping>reaching). Activity in AIP showed no modulation, however, when size was computed in the context of a purely perceptual task (size = pattern discrimination). Conversely, the lateral occipital (LO) cortex in the ventral stream was modulated when size was computed for perception (size>pattern discrimination) but not for action (grasping = reaching). Conclusions/Significance While areas in both the dorsal and ventral streams responded to the simple presentation of 3D objects (passive viewing), these areas were differentially activated depending on whether the task was grasping or perceptual discrimination, respectively. The demonstration of dual coding of an object for the purposes of action on the one hand and perception on the other in the same healthy brains offers a substantial contribution to the current debate about the nature of the neural coding that takes place in the dorsal and ventral streams

    Preserved Haptic Shape Processing after Bilateral LOC Lesions.

    Get PDF
    UNLABELLED: The visual and haptic perceptual systems are understood to share a common neural representation of object shape. A region thought to be critical for recognizing visual and haptic shape information is the lateral occipital complex (LOC). We investigated whether LOC is essential for haptic shape recognition in humans by studying behavioral responses and brain activation for haptically explored objects in a patient (M.C.) with bilateral lesions of the occipitotemporal cortex, including LOC. Despite severe deficits in recognizing objects using vision, M.C. was able to accurately recognize objects via touch. M.C.\u27s psychophysical response profile to haptically explored shapes was also indistinguishable from controls. Using fMRI, M.C. showed no object-selective visual or haptic responses in LOC, but her pattern of haptic activation in other brain regions was remarkably similar to healthy controls. Although LOC is routinely active during visual and haptic shape recognition tasks, it is not essential for haptic recognition of object shape. SIGNIFICANCE STATEMENT: The lateral occipital complex (LOC) is a brain region regarded to be critical for recognizing object shape, both in vision and in touch. However, causal evidence linking LOC with haptic shape processing is lacking. We studied recognition performance, psychophysical sensitivity, and brain response to touched objects, in a patient (M.C.) with extensive lesions involving LOC bilaterally. Despite being severely impaired in visual shape recognition, M.C. was able to identify objects via touch and she showed normal sensitivity to a haptic shape illusion. M.C.\u27s brain response to touched objects in areas of undamaged cortex was also very similar to that observed in neurologically healthy controls. These results demonstrate that LOC is not necessary for recognizing objects via touch

    A Market-Utility Approach to Scheduling Employees

    Get PDF
    [Excerpt] Scheduling front-line service providers is a constant challenge for hospitality managers, given the inevitable tradeoff between service standards and operating expense. Traditional employee scheduling typically applies a cost-minimization approach to specify the level of front-line service providers who will be available to meet periodic demand. That cost includes the opportunity cost of lost customers, which is part of the pseudo-costs of understaffing. A confounding and often ignored effect, however, is the benefit generated by maintaining high service levels in a system where capacity exceeds demand. That is, scheduling more frontline service providers than the minimum level necessary to provide acceptable customer service (what might be considered to be overstaffing in some rubrics) may mean that customers receive service that is better than they expected (or what company standards prescribe). In this paper we report on a scheduling approach that explicitly considers the interrelationships among customer preferences, customer demand, waiting times, and scheduling decisions. This approach, which we call the market-utility model for scheduling (MUMS), helps managers consider the dynamics of scheduling service employees. First, we discuss the components that make up this approach, which includes methods from customer-preferences modeling, service-capacity planning, and the four tasks of labor scheduling proposed by Thompson. Next, we\u27ll show how the model applies to balancing queue lengths and operating costs for an airport food-court vendor. Finally, we discuss the value of MUMS for hospitality managers
    corecore