1,279 research outputs found

    Teaching Tax Law after Tax Reform

    Get PDF
    Professor Ginsburg compares the teaching of individual income taxation before and after the extensive statutory revisions of the 1980s. The pervasive question, what is income, remains the central inquiry in the basic tax course, he observes, and the great classifications, personal versus commercial and current versus capital, unavoidably persist. The development in tax law that has most significantly changed the way the subject is taught, he believes, is embodied in the recent enactment of a variety of Internal Revenue Code provisions which, while facially inconsistent in their approach to particular cases, have in common an appreciation of differences in present values. The generalized question that now dominates much of tax teaching, when is income, is explored in his paper, as in the tax course, through a series of hypothetical cases designed to demonstrate that what appear to be disparate problems in fact are variations on a single theme, and that the tax law\u27s seemingly disparate resolutions often reach economically equivalent results

    A Comparison of the Merger and Acquisition Provisions of Present Law with the Provisions in the Senate Finance Committee\u27s Draft Bill

    Get PDF
    Mr. Thompson presents a detailed comparison of the corporate merger and acquisition provisions of present law with the changes proposed by the Senate Finance Committee Staff Report and the Draft Bill, prepared by the Senate Finance Committee staff. Professor Ginsburg attempts to illustrate how the changes proposed by the SFC Report do not resolve many of the more sophisticated problems generated by use of multiple corporations and selective acquisitions of some of the target\u27s assets or stocks. Professor Ginsburg argues that the rules work well in the unreal world of one corporation operating one business, but not in the real world of multiple commonly controlled entities. He provides several examples of corporations that operate through affiliates that may end up with results not necessarily by the proposed changes

    TEFRA: Purchase and Sale of a Corporate Business

    Full text link

    Interaction of mammalian cells with polymorphonuclear leukocytes: Relative sensitivity to monolayer disruption and killing

    Full text link
    Monolayers of murine fibrosarcoma cells that had been treated either with histone-opsonized streptococci, histone-opsonized Candida globerata , or lipoteichoic acid-anti-lipoteichoic acid complexes underwent disruption when incubated with human polymorphonuclear leukocytes (PMNs). Although the architecture of the monolayers was destroyed, the target cells were not killed. The destruction of the monolayers was totally inhibited by proteinase inhibitors, suggesting that the detachment of the cells from the monolayers and aggregation in suspension were induced by proteinases released from the activated PMNs. Monolayers of normal endothelial cells and fibroblasts were much more resistant to the monolayer-disrupting effects of the PMNs than were the fibrosarcoma cells. Although the fibrosarcoma cells were resistant to killing by PMNs, killing was promoted by the addition of sodium azide (a catalase inhibitor). This suggests that the failure of the PMNs to kill the target cells was due to catalase inhibition of the hydrogen peroxide produced by the activated PMNs. Target cell killing that occurred in the presence of sodium azide was reduced by the addition of a “cocktail” containing methionine, histidine, and deferoxamine mesylate, suggesting that hydroxyl radicals but not myeloperoxidase-catalayzed products were responsible for cell killing. The relative ease with which the murine fibrosarcoma cells can be released from their substratum by the action of PMNs, coupled with their insensitivity to PMN-mediated killing, may explain why the presence of large numbers of PMNs at the site of tumors produced in experimental animals by the fibrosarcoma cells is associated with an unfavorable outcome.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/44503/1/10753_2004_Article_BF00916759.pd

    Algorithmic Complexity for Short Binary Strings Applied to Psychology: A Primer

    Full text link
    Since human randomness production has been studied and widely used to assess executive functions (especially inhibition), many measures have been suggested to assess the degree to which a sequence is random-like. However, each of them focuses on one feature of randomness, leading authors to have to use multiple measures. Here we describe and advocate for the use of the accepted universal measure for randomness based on algorithmic complexity, by means of a novel previously presented technique using the the definition of algorithmic probability. A re-analysis of the classical Radio Zenith data in the light of the proposed measure and methodology is provided as a study case of an application.Comment: To appear in Behavior Research Method

    The role of the user within the medical device design and development process: medical device manufacturers' perspectives

    Get PDF
    Copyright @ 2011 Money et al.Background: Academic literature and international standards bodies suggest that user involvement, via the incorporation of human factors engineering methods within the medical device design and development (MDDD) process, offer many benefits that enable the development of safer and more usable medical devices that are better suited to users' needs. However, little research has been carried out to explore medical device manufacturers' beliefs and attitudes towards user involvement within this process, or indeed what value they believe can be added by doing so.Methods: In-depth interviews with representatives from 11 medical device manufacturers are carried out. We ask them to specify who they believe the intended users of the device to be, who they consult to inform the MDDD process, what role they believe the user plays within this process, and what value (if any) they believe users add. Thematic analysis is used to analyse the fully transcribed interview data, to gain insight into medical device manufacturers' beliefs and attitudes towards user involvement within the MDDD process.Results: A number of high-level themes emerged, relating who the user is perceived to be, the methods used, the perceived value and barriers to user involvement, and the nature of user contributions. The findings reveal that despite standards agencies and academic literature offering strong support for the employment formal methods, manufacturers are still hesitant due to a range of factors including: perceived barriers to obtaining ethical approval; the speed at which such activity may be carried out; the belief that there is no need given the 'all-knowing' nature of senior health care staff and clinical champions; a belief that effective results are achievable by consulting a minimal number of champions. Furthermore, less senior health care practitioners and patients were rarely seen as being able to provide valuable input into the process.Conclusions: Medical device manufacturers often do not see the benefit of employing formal human factors engineering methods within the MDDD process. Research is required to better understand the day-to-day requirements of manufacturers within this sector. The development of new or adapted methods may be required if user involvement is to be fully realised.This study was in part funded by grant number Ref: GR/S29874/01 from the Engineering and Physical Sciences Research Council. This article is made available through the Brunel University Open Access Publishing Fund

    Measurement of the Lifetime Difference Between B_s Mass Eigenstates

    Get PDF
    We present measurements of the lifetimes and polarization amplitudes for B_s --> J/psi phi and B_d --> J/psi K*0 decays. Lifetimes of the heavy (H) and light (L) mass eigenstates in the B_s system are separately measured for the first time by determining the relative contributions of amplitudes with definite CP as a function of the decay time. Using 203 +/- 15 B_s decays, we obtain tau_L = (1.05 +{0.16}/-{0.13} +/- 0.02) ps and tau_H = (2.07 +{0.58}/-{0.46} +/- 0.03) ps. Expressed in terms of the difference DeltaGamma_s and average Gamma_s, of the decay rates of the two eigenstates, the results are DeltaGamma_s/Gamma_s = (65 +{25}/-{33} +/- 1)%, and DeltaGamma_s = (0.47 +{0.19}/-{0.24} +/- 0.01) inverse ps.Comment: 8 pages, 3 figures, 2 tables; as published in Physical Review Letters on 16 March 2005; revisions are for length and typesetting only, no changes in results or conclusion

    Measurement of WγW\gamma and ZγZ\gamma Production in ppˉp\bar{p} Collisions at s\sqrt{s} = 1.96 TeV

    Get PDF
    The Standard Model predictions for WγW\gamma and ZγZ\gamma production are tested using an integrated luminosity of 200 pb1^{-1} of \ppbar collision data collected at the Collider Detector at Fermilab. The cross sections are measured selecting leptonic decays of the WW and ZZ bosons, and photons with transverse energy ET>7E_T>7 GeV that are well separated from leptons. The production cross sections and kinematic distributions for the WγW\gamma and ZγZ\gamma are compared to SM predictions.Comment: 7 pages, 4 figures, submitted to PR

    Evidence for the exclusive decay Bc+- to J/psi pi+- and measurement of the mass of the Bc meson

    Get PDF
    We report first evidence for a fully reconstructed decay mode of the B_c^{\pm} meson in the channel B_c^{\pm} \to J/psi \pi^{\pm}, with J/psi \to mu^+mu^-. The analysis is based on an integrated luminosity of 360 pb$^{-1} in p\bar{p} collisions at 1.96 TeV center of mass energy collected by the Collider Detector at Fermilab. We observe 14.6 \pm 4.6 signal events with a background of 7.1 \pm 0.9 events, and a fit to the J/psi pi^{\pm} mass spectrum yields a B_c^{\pm} mass of 6285.7 \pm 5.3(stat) \pm 1.2(syst) MeV/c^2. The probability of a peak of this magnitude occurring by random fluctuation in the search region is estimated as 0.012%.Comment: 7 pages, 3 figures. Version 3, accepted by PR
    corecore