167 research outputs found

    Apple v. Pepper: Applying the Indirect Purchaser Rule to Online Platforms

    Get PDF
    Long-established antitrust precedent bars customers who buy a firmā€™s product through intermediaries from suing that firm for antitrust damages. In Apple Inc. v. Pepper, this ā€œindirect purchaser ruleā€ is brought into the smartphone age in a price-fixing dispute between technology giant Apple and iPhone users. This case will determine whether iPhone users buy smartphone applications directly from Apple through the App Store, or if Apple is merely an intermediary seller-agent of app developers. The indirect purchase rule is generally considered settled precedent. How the rule should apply to online platforms, however, differs between circuit courts, which have split on the question of how to determine which users of online marketplaces are direct purchasers and which users are indirect purchasers. Here, the Supreme Court must decide whether the app purchasers are direct purchasers of apps from Apple. If so, the plaintiff app purchasers can proceed in bringing an antitrust suit against Apple. Alternatively, the Court could decide that the consumer app purchasers are merely indirect purchasers of Apple, who actually buy apps directly from third-party software developers. In that case, Apple would be classified as a passive middleman, immune to antitrust suit by app purchasers. The Court should take the former approach and affirm the decision of the Ninth Circuit holding that consumer app purchasers have standing to sue Appleā€™s App Store. Otherwise, consumers will be unable to recover for potentially legitimate antitrust injuries, and Appleā€™s conduct will be unlikely to be challenged by another party

    Stand in the Place Where Data Live: Data Breaches as Article III Injuries

    Get PDF
    Every day, another hacker gains unauthorized access to information, be it credit card data from grocery stores or fingerprint records from federal databases. Bad actors who orchestrate these data breaches, if they can be found, face clear criminal liability. Still, a hackerā€™s conviction may not be satisfying to victims whose data was accessed, and so victims may seek proper redress through lawsuits against compromised organizations. In those lawsuits, plaintiff-victims allege promising theories, including that the compromised organization negligently caused the data breach or broke an implied contract to protect customersā€™ personal information. However, many federal courts see a data breach as essentially harmless, or that data breach plaintiff-victims do not necessarily suffer cognizable legal injuries. In practice, this means that the plaintiffs do not have Article III standing, and courts do not reach merits determinations of fault. Instead, a data breach to these courts is only harmful to the extent that it leads to a subsequent injury, like identity theft or fraud. Therefore, data breach victims must suffer even more harm before they can bring a lawsuit. Other courts under this framework do nonetheless find that data breach plaintiff-victims have standing. However, even those courts still wrongfully check whether the plaintiffs suffered future identity theft, fraud, or other harm. Those courts simply find that such subsequent harm is readily apparent. This Note offers a proper approach to standing in data breach lawsuits. I argue that the moment a victimsā€™ data is exposed without their authorization, they suffer a cognizable common law injury, regardless of whether that data exposure actually causes subsequent harm. Rather than thinking of data breaches as a means to future data misuse, courts should think of data breaches as injurious in and of themselves

    3D Printed Biliary Anatomy for Surgical Planning

    Get PDF
    Background: Studies demonstrated that 3D-printed livers from CT or MRI data can be accurate models of actual patient anatomy. However, it has yet to be established if 3D printing offers improvements to clinical outcomes in surgery. This project seeks to optimize these applications for use at Jefferson by producing 3D-printed models from patient CT scans to guide liver resection surgeries. Methods: A liver transplant attending was interviewed about challenges encountered during hepatectomies. A publicly available abdominal computed tomography scan was used to render a liver and its vasculature in 3DSlicer. The liver surface was cut into two halves in MeshMixer to allow visualization of the underlying vasculature. These models were printed using an Ultimaker S5. Results: We successfully 3D printed a model of a liver capsule containing branches of the right and left hepatic arteries. The processing time included 3 hours to render liver anatomy and 31 minutes to edit the model into a form best suited for visualization of internal structures. The printing time was 47 hours and 25 minutes. 232g of PLA, 51g of Breakaway, and 24g of polyvinyl alcohol were used to create the model. Conclusions: Creation of 3-D printed models of biliary anatomy is feasible, time-efficient, and inexpensive. In future work, we plan to encapsulate the hepatic vasculature and biliary tree (and potentially also the tumor when applicable) into a translucent silicone model of the liver parenchyma, using a 3D-printed liver shell as a mold. This silicone model could be used pre-operatively and intra-operatively to help plan and guide the surgery. Measurable endpoints will include procedure time and intra-operative blood loss. This work has the potential to improve surgical outcomes for patients while facilitating the work of the surgeons

    Reasons to Accept Vaccine Refusers in Primary Care

    Get PDF
    Vaccine refusal forces us to confront tensions between many values, including scientific expertise, parental rights, childrenā€™s best interests, social responsibility, public trust, and community health. Recent outbreaks of vaccine-preventable and emerging infectious diseases have amplified these issues. The prospect of a coronavirus disease 2019 vaccine signals even more friction on the horizon. In this contentious sociopolitical landscape, it is therefore more important than ever for clinicians to identify ethically justified responses to vaccine refusal

    Problematics of Grounded Theory: Innovations for Developing an Increasingly Rigorous Qualitative Method

    Get PDF
    Our purpose in this article is to identify and suggest resolution for two core problematics of grounded theory. First, while grounded theory provides transparency to one part of the conceptualization process, where codes emerge directly from the data, it provides no such systematic or transparent way for gaining insight into the conceptual relationships between discovered codes. Producing a grounded theory depends not only on the definition of conceptual pieces, but the delineation of a relationship between at least two of those pieces. Second, the conceptualization process of grounded theory is done in hierarchical fashion, where individual codes emerge from the data but then are used to generate insight into more general concepts and thematic statements. But various works on grounded theory have failed to provide any systematic way of using data specific levels of scale (the codes) to gain insight into more macro levels of scale (concepts and themes). We offer fractal concept analysis as a means of resolving both of these issues. By using a logic structure generator, fractal concept analysis delineates self-similar conceptual frameworks at various levels of abstraction, yielding a method for linking concepts together within and between levels of scale encountered in the grounded theory coding and categorization process. We conclude that this fractal analytic technique can bolster the aims of grounded theory as a formalized and systematic process for generating theory from empirical data

    Colonisation with pathogenic drug-resistant bacteria and Clostridioides difficile among residents of residential care facilities in Cape Town, South Africa: a cross-sectional prevalence study

    Get PDF
    Abstract Background Residential care facilities (RCFs) act as reservoirs for multidrug-resistant organisms (MDRO). There are scarce data on colonisation with MDROs in Africa. We aimed to determine the prevalence of MDROs and C. difficile and risk factors for carriage amongst residents of RCFs in Cape Town, South Africa. Methods We performed a cross-sectional surveillance study at three RCFs. Chromogenic agar was used to screen skin swabs for methicillin-resistant S. aureus (MRSA) and stool samples for extended-spectrum beta-lactamase-producing Enterobacteriaceae (ESBL-E). Antigen testing and PCR was used to detect Clostridiodes difficile. Risk factors for colonisation were determined with logistic regression. Results One hundred fifty-four residents were enrolled, providing 119 stool samples and 152 sets of skin swabs. Twenty-seven (22.7%) stool samples were positive for ESBL-E, and 13 (8.6%) residents had at least one skin swab positive for MRSA. Two (1.6%) stool samples tested positive for C. difficile. Poor functional status (OR 1.3 (95% CI, 1.0ā€“1.6)) and incontinence (OR 2.9 (95% CI, 1.2ā€“6.9)) were significant predictors for ESBL-E colonisation. MRSA colonization appeared higher in frail care areas (8/58 v 5/94, pĀ =ā€‰0.07). Conclusions There was a relatively high prevalence of colonisation with MDROs, particularly ESBL-E, but low C. difficile carriage, with implications for antibiotic prescribing and infection control practice

    Efficacy and Safety of Evolocumab in Reducing Lipids and Cardiovascular Events

    Get PDF
    BACKGROUND: Evolocumab, a monoclonal antibody that inhibits proprotein convertase subtilisin-kexin type 9 (PCSK9), significantly reduced low-density lipoprotein (LDL) cholesterol levels in short-term studies. We conducted two extension studies to obtain longer-term data. METHODS: In two open-label, randomized trials, we enrolled 4465 patients who had completed 1 of 12 phase 2 or 3 studies ("parent trials") of evolocumab. Regardless of study-group assignments in the parent trials, eligible patients were randomly assigned in a 2:1 ratio to receive either evolocumab (140 mg every 2 weeks or 420 mg monthly) plus standard therapy or standard therapy alone. Patients were followed for a median of 11.1 months with assessment of lipid levels, safety, and (as a prespecified exploratory analysis) adjudicated cardiovascular events including death, myocardial infarction, unstable angina, coronary revascularization, stroke, transient ischemic attack, and heart failure. Data from the two trials were combined. RESULTS: As compared with standard therapy alone, evolocumab reduced the level of LDL cholesterol by 61%, from a median of 120 mg per deciliter to 48 mg per deciliter (P<0.001). Most adverse events occurred with similar frequency in the two groups, although neurocognitive events were reported more frequently in the evolocumab group. The risk of adverse events, including neurocognitive events, did not vary significantly according to the achieved level of LDL cholesterol. The rate of cardiovascular events at 1 year was reduced from 2.18% in the standard-therapy group to 0.95% in the evolocumab group (hazard ratio in the evolocumab group, 0.47; 95% confidence interval, 0.28 to 0.78; P=0.003). CONCLUSIONS: During approximately 1 year of therapy, the use of evolocumab plus standard therapy, as compared with standard therapy alone, significantly reduced LDL cholesterol levels and reduced the incidence of cardiovascular events in a prespecified but exploratory analysis. (Funded by Amgen; OSLER-1 and OSLER-2 ClinicalTrials.gov numbers, NCT01439880 and NCT01854918.)

    The Long-Baseline Neutrino Experiment: Exploring Fundamental Symmetries of the Universe

    Get PDF
    The preponderance of matter over antimatter in the early Universe, the dynamics of the supernova bursts that produced the heavy elements necessary for life and whether protons eventually decay --- these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our Universe, its current state and its eventual fate. The Long-Baseline Neutrino Experiment (LBNE) represents an extensively developed plan for a world-class experiment dedicated to addressing these questions. LBNE is conceived around three central components: (1) a new, high-intensity neutrino source generated from a megawatt-class proton accelerator at Fermi National Accelerator Laboratory, (2) a near neutrino detector just downstream of the source, and (3) a massive liquid argon time-projection chamber deployed as a far detector deep underground at the Sanford Underground Research Facility. This facility, located at the site of the former Homestake Mine in Lead, South Dakota, is approximately 1,300 km from the neutrino source at Fermilab -- a distance (baseline) that delivers optimal sensitivity to neutrino charge-parity symmetry violation and mass ordering effects. This ambitious yet cost-effective design incorporates scalability and flexibility and can accommodate a variety of upgrades and contributions. With its exceptional combination of experimental configuration, technical capabilities, and potential for transformative discoveries, LBNE promises to be a vital facility for the field of particle physics worldwide, providing physicists from around the globe with opportunities to collaborate in a twenty to thirty year program of exciting science. In this document we provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess.Comment: Major update of previous version. This is the reference document for LBNE science program and current status. Chapters 1, 3, and 9 provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess. 288 pages, 116 figure

    Comprehensive Analysis of Transcript Start Sites in Ly49 Genes Reveals an Unexpected Relationship with Gene Function and a Lack Of Upstream Promoters

    Get PDF
    Comprehensive analysis of the transcription start sites of the Ly49 genes of C57BL/6 mice using the oligo-capping 5ā€²-RACE technique revealed that the genes encoding the ā€œmissing selfā€ inhibitory receptors, Ly49A, C, G, and I, were transcribed from multiple broad regions in exon 1, in the intron1/exon2 region, and upstream of exon -1b. Ly49E was also transcribed in this manner, and uniquely showed a transcriptional shift from exon1 to exon 2 when NK cells were activated in vitro with IL2. Remarkably, a large proportion of Ly49E transcripts was then initiated from downstream of the translational start codon. By contrast, the genes encoding Ly49B and Q in myeloid cells, the activating Ly49D and H receptors in NK cells, and Ly49F in activated T cells, were predominantly transcribed from a conserved site in a pyrimidine-rich region upstream of exon 1. An āˆ¼200 bp fragment from upstream of the Ly49B start site displayed tissue-specific promoter activity in dendritic cell lines, but the corresponding upstream fragments from all other Ly49 genes lacked detectable tissue-specific promoter activity. In particular, none displayed any significant activity in a newly developed adult NK cell line that expressed multiple Ly49 receptors. Similarly, no promoter activity could be found in fragments upstream of intron1/exon2. Collectively, these findings reveal a previously unrecognized relationship between the pattern of transcription and the expression/function of Ly49 receptors, and indicate that transcription of the Ly49 genes expressed in lymphoid cells is achieved in a manner that does not require classical upstream promoters
    • ā€¦
    corecore