714 research outputs found

    Directors’ Use of Data in Decision Making in 21st Century Community Learning Center Grants

    Get PDF
    Evidence of positive impact of youth participation in 21st Century Community Learning Center (CCLC) grants remains mixed (Leos-Urbel, 2015). Reports of conflicting results of 21st CCLC program success may plague the 21st CCLC reputation, potentially reducing support and capacity to which they can serve youth at risk. Stakeholders, such as school districts and policy makers lack understanding of the data of the effectiveness of these 21CCLC after school programs is putting current funding at risk (Farrell, Collier‐Meek, & Furman, 2019). Throughout the field of education, expectation from policy makers is increasing that evidence-based practices that are based on research and data are used to guide practical decision making (Mahoney, p.35 2016). Success in today’s data-oriented environment requires program leaders and stakeholders to be able to think about data analytically (Provost & Fawcett, 2013). The potential concern is a science to practice gap, further undermining the credibility of the 21st CCLC initiative (Mahoney, p.34 2016). The problem and the importance of this research is to identify how directors collect, analyze and report data may be contributing to the lack of stakeholder understanding and mixed review of program success. The basis of this formative research is to examine the practice of 21st CCLC directors on how they use data to base decisions regarding their programs. “The question is not if or why programs are successful but how is data used at program level to drive decisions and program improvement” (Granger, 2010). Through semi- structured interviews of Washington State 21st CCLC grant directors, this inquiry seeks to answer what professional development directors’ need to collect program data accurately, to analyze and develop a deeper understanding of the data, implement program improvement and decisions based on the data

    Hassle

    Get PDF
    Before police perform a search or seizure, they typically must meet the probable cause or reasonable suspicion standard. Moreover, even if they meet the appropriate standard, their evidence must be individualized to the suspect and cannot rely on purely probabilistic inferences. Scholars and courts have long defended the distinction between individualized and purely probabilistic evidence, but existing theories of individualization fail to articulate principles that are descriptively accurate or normatively desirable. They overlook the only benefit that the individualization requirement can offer: reducing hassle. Hassle measures the chance that an innocent person will experience a search or seizure. Because some investigation methods meet the relevant suspicion standards but nevertheless impose too many stops and searches on the innocent, courts must have a lever independent from the suspicion standard to constrain the scope of criminal investigations. The individualization requirement has unwittingly performed this function, but not in an optimal way. Individualization has kept hassle low by entrenching old methods of investigation. Because courts designate practices as individualized when they are costly (for example, gumshoe methods) or lucky (for example, tips), the requirement has confined law enforcement to practices that cannot scale. New investigation methods such as facial-recognition software and pattern-based data mining, by contrast, can scale up law-enforcement activities very quickly. Although these innovations have the potential to increase the accuracy of stops and searches, they may also increase the total number of innocent individuals searched because of the innovations’ speed and cost-effectiveness. By reforming individualization to minimize hassle, courts can enable law-enforcement innovations that are fairer and more accurate than traditional police investigations without increasing burdens on the innocent

    Surface-Treated versus Untreated Large-Bore Catheters as Vascular Access in Hemodialysis and Apheresis Treatments

    Get PDF
    Background. Catheter-related infections, thrombosis, and stenosis are among the most frequent complications associated with catheters, which are inserted in vessels. Surface treatment processes of the outer surface, such as ion-beam-assisted deposition, can be used to mitigate such complications. Methods. This retrospective study (1992–2007) evaluated silver-coated (54 patients) and noncoated (105 patients) implanted large-bore catheters used for extracorporeal detoxification. The catheters were inserted into the internal jugular or subclavian veins. After removal, the catheters were cultured for bacterial colonization using standard microbiologic assays. They also were examined using scanning electron microscope. Results. The silver coated catheters showed a tendency towards longer in situ time. The microbiologic examinations of the catheter tips were in both catheter types high positive, but not significant. Conclusion. The silver-coated catheters showed no significantly reduction in infection rate by evaluation of all collected data in this retrospective study. There was no association between both catheters in significantly reducing savings in treatment costs and in reducing patient discomfort. Other new developed catheter materials such as the microdomain-structured inner and outer surface are considered more biocompatible because they mimic the structure of natural biological surface

    The New Intrusion

    Get PDF
    The article focuses on a new taxonomy introduced for organizing privacy regulations across several stages of information flow including observation, capture and dissemnation. It states that the tort of intrusion imposes liability upon seclusion offers and provides recourse for the observation of data. It also highlights the application of intrusion on modern settings such as web tracking technologies and global positioning systems

    LDL-Apheresis: Technical and Clinical Aspects

    Get PDF
    The prognosis of patients suffering from severe hyperlipidemia, sometimes combined with elevated lipoprotein (a) levels, and coronary heart disease refractory to diet and lipid-lowering drugs is poor. For such patients, regular treatment with low-density lipoprotein (LDL) apheresis is the therapeutic option. Today, there are five different LDL-apheresis systems available: cascade filtration or lipid filtration, immunoadsorption, heparin-induced LDL precipitation, dextran sulfate LDL adsorption, and the LDL hemoperfusion. There is a strong correlation between hyperlipidemia and atherosclerosis. Besides the elimination of other risk factors, in severe hyperlipidemia therapeutic strategies should focus on a drastic reduction of serum lipoproteins. Despite maximum conventional therapy with a combination of different kinds of lipid-lowering drugs, sometimes the goal of therapy cannot be reached. Hence, in such patients, treatment with LDL-apheresis is indicated. Technical and clinical aspects of these five different LDL-apheresis methods are shown here. There were no significant differences with respect to or concerning all cholesterols, or triglycerides observed. With respect to elevated lipoprotein (a) levels, however, the immunoadsorption method seems to be most effective. The different published data clearly demonstrate that treatment with LDL-apheresis in patients suffering from severe hyperlipidemia refractory to maximum conservative therapy is effective and safe in long-term application

    The MacGuffin and the Net: Taking Internet Listeners Seriously

    Get PDF
    To date, listeners and readers play little more than bit parts in First Amendment jurisprudence. The advent of digital networked communication over the Internet supports moving these interests to center stage in free speech doctrine and offers new empirical data to evaluate the regulation of online information. Such a shift will have important and unexpected consequences for other areas, including ones seemingly orthogonal to First Amendment concerns. This Essay explores likely shifts in areas that include intellectual property, tort, and civil procedure, all of which have been able to neglect certain free speech issues because of the lack of listener interests in the canon. For good or ill, these doctrines will be forced to evolve by free speech precedent that prioritizes consumers

    Information Hacking

    Get PDF
    The 2016 U.S. presidential election is seen as a masterpiece of effective disinformation tactics. Commentators credit the Russian Federation with a set of targeted, effective information interventions that led to the surprise election of Republican candidate Donald Trump. On this account, Russia hacked not only America’s voting systems, but also American voters, plying them with inaccurate data—especially on Internet platforms—that changed political views. This Essay examines the 2016 election narrative through the lens of cybersecurity; it treats foreign efforts to influence the outcome as information hacking. It critically assesses unstated assumptions of the narrative, including whether these attacks can be replicated; the size of their effect; the role of key influencers in targeted groups; and the normative claim that citizens voted against their preferences. Next, the Essay offers examples of other successful information hacks and argues that these attacks have multiple, occasionally conflicting goals. It uses lessons from cybersecurity to analyze possible responses, including prevention, remediation, and education. Finally, it draws upon the security literature to propose quarantines for suspect information, protection of critical human infrastructure, and whitelists as tactics that defenders might usefully employ to counteract political disinformation efforts

    Conundrum

    Get PDF
    corecore