56 research outputs found
Consumer Bankruptcy Practice in Kentucky: Chapter 7 Practice
A handbook for Kentucky practitioners covering Chapter 7 debtors, automatic stay and abandonment, dealing with secured creditors, discharging debts, bankrupt estates, and bankruptcy ethics
Science and Ideology in Economic, Political, and Social Thought
This paper has two sources: One is my own research in three broad areas: business cycles, economic measurement and social choice. In all of these fields I attempted to apply the basic precepts of the scientific method as it is understood in the natural sciences. I found that my effort at using natural science methods in economics was met with little understanding and often considerable hostility. I found economics to be driven less by common sense and empirical evidence, then by various ideologies that exhibited either a political or a methodological bias, or both. This brings me to the second source: Several books have appeared recently that describe in historical terms the ideological forces that have shaped either the direct areas in which I worked, or a broader background. These books taught me that the ideological forces in the social sciences are even stronger than I imagined on the basis of my own experiences.
The scientific method is the antipode to ideology. I feel that the scientific work that I have done on specific, long standing and fundamental problems in economics and political science have given me additional insights into the destructive role of ideology beyond the history of thought orientation of the works I will be discussing
Duhemian Themes in Expected Utility Theory
This monographic chapter explains how expected utility (EU) theory arose in von Neumann and Morgenstern, how it was called into question by Allais and others, and how it gave way to non-EU theories, at least among the specialized quarters of decion theory. I organize the narrative around the idea that the successive theoretical moves amounted to resolving Duhem-Quine underdetermination problems, so they can be assessed in terms of the philosophical recommendations made to overcome these problems. I actually follow Duhem's recommendation, which was essentially to rely on the passing of time to make many experiments and arguments available, and evebntually strike a balance between competing theories on the basis of this improved knowledge. Although Duhem's solution seems disappointingly vague, relying as it does on "bon sens" to bring an end to the temporal process, I do not think there is any better one in the philosophical literature, and I apply it here for what it is worth.
In this perspective, EU theorists were justified in resisting the first attempts at refuting their theory, including Allais's in the 50s, but they would have lacked "bon sens" in not acknowledging their defeat in the 80s, after the long process of pros and cons had sufficiently matured.
This primary Duhemian theme is actually combined with a secondary theme - normativity. I suggest that EU theory was normative at its very beginning and has remained so all along, and I express dissatisfaction with the orthodox view that it could be treated as a straightforward descriptive theory for purposes of prediction and scientific test. This view is usually accompanied with a faulty historical reconstruction, according to which EU theorists initially formulated the VNM axioms descriptively and retreated to a normative construal once they fell threatened by empirical refutation. From my historical study, things did not evolve in this way, and the theory was both proposed and rebutted on the basis of normative arguments already in the 1950s. The ensuing, major problem was to make choice experiments compatible with this inherently normative feature of theory. Compability was obtained in some experiments, but implicitly and somewhat confusingly, for instance by excluding overtly incoherent subjects or by creating strong incentives for the subjects to reflect on the questions and provide answers they would be able to defend.
I also claim that Allais had an intuition of how to combine testability and normativity, unlike most later experimenters, and that it would have been more fruitful to work from his intuition than to make choice experiments of the naively empirical style that flourished after him.
In sum, it can be said that the underdetermination process accompanying EUT was resolved in a Duhemian way, but this was not without major inefficiencies. To embody explicit rationality considerations into experimental schemes right from the beginning would have limited the scope of empirical research, avoided wasting resources to get only minor findings, and speeded up the Duhemian process of groping towards a choice among competing theories
Consumer Bankruptcy Update
Materials from the Consumer Bankruptcy Update presentations held by UK/CLE in December 2000
A flexible framework for sparse simultaneous component based data integration
<p>Abstract</p> <p>1 Background</p> <p>High throughput data are complex and methods that reveal structure underlying the data are most useful. Principal component analysis, frequently implemented as a singular value decomposition, is a popular technique in this respect. Nowadays often the challenge is to reveal structure in several sources of information (e.g., transcriptomics, proteomics) that are available for the same biological entities under study. Simultaneous component methods are most promising in this respect. However, the interpretation of the principal and simultaneous components is often daunting because contributions of each of the biomolecules (transcripts, proteins) have to be taken into account.</p> <p>2 Results</p> <p>We propose a sparse simultaneous component method that makes many of the parameters redundant by shrinking them to zero. It includes principal component analysis, sparse principal component analysis, and ordinary simultaneous component analysis as special cases. Several penalties can be tuned that account in different ways for the block structure present in the integrated data. This yields known sparse approaches as the lasso, the ridge penalty, the elastic net, the group lasso, sparse group lasso, and elitist lasso. In addition, the algorithmic results can be easily transposed to the context of regression. Metabolomics data obtained with two measurement platforms for the same set of <it>Escherichia coli </it>samples are used to illustrate the proposed methodology and the properties of different penalties with respect to sparseness across and within data blocks.</p> <p>3 Conclusion</p> <p>Sparse simultaneous component analysis is a useful method for data integration: First, simultaneous analyses of multiple blocks offer advantages over sequential and separate analyses and second, interpretation of the results is highly facilitated by their sparseness. The approach offered is flexible and allows to take the block structure in different ways into account. As such, structures can be found that are exclusively tied to one data platform (group lasso approach) as well as structures that involve all data platforms (Elitist lasso approach).</p> <p>4 Availability</p> <p>The additional file contains a MATLAB implementation of the sparse simultaneous component method.</p
Real-Life Evidence for Tedizolid Phosphate in the Treatment of Cellulitis and Wound Infections: A Case Series
Introduction
Tedizolid phosphate 200 mg, once daily for 6 days, has recently been approved for the treatment of patients with acute bacterial skin and skin structure infections (ABSSSIs) in several countries; however, clinical experience in real-life settings is currently limited. Here, we report on the use of tedizolid with an extended treatment duration for complex and severe ABSSSIs in real-world clinical settings.
Methods
Two patients with cellulitis and two patients with surgical site infection (SSI), aged 26â60 years, were treated with tedizolid phosphate 200 mg, intravenous/oral (IV/PO) or IV only, once daily at four different institutions.
Results
Two morbidly obese patients had non-necrotizing, non-purulent severe cellulitis, which were complicated by sepsis or systemic inflammatory response syndrome plus myositis. One female patient failed on first-line empiric therapy with IV cefalotin, clindamycin and imipenem (3â4 days), and was switched to IV/PO tedizolid (7â+â5 days). One male patient received IV clindamycin plus IV/PO tedizolid (5â+â5 days), but clindamycin was discontinued on Day 3 due to an adverse event. For both patients, clinical signs and symptoms improved within 72 h, and laboratory results were normalized by Days 7 and 8, respectively. Two other patients (one obese, diabetic female with chronic hepatitis and chronic obstructive pulmonary disease) had complicated SSIs occurring 10 days after hernia repair with mesh or 3 months after spinal fusion surgery with metal implant. First patient with previous methicillin-resistant Staphylococcus aureus (MRSA) bacteremia received a 7-day tedizolid IV course empirically. The second patient with culture-confirmed MRSA infection received a 14-day IV course. Both patients responded within 72 h, and local and systemic signs normalized by end of treatment. There were no reports of thrombocytopenia.
Conclusion
Tedizolid phosphate 200 mg for 7â14 days was a favored treatment option for patients with severe/complex ABSSSIs, and was effective following previous treatment failure or in late-onset infections
Assumption without representation: the unacknowledged abstraction from communities and social goods
We have not clearly acknowledged the abstraction from unpriceable âsocial goodsâ (derived from
communities) which, different from private and public goods, simply disappear if it is attempted to
market them. Separability from markets and economics has not been argued, much less established.
Acknowledging communities would reinforce rather than undermine them, and thus facilitate
the production of social goods. But it would also help economics by facilitating our understanding
of â and response to â financial crises as well as environmental destruction and many social problems,
and by reducing the alienation from economics often felt by students and the public
Omecamtiv mecarbil in chronic heart failure with reduced ejection fraction, GALACTICâHF: baseline characteristics and comparison with contemporary clinical trials
Aims:
The safety and efficacy of the novel selective cardiac myosin activator, omecamtiv mecarbil, in patients with heart failure with reduced ejection fraction (HFrEF) is tested in the Global Approach to Lowering Adverse Cardiac outcomes Through Improving Contractility in Heart Failure (GALACTICâHF) trial. Here we describe the baseline characteristics of participants in GALACTICâHF and how these compare with other contemporary trials.
Methods and Results:
Adults with established HFrEF, New York Heart Association functional class (NYHA)ââ„âII, EF â€35%, elevated natriuretic peptides and either current hospitalization for HF or history of hospitalization/ emergency department visit for HF within a year were randomized to either placebo or omecamtiv mecarbil (pharmacokineticâguided dosing: 25, 37.5 or 50âmg bid). 8256 patients [male (79%), nonâwhite (22%), mean age 65âyears] were enrolled with a mean EF 27%, ischemic etiology in 54%, NYHA II 53% and III/IV 47%, and median NTâproBNP 1971âpg/mL. HF therapies at baseline were among the most effectively employed in contemporary HF trials. GALACTICâHF randomized patients representative of recent HF registries and trials with substantial numbers of patients also having characteristics understudied in previous trials including more from North America (n = 1386), enrolled as inpatients (n = 2084), systolic blood pressureâ<â100âmmHg (n = 1127), estimated glomerular filtration rate <â30âmL/min/1.73 m2 (n = 528), and treated with sacubitrilâvalsartan at baseline (n = 1594).
Conclusions:
GALACTICâHF enrolled a wellâtreated, highârisk population from both inpatient and outpatient settings, which will provide a definitive evaluation of the efficacy and safety of this novel therapy, as well as informing its potential future implementation
- âŠ