1,534 research outputs found
Sonic stuff : objects and objectiles
PhD ThesisThis thesis investigates the role of objects in creative practice as alluring and evocative
materials that disrupt compositional intentions and trajectories. This research does not
begin from music as a cultural text but rather from the deeper experiences of sound as
resistant materials that animate experiential space with their own styles of atmosphere,
ambience and inaudible-audible signatures. Working across and often at the peripheries
of the theoretical disciplines of object orientated ontology and process philosophy I
address the philosophical issue of how sounds and objects possess the potential to
unsettle, agitate and reconfigure networks of relation.
Practice has informed a hybridisation of concepts derived from various disciplines,
which are held together by threads of fictionalised prose that contribute alternative
insights into the field of studio-based composition. This research employs a
phenomenological method of reduction and at times an object orientated approach in
theorising the autonomous life of sounds and objects. Dense descriptions of
experiences, observations, thoughts and poetics form the basis for developing an
informed creative treatise. Deviating descriptions of sensuous experiences are
deployed throughout this research in order to find personal and meaningful ways of
articulating sonic encounter.
What are the multiple contours of Sonic Stuff? Is there an identity of sonic potential?
What tensions/relations occur between the composer, studio and sonic object? In what
form does Sonic Stuff reveal and characterise experiential time and space? What do the
concepts of the withdrawn and revealed afford an understanding of sonic objects and
sound in-itself
Disposable Personal Goodwill, Frosty the Snowman, and \u3ci\u3eMartin Ice Cream\u3c/i\u3e All Melt Away in the Bright Sunlight of Analysis
The current rage in dispositional tax planning for closely-held C corporations is to bifurcate the sale transaction into two components comprising: (a) a sale by (i) the target C corporationâs shareholders of their target C corporation stock or (ii) the target C corporation of its assets; and (b) a sale by some or all of the target C corporationâs shareholders of âpersonal goodwillâ associated with the business conducted by the target C corporation. The documented purchase price paid for the first component of the transaction (either the stock of the C corporation or the assets of the C corporation) is based on a fair market value determination that excludes consideration of the personal goodwill component of the transaction. If successful, this tax planning technique allows the selling shareholders to report only shareholderlevel capital gain on the personal goodwill component of the transaction and allows the buyer to claim that this portion of the purchase price is allocable to an acquired intangible, i.e., goodwill, that is amortizable over fifteen years under § 197. More specifically, from the selling shareholdersâ perspective, if the first component of the transaction involves a sale of the target C corporationâs assets, the portion of the purchase price attributable to the personal goodwill component of the transaction does not bear the burden of a corporate level of taxation. From the buyerâs perspective, if the first component of the transaction involves a purchase of the target C corporationâs stock, the portion of the purchase price attributable to the personal goodwill component of the transaction is not capitalized into the stock. This planning is premised on the position that certain goodwill associated with the target C corporationâs business can be, and is in fact, owned for tax purposes, by one or more shareholders. If all goodwill associated with the target C corporationâs business activities were in fact owned for tax purposes by the target C corporation, then the personal goodwill component of the transaction is properly viewed as a sale by the target C corporation of such goodwill creating a corporatelevel gain, followed by a distribution from the target C corporation to the shareholders, which in turn creates a shareholder-level gain. If, however, the personal goodwill can be, and in fact is, owned by the selling shareholders and can be, and in fact is, sold by the selling shareholders to the buyer for tax purposes, then its disposition is not subject to corporate-level taxation. Although this planning has garnered much attention recently and could provide significant tax benefits if effective, we believe it deserves further scrutiny before being accepted as an appropriate component of dispositional tax planning for closely-held businesses. This planning technique also highlights the continuing horizontal equity problems associated with the current tax lawâs treatment of closely-held businesses. In Part II of this article, we discuss the place that this tax planning technique occupies within a historical context. In Part III, we set forth a substantive discussion of the issues raised by the technique. In Part IV, we discuss the tax policy implications that are raised by the existing application of the corporate income tax regime. Finally, in Part V, we discuss some final thoughts about the implications of the analysis contained in this paper
Recommended from our members
Examining Appropriacy of CFI and TLI Cutoff Value in Multiple-Group CFA Test of Measurement Invariance to Enhance Accuracy of Test Score Interpretation
The most common effect size when using a multiple-group confirmatory factor analysis approach to measurement invariance is ÎCFI and ÎTLI with a cutoff value of 0.01. However, this recommended cutoff value may not be ubiquitously appropriate and may be of limited application for some tests (e.g., measures using dichotomous items or different estimation methods, sample sizes, or model complexity). Moreover, prior cutoff value estimations often have ignored consequences resulting in using measures that more accurately estimate countriesâ or learnersâ proficiency for some countries or groups versus others. In this study, we investigate whether the cutoff value proposed by Cheung and Rensvold (ÎCFI or ÎTLIâ\u3eâ0.01) is appropriate across educational measurement contexts. Specifically, we investigated the performance of ÎCFI and ÎTLI in capturing LOI at the scalar level in dichotomous items within item response theory on groups whose test characteristic curves differed by 0.5. Simulation results showed that the proposed cutoff value of 0.01 in ÎCFI and ÎTLI was not appropriate to capture LOI under the study conditions, which may result in the misinterpretation of test results or inaccurate inferences
The cost-effectiveness of upfront point-ofcare testing in the emergency department : a secondary analysis of a randomised, controlled trial
Abstract: Background: Time-saving is constantly sought after in the Emergency Department (ED), and Point-of-Care (POC) testing has been shown to be an effective time-saving intervention. However, when direct costs are compared, these tests commonly appear to be cost-prohibitive. Economic viability may become apparent when the timesaving is translated into financial benefits from staffing, time- and cost-saving. The purpose of this study was to evaluate the cost-effectiveness of diagnostic investigations utilised prior to medical contact for ED patients with common medical complaints. Methods: This was a secondary analysis of data from a prospective, randomised, controlled trial in order to assess the cost-effectiveness of upfront, POC testing. Eleven combinations of POC equivalents of commonly-used special investigations (blood tests (i-STAT and complete blood count (CBC)), electrocardiograms (ECGs) and x-rays (LODOXÂź (Low Dose X-ray)) were evaluated compared to the standard ED pathway with traditional diagnostic tests. The economic viability of each permutation was assessed using the Incremental Cost Effectiveness Ratio and Cost- Effectiveness Acceptability Curves. Expenses related to the POC test implementation were compared to the control group while taking staffing costs and time-saving into account. Results: There were 897 medical patients randomised to receive various combinations of POC tests. The most costeffective combination was the i-STAT+CBC permutation which, based on the time saving, would ultimately save money if implemented. All LODOXÂź-containing permutations were costlier but still saved time. Non-LODOXÂź permutations were virtually 100% cost-effective if an additional cost of US$50 per patient was considered acceptable. Higher staffing costs would make using POC testing even more economical. Conclusions: In certain combinations, upfront, POC testing is more cost-effective than standard diagnostic testing for common ED undifferentiated medical presentations â the most economical POC test combination being the i- STAT + CBC. Upfront POC testing in the ED has the potential to not only save time but also to save money
Activities to support the implementation of complex interventions as part of routine care: a review of the quality of reporting in cluster randomised controlled trials
Objective: To review a sample of cluster randomised controlled trials and explore the quality of reporting of (1) enabling or support activities provided to the staff during the trial, (2) strategies used to monitor fidelity throughout the trial and (3) the extent to which the intervention being tested was delivered as planned. Design: A descriptive review. Data sources and study selection: We searched MEDLINE for trial reports published between 2008 and 2014 with combinations of the search terms 'randomised', 'cluster', 'trial', 'study', 'intervention' and 'implementâ'. We included trials in which healthcare professionals (HCPs) implemented the intervention being tested as part of routine practice. We excluded trials (1) conducted in non-health services settings, (2) where the intervention explicitly aimed to change the behaviours of the HCPs and (3) where the trials were ongoing or for which only trial protocols were available. Data collection: We developed a data extraction form using the Template for Intervention Description and Replication (TIDieR checklist). Review authors independently extracted data from the included trials and assessed quality of reporting for individual items. Results: We included 70 publications (45 results publications, 25 related publications). 89% of trials reported using enabling or support activities. How these activities were provided (75.6%, n=34) and how much was provided (73.3%, n=33) were the most frequently reported items. Less than 20% (n=8) of the included trials reported that competency checking occurred prior to implementation and data collection. 64% (n=29) of trials reported collecting measures of implementation. 44% (n=20) of trials reported data from these measures. Conclusions: Although enabling and support activities are reported in trials, important gaps exist when assessed using an established checklist. Better reporting of the supports provided in effectiveness trials will allow for informed decisions to be made about financial and resource implications for wide scale implementation of effective interventions
Recurrent atypical fibroxanthoma of the limbus
Author version made available in accordance with the publisher's policy.We report an unusual presentation of recurrent atypical fibroxanthoma of the limbus. Clinical and histological appearance, as well as management are discussed and the current literature is reviewed
Recommended from our members
Effect of Adjusting Pseudo-Guessing Parameter Estimates on Test Scaling When Item Parameter Drift Is Present
In item response theory test scaling/equating with the three-parameter model, the scaling coefficients A and B have no impact on the c-parameter estimates of the test items since the c-parameter estimates are not adjusted in the scaling/equating procedure. The main research question in this study concerned how serious the consequences would be if c-parameter estimates are not adjusted in the test equating procedure when item-parameter drift (IPD) is present. This drift is commonly observed in equating studies and hence, has been the source of considerable research. The results from a series of Monte-Carlo simulation studies conducted under 32 different combinations of conditions showed that some calibration strategies in the study, where the c-parameters were adjusted to be identical across two test forms, resulted in more robust equating performance in the presence of IPD. This paper discusses the practical effectiveness and the theoretical importance of appropriately adjusting c-parameter estimates in equating. Accessed 3,754 times on https://pareonline.net from July 04, 2015 to December 31, 2019. For downloads from January 1, 2020 forward, please click on the PlumX Metrics link to the right
Understanding the Role of Theory on Instrument Development: An Examination of Strengths and Weaknesses of Discriminant Validity Analysis Techniques
Numerous researchers have called attention to many important issues in instrument development throughout the relatively short history of the information systems (IS) academic research discipline (e.g., Petter, Straub, & Rai 2007; Straub, Boudreau, & Gefen 2004; Straub 1989). With the accumulation of knowledge related to the process of instrument development, it has now become necessary to take a closer look at specific aspects of this process. This paper focuses on construct validity, specifically discriminant validity, and examines some popular methods of supporting this type of validity when using cross-sectional data. We examine strengths and weaknesses of these analysis techniques with a focus on the role of theory and informed interpretation. We highlight the applicability of these techniques by analyzing a sample dataset where we theorize two constructs to be highly correlated. With this paper, we provide both researchers and reviewers a greater understanding of the highlighted discriminant validity analysis techniques
Risk factors for COPD exacerbations in inhaled medication users: the COPDGene study biannual longitudinal follow-up prospective cohort.
BackgroundDespite inhaled medications that decrease exacerbation risk, some COPD patients experience frequent exacerbations. We determined prospective risk factors for exacerbations among subjects in the COPDGene Study taking inhaled medications.Methods2113 COPD subjects were categorized into four medication use patterns: triple therapy with tiotropium (TIO) plus long-acting beta-agonist/inhaled-corticosteroid (ICSâ±âLABA), tiotropium alone, ICSâ±âLABA, and short-acting bronchodilators. Self-reported exacerbations were recorded in telephone and web-based longitudinal follow-up surveys. Associations with exacerbations were determined within each medication group using four separate logistic regression models. A head-to-head analysis compared exacerbation risk among subjects using tiotropium vs. ICSâ±âLABA.ResultsIn separate logistic regression models, the presence of gastroesophageal reflux, female gender, and higher scores on the St. George's Respiratory Questionnaire were significant predictors of exacerbator status within multiple medication groups (reflux: OR 1.62-2.75; female gender: OR 1.53 - OR 1.90; SGRQ: OR 1.02-1.03). Subjects taking either ICSâ±âLABA or tiotropium had similar baseline characteristics, allowing comparison between these two groups. In the head-to-head comparison, tiotropium users showed a trend towards lower rates of exacerbations (ORâ=â0.69 [95 % CI 0.45, 1.06], pâ=â0.09) compared with ICSâ±âLABA users, especially in subjects without comorbid asthma (ORâ=â0.56 [95% CI 0.31, 1.00], pâ=â0.05).ConclusionsEach common COPD medication usage group showed unique risk factor patterns associated with increased risk of exacerbations, which may help clinicians identify subjects at risk. Compared to similar subjects using ICSâ±âLABA, those taking tiotropium showed a trend towards reduced exacerbation risk, especially in subjects without asthma.Trial registrationClinicalTrials.gov NCT00608764, first received 1/28/2008
On the evaluation of methods for the recovery of plant root systems from X-ray computed tomography images
X-ray micro computed tomography (”CT) allows non-destructive visualisation of plant root systems within their soil environment and thus offers an alternative to commonly used destructive methodologies for the examination of plant roots and their interaction with the surrounding soil. Various methods for the recovery of root system information from X-ray CT image data have been presented in the literature. Detailed, ideally quantitative, evaluation is essential, in order to determine the accuracy and limitations of the proposed methods, and to allow potential users to make informed choices between them. This, however, is a complicated task. Three-dimensional ground truth data is expensive to produce, and the complexity of X-ray CT data means that manually generated ground truth may not be definitive. Similarly, artificially generated data is not entirely representative of real samples. The aims of this work are to raise awareness of the evaluation problem and to propose experimental approaches that allow the performance of root extraction methods to be assessed, ultimately improving the techniques available. To illustrate the issues, tests are conducted using both artificially generated images and real data samples
- âŠ