1,952,513 research outputs found
Appraisal of patient-reported outcome measures in analogous diseases and recommendations for use in phase II and III clinical trials of pyruvate kinase deficiency
Purpose: Pyruvate kinase deficiency (PKD) is a rare disease and understanding of its epidemiology and associated burden remains limited. With no current curative therapy, clinical manifestations can be life threatening, clinically managed by maintaining adequate hemoglobin levels through transfusion and subsequent support, but with frequent complications. Treatment goals are to maintain/improve the patient’s quality of life. With new therapies, reliable, valid, and relevant patient-reported outcome (PRO) tools are required for use in clinical trials. Methods: Systematic literature search identified no current PRO tools for capturing/measuring the impact of PKD and treatments in clinical trials. Therefore, the search strategy was revised to consider conditions analogous to PKD in terms of symptoms and impacts that might serve as parallels to the experience in PKD; this included sickle cell anemia, thalassemia, and hemolytic anemia. Psychometric properties, strengths, and weakness of selected appropriate PRO instruments were compared, and recommendations made for choice of PRO tools. Results: In adult populations, EORTC QLQ C30 and SF-36v2 are recommended, the former being a basic minimum, covering generic HRQoL, and core symptoms such as fatigue. In pediatric populations, PedsQL Generic Core Scale to measure HRQoL and PedsQL MFS scale to measure fatigue are recommended. Conclusions: Some symptoms/life impacts may be unique to PKD and not observable in analogous conditions. A ‘Physico-Psychosocial Model’ derived from the ‘Medical Model’ is proposed to form the basis for a hypothesized conceptual framework to address the development of PKD-specific PRO instruments.Peer reviewedFinal Published versio
Who Put the \u3ci\u3eQuo\u3c/i\u3e in Quid Pro Quo?: Why Courts Should Apply \u3ci\u3eMcDonnell\u3c/i\u3e ’s “Official Act” Definition Narrowly
Federal prosecutors have several tools at their disposal to bring criminal charges against state and local officials for their engagement in corrupt activity. Section 666 federal funds bribery and § 1951 Hobbs Act extortion, two such statuary tools, have coexisted for the past thirty-six years, during which time § 666 has seen an increasing share of total prosecutions while the Hobbs Act’s share of prosecutions has fallen commensurately. In the summer of 2016, the U.S. Supreme Court decided McDonnell v. United States—a decision that threatens to quicken the demise of Hobbs Act extortion in favor of § 666. If McDonnell is interpreted to apply to Hobbs Act extortion but not to § 666, we can expect the latter to become the unchallenged favorite of federal prosecutors as well as increased litigation over whether § 666 bribery contains a quid pro quo requirement. This is likely to occur given § 666’s coverage of the same corrupt behavior, expansive jurisdictional hook, and, following McDonnell, lower difficulty of proving violations within some circuits. To avoid this eventuality, lower courts should distinguish McDonnell because of its unique procedural posture and continue to apply the existing quid pro quo framework. Before meaningful change to our federal bribery statutes can take place, the courts of appeals must first find consensus over whether and when § 666 requires the government to prove the existence of a quid pro quo
A Tannakian interpretation of the elliptic infinitesimal braid Lie algebras
Let . The pro-unipotent completion of the pure braid group of
points on a genus 1 surface has been shown to be isomorphic to an explicit
pro-unipotent group with graded Lie algebra using two types of tools: (a)
minimal models (Bezrukavnikov), (b) the choice of a complex structure on the
genus 1 surface, making it into an elliptic curve , and an appropriate flat
connection on the configuration space of points in (joint work of the
authors with D. Calaque). Following a suggestion by P. Deligne, we give an
interpretation of this isomorphism in the framework of the Riemann-Hilbert
correspondence, using the total space of an affine line bundle over ,
which identifies with the moduli space of line bundles over equipped with a
flat connection.Comment: 52 pages, dedicated to A.A. Kirillov on his 81th birthda
Modern CACSD using the Robust-Control Toolbox
The Robust-Control Toolbox is a collection of 40 M-files which extend the capability of PC/PRO-MATLAB to do modern multivariable robust control system design. Included are robust analysis tools like singular values and structured singular values, robust synthesis tools like continuous/discrete H(exp 2)/H infinity synthesis and Linear Quadratic Gaussian Loop Transfer Recovery methods and a variety of robust model reduction tools such as Hankel approximation, balanced truncation and balanced stochastic truncation, etc. The capabilities of the toolbox are described and illustated with examples to show how easily they can be used in practice. Examples include structured singular value analysis, H infinity loop-shaping and large space structure model reduction
Cutting tracks, making CDs: a comparative study of audio time-correction techniques in the desktop age.
Producers have long sought to “tighten” studio performances. Software-based DAW’s now come with proprietary functions to facilitate this, but only the latest generation of platforms allow relative ease of use on longer takes. Each method has advantages and disadvantages in terms of ease and speed of use, transient preservation, implied subsequent workflow and (usually) unwanted artifacts. Whilst rhythmically consistent material with clear transients is readily controllable with contemporary tools, working with complex mixtures of note-values still presents a challenge and requires much user intervention.
This paper performs a comparative study of different audio quantize techniques on percussive material, often on rhythmically complex performances. It will seek to compare necessary methodologies and workflow implications through the use of several contemporary systems: Recycle, Pro Tools, Logic, Cubase, Live, Melodyne, and Nuendo. The current level of man-machine interaction will be explored, and the best features from each platform will be collated. A model for the future will be speculatively presented
Separating cyclic subgroups in graph products of groups
We prove that the property of being cyclic subgroup separable, that is having
all cyclic subgroups closed in the profinite topology, is preserved under
forming graph products.
Furthermore, we develop the tools to study the analogous question in the
pro- case. For a wide class of groups we show that the relevant cyclic
subgroups - which are called -isolated - are closed in the pro- topology
of the graph product. In particular, we show that every -isolated cyclic
subgroup of a right-angled Artin group is closed in the pro- topology, and
we fully characterise such subgroups.Comment: 37 pages, revised following referee's comments, to appear in Journal
of Algebr
Unix Tools for Application and System Profiling
Cílem této práce bylo demonstrování využití nástrojů pro profilování aplikací a systémů. Nejprve byly tyto nástroje nalezeny a nastudovány. Také byly rozděleny do kategorií podle jejich účelu. Dále byly tyto nástroje porovnány z hlediska složitosti použití a invazivnosti. Výsledkem bylo rozdělení do tří skupin podle míry složitosti použití nebo invazivnosti. Jako technologie, využité při vytváření modelů, byly zvoleny Apache server a NFS server. Pro vytvoření modelů těchto serverů bylo využito virtualizace systémů pomocí technologie hyper-v. Vytvořeny byly čtyři virtuální stroje. Jeden pro Apache server, další pro NFS server, třetí pro zrcadlení obsahu Apache serveru a poslední pro generování zátěže. Poslední částí této práce je demonstrace využití nalezených nástrojů na vytvořených modelových situacích.The main goal of this thesis was to demonstrate usage of tools for application and system profiling. Initially, these tools was found and studied. They was also divided into categories according to their purpose. After that, these tools was compared according to their complexity of use and invasiveness. As the result of this comparison, these tools was divided into three groups, that express measure of complexity and invasiveness. As technology, used for creating models, was chosen Apache server and NFS server. Virtualization by hyper-v technology was used for putting these models into operation. There was created four virtual machines. Fist one for Apache server, another one for NFS server. Third was for mirroring content of Apache server and the last one for load generation. The last part of this thesis was to demonstrate usage of found tools on the created models.
- …
