673 research outputs found

    Constriction size distributions of granular filters: a numerical study

    Get PDF
    The retention capability of granular filters is controlled by the narrow constrictions connecting the voids within the filter. The theoretical justification for empirical filter rules used in practice includes consideration of an idealised soil fabric in which constrictions form between co-planar combinations of spherical filter particles. This idealised fabric has not been confirmed by experimental or numerical observations of real constrictions. This paper reports the results of direct, particle-scale measurement of the constriction size distribution (CSD) within virtual samples of granular filters created using the discrete-element method (DEM). A previously proposed analytical method that predicts the full CSD using inscribed circles to estimate constriction sizes is found to poorly predict the CSD for widely graded filters due to an over-idealisation of the soil fabric. The DEM data generated are used to explore quantitatively the influence of the coefficient of uniformity, particle size distribution and relative density of the filter on the CSD. For a given relative density CSDs form a narrow band of similarly shaped curves when normalised by characteristic filter diameters. This lends support to the practical use of characteristic diameters to assess filter retention capability

    The Achilles tendon Total Rupture Score is a responsive primary outcome measure:an evaluation of the Dutch version including minimally important change

    Get PDF
    PURPOSE: Aim of this study was to evaluate the responsiveness of the Dutch version of the Achilles tendon Total Rupture Score (ATRS-NL). METHODS: Patients (N = 47) completed the ATRS-NL at 3 and 6 months after Achilles tendon rupture (ATR). Additionally, they filled out the Euroqol-5D-5L (EQ-5D-5L) and Global Rating of Change Score (GRoC). Effect sizes (ES) and standardized response means (SRM) were calculated. The anchor-based method for determining the minimally important change (MIC) was used. GRoC and improvement on the items mobility and usual activities on the EQ-5D-5L served as external criteria. The scores on these anchors were used to categorize patients' physical functioning as improved or unchanged between 3 and 6 months after ATR. Receiver operating curve (ROC) analysis was performed, with the calculation of the area under the ROC curve (AUC) and the estimation of MIC values using the optimal cut-off points. RESULTS: There was a large change (ES: 1.58) and good responsiveness (SRM: 1.19) of the ATRS-NL between 3 and 6 months after ATR. Using ROC analysis, the MIC values ranged from 13.5 to 28.5 for reporting improvement on EQ-5D-5L mobility and GRoC, respectively. The AUC of improvement on mobility and improvement on GRoC were > 0.70. CONCLUSION: The ATRS-NL showed good responsiveness in ATR patients between 3 and 6 months after injury. Use of this questionnaire is recommended in clinical follow-up and longitudinal research of ATR patients. MIC values of 13.5 and 28.5 are recommended to consider ATR patients as improved and greatly improved between 3 and 6 months after ATR. LEVEL OF EVIDENCE: II

    Special section on advances in reachability analysis and decision procedures: contributions to abstraction-based system verification

    No full text
    Reachability analysis asks whether a system can evolve from legitimate initial states to unsafe states. It is thus a fundamental tool in the validation of computational systems - be they software, hardware, or a combination thereof. We recall a standard approach for reachability analysis, which captures the system in a transition system, forms another transition system as an over-approximation, and performs an incremental fixed-point computation on that over-approximation to determine whether unsafe states can be reached. We show this method to be sound for proving the absence of errors, and discuss its limitations for proving the presence of errors, as well as some means of addressing this limitation. We then sketch how program annotations for data integrity constraints and interface specifications - as in Bertrand Meyers paradigm of Design by Contract - can facilitate the validation of modular programs, e.g., by obtaining more precise verification conditions for software verification supported by automated theorem proving. Then we recap how the decision problem of satisfiability for formulae of logics with theories - e.g., bit-vector arithmetic - can be used to construct an over-approximating transition system for a program. Programs with data types comprised of bit-vectors of finite width require bespoke decision procedures for satisfiability. Finite-width data types challenge the reduction of that decision problem to one that off-the-shelf tools can solve effectively, e.g., SAT solvers for propositional logic. In that context, we recall the Tseitin encoding which converts formulae from that logic into conjunctive normal form - the standard format for most SAT solvers - with only linear blow-up in the size of the formula, but linear increase in the number of variables. Finally, we discuss the contributions that the three papers in this special section make in the areas that we sketched above. © Springer-Verlag 2009

    Heart Failure and Pancreas Exocrine Insufficiency:Pathophysiological Mechanisms and Clinical Point of View

    Get PDF
    Heart failure is associated with decreased tissue perfusion and increased venous congestion that may result in organ dysfunction. This dysfunction has been investigated extensively for many organs, but data regarding pancreatic (exocrine) dysfunction are scarce. In the present review we will discuss the available data on the mechanisms of pancreatic damage, how heart failure can lead to exocrine dysfunction, and its clinical consequences. We will show that heart failure causes significant impairment of pancreatic exocrine function, particularly in the elderly, which may exacerbate the clinical syndrome of heart failure. In addition, pancreatic exocrine insufficiency may lead to further deterioration of cardiovascular disease and heart failure, thus constituting a true vicious circle. We aim to provide insight into the pathophysiological mechanisms that constitute this reciprocal relation. Finally, novel treatment options for pancreatic dysfunction in heart failure are discussed

    Exploring patient satisfaction after operative and nonoperative treatment for midshaft clavicle fractures:a focus group analysis

    Get PDF
    Background: There is no consensus on the optimal treatment for displaced midshaft clavicle fractures. Several studies indicate superior patient satisfaction in favour of operative reconstruction. It is unknown what drives superior satisfaction in this treatment group. The aim of this study was to explore patient satisfaction and identify contributors to patient satisfaction after operative and nonoperative treatment for displaced midshaft clavicle fractures in adults using a focus group approach. Methods: Four face-to-face and two web-based focus groups were hosted. A total of 24 participants who were treated nonoperatively (n = 14) or operatively (n = 10) agreed to participate. Participants were selected using purposive sampling, ensuring variation in gender, age, treatment complications and outcomes. A question script was developed to systematically explore patient expectations, attitudes and satisfaction with different dimensions of care. All focus groups were voice-recorded and transcribed at verbatim. Thematic analysis was conducted on all face-to-face and web-based transcripts. Results: The main emerging themes across treatment groups were; need for more information, functional recovery, speed of recovery and patient-doctor interaction. There was no difference in themes observed between operative and nonoperative focus groups. The lack of information was the most important complaint in dissatisfied patients. Conclusion: Our study shows that informing patients about their injury, treatment options and expectations for recovery is paramount for overall patient satisfaction after treatment for a displaced midshaft clavicle fracture. Level of evidence: Level III, focus group study. </p

    Computing Nash Equilibrium in Wireless Ad Hoc Networks: A Simulation-Based Approach

    Full text link
    This paper studies the problem of computing Nash equilibrium in wireless networks modeled by Weighted Timed Automata. Such formalism comes together with a logic that can be used to describe complex features such as timed energy constraints. Our contribution is a method for solving this problem using Statistical Model Checking. The method has been implemented in UPPAAL model checker and has been applied to the analysis of Aloha CSMA/CD and IEEE 802.15.4 CSMA/CA protocols.Comment: In Proceedings IWIGP 2012, arXiv:1202.422

    Prioritizing Stream Barrier Removal to Maximize Connected Aquatic Habitat and Minimize Water Scarcity

    Get PDF
    Instream barriers, such as dams, culverts, and diversions, alter hydrologic processes and aquatic habitat. Removing uneconomical and aging instream barriers is increasingly used for river restoration. Historically, selection of barrier removal projects used score‐and‐rank techniques, ignoring cumulative change and the spatial structure of stream networks. Likewise, most water supply models prioritize either human water uses or aquatic habitat, failing to incorporate both human and environmental water use benefits. Here, a dual‐objective optimization model identifies barriers to remove that maximize connected aquatic habitat and minimize water scarcity. Aquatic habitat is measured using monthly average streamflow, temperature, channel gradient, and geomorphic condition as indicators of aquatic habitat suitability. Water scarcity costs are minimized using economic penalty functions while a budget constraint specifies the money available to remove barriers. We demonstrate the approach using a case study in Utah\u27s Weber Basin to prioritize removal of instream barriers for Bonneville cutthroat trout, while maintaining human water uses. Removing 54 instream barriers reconnects about 160 km of quality‐weighted habitat and costs approximately US$10 M. After this point, the cost‐effectiveness of removing barriers to connect river habitat decreases. The modeling approach expands barrier removal optimization methods by explicitly including both economic and environmental water uses
    • 

    corecore