492 research outputs found
Recommended from our members
Optimal Systolic Blood Pressure Target, Time-to-Intensification and Time-to-Follow-up in the Treatment of Hypertension
Objective:
I sought to determine the systolic intensification threshold, time-to-intensification and time-to-follow-up associated with the lowest risk of cardiovascular events or death in primary care patients with hypertension.
Methods:
A retrospective cohort study of 88,756 patients was performed. Systolic intensification threshold, time-to-intensification and time-to-follow-up were analyzed with respect to risk of acute cardiovascular event or death. The Cox model was adjusted for age, sex, smoking status, socioeconomic deprivation, history of diabetes, cardiovascular disease or CKD, Charlson Comorbidity Index, BMI, medication possession ratio, and baseline blood pressure.
Results:
During median follow-up of 37.4 months, 9,985 participants experienced acute cardiovascular event or death (11.3%). Systolic intensification thresholds of 130-150 mmHg were associated with no difference in risk, while higher thresholds were associated with progressively greater risk. Risk increased progressively from the lowest (0-1.4 months) to the highest quintile of time to medication intensification. The highest quintile of time to-follow-up (>2.7 months) was also associated with increased risk.
Conclusions:
Systolic intensification threshold higher than 150 mmHg, delays of greater than 1.4 months before medication intensification following systolic blood pressure elevation, and delays of greater than 2.7 months before blood pressure follow-up following medication intensification were associated with increased risk for acute cardiovascular events or death
Failure Prediction of Carbon-Fibre Epoxy Composite Subsea Flowline Using Response Surface Methodology
Master's thesis in Offshore technologyCarbon Fibre Epoxy Composite is a relatively new material utilized in the subsea oil and gas industry. The fibre reinforced composite is competitive for its high strength-to-weight ratio and good corrosion resistance. This thesis addresses the stochastic process applied to the CFEC flowline using response surface methodology. The material properties, geometries, and loadings are considered as the input parameters of the finite element model of the CFEC flowline, while failure criteria are output parameters. To have a better understanding of which parameter will affect the results, studies of correlation matrices are performed. Input parameters with higher correlation coefficients are identified and chosen to generate the response surfaces. A stochastic process which requires the large size of “measured results” can be substituted by approximate “response values”. The accuracy of the response surface is an essential issue that determines whether the approximate results are meaningful. Many factors that will affect the quality of the response surfaces, i.e. response surface type, number of selected parameters, size of the response surface, etc. Comparison studies about these possible factors are discussed in this thesis.
It is found that parameters that have correlation coefficients larger than a level should be selected for response surface generation. More parameters selected will increase both the time of generating response surface and the accuracy, while if extremely few parameters are selected, i.e. five, the accuracy will be significantly affected. Larger response surface size will slightly reduce the accuracy of response values, so the use of larger size becomes available, which can be utilized by more design cases. It is noted that the sample populations should avoid centralized at the boundary of response surfaces. With these approaches, the efficiency of using the response surface methodology in composite flowline design can be improved, where the percentage differences of predicted exceeding probabilities are usually below 10%. Based on these findings, a safety factor can be defined and used to describe these uncertainties
INTERVENTION EFFECT OF SENSORY INTEGRATION TRAINING ON THE BEHAVIORS AND QUALITY OF LIFE OF CHILDREN WITH AUTISM
Background: Autism is a widespread developmental disorder that occurs mostly among children. Children with autism are prone to problematic behaviors due to their deficiencies in language communication and social development. Thus, children with a high degree of autism suffer lower life satisfaction. Moreover, sensory integration dysfunction is closely related to autism. Therefore, the effect of Sensory Integration Training (SIT) on the behaviors and quality of life of children with autism was explored in this study.
Subjects and methods: From September 2017 to December 2018, 108 patients from Fuzhou Fourth Hospital and Xiangtan Fifth Hospital were included in the intervention group (group A) and the control group (group B), with 54 members in each group. The 54 members in group B, with an average age of 5.18±2.94, received routine treatment. In addition to the same routine treatment, the members in group B also received sensory integration training and physical exercise intervention, which lasted for three months. The Childhood Autism Rating Scale (CARS) and Autism Behavior Checklist (ABC) were used before and after the intervention experiment to evaluate the curative effect.
Results: After the treatment, statistically significant differences were observed in the CARS and ABC scores (P<0.05); the total effective rate was 86.11% in group A and 64.10% in group B. The difference in the CARS score was statistically significant (P<0.05), whereas the difference in the ABC score was also statistically significant (P<0.05). In general, the difference in CARS is statistically significant. Specifically, group A is better than group B, t=3.492, df=73, and bilateral P=0.001<0.01.
Conclusions: SIT intervention had a certain effect on autism and is of great value for the future development of SIT courses or intervention programs for children with autism
Using the Kriging Response Surface Method for the Estimation of Failure Values of Carbon-Fibre-Epoxy Subsea Composite Flowlines under the Influence of Stochastic Processes
This paper investigates the use of the Kriging response surface method to estimate failure values in carbon-fibre-epoxy composite flow-lines under the influence of stochastic processes. A case study of a 125 mm flow-line was investigated. The maximum stress, Tsai-Wu and Hashin failure criteria was used to assess the burst design under combined loading with axial forces, torsion and bending moments. An extensive set of measured values was generated using Monte Carlo simulation and used as the base case population to which the results from the response surfaces was compared. The response surfaces were evaluated in detail in their ability to reproduce the statistical moments, probability and cumulative distributions and failure values at low probabilities of failure. In addition, the optimisation of the response surface calculation was investigated in terms of reducing the number of input parameters and size of the response surface. Finally, a decision chart that can be used to build a response surface to calculate failures in a carbon fibre-epoxy-composite (CFEC) flow-line was proposed based on the findings obtained. The results show that the response surface method is suitable and can calculate failure values close to that calculated using a large set of measured values. The results from this paper provide an analytical framework for identifying the principal design parameters, response surface generation, and failure prediction for CFEC flow-lines.publishedVersio
Strategic management of operational resources under uncertainty
This dissertation addresses and answers two questions: (1) What are the impacts of market uncertainty and technological uncertainty? (2) What is the best way for a firm to manage demand information and technological knowledge in the face of competition?
The first essay (Chapter 2) investigates a problem of competitive investment with payoff externalities and uncertain but partially observable profitability. This essay examines a duopoly game, which, under appropriate conditions, reduces to a war of attrition game in the sense that both firms have incentives to be the follower. We find that due to the strategic interactions, payoff externalities and learning opportunities have counterintuitive effects on investment strategies and on the time to the first investment. In particular, we find that an increase in the rate of learning, which usually benefits the follower, may hasten or delay the first investment depending on the rate of learning and the prior probability that the investment is profitable. Overall, the results of this chapter suggest that firms facing entry into an unproven market need to consider the strategic effects arising from learning and externalities.
The second essay (Chapter 3) investigates the strategy of investment in R&D projects when completion time of R&D is uncertain. By examining a game theoretic model of two firms competitively engaged in R&D projects, we find that the more innovative firm may or may not have an incentive to unilaterally share technological knowledge with its opponent; the result de- pends on the more innovative firm's tradeoff between reduction of competitive pressure and reduction of the competitor’s imitation. A direct implication of this result is that a firm may achieve superior performance by strategically managing its technological knowledge without incurring cost.
The third essay (Chapter 4) investigates a problem of competitive investment in R&D projects to examine (1) the impacts of uncertainties and (2) the strategies of managing demand information and technological knowledge. We find that market uncertainty can improve or diminish a firm's payoff due to strategic interactions between firms and the interplay of learning effects and externalities. Our results also indicate that technological uncertainty can alter the relationship between the time to completion and the fierceness of competition. More specifically, we find that an increase in the time to completion may or may not increase the fierceness of the competition. Lastly, this essay compares the impact of disclosing demand information and that of disclosing technological knowledge. The results show that disclosing technological knowledge can only improve a firm's ex-ante payoff, whereas disclosing demand information can improve both the ex-ante and ex-post payoffs. Hence, our results indicate that the disclosed contents and the time to disclose are important when firms consider voluntary disclosure to reduce competition
Tetraaquabis[4-(4H-1,2,4-triazol-4-yl)benzoato-κN 1]copper(II) dihydrate
In the title compound, [Cu(C9H6N3O2)2(H2O)4]·2H2O, the CuII atom lies on an inversion center and is six-coordinated by two N atoms from two 4-(1,2,4-triazol-4-yl)benzoate ligands and four water molecules in a distorted octahedral geometry. In the crystal, intermolecular O—H⋯O hydrogen bonds lead to a three-dimensional supramolecular network. Intramolecular O—H⋯N hydrogen bonds and π–π interactions between the benzene rings and between the benzene and triazole rings [centroid–centroid distances = 3.657 (1) and 3.752 (1) Å] are observed
Intersection-free Robot Manipulation with Soft-Rigid Coupled Incremental Potential Contact
This paper presents a novel simulation platform, ZeMa, designed for robotic
manipulation tasks concerning soft objects. Such simulation ideally requires
three properties: two-way soft-rigid coupling, intersection-free guarantees,
and frictional contact modeling, with acceptable runtime suitable for deep and
reinforcement learning tasks. Current simulators often satisfy only a subset of
these needs, primarily focusing on distinct rigid-rigid or soft-soft
interactions. The proposed ZeMa prioritizes physical accuracy and integrates
the incremental potential contact method, offering unified dynamics simulation
for both soft and rigid objects. It efficiently manages soft-rigid contact,
operating 75x faster than baseline tools with similar methodologies like
IPC-GraspSim. To demonstrate its applicability, we employ it for parallel grasp
generation, penetrated grasp repair, and reinforcement learning for grasping,
successfully transferring the trained RL policy to real-world scenarios
How Early Participation Determines Long-Term Sustained Activity in GitHub Projects?
Although the open source model bears many advantages in software development,
open source projects are always hard to sustain. Previous research on open
source sustainability mainly focuses on projects that have already reached a
certain level of maturity (e.g., with communities, releases, and downstream
projects). However, limited attention is paid to the development of
(sustainable) open source projects in their infancy, and we believe an
understanding of early sustainability determinants is crucial for project
initiators, incubators, newcomers, and users.
In this paper, we aim to explore the relationship between early participation
factors and long-term project sustainability. We leverage a novel methodology
combining the Blumberg model of performance and machine learning to predict the
sustainability of 290,255 GitHub projects. Specificially, we train an XGBoost
model based on early participation (first three months of activity) in 290,255
GitHub projects and we interpret the model using LIME. We quantitatively show
that early participants have a positive effect on project's future sustained
activity if they have prior experience in OSS project incubation and
demonstrate concentrated focus and steady commitment. Participation from
non-code contributors and detailed contribution documentation also promote
project's sustained activity. Compared with individual projects, building a
community that consists of more experienced core developers and more active
peripheral developers is important for organizational projects. This study
provides unique insights into the incubation and recognition of sustainable
open source projects, and our interpretable prediction approach can also offer
guidance to open source project initiators and newcomers.Comment: The 31st ACM Joint European Software Engineering Conference and
Symposium on the Foundations of Software Engineering (ESEC/FSE 2023
Automatic Detection of Alzheimer's Disease with Multi-Modal Fusion of Clinical MRI Scans
The aging population of the U.S. drives the prevalence of Alzheimer's
disease. Brookmeyer et al. forecasts approximately 15 million Americans will
have either clinical AD or mild cognitive impairment by 2060. In response to
this urgent call, methods for early detection of Alzheimer's disease have been
developed for prevention and pre-treatment. Notably, literature on the
application of deep learning in the automatic detection of the disease has been
proliferating. This study builds upon previous literature and maintains a focus
on leveraging multi-modal information to enhance automatic detection. We aim to
predict the stage of the disease - Cognitively Normal (CN), Mildly Cognitive
Impairment (MCI), and Alzheimer's Disease (AD), based on two different types of
brain MRI scans. We design an AlexNet-based deep learning model that learns the
synergy of complementary information from both T1 and FLAIR MRI scans
- …