70 research outputs found
Characterising Place by Scene Depth
Turner and Penn introduced the notion of integration of isovist fields as a means to understand such fields syntactically - as a set of components with a structural
relationship to a global whole (1999). This research was further refined to put forward the concept of visibility graph analysis (VGA) as a tool for architectural analysis (Turner, Doxa, O’sullivan, & Penn, 2001), which has become widely used. We suggest a complementary method of characterising place that does not make use of integration or a graph yet which allows - as visibility graph analysis does - discrete view points to be dimensioned in relation to a set of such viewpoints. In our method, Principal Component Analysis (PCA), a statistical technique, is employed to infer salient characteristics of a set of views and then to situate these component views within a low dimensional space in order to compare the extent to which each
view corresponds to these characteristics. We demonstrate the method by reference to two distinct urban areas with differing spatial characteristics. Because PCA
operates on vectors, order of the data has important implications. We consider some of these implications including view orientation and chirality (handedness) and
assess the variance of results with regard to these factors
Leaders Like Me
The Workshop Program at the University of Rochester infuses collaborative learning into a variety of introductory STEM and non-STEM courses through small, weekly, peer-led problem solving sessions called Workshops. Decades of data from these Workshops indicate that 1) African American, Black, Hispanic, and Latinx students are less likely to attend them than White and Asian students and 2) that every additional Workshop students attend improves their final course grades, even if they only miss a single Workshop out of the 13 or 14 that are offered each semester. To address this situation, the UR Workshop Program has partnered with the People Like Me project at Bucknell University. Before the start of the Fall 2018 semester, Workshop leaders were asked to respond to the People Like Me survey questions, and we crafted their responses into profiles. We then posted these profiles for students in the courses to view on a platform on which we could track those views at the individual student level. In this work-in-progress, we hope to answer the question: to what extent does viewing personal information about Workshop leaders affect students\u27 likelihood to attend Workshops
Form finding nodal connections in grid structure
Nodes for grid structures are often manufactured in a rather material intensive and inefficient way,
increasing the weight of the structure and thus the load. Recent development of additive manufacturing
techniques, have resulted in a rising interest in large-scale metal 3D printing. Topology optimization
has become the obvious companion in the design of structural parts for 3D printing, and rightfully so.
The technique is demonstrably able to provide material efficient solutions and is well suited for a
manufacturing technique with few formal restrictions. However, from a designer’s perspective one
could argue that topology optimization have some limitations. Like other “automated processes”, it
tends to take over and does not leave much room for other form drivers.
This paper presents an alternative method for designing material efficient nodes in grid structures that
builds on the conventional form-finding techniques, usually applied to create minimal surface tensile
structures or gravity shell like structures. The technique works by modelling the node as a hollow shell
with a mesh, applying a set of tensile forces derived from the structural action from elements adjacent
to the node (where compression is converted to tension) and running a form finding simulation. After
the simulation, the shell is then thickened and analysed for the real load case (which consider both
tension and compression) using FE-analysis.
The benefit of such technique is that the designer has control over the topology of the design which
enables more creative control and free exploration of a range of design variations. The form finding is
done using dynamic relaxation and introduces spline elements with bending capability to control
deviation from the pure spring network solution
Quick assessment of hopelessness: a cross-sectional study
BACKGROUND: Lengthy questionnaires reduce data quality and impose a burden on respondents. Previous researchers proposed that a single item ("My future seems dark to me") and a 4-item component of the Beck's Hopelessness Scale (BHS) can summarise most of the information the BHS provides. There is no clear indication of what BHS cutoff values are useful in identifying people with suicide tendency. METHODS: In a population-based study of Chinese people aged between 15 and 59 in Hong Kong, the Chinese version of the BHS and the Centre for Epidemiologic Studies – Depression scale were administered by trained interviewers and suicidal ideation and suicidal attempts were self-reported. Receiver operating characteristics curve analysis and regression analysis were used to compare the performance of the BHS and its components in identifying people with suicidality and depression. Smoothed level of suicidal tendency was assessed in relation to scores on the BHS and its component to identify thresholds. RESULTS: It is found that the 4-item component and, to a lesser extent, the single item of the BHS perform in ways similar to the BHS. There are non-linear relationship between suicidality and scores on the BHS and the 4-item component; cutoff values identified accordingly have sensitivity and specificity of about 65%. CONCLUSION: The 4-item component is a useful alternative to the BHS. Shortening of psycho-social measurement scales should be considered in order to reduce burden on patients or respondents and to improve response rate
Multiple dimensions of health locus of control in a representative population sample: ordinal factor analysis and cross-validation of an existing three and a new four factor model
<p>Abstract</p> <p>Background</p> <p>Based on the general approach of locus of control, health locus of control (HLOC) concerns control-beliefs due to illness, sickness and health. HLOC research results provide an improved understanding of health related behaviour and patients' compliance in medical care. HLOC research distinguishes between beliefs due to Internality, Externality powerful Others (POs) and Externality Chance. However, evidences for differentiating the POs dimension were found. Previous factor analyses used selected and predominantly clinical samples, while non-clinical studies are rare. The present study is the first analysis of the HLOC structure based on a large representative general population sample providing important information for non-clinical research and public health care.</p> <p>Methods</p> <p>The standardised German questionnaire which assesses HLOC was used in a representative adult general population sample for a region in Northern Germany (N = 4,075). Data analyses used ordinal factor analyses in LISREL and Mplus. Alternative theory-driven models with one to four latent variables were compared using confirmatory factor analysis. Fit indices, chi-square difference tests, residuals and factor loadings were considered for model comparison. Exploratory factor analysis was used for further model development. Results were cross-validated splitting the total sample randomly and using the cross-validation index.</p> <p>Results</p> <p>A model with four latent variables (Internality, Formal Help, Informal Help and Chance) best represented the HLOC construct (three-dimensional model: normed chi-square = 9.55; RMSEA = 0.066; CFI = 0.931; SRMR = 0.075; four-dimensional model: normed chi-square = 8.65; RMSEA = 0.062; CFI = 0.940; SRMR = 0.071; chi-square difference test: p < 0.001). After excluding one item, the superiority of the four- over the three-dimensional HLOC construct became very obvious (three-dimensional model: normed chi-square = 7.74; RMSEA = 0.059; CFI = 0.950; SRMR = 0.079; four-dimensional model: normed chi-square = 5.75; RMSEA = 0.049; CFI = 0.965; SRMR = 0.065; chi-square difference test: p < 0.001). Results were confirmed by cross-validation. Results based on our large community sample indicated that western general populations separate health-related control-beliefs concerning formal and informal assistance.</p> <p>Conclusions</p> <p>Future non-clinical HLOC studies in western cultures should consider four dimensions of HLOC: Internality, Formal Help, Informal Help and Chance. However, the standardised German instrument needs modification. Therefore, confirmation of our results may be useful. Future research should compare HLOC structure between clinical and non-clinical samples as well as cross-culturally.</p
Antiinflammatory Therapy with Canakinumab for Atherosclerotic Disease
Background: Experimental and clinical data suggest that reducing inflammation without affecting lipid levels may reduce the risk of cardiovascular disease. Yet, the inflammatory hypothesis of atherothrombosis has remained unproved. Methods: We conducted a randomized, double-blind trial of canakinumab, a therapeutic monoclonal antibody targeting interleukin-1β, involving 10,061 patients with previous myocardial infarction and a high-sensitivity C-reactive protein level of 2 mg or more per liter. The trial compared three doses of canakinumab (50 mg, 150 mg, and 300 mg, administered subcutaneously every 3 months) with placebo. The primary efficacy end point was nonfatal myocardial infarction, nonfatal stroke, or cardiovascular death. RESULTS: At 48 months, the median reduction from baseline in the high-sensitivity C-reactive protein level was 26 percentage points greater in the group that received the 50-mg dose of canakinumab, 37 percentage points greater in the 150-mg group, and 41 percentage points greater in the 300-mg group than in the placebo group. Canakinumab did not reduce lipid levels from baseline. At a median follow-up of 3.7 years, the incidence rate for the primary end point was 4.50 events per 100 person-years in the placebo group, 4.11 events per 100 person-years in the 50-mg group, 3.86 events per 100 person-years in the 150-mg group, and 3.90 events per 100 person-years in the 300-mg group. The hazard ratios as compared with placebo were as follows: in the 50-mg group, 0.93 (95% confidence interval [CI], 0.80 to 1.07; P = 0.30); in the 150-mg group, 0.85 (95% CI, 0.74 to 0.98; P = 0.021); and in the 300-mg group, 0.86 (95% CI, 0.75 to 0.99; P = 0.031). The 150-mg dose, but not the other doses, met the prespecified multiplicity-adjusted threshold for statistical significance for the primary end point and the secondary end point that additionally included hospitalization for unstable angina that led to urgent revascularization (hazard ratio vs. placebo, 0.83; 95% CI, 0.73 to 0.95; P = 0.005). Canakinumab was associated with a higher incidence of fatal infection than was placebo. There was no significant difference in all-cause mortality (hazard ratio for all canakinumab doses vs. placebo, 0.94; 95% CI, 0.83 to 1.06; P = 0.31). Conclusions: Antiinflammatory therapy targeting the interleukin-1β innate immunity pathway with canakinumab at a dose of 150 mg every 3 months led to a significantly lower rate of recurrent cardiovascular events than placebo, independent of lipid-level lowering. (Funded by Novartis; CANTOS ClinicalTrials.gov number, NCT01327846.
- …