2,003 research outputs found
African-American patients with cancer Talking About Clinical Trials (TACT) with oncologists during consultations: evaluating the efficacy of tailored health messages in a randomised controlled trialâthe TACT study protocol
Introduction Low rates of accrual of African-American (AA) patients with cancer to therapeutic clinical trials (CTs) represent a serious and modifiable racial disparity in healthcare that impedes the development of promising cancer therapies. Suboptimal physicianâpatient consultation communication is a barrier to the accrual of patients with cancer of any race, but communication difficulties are compounded with AA patients. Providing tailored health messages (THM) to AA patients and their physician about CTs has the potential to improve communication, lower barriers to accrual and ameliorate health disparities. Objective (1) Demonstrate the efficacy of THM to increase patient activation as measured by direct observation. (2) Demonstrate the efficacy of THM to improve patient outcomes associated with barriers to AA participation. (3) Explore associations among preconsultation levels of: (A) trust in medical researchers, (B) knowledge and attitudes towards CTs, (C) patient-family member congruence in decision-making, and (D) involvement/information preferences, and group assignment. Methods and analysis First, using established methods, we will develop THM materials. Second, the efficacy of the intervention is determined in a 2 by 2 factorial randomised controlled trial to test the effectiveness of (1) providing 357 AA patients with cancer with THM with 2 different âdepthsâ of tailoring and (2) either providing feedback to oncologists about the patients\u27 trial THM or not. The primary analysis compares patient engaged communication in 4 groups preconsultation and postconsultation. Ethics and dissemination This study was approved by the Virginia Commonwealth University Institutional Review Board. To facilitate use of the THM intervention in diverse settings, we will convene âuser groupsâ at 3 major US cancer centres. To facilitate dissemination, we will post all materials and the implementation guide in publicly available locations
Toward a Better Benchmark: Assessing the Utility of Not-at-Fault Traffic Crash Data in Racial Profiling Research
As studies on racial profiling and biased policing have begun to proliferate, researchers are debating which benchmark is most appropriate for comparison with police traffic stop data. Existing benchmark populations, which include populations estimated from census figures, licensed drivers, arrestees, reported crime suspects, and observed drivers and traffic violators, all have significant limitations. This article offers a new, alternative benchmark for police traffic stops, a benchmark that has not been previously applied or tested in a racial profiling research setting. The analysis presented compares traffic observation data, gathered at selected, high volume intersections during an ongoing racial profiling study in Miami-Dade County, Florida, to not-at-fault driver demographic data from two-vehicle crashes at those same intersections. Findings indicate that non-responsible drivers in two-vehicle crashes appear to represent a reasonably accurate estimate of the racial composition of drivers on the roadways at selected intersections and within areas of varying racial composition. The implications of this finding for racial profiling research are discussed, and suggested areas for future inquiry are identified
Trainâtheâtrainer: Methodology to learn the cognitive interview
Research has indicated that police may not receive enough training in interviewing cooperative witnesses, specifically in use of the cognitive interview (CI). Practically, for the CI to be effective in realâworld investigations, police investigators must be trained by law enforcement trainers. We conducted a threeâphase experiment to examine the feasibility of training experienced law enforcement trainers who would then train others to conduct the CI. We instructed Federal Bureau of Investigation and local law enforcement trainers about the CI (Phase I); law enforcement trainers from both agencies (n = 4, 100% male, mean age = 50 years) instructed university students (n = 25, 59% female, mean age = 21 years) to conduct either the CI or a standard law enforcement interview (Phase II); the student interviewers then interviewed other student witnesses (n = 50, 73% female, mean age = 22 years), who had watched a simulated crime (phase III). Compared with standard training, interviews conducted by those trained by CIâtrained instructors contained more information and at a higher accuracy rate and with fewer suggestive questions.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/147804/1/jip1518_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/147804/2/jip1518.pd
The Peculiar Phase Structure of Random Graph Bisection
The mincut graph bisection problem involves partitioning the n vertices of a
graph into disjoint subsets, each containing exactly n/2 vertices, while
minimizing the number of "cut" edges with an endpoint in each subset. When
considered over sparse random graphs, the phase structure of the graph
bisection problem displays certain familiar properties, but also some
surprises. It is known that when the mean degree is below the critical value of
2 log 2, the cutsize is zero with high probability. We study how the minimum
cutsize increases with mean degree above this critical threshold, finding a new
analytical upper bound that improves considerably upon previous bounds.
Combined with recent results on expander graphs, our bound suggests the unusual
scenario that random graph bisection is replica symmetric up to and beyond the
critical threshold, with a replica symmetry breaking transition possibly taking
place above the threshold. An intriguing algorithmic consequence is that
although the problem is NP-hard, we can find near-optimal cutsizes (whose ratio
to the optimal value approaches 1 asymptotically) in polynomial time for
typical instances near the phase transition.Comment: substantially revised section 2, changed figures 3, 4 and 6, made
minor stylistic changes and added reference
The ubiquity of phenotypic plasticity in plants: A synthesis
Ecology and Evolution Published by John Wiley & Sons Ltd. Adaptation to heterogeneous environments can occur via phenotypic plasticity, but how often this occurs is unknown. Reciprocal transplant studies provide a rich dataset to address this issue in plant populations because they allow for a determination of the prevalence of plastic versus canalized responses. From 31 reciprocal transplant studies, we quantified the frequency of five possible evolutionary patterns: (1) canalized response-no differentiation: no plasticity, the mean phenotypes of the populations are not different; (2) canalized response-population differentiation: no plasticity, the mean phenotypes of the populations are different; (3) perfect adaptive plasticity: plastic responses with similar reaction norms between populations; (4) adaptive plasticity: plastic responses with parallel, but not congruent reaction norms between populations; and (5) nonadaptive plasticity: plastic responses with differences in the slope of the reaction norms. The analysis included 362 records: 50.8% life-history traits, 43.6% morphological traits, and 5.5% physiological traits. Across all traits, 52% of the trait records were not plastic, and either showed no difference in means across sites (17%) or differed among sites (83%). Among the 48% of trait records that showed some sort of plasticity, 49.4% showed perfect adaptive plasticity, 19.5% adaptive plasticity, and 31% nonadaptive plasticity. These results suggest that canalized responses are more common than adaptive plasticity as an evolutionary response to environmental heterogeneity
Efficient Resolution of Anisotropic Structures
We highlight some recent new delevelopments concerning the sparse
representation of possibly high-dimensional functions exhibiting strong
anisotropic features and low regularity in isotropic Sobolev or Besov scales.
Specifically, we focus on the solution of transport equations which exhibit
propagation of singularities where, additionally, high-dimensionality enters
when the convection field, and hence the solutions, depend on parameters
varying over some compact set. Important constituents of our approach are
directionally adaptive discretization concepts motivated by compactly supported
shearlet systems, and well-conditioned stable variational formulations that
support trial spaces with anisotropic refinements with arbitrary
directionalities. We prove that they provide tight error-residual relations
which are used to contrive rigorously founded adaptive refinement schemes which
converge in . Moreover, in the context of parameter dependent problems we
discuss two approaches serving different purposes and working under different
regularity assumptions. For frequent query problems, making essential use of
the novel well-conditioned variational formulations, a new Reduced Basis Method
is outlined which exhibits a certain rate-optimal performance for indefinite,
unsymmetric or singularly perturbed problems. For the radiative transfer
problem with scattering a sparse tensor method is presented which mitigates or
even overcomes the curse of dimensionality under suitable (so far still
isotropic) regularity assumptions. Numerical examples for both methods
illustrate the theoretical findings
Extremal Optimization for Graph Partitioning
Extremal optimization is a new general-purpose method for approximating
solutions to hard optimization problems. We study the method in detail by way
of the NP-hard graph partitioning problem. We discuss the scaling behavior of
extremal optimization, focusing on the convergence of the average run as a
function of runtime and system size. The method has a single free parameter,
which we determine numerically and justify using a simple argument. Our
numerical results demonstrate that on random graphs, extremal optimization
maintains consistent accuracy for increasing system sizes, with an
approximation error decreasing over runtime roughly as a power law t^(-0.4). On
geometrically structured graphs, the scaling of results from the average run
suggests that these are far from optimal, with large fluctuations between
individual trials. But when only the best runs are considered, results
consistent with theoretical arguments are recovered.Comment: 34 pages, RevTex4, 1 table and 20 ps-figures included, related papers
available at http://www.physics.emory.edu/faculty/boettcher
- âŠ