2,071 research outputs found

    Plain English or Plain Confusing

    Get PDF
    A presumption of perfection attaches to the pattern instructions that Missouri judges read to jurors in every civil case.\u27 Missouri law presumes that these instructions, as set forth in Missouri Approved Jury Instructions( MAI ) Fifth Edition, are not only infallible statements of the law, but also perfectly comprehensible to the average juror. Even if jurors in a given case complain that they do not understand a particular pattern instruction, the trial judge is without recourse, required to leave these instructions undisturbed even if a more understandable improvement might result. In contrast to the emphasis in many states upon legal accuracy,\u27 as well as the apparent equal emphasis of the MAI drafters upon both accuracy and comprehensibility, a wealth of empirical research and scholarly commentaries over the past twenty years have indicated that simplicity and comprehensibility need far more attention. Numerous studies have identified an ongoing nationwide problem of juror miscomprehension of pattern instructions. Other research efforts have taken the next step: searching for ways to make currently incomprehensible instructions understandable to jurors. Most of these studies have found significant improvement in juror comprehension levels when traditional pattern instructions are rewritten based upon psycholinguistic principles of simple English

    Post-Transcriptional Mechanisms of Neuronal Translational Control in Synaptic Plasticity

    Get PDF
    The dynamic complexity of synaptic function is matched by extensive multidimensional regulation of neuronal mRNA translation which is achieved by a number of post‐transcriptional mechanisms. The first key aspect of this regulatory capacity is mRNA distal trafficking through RNA‐binding proteins, which governs the transcriptomic composition of post‐synaptic compartments. Small non‐coding microRNA and associated machinery have the capacity to precisely coordinate neural gene networks in space and time by providing a flexible specificity dimension to translational regulation. This RNA‐guided subcellular fine‐tuning of protein synthesis is an exquisite mechanism used in neurons to exert control of synaptic properties. Emerging evidence also implicates brain‐enriched long non‐coding RNA and novel circular RNA in posttranscriptional regulation of gene expression through the modulation of both mRNA and miRNA functions, thereby exemplifying the complex nature of neuronal translation. Herein, we review current knowledge of these regulatory systems and analyse how these mechanisms of transcriptomic regulation may be linked together to achieve high‐order spatiotemporal control of post‐synaptic translation

    Personality differentiation by cognitive ability:An application of the moderated factor model

    Get PDF
    The personality differentiation hypothesis holds that at higher levels of intellectual ability, personality structure is more differentiated. We tested differentiation at the primary and global factor levels in the US standardisation sample of the 16PF5 (n = 10,261; 5124 male; mean age = 32.69 years (SD = 12.83 years). We used a novel combined item response theory and moderated factor model approach that overcomes many of the limitations of previous tests. We found moderation of latent factor variances in five of the fifteen primary personality traits of the 16PF. At the domain level, we found no evidence of personality differentiation in Extraversion, Self-Control, or Independence. We found evidence of moderated factor loadings consistent with the personality differentiation for Anxiety, and moderated factor loadings consistent with anti-differentiation for Tough-Mindedness. As differentiation was restricted to a few personality factors with small effect sizes, we conclude that there is only very limited support for the personality differentiation hypothesis

    Relations and Equivalences Between Circuit Lower Bounds and Karp-Lipton Theorems

    Get PDF
    A frontier open problem in circuit complexity is to prove P^{NP} is not in SIZE[n^k] for all k; this is a necessary intermediate step towards NP is not in P_{/poly}. Previously, for several classes containing P^{NP}, including NP^{NP}, ZPP^{NP}, and S_2 P, such lower bounds have been proved via Karp-Lipton-style Theorems: to prove C is not in SIZE[n^k] for all k, we show that C subset P_{/poly} implies a "collapse" D = C for some larger class D, where we already know D is not in SIZE[n^k] for all k. It seems obvious that one could take a different approach to prove circuit lower bounds for P^{NP} that does not require proving any Karp-Lipton-style theorems along the way. We show this intuition is wrong: (weak) Karp-Lipton-style theorems for P^{NP} are equivalent to fixed-polynomial size circuit lower bounds for P^{NP}. That is, P^{NP} is not in SIZE[n^k] for all k if and only if (NP subset P_{/poly} implies PH subset i.o.- P^{NP}_{/n}). Next, we present new consequences of the assumption NP subset P_{/poly}, towards proving similar results for NP circuit lower bounds. We show that under the assumption, fixed-polynomial circuit lower bounds for NP, nondeterministic polynomial-time derandomizations, and various fixed-polynomial time simulations of NP are all equivalent. Applying this equivalence, we show that circuit lower bounds for NP imply better Karp-Lipton collapses. That is, if NP is not in SIZE[n^k] for all k, then for all C in {Parity-P, PP, PSPACE, EXP}, C subset P_{/poly} implies C subset i.o.-NP_{/n^epsilon} for all epsilon > 0. Note that unconditionally, the collapses are only to MA and not NP. We also explore consequences of circuit lower bounds for a sparse language in NP. Among other results, we show if a polynomially-sparse NP language does not have n^{1+epsilon}-size circuits, then MA subset i.o.-NP_{/O(log n)}, MA subset i.o.-P^{NP[O(log n)]}, and NEXP is not in SIZE[2^{o(m)}]. Finally, we observe connections between these results and the "hardness magnification" phenomena described in recent works

    Experiments on causal exclusion

    Get PDF

    Experiments on causal exclusion

    Get PDF
    Intuitions play an important role in the debate on the causal status of high-level properties. For instance, Kim has claimed that his “exclusion argument” relies on “a perfectly intuitive ... understanding of the causal relation.” We report the results of three experiments examining whether laypeople really have the relevant intuitions. We find little support for Kim's view and the principles on which it relies. Instead, we find that lay- people are willing to count both a multiply realized property and its realizers as causes, and regard the sys- tematic overdetermination implied by this view as unproblematic

    Hearing, Cognitive Decline, and the Value of Hearing Interventions

    Get PDF
    The term “dementia” includes a wide array of diseases. Millions of Americans are affected by these diseases, especially with aging. Its prevalence makes dementia a candidate for exploratory research in understanding its various etiologies and cause-effect relationships in hopes of developing treatment. Numerous studies have been conducted in an attempt to discern whether a causal relationship exists between hearing loss and dementia, as hearing loss frequently precedes dementia. Some publications have reported a correlation between hearing loss treatment and a decreased dementia incidence rate. This review seeks to investigate the associations between hearing loss and dementia, the efficacy of hearing interventions as a preventative measure, and the potential for using these measures as treatment for dementia

    Adapting to climate change: Combining economics and geomorphology in coastal policy

    Get PDF
    How is climate change affecting our coastal environment? How can coastal communities adapt to sea level rise and increased storm risk? These questions have garnered tremendous interest from scientists and policy makers alike, as the dynamic coastal environment is particularly vulnerable to the impacts of climate change. Over half the world population lives and works in a coastal zone less than 120 miles wide, thereby being continuously affected by the changes in the coastal environment [6]. Housing markets are directly influenced by the physical processes that govern coastal systems. Beach towns like Oak Island in North Carolina (NC) face severe erosion, and the tax assesed value of one coastal property fell by 93% in 2007 [9]. With almost ninety percent of the sandy beaches in the US facing moderate to severe erosion [8], coastal communities often intervene to stabilize the shoreline and hold back the sea in order to protect coastal property and infrastructure. Beach nourishment, which is the process of rebuilding a beach by periodically replacing an eroding section of the beach with sand dredged from another location, is a policy for erosion control in many parts of the US Atlantic and Pacific coasts [3]. Beach nourishment projects in the United States are primarily federally funded and implemented by the Army Corps of Engineers (ACE) after a benefit-cost analysis. Benefits from beach nourishment include reduction in storm damage and recreational benefits from a wider beach. Costs would include the expected cost of construction, present value of periodic maintenance, and any external cost such as the environmental cost associated with a nourishment project (NOAA). Federal appropriations for nourishment totaled $787 million from 1995 to 2002 [10]. Human interventions to stabilize shorelines and physical coastal dynamics are strongly coupled. The value of the beach, in the form of storm protection and recreation amenities, is at least partly capitalized into property values. These beach values ultimately influence the benefit-cost analysis in support of shoreline stabilization policy, which, in turn, affects the shoreline dynamics. This paper explores the policy implications of this circularity. With a better understanding of the physical-economic feedbacks, policy makers can more effectively design climate change adaptation strategies. (PDF contains 4 pages

    Dependence of gene-by-environment interactions (GxE) on scaling:Comparing the use of sum scores, transformed sum scores and IRT scores for the phenotype in tests of GxE interaction

    Get PDF
    Estimates of gene–environment interactions (GxE) in behavior genetic models depend on how a phenotype is scaled. Inappropriately scaled phenotypes result in biased estimates of GxE and can sometimes even suggest GxE in the direction opposite to its true direction. Previously proposed solutions are mathematically complex, computationally demanding and may prove impractical for the substantive researcher. We, therefore, evaluated two simple-to-use alternatives: (1) straightforward non-linear transformation of sum scores and (2) factor scores from an appropriate item response theory (IRT) model. Within Purcell’s (2002) GxM framework, both alternatives provided less biased parameter estimates, and improved false and true positive rates than using a raw sum score. These approaches are, therefore, recommended over using raw sum scores in tests of GxE. Circumstances under which IRT factor scores versus transformed sum scores should be preferred are discussed
    • 

    corecore