242 research outputs found

    Currencies and Prices in 3rd and 4th Century Palestine and Their Implications for Roman Economic History.

    Get PDF
    The following study is an attempt to throw further light on Roman economic history of the III and IV cents. by drawing upon the Palestinian source-material of the period. Clearly, it is no more than a beginning in this direction, and makes no claims to being exhaustive either in the collection of material or in the analysis thereof. In the first section lists of Palestinian prices of different commodities are set out in chronological order, and compared with their Egyptian parallels. Babylonian material (analysed in Appendix C) is likewise presented. There follows a discussion of the monetary terminology of the period, and there certain semantic changes are noted and inferences of economic significance drawn. With the clarification of these terms, some observations are made on the patterns of III cent. monetary developments, and the nature of its price-levels. A series of legal texts are next analysed and it is shown that they reflect the change from a silver to a gold standard, via a transitionary period of economic instability and confusion. Thereafter follows an analysis of IV cent. Palestinian price-levels, and these are compared with the Egyptian evidence. It is suggested that internal discrepancies and apparent differences are to be explained on terminological grounds. In the final section, certain questions are raised concerning the chronological pattern of the III cent. economic developments, and some painters to the answers hazarded. To end, a very brief and concentrated description of the social conditions of the times (viewed Partially As implications of the economic development) is given, primarily to indicate the possible range of the sources, their ability to illumine dark periods, and the embryonic-ness of these studies

    Stroke lesion size:Still a useful biomarker for stroke severity and outcome in times of high-dimensional models

    Get PDF
    BACKGROUND The volumetric size of a brain lesion is a frequently used stroke biomarker. It stands out among most imaging biomarkers for being a one-dimensional variable that is applicable in simple statistical models. In times of machine learning algorithms, the question arises of whether such a simple variable is still useful, or whether high-dimensional models on spatial lesion information are superior. METHODS We included 753 first-ever anterior circulation ischemic stroke patients (age 68.4±15.2 years; NIHSS at 24 h 4.4±5.1; modified Rankin Scale (mRS) at 3-months median[IQR] 1[0.75;3]) and traced lesions on diffusion-weighted MRI. In an out-of-sample model validation scheme, we predicted stroke severity as measured by NIHSS 24 h and functional stroke outcome as measured by mRS at 3 months either from spatial lesion features or lesion size. RESULTS For stroke severity, the best regression model based on lesion size performed significantly above chance (p < 0.0001) with R2 = 0.322, but models with spatial lesion features performed significantly better with R2 = 0.363 (t(752) = 2.889; p = 0.004). For stroke outcome, the best classification model based on lesion size again performed significantly above chance (p < 0.0001) with an accuracy of 62.8%, which was not different from the best model with spatial lesion features (62.6%, p = 0.80). With smaller training data sets of only 150 or 50 patients, the performance of high-dimensional models with spatial lesion features decreased up to the point of being equivalent or even inferior to models trained on lesion size. The combination of lesion size and spatial lesion features in one model did not improve predictions. CONCLUSIONS Lesion size is a decent biomarker for stroke outcome and severity that is slightly inferior to spatial lesion features but is particularly suited in studies with small samples. When low-dimensional models are desired, lesion size provides a viable proxy biomarker for spatial lesion features, whereas high-precision prediction models in personalised prognostic medicine should operate with high-dimensional spatial imaging features in large samples

    Automated System Identification for Satellite Attitude Control

    Get PDF
    A novel approach to on-obit system identification of satellite attitude control dynamics is presented. The approach is fully automated and will thus enable a variety of satellite applications, including high-performance proliferated constellations and modular payloads. The key enabling feature of the approach is the ability to estimate the uncertainty in the model and then perform additional data collections specifically to reduce the uncertainty. A prototype software implementation of the algorithm accurately estimated multiple structural modes in a CubeSat simulation and a CubeSat reaction wheel testbed in preparation for an on-orbit demonstration as part of the The Aerospace Corporation’s Slingshot 1 mission

    Some , And Possibly All, Scalar Inferences Are Not Delayed: Evidence For Immediate Pragmatic Enrichment

    Get PDF
    Scalar inferences are commonly generated when a speaker uses a weaker expression rather than a stronger alternative, e.g., John ate some of the apples implies that he did not eat them all. This article describes a visual-world study investigating how and when perceivers compute these inferences. Participants followed spoken instructions containing the scalar quantifier some directing them to interact with one of several referential targets (e.g., Click on the girl who has some of the balloons). Participants fixated on the target compatible with the implicated meaning of some and avoided a competitor compatible with the literal meaning prior to a disambiguating noun. Further, convergence on the target was as fast for some as for the non-scalar quantifiers none and all. These findings indicate that the scalar inference is computed immediately and is not delayed relative to the literal interpretation of some. It is argued that previous demonstrations that scalar inferences increase processing time are not necessarily due to delays in generating the inference itself, but rather arise because integrating the interpretation of the inference with relevant information in the context may require additional time. With sufficient contextual support, processing delays disappear

    Raising argument strength using negative evidence: A constraint on models of induction

    Get PDF
    Both intuitively, and according to similarity-based theories of induction, relevant evidence raises argument strength when it is positive and lowers it when it is negative. In three experiments, we tested the hypothesis that argument strength can actually increase when negative evidence is introduced. Two kinds of argument were compared through forced choice or sequential evaluation: single positive arguments (e.g., “Shostakovich’s music causes alpha waves in the brain; therefore, Bach’s music causes alpha waves in the brain”) and double mixed arguments (e.g., “Shostakovich’s music causes alpha waves in the brain, X’s music DOES NOT; therefore, Bach’s music causes alpha waves in the brain”). Negative evidence in the second premise lowered credence when it applied to an item X from the same subcategory (e.g., Haydn) and raised it when it applied to a different subcategory (e.g., AC/DC). The results constitute a new constraint on models of induction

    Optimal Relevance in Imperfect Information Games

    Full text link
    To help incorporate natural language into economic theory, this paper does two things. First, the paper extends to imperfect information games an equilibrium concept developed for incomplete information games, so natural language can be formalized as a vehicle to convey information about actions as well as types. This equilibrium concept is specific to language games, because information is conveyed by the sender through the message's literal meaning. Second, the paper proposes an equilibrium refinement which selects the sender's most preferred equilibrium. The refinement captures the notion that the speaker seeks to improve its status quo, aiming at optimal relevance. Explicit coordination through verbal communication parallels the idea of implicit coordination through focal points

    Distinguishing Speed From Accuracy In Scalar Implicatures

    Get PDF
    Scalar implicatures are inferences that arise when a weak expression is used instead of a stronger alternative. For example, when a speaker says, “Some of the children are in the classroom,” she often implies that not all of them are. Recent processing studies of scalar implicatures have argued that generating an implicature carries a cost. In this study we investigated this cost using a sentence verification task similar to that of Bott and Noveck (2004) combined with a response deadline procedure to estimate speed and accuracy independently. Experiment 1 compared implicit upper-bound interpretations (some [but not all]) with lower-bound interpretations (some [and possibly all]). Experiment 2 compared an implicit upper-bound meaning of some with the explicit upper-bound meaning of only some. Experiment 3 compared an implicit lower-bound meaning of some with the explicit lower-bound meaning of at least some. Sentences with implicatures required additional processing time that could not be attributed to retrieval probabilities or factors relating to semantic complexity. Our results provide evidence against several different types of processing models, including verification and nonverification default implicature models and cost-free contextual models. More generally, our data are the first to provide evidence of the costs associated with deriving implicatures per se

    Healthy Annapolis

    Get PDF
    Final project for Urban Studies and Planning Studio (Summer 2017). University of Maryland, College Park.Annapolis, Maryland, located in Anne Arundel County, is home to the United States Naval Academy and Saint John’s College. The small waterfront capital city is also a popular tourist destination for sailors and history buffs drawn to the nationally recognized historic district. While continuing to focus on preserving the City’s historic and natural resources and strong local economy, Annapolis is taking steps to become a healthier city by participating in the Let’s Move! Cities Towns, and Counties (LMCTC) initiative, a national campaign to end childhood obesity by providing guidance to elected officials, parents, schools, community leaders, and other stakeholders in order to make healthy living accessible for everyone. Annapolis has successfully met the five initial program goals for LMCTC, and has achieved All-Star status. This report will help the City pursue three of the four All-Star strategies it is now eligible to pursue after achieving All-Star status. This report highlights disadvantaged communities, as they are more likely to suffer from poor health. In addition to an increased likelihood of health issues, these communities are also less likely to have resources such as education and community support to improve certain aspects of their health. This University of Maryland PALS summer studio project is meant to help guide the City of Annapolis in creating a healthier city for all residents, and in reaching their LMCTC All-Star strategies. Four chapters were written by groups that focused on health-related aspects of the city that relate directly to areas of focus for achieving All-Star status: 1) updates to incorporate health into the Comprehensive Plan, 2) parks and open space, 3) bicycle infrastructure, and 4) urban agriculture and community gardens. We hope that by providing recommendations for integrating health into the planning process and city design, and by suggesting strategies to make the most effective use of existing tools, Annapolis will be better situated to achieve its LMCTC All-Star strategies.The City of Annapoli

    Comparative effectiveness of less commonly used systemic monotherapies and common combination therapies for moderate to severe psoriasis in the clinical setting.

    Get PDF
    BACKGROUND: The effectiveness of psoriasis therapies in real-world settings remains relatively unknown. OBJECTIVE: We sought to compare the effectiveness of less commonly used systemic therapies and commonly used combination therapies for psoriasis. METHODS: This was a multicenter cross-sectional study of 203 patients with plaque psoriasis receiving less common systemic monotherapy (acitretin, cyclosporine, or infliximab) or common combination therapies (adalimumab, etanercept, or infliximab and methotrexate) compared with 168 patients receiving methotrexate evaluated at 1 of 10 US outpatient dermatology sites participating in the Dermatology Clinical Effectiveness Research Network. RESULTS: In adjusted analyses, patients on acitretin (relative response rate 2.01; 95% confidence interval [CI] 1.18-3.41), infliximab (relative response rate 1.93; 95% CI 1.26-2.98), adalimumab and methotrexate (relative response rate 3.04; 95% CI 2.12-4.36), etanercept and methotrexate (relative response rate 2.22; 95% CI 1.25-3.94), and infliximab and methotrexate (relative response rate 1.72; 95% CI 1.10-2.70) were more likely to have clear or almost clear skin compared with patients on methotrexate. There were no differences among treatments when response rate was defined by health-related quality of life. LIMITATIONS: Single time point assessment may result in overestimation of effectiveness. CONCLUSIONS: The efficacy of therapies in clinical trials may overestimate their effectiveness as used in clinical practice. Although physician-reported relative response rates were different among therapies, absolute differences were small and did not correspond to differences in patient-reported outcomes
    corecore