264 research outputs found
Complexity Results for Modal Dependence Logic
Modal dependence logic was introduced recently by V\"a\"an\"anen. It enhances
the basic modal language by an operator =(). For propositional variables
p_1,...,p_n, =(p_1,...,p_(n-1);p_n) intuitively states that the value of p_n is
determined by those of p_1,...,p_(n-1). Sevenster (J. Logic and Computation,
2009) showed that satisfiability for modal dependence logic is complete for
nondeterministic exponential time. In this paper we consider fragments of modal
dependence logic obtained by restricting the set of allowed propositional
connectives. We show that satisfibility for poor man's dependence logic, the
language consisting of formulas built from literals and dependence atoms using
conjunction, necessity and possibility (i.e., disallowing disjunction), remains
NEXPTIME-complete. If we only allow monotone formulas (without negation, but
with disjunction), the complexity drops to PSPACE-completeness. We also extend
V\"a\"an\"anen's language by allowing classical disjunction besides dependence
disjunction and show that the satisfiability problem remains NEXPTIME-complete.
If we then disallow both negation and dependence disjunction, satistiability is
complete for the second level of the polynomial hierarchy. In this way we
completely classify the computational complexity of the satisfiability problem
for all restrictions of propositional and dependence operators considered by
V\"a\"an\"anen and Sevenster.Comment: 22 pages, full version of CSL 2010 pape
Dependence Logic with Generalized Quantifiers: Axiomatizations
We prove two completeness results, one for the extension of dependence logic
by a monotone generalized quantifier Q with weak interpretation, weak in the
meaning that the interpretation of Q varies with the structures. The second
result considers the extension of dependence logic where Q is interpreted as
"there exists uncountable many." Both of the axiomatizations are shown to be
sound and complete for FO(Q) consequences.Comment: 17 page
Changing a semantics: opportunism or courage?
The generalized models for higher-order logics introduced by Leon Henkin, and
their multiple offspring over the years, have become a standard tool in many
areas of logic. Even so, discussion has persisted about their technical status,
and perhaps even their conceptual legitimacy. This paper gives a systematic
view of generalized model techniques, discusses what they mean in mathematical
and philosophical terms, and presents a few technical themes and results about
their role in algebraic representation, calibrating provability, lowering
complexity, understanding fixed-point logics, and achieving set-theoretic
absoluteness. We also show how thinking about Henkin's approach to semantics of
logical systems in this generality can yield new results, dispelling the
impression of adhocness. This paper is dedicated to Leon Henkin, a deep
logician who has changed the way we all work, while also being an always open,
modest, and encouraging colleague and friend.Comment: 27 pages. To appear in: The life and work of Leon Henkin: Essays on
his contributions (Studies in Universal Logic) eds: Manzano, M., Sain, I. and
Alonso, E., 201
Twin Paradox and the logical foundation of relativity theory
We study the foundation of space-time theory in the framework of first-order
logic (FOL). Since the foundation of mathematics has been successfully carried
through (via set theory) in FOL, it is not entirely impossible to do the same
for space-time theory (or relativity). First we recall a simple and streamlined
FOL-axiomatization SpecRel of special relativity from the literature. SpecRel
is complete with respect to questions about inertial motion. Then we ask
ourselves whether we can prove usual relativistic properties of accelerated
motion (e.g., clocks in acceleration) in SpecRel. As it turns out, this is
practically equivalent to asking whether SpecRel is strong enough to "handle"
(or treat) accelerated observers. We show that there is a mathematical
principle called induction (IND) coming from real analysis which needs to be
added to SpecRel in order to handle situations involving relativistic
acceleration. We present an extended version AccRel of SpecRel which is strong
enough to handle accelerated motion, in particular, accelerated observers.
Among others, we show that the Twin Paradox becomes provable in AccRel, but it
is not provable without IND.Comment: 24 pages, 6 figure
Home Is Where the Smart Is: Development and Validation of the Cybersecurity Self-Efficacy in Smart Homes (CySESH) Scale
The ubiquity of devices connected to the internet raises concerns about the security and privacy of smart homes. The effectiveness of interventions to support secure user behaviors is limited by a lack of validated instruments to measure relevant psychological constructs, such as self-efficacy - the belief that one is able to perform certain behaviors. We developed and validated the Cybersecurity Self-Efficacy in Smart Homes (CySESH) scale, a 12-item unidimensional measure of domain-specific self-efficacy beliefs, across five studies (N = 1247). Three pilot studies generated and refined an item pool. We report evidence from one initial and one major, preregistered validation study for (1) excellent reliability (α = 0.90), (2) convergent validity with self-efficacy in information security (rSEIS = 0.64, p < .001), and (3) discriminant validity with outcome expectations (rOE = 0.26, p < .001), self-esteem (rRSE = 0.17, p < .001), and optimism (rLOT-R = 0.18, p < .001). We discuss CySESH's potential to advance future HCI research on cybersecurity, practitioner user assessments, and implications for consumer protection policy
A World Full of Privacy and Security (Mis)conceptions? Findings of a Representative Survey in 12 Countries
Misconceptions about digital security and privacy topics in the general public frequently lead to insecure behavior. However, little is known about the prevalence and extent of such misconceptions in a global context. In this work, we present the results of the first large-scale survey of a global population on misconceptions: We conducted an online survey with n = 12, 351 participants in 12 countries on four continents. By investigating influencing factors of misconceptions around eight common security and privacy topics (including E2EE, Wi-Fi, VPN, and malware), we find the country of residence to be the strongest estimate for holding misconceptions. We also identify differences between non-Western and Western countries, demonstrating the need for region-specific research on user security knowledge, perceptions, and behavior. While we did not observe many outright misconceptions, we did identify a lack of understanding and uncertainty about several fundamental privacy and security topics
Self-medication with antibiotics in rural population in Greece: a cross-sectional multicenter study
<p>Abstract</p> <p>Background</p> <p>Self-medication is an important driver of antimicrobial overuse as well as a worldwide problem. The aim of the present study was to estimate the use of antibiotics, without medical prescription, in a sample of rural population presenting in primary care in southern Greece.</p> <p>Methods</p> <p>The study included data from 1,139 randomly selected adults (545 men/594 women, mean age ± SD: 56.2 ± 19.8 years), who visited the 6 rural Health Centres of southern Greece, between November 2009 and January 2010. The eligible participants were sought out on a one-to-one basis and asked to answer an anonymous questionnaire.</p> <p>Results</p> <p>Use of antibiotics within the past 12 months was reported by 888 participants (77.9%). 508 individuals (44.6%) reported that they had received antibiotics without medical prescription at least one time. The major source of self-medication was the pharmacy without prescription (76.2%). The antibiotics most frequently used for self-medication were amoxicillin (18.3%), amoxicillin/clavulanic acid (15.4%), cefaclor (9.7%), cefuroxim (7.9%), cefprozil (4.7%) and ciprofloxacin (2.3%). Fever (41.2%), common cold (32.0%) and sore throat (20.6%) were the most frequent indications for the use of self-medicated antibiotics.</p> <p>Conclusion</p> <p>In Greece, despite the open and rapid access to primary care services, it appears that a high proportion of rural adult population use antibiotics without medical prescription preferably for fever and common cold.</p
Vertical profiles of sub-3nm particles over the boreal forest
This work presents airborne observations of sub-3 nm particles in the lower troposphere and investigates new particle formation (NPF) within an evolving boundary layer (BL). We studied particle concentrations together with supporting gas and meteorological data inside the planetary BL over a boreal forest site in Hyytiala, southern Finland. The analysed data were collected during three flight measurement campaigns: May-June 2015, August 2015 and April-May 2017, including 27 morning and 26 afternoon vertical profiles. As a platform for the instrumentation, we used a Cessna 172 aircraft. The analysed flight data were collected horizontally within a 30 km distance from SMEAR II in Hyytiala and vertically from 100 m above ground level up to 2700 m. The number concentration of 1.5-3 nm particles was observed to be, on average, the highest near the forest canopy top and to decrease with increasing altitude during the mornings of NPF event days. This indicates that the precursor vapours emitted by the forest play a key role in NPF in Hyytiala. During daytime, newly formed particles were observed to grow in size and the particle population became more homogenous within the well-mixed BL in the afternoon. During undefined days with respect to NPF, we also detected an increase in concentration of 1.5-3 nm particles in the morning but not their growth in size, which indicates an interrupted NPF process during these undefined days. Vertical mixing was typically stronger during the NPF event days than during the undefined or non-event days. The results shed light on the connection between boundary layer dynamics and NPF.Peer reviewe
Recommended from our members
BAECC: a field campaign to elucidate the impact of Biogenic Aerosols on Clouds and Climate
Observations obtained during an 8-month deployment of AMF2 in a boreal environment in HyytiĂ€lĂ€, Finland, and the 20-year comprehensive in-situ data from SMEAR-II station enable the characterization of biogenic aerosol, clouds and precipitation, and their interactions. During âBiogenic Aerosols - Effects on Clouds and Climate (BAECC)â, the U.S. Department of Energyâs Atmospheric Radiation Measurement (ARM) Program deployed the ARM 2nd Mobile Facility (AMF2) to HyytiĂ€lĂ€, Finland, for an 8-month intensive measurement campaign from February to September 2014. The primary research goal is to understand the role of biogenic aerosols in cloud formation. HyytiĂ€lĂ€ is host to SMEAR-II (Station for Measuring Forest Ecosystem-Atmosphere Relations), one of the worldâs most comprehensive surface in-situ observation sites in a boreal forest environment. The station has been measuring atmospheric aerosols, biogenic emissions and an extensive suite of parameters relevant to atmosphere-biosphere interactions continuously since 1996. Combining vertical profiles from AMF2 with surface-based in-situ SMEAR-II observations allow the processes at the surface to be directly related to processes occurring throughout the entire tropospheric column. Together with the inclusion of extensive surface precipitation measurements, and intensive observation periods involving aircraft flights and novel radiosonde launches, the complementary observations provide a unique opportunity for investigating aerosol-cloud interactions, and cloud-to-precipitation processes, in a boreal environment. The BAECC dataset provides opportunities for evaluating and improving models of aerosol sources and transport, cloud microphysical processes, and boundary-layer structures. In addition, numerical models are being used to bridge the gap between surface-based and tropospheric observations
Cognitive stimulation in the workplace, plasma proteins, and risk of dementia: three analyses of population cohort studies
Objectives: To examine the association between cognitively stimulating work and subsequent risk of dementia and to identify protein pathways for this association. /
Design: Multicohort study with three sets of analyses. /
Setting: United Kingdom, Europe, and the United States. /
Participants: Three associations were examined: cognitive stimulation and dementia risk in 107â896 participants from seven population based prospective cohort studies from the IPD-Work consortium (individual participant data meta-analysis in working populations); cognitive stimulation and proteins in a random sample of 2261 participants from one cohort study; and proteins and dementia risk in 13â656 participants from two cohort studies. /
Main outcome measures: Cognitive stimulation was measured at baseline using standard questionnaire instruments on active versus passive jobs and at baseline and over time using a job exposure matrix indicator. 4953 proteins in plasma samples were scanned. Follow-up of incident dementia varied between 13.7 to 30.1 years depending on the cohort. People with dementia were identified through linked electronic health records and repeated clinical examinations. /
Results: During 1.8 million person years at risk, 1143 people with dementia were recorded. The risk of dementia was found to be lower for participants with high compared with low cognitive stimulation at work (crude incidence of dementia per 10â000 person years 4.8 in the high stimulation group and 7.3 in the low stimulation group, age and sex adjusted hazard ratio 0.77, 95% confidence interval 0.65 to 0.92, heterogeneity in cohort specific estimates I2=0%, P=0.99). This association was robust to additional adjustment for education, risk factors for dementia in adulthood (smoking, heavy alcohol consumption, physical inactivity, job strain, obesity, hypertension, and prevalent diabetes at baseline), and cardiometabolic diseases (diabetes, coronary heart disease, stroke) before dementia diagnosis (fully adjusted hazard ratio 0.82, 95% confidence interval 0.68 to 0.98). The risk of dementia was also observed during the first 10 years of follow-up (hazard ratio 0.60, 95% confidence interval 0.37 to 0.95) and from year 10 onwards (0.79, 0.66 to 0.95) and replicated using a repeated job exposure matrix indicator of cognitive stimulation (hazard ratio per 1 standard deviation increase 0.77, 95% confidence interval 0.69 to 0.86). In analysis controlling for multiple testing, higher cognitive stimulation at work was associated with lower levels of proteins that inhibit central nervous system axonogenesis and synaptogenesis: slit homologue 2 (SLIT2, fully adjusted ÎČ â0.34, P<0.001), carbohydrate sulfotransferase 12 (CHSTC, fully adjusted ÎČ â0.33, P<0.001), and peptidyl-glycine α-amidating monooxygenase (AMD, fully adjusted ÎČ â0.32, P<0.001). These proteins were associated with increased dementia risk, with the fully adjusted hazard ratio per 1 SD being 1.16 (95% confidence interval 1.05 to 1.28) for SLIT2, 1.13 (1.00 to 1.27) for CHSTC, and 1.04 (0.97 to 1.13) for AMD. /
Conclusions: The risk of dementia in old age was found to be lower in people with cognitively stimulating jobs than in those with non-stimulating jobs. The findings that cognitive stimulation is associated with lower levels of plasma proteins that potentially inhibit axonogenesis and synaptogenesis and increase the risk of dementia might provide clues to underlying biological mechanisms
- âŠ