902 research outputs found
Ankylosing spondylitis and sarcoidosis â Coincidence or association?
We report a 25-year-old woman presenting with sarcoidosis and bilateral sacroiliitis. Her sarcoidosis related symptoms (malaise, cough and dyspnoea) improved dramatically under treatment with steroids but severe back pain persisted. Only seven similar cases have been described over the last 40 years and the question of a possible association between the two diseases has been raised. However, prevalence data from the literature and the apparent lack of genetic links are better arguments for coincidence than for association
The Luminosity Distribution of Local Group Galaxies
From a rediscussion of Local Group membership, and of distances to individual
galaxies, we obtain values for 35 probable and possible Local Group
members. The luminosity function of these objects is well fitted by a Schechter
function with faint end slope . The probability that the
luminosity distribution of the Local Group is a single Schechter function with
steeper than -1.3 is less than 1 per cent. However, more complicated
luminosity functions, such as multi-component Schechter functions with steep
faint-end slopes, cannot be ruled out. There is some evidence that the
luminosity distribution of dwarf spheroidal galaxies in the Local Group is
steeper than that of dwarf irregular galaxies.Comment: 13 pages, 2 figures, accepted for publication in The Astronomical
Journal. Figure 2 replaced, conclusion based on this figure change
Blazar synchrotron emission of instantaneously power-law injected electrons under linear synchrotron, non-linear SSC, and combined synchrotron-SSC cooling
The broadband SEDs of blazars show two distinct components which in leptonic
models are associated with synchrotron and SSC emission of highly relativistic
electrons. In some sources the SSC component dominates the synchrotron peak by
one or more orders of magnitude implying that the electrons mainly cool by
inverse Compton collisions with their self-made synchrotron photons. Therefore,
the linear synchrotron loss of electrons, which is normally invoked in emission
models, has to be replaced by a nonlinear loss rate depending on an energy
integral of the electron distribution. This modified electron cooling changes
significantly the emerging radiation spectra. It is the purpose of this work to
apply this new cooling scenario to relativistic power-law distributed
electrons, which are injected instantaneously into the jet. We will first solve
the differential equation of the volume-averaged differential number density of
the electrons, and then discuss their temporal evolution. Since any non-linear
cooling will turn into linear cooling after some time, we also calculated the
electron number density for a combined cooling scenario consisting of both the
linear and non-linear cooling. For all cases, we will also calculate
analytically the emerging optically thin synchrotron fluence spectrum which
will be compared to a numerical solution. For small normalized frequencies f <
1 the fluence spectra show constant spectral indices. We find for linear
cooling a_SYN = 1/2, and for non-linear cooling a_SSC = 3/2. In the combined
cooling scenario we obtain for the small injection parameter b_1 = 1/2, and for
the large injection parameter b_2 = 3/2, which becomes b_1 = 1/2 for very small
frequencies, again. This is the same behaviour as for monoenergetically
injected electrons.Comment: 24 pages, 25 figures, submitted to A&
POS0758â DOES EXPERIENCE IN SYSTEMIC LUPUS ERYTHEMATOSUS INFLUENCE THE PHYSICIAN GLOBAL ASSESSMENT SCORING? A CROSS-SECTIONAL STUDY ON TWO EUROPEAN COHORTS
Background:The Physician Global Assessment (PGA) is an outcome instrument based on physician judgement of disease activity in patients with Systemic Lupus Erythematosus (SLE). Due to the subjectivity of the score and lack of standardization, the PGA could represent a source of heterogeneity, because the same manifestations could be rated differently by physicians with different backgrounds (1).Objectives:The purpose of this study was to evaluate the inter-rater reliability of PGA between a rheumatology trainee and rheumatologists expert in SLE from 2 european countries.Methods:SLE patients classified according to SLICC 2012 criteria were enrolled between May 2019 and December 2019 during a SLEuro traineeship program. Demographic, clinical (SLEDAI-2k, PGA), serological and ongoing medication data were collected. PGA was evaluated before (pre-lab) and after (post-lab) knowledge of laboratory exams, using a Visual Analogue Scale (VAS) ranging from 0 to 3, anchored at point 1 (mild), 2 (moderate) and 3 (severe activity). A trainee in Rheumatology (EC) and three rheumatologists experts in SLE (LA, MP, FS) independently scored the PGA for each patient.The trainee preliminarily received a standardization training with her tutor (MP), consisting of a shared discussion about 10 consecutive SLE outpatients to increase reliability in PGA scoring.Inter-rater reliability was analysed using the intraclass correlation coefficient (ICC) with a two-way single-rating model (2,1); 95% confidence interval (CI) was calculated.Results:Fifty-seven patients (86% female) affected from SLE (29 belonging to a French cohort and 28 to an Italian cohort) with a mean (SD) age 43.2 (15.9) years and a median [IQR] disease duration 6.4 [2.0-15.4] years were enrolled. Clinical features are presented in table 1. Pre-lab PGA scores were obtained from all patients and ranged from 0 to 2.3; post-lab PGA scores were obtained from 51 patients and ranged from 0 to 2.9. Inter-rater reliability of the PGA among the trainee was good to excellent for each lupus expert comparison: a) pre-lab PGA ICC 0.94, 95% CI 0.87-0.97; post-lab PGA ICC 0.94, 95% CI 0.87-0.97 (MP); b) pre-lab PGA ICC 0.84, 95% CI 0.63-0.93; post-lab PGA ICC 0.96 CI 0.88-0.99 (LA); c) pre-lab PGA ICC 0.91, 95% CI 0.65-0.98; post-lab PGA ICC 0.91, 95% CI 0.65-0.98 (FS).Conclusion:After an adequate standardization, PGA scoring reaches good to excellent reliability between trainee and experts.References:[1]Chessa E, Piga M, Floris A, Devilliers H, Cauli A, Arnaud L. Use of Physician Global Assessment in systemic lupus erythematosus: a systematic review of its psychometric properties. Rheumatology (Oxford). 2020 Dec 1;59(12):3622-3632.Clinical DataCaucasian44 (77.2%)anti-dsDNA titre (median,IQR)14 (0-75)Hypocomplementemia (n,%)30 (54%)SLEDAIâ„6 (n,%)18 (31.6%)SLEDAI (median,IQR)4 (2-6)Flares (n,%)18 (31.6%)Ongoing prednisone treatment (n,%)41 (71.9%)Prednisone dose mg (mean±sd)5 (0 - 8.9)Hydroxychloroquine (n,%)44 (77.2%)Immunosuppressant (n,%)35 (61.4%)Acknowledgements:Elisabetta Chessa gratefully acknowledges the SLEuro European Lupus Society for its financial support in her traineeship in Strasbourg.Disclosure of Interests:None declare
The Age-Redshift Relation for Standard Cosmology
We present compact, analytic expressions for the age-redshift relation
for standard Friedmann-Lema\^ \itre-Robertson-Walker (FLRW)
cosmology. The new expressions are given in terms of incomplete Legendre
elliptic integrals and evaluate much faster than by direct numerical
integration.Comment: 13 pages, 3 figure
Gamma-Ray Background from Structure Formation in the Intergalactic Medium
The universe is filled with a diffuse and isotropic extragalactic background
of gamma-ray radiation, containing roughly equal energy flux per decade in
photon energy between 3 MeV-100 GeV. The origin of this background is one of
the unsolved puzzles in cosmology. Less than a quarter of the gamma-ray flux
can be attributed to unresolved discrete sources, but the remainder appears to
constitute a truly diffuse background whose origin has hitherto been
mysterious. Here we show that the shock waves induced by gravity during the
formation of large-scale structure in the intergalactic medium, produce a
population of highly-relativistic electrons with a maximum Lorentz factor above
10^7. These electrons scatter a small fraction of the microwave background
photons in the present-day universe up to gamma-ray energies, thereby providing
the gamma-ray background. The predicted diffuse flux agrees with the observed
background over more than four decades in photon energy, and implies a mean
cosmological density of baryons which is consistent with Big-Bang
nucleosynthesis.Comment: 7 pages, 1 figure. Accepted for publication in Nature. (Press embargo
until published.
The Seyfert Population in the Local Universe
The magnitude-limited catalog of the Southern Sky Redshift Survey (SSRS2), is
used to characterize the properties of galaxies hosting Active Galactic Nuclei.
Using emission-line ratios, we identify a total of 162 (3%) Seyfert galaxies
out of the parent sample with 5399 galaxies. The sample contains 121 Seyfert 2
galaxies and 41 Seyfert 1. The SSRS2 Seyfert galaxies are predominantly in
spirals of types Sb and earlier, or in galaxies with perturbed appearance as
the result of strong interactions or mergers. Seyfert galaxies in this sample
are twice as common in barred hosts than the non-Seyferts. By assigning
galaxies to groups using a percolation algorithm we find that the Seyfert
galaxies in the SSRS2 are more likely to be found in binary systems, when
compared to galaxies in the SSRS2 parent sample. However, there is no
statistically significant difference between the Seyfert and SSRS2 parent
sample when systems with more than 2 galaxies are considered. The analysis of
the present sample suggests that there is a stronger correlation between the
presence of the AGN phenomenon with internal properties of galaxies
(morphology, presence of bar, luminosity) than with environmental effects
(local galaxy density, group velocity dispersion, nearest neighbor distance).Comment: 35 pages, 13 figures, Accepted to be publised in Astronomical Journa
Non-Fermi liquid normal state of the Heavy Fermion superconductor UBe13
Non-Fermi liquid (NFL) behavior in the normal state of the heavy-fermion
superconductor UBe13 is studied by means of low-temperature measurements of the
specific heat, C, and electrical resistivity, \rho, on a high-quality single
crystal in magnetic fields up to 15.5 T. At B=0, unconventional
superconductivity forms at Tc=0.9 K out of an incoherent state, characterized
by a large and strongly temperature dependent \rho(T). In the magnetic field
interval 4 T \leq B \leq 10 T, \rho(T) follows a T^3/2 behavior for Tc(B)\leq T
\leq 1 K, while \rho is proportional to T at higher temperatures. Corresponding
Non-Fermi liquid behavior is observed in C/T as well and hints at a nearby
antiferromagnetic (AF) quantum critical point (QCP) covered by the
superconducting state. We speculate that the suppression of short-range AF
correlations observed by thermal expansion and specific heat measurements below
T_L \simeq 0.7 K (B=0) yields a field-induced QCP, T_L \to 0, at B=4.5 T.Comment: Presented at the M2S-2003 conference in Rio / Brazi
Squeezing MOND into a Cosmological Scenario
Explaining the effects of dark matter using modified gravitational dynamics
(MOND) has for decades been both an intriguing and controversial possibility.
By insisting that the gravitational interaction that accounts for the Newtonian
force also drives cosmic expansion, one may kinematically identify which
cosmologies are compatible with MOND, without explicit reference to the
underlying theory so long as the theory obeys Birkhoff's law. Using this
technique, we are able to self-consistently compute a number of quantities of
cosmological interest. We find that the critical acceleration a_0 must have a
slight source-mass dependence (a_0 ~ M^(1/3)) and that MOND cosmologies are
naturally compatible with observed late-time expansion history and the
contemporary cosmic acceleration. However, cosmologies that can produce enough
density perturbations to account for structure formation are contrived and
fine-tuned. Even then, they may be marginally ruled out by evidence of early (z
\~ 20) reionization.Comment: 11 pages revtex, 2 figure
Accountable Algorithms
Many important decisions historically made by people are now made by computers. Algorithms count votes, approve loan and credit card applications, target citizens or neighborhoods for police scrutiny, select taxpayers for IRS audit, grant or deny immigration visas, and more. The accountability mechanisms and legal standards that govern such decision processes have not kept pace with technology. The tools currently available to policymakers, legislators, and courts were developed to oversee human decisionmakers and often fail when applied to computers instead. For example, how do you judge the intent of a piece of software? Because automated decision systems can return potentially incorrect, unjustified, or unfair results, additional approaches are needed to make such systems accountable and governable. This Article reveals a new technological toolkit to verify that automated decisions comply with key standards of legal fairness. We challenge the dominant position in the legal literature that transparency will solve these problems. Disclosure of source code is often neither necessary (because of alternative techniques from computer science) nor sufficient (because of the issues analyzing code) to demonstrate the fairness of a process. Furthermore, transparency may be undesirable, such as when it discloses private information or permits tax cheats or terrorists to game the systems determining audits or security screening. The central issue is how to assure the interests of citizens, and society as a whole, in making these processes more accountable. This Article argues that technology is creating new opportunitiesâsubtler and more flexible than total transparencyâto design decisionmaking algorithms so that they better align with legal and policy objectives. Doing so will improve not only the current governance of automated decisions, but alsoâin certain casesâthe governance of decisionmaking in general. The implicit (or explicit) biases of human decisionmakers can be difficult to find and root out, but we can peer into the âbrainâ of an algorithm: computational processes and purpose specifications can be declared prior to use and verified afterward. The technological tools introduced in this Article apply widely. They can be used in designing decisionmaking processes from both the private and public sectors, and they can be tailored to verify different characteristics as desired by decisionmakers, regulators, or the public. By forcing a more careful consideration of the effects of decision rules, they also engender policy discussions and closer looks at legal standards. As such, these tools have far-reaching implications throughout law and society. Part I of this Article provides an accessible and concise introduction to foundational computer science techniques that can be used to verify and demonstrate compliance with key standards of legal fairness for automated decisions without revealing key attributes of the decisions or the processes by which the decisions were reached. Part II then describes how these techniques can assure that decisions are made with the key governance attribute of procedural regularity, meaning that decisions are made under an announced set of rules consistently applied in each case. We demonstrate how this approach could be used to redesign and resolve issues with the State Departmentâs diversity visa lottery. In Part III, we go further and explore how other computational techniques can assure that automated decisions preserve fidelity to substantive legal and policy choices. We show how these tools may be used to assure that certain kinds of unjust discrimination are avoided and that automated decision processes behave in ways that comport with the social or legal standards that govern the decision. We also show how automated decisionmaking may even complicate existing doctrines of disparate treatment and disparate impact, and we discuss some recent computer science work on detecting and removing discrimination in algorithms, especially in the context of big data and machine learning. And lastly, in Part IV, we propose an agenda to further synergistic collaboration between computer science, law, and policy to advance the design of automated decision processes for accountabilit
- âŠ