6,103 research outputs found
Ocean acidification and the loss of phenolic substances in marine plants.
Rising atmospheric CO(2) often triggers the production of plant phenolics, including many that serve as herbivore deterrents, digestion reducers, antimicrobials, or ultraviolet sunscreens. Such responses are predicted by popular models of plant defense, especially resource availability models which link carbon availability to phenolic biosynthesis. CO(2) availability is also increasing in the oceans, where anthropogenic emissions cause ocean acidification, decreasing seawater pH and shifting the carbonate system towards further CO(2) enrichment. Such conditions tend to increase seagrass productivity but may also increase rates of grazing on these marine plants. Here we show that high CO(2) / low pH conditions of OA decrease, rather than increase, concentrations of phenolic protective substances in seagrasses and eurysaline marine plants. We observed a loss of simple and polymeric phenolics in the seagrass Cymodocea nodosa near a volcanic CO(2) vent on the Island of Vulcano, Italy, where pH values decreased from 8.1 to 7.3 and pCO(2) concentrations increased ten-fold. We observed similar responses in two estuarine species, Ruppia maritima and Potamogeton perfoliatus, in in situ Free-Ocean-Carbon-Enrichment experiments conducted in tributaries of the Chesapeake Bay, USA. These responses are strikingly different than those exhibited by terrestrial plants. The loss of phenolic substances may explain the higher-than-usual rates of grazing observed near undersea CO(2) vents and suggests that ocean acidification may alter coastal carbon fluxes by affecting rates of decomposition, grazing, and disease. Our observations temper recent predictions that seagrasses would necessarily be "winners" in a high CO(2) world
Dietary glycaemic index and glycaemic load among Australian adults – results from the 2011–2012 Australian Health Survey
published_or_final_versio
Thermodynamics of an ideal generalized gas:II Means of order
The property that power means are monotonically increasing functions of their
order is shown to be the basis of the second laws not only for processes
involving heat conduction but also for processes involving deformations. In an
-potentail equilibration the final state will be one of maximum entropy,
while in an entropy equilibrium the final state will be one of minimum . A
metric space is connected with the power means, and the distance between means
of different order is related to the Carnot efficiency. In the ideal classical
gas limit, the average change in the entropy is shown to be proportional to the
difference between the Shannon and R\'enyi entropies for nonextensive systems
that are multifractal in nature. The -potential, like the internal energy,
is a Schur convex function of the empirical temperature, which satisfies
Jensen's inequality, and serves as a measure of the tendency to uniformity in
processes involving pure thermal conduction.Comment: 8 page
Clinical assessment of hand oedema: A systematic review
Introduction: Assessment of oedema after trauma or surgery is important to determine whether treatment is effective and to detect change over time. Volumetry is referred to as the ‘gold standard’ method of measuring volume. However, this has practical limitations and other methods are available. The aim of this systematic review was to evaluate the psychometric properties of alternative methods used to assess hand oedema. Methods: A search of electronic bibliographic databases was undertaken for any studies published in English reporting the psychometric evaluation of a method for measuring hand oedema, in an adult population with hand swelling from surgery, trauma or stroke. The Consensus‐based Standards for the Selection of health Measurement Instruments (COSMIN) checklist was used to evaluate the methodological quality. Results: Six studies met the inclusion criteria. Three methods were identified assessing hand oedema: perometry, visual inspection and the figure-of-eight tape measure, all were compared to volumetry. Four different psychometric properties were assessed. Studies scored fair or poor on COSMIN criteria. There is low-quality evidence supporting the use of the figure-of-eight tape measure to assess hand volume. The perometer systematically overestimated volume and visual estimation had poor sensitivity and specificity. Discussion: The figure-of-eight tape measure is the best alternative to volumetry for hand oedema. Benefits include reduced cost and time while having comparable reliability to the ‘gold standard’. Further research is needed to compare methods in patients with greater variability of conditions and with isolated digit oedema. Visual estimation of hand oedema is not recommended
Considering the Case for Biodiversity Cycles: Reexamining the Evidence for Periodicity in the Fossil Record
Medvedev and Melott (2007) have suggested that periodicity in fossil
biodiversity may be induced by cosmic rays which vary as the Solar System
oscillates normal to the galactic disk. We re-examine the evidence for a 62
million year (Myr) periodicity in biodiversity throughout the Phanerozoic
history of animal life reported by Rohde & Mueller (2005), as well as related
questions of periodicity in origination and extinction. We find that the signal
is robust against variations in methods of analysis, and is based on
fluctuations in the Paleozoic and a substantial part of the Mesozoic.
Examination of origination and extinction is somewhat ambiguous, with results
depending upon procedure. Origination and extinction intensity as defined by RM
may be affected by an artifact at 27 Myr in the duration of stratigraphic
intervals. Nevertheless, when a procedure free of this artifact is implemented,
the 27 Myr periodicity appears in origination, suggesting that the artifact may
ultimately be based on a signal in the data. A 62 Myr feature appears in
extinction, when this same procedure is used. We conclude that evidence for a
periodicity at 62 Myr is robust, and evidence for periodicity at approximately
27 Myr is also present, albeit more ambiguous.Comment: Minor modifications to reflect final published versio
Human Computation and Convergence
Humans are the most effective integrators and producers of information,
directly and through the use of information-processing inventions. As these
inventions become increasingly sophisticated, the substantive role of humans in
processing information will tend toward capabilities that derive from our most
complex cognitive processes, e.g., abstraction, creativity, and applied world
knowledge. Through the advancement of human computation - methods that leverage
the respective strengths of humans and machines in distributed
information-processing systems - formerly discrete processes will combine
synergistically into increasingly integrated and complex information processing
systems. These new, collective systems will exhibit an unprecedented degree of
predictive accuracy in modeling physical and techno-social processes, and may
ultimately coalesce into a single unified predictive organism, with the
capacity to address societies most wicked problems and achieve planetary
homeostasis.Comment: Pre-publication draft of chapter. 24 pages, 3 figures; added
references to page 1 and 3, and corrected typ
Engaging with assessment: increasing student engagement through continuous assessment
Student engagement is intrinsically linked to two important metrics in learning: student satisfaction and the quality of the student experience. One of the ways that engagement can be influenced is through careful curriculum design. Using the knowledge that many students are ‘assessment-driven’ a low stakes continuous weekly summative e-assessment was introduced to a module. The impact this had on student engagement was measured by studying student activity within the module virtual learning environment (VLE). It was found that introduction of the e-assessments led to a significant increase in VLE activity compared to the VLE activity in that module the previous year, and also compared to the VLE activity of two other modules studied by the same student cohort. As many institutions move towards greater blended or online deliveries it will become more important to ensure that VLEs encourage high levels of student engagement in order to maintain or enhance the student experience.
Keywords : continuous assessment, learning analytics, student engagement, virtual learning environment
The Pioneer Anomaly
Radio-metric Doppler tracking data received from the Pioneer 10 and 11
spacecraft from heliocentric distances of 20-70 AU has consistently indicated
the presence of a small, anomalous, blue-shifted frequency drift uniformly
changing with a rate of ~6 x 10^{-9} Hz/s. Ultimately, the drift was
interpreted as a constant sunward deceleration of each particular spacecraft at
the level of a_P = (8.74 +/- 1.33) x 10^{-10} m/s^2. This apparent violation of
the Newton's gravitational inverse-square law has become known as the Pioneer
anomaly; the nature of this anomaly remains unexplained. In this review, we
summarize the current knowledge of the physical properties of the anomaly and
the conditions that led to its detection and characterization. We review
various mechanisms proposed to explain the anomaly and discuss the current
state of efforts to determine its nature. A comprehensive new investigation of
the anomalous behavior of the two Pioneers has begun recently. The new efforts
rely on the much-extended set of radio-metric Doppler data for both spacecraft
in conjunction with the newly available complete record of their telemetry
files and a large archive of original project documentation. As the new study
is yet to report its findings, this review provides the necessary background
for the new results to appear in the near future. In particular, we provide a
significant amount of information on the design, operations and behavior of the
two Pioneers during their entire missions, including descriptions of various
data formats and techniques used for their navigation and radio-science data
analysis. As most of this information was recovered relatively recently, it was
not used in the previous studies of the Pioneer anomaly, but it is critical for
the new investigation.Comment: 165 pages, 40 figures, 16 tables; accepted for publication in Living
Reviews in Relativit
Madness decolonized?: Madness as transnational identity in Gail Hornstein’s Agnes’s Jacket
The US psychologist Gail Hornstein’s monograph Agnes’s Jacket: A Psychologist’s Search for the Meanings of Madness (2009) is an important intervention in the identity politics of the mad movement. Hornstein offers a resignified vision of mad identity that embroiders the central trope of an “anti-colonial” struggle to reclaim the experiential world “colonized” by psychiatry. A series of literal and figurative appeals make recourse to the inner world and (corresponding) cultural world of the mad, as well as to the ethno-symbolic cultural materials of dormant nationhood. This rhetoric is augmented by a model in which the mad comprise a diaspora without an origin, coalescing into a single transnational community. The mad are also depicted as persons displaced from their metaphorical homeland, the “inner” world “colonized” by the psychiatric regime. There are a number of difficulties with Hornstein’s rhetoric, however. Her “ethnicity-and-rights” response to the oppression of the mad is symptomatic of Western parochialism, while her proposed transmutation of putative psychopathology from limit upon identity to parameter of successful identity is open to contestation. Moreover, unless one accepts Hornstein’s porous vision of mad identity, her self-ascribed insider status in relation to the mad community may present a problematic “re-colonization” of mad experience
Theory of Star Formation
We review current understanding of star formation, outlining an overall
theoretical framework and the observations that motivate it. A conception of
star formation has emerged in which turbulence plays a dual role, both creating
overdensities to initiate gravitational contraction or collapse, and countering
the effects of gravity in these overdense regions. The key dynamical processes
involved in star formation -- turbulence, magnetic fields, and self-gravity --
are highly nonlinear and multidimensional. Physical arguments are used to
identify and explain the features and scalings involved in star formation, and
results from numerical simulations are used to quantify these effects. We
divide star formation into large-scale and small-scale regimes and review each
in turn. Large scales range from galaxies to giant molecular clouds (GMCs) and
their substructures. Important problems include how GMCs form and evolve, what
determines the star formation rate (SFR), and what determines the initial mass
function (IMF). Small scales range from dense cores to the protostellar systems
they beget. We discuss formation of both low- and high-mass stars, including
ongoing accretion. The development of winds and outflows is increasingly well
understood, as are the mechanisms governing angular momentum transport in
disks. Although outstanding questions remain, the framework is now in place to
build a comprehensive theory of star formation that will be tested by the next
generation of telescopes.Comment: 120 pages, to appear in ARAA. No changes from v1 text; permission
statement adde
- …
