178 research outputs found

    Research-Informed Models for Communicating the Value of Court-Connected Alternative Dispute Resolution for Public Funding

    Get PDF
    The purpose of framing the research in the following models is to assist the court and its court-connected mediation programs in their ongoing dialogue with the funding decision-makers in addressing the question: Is state funding of community mediation centers and court ADR generally a worthwhile investment? As a means of setting forth components of an analytic framework, the following simplified financial models are offered to draw out salient aspects of the nature of the investment. The simplified models are employed primarily for the purpose of illustrating the investment in terms of classic financial models familiar to a budget analyst. These models intend to provide guidance in framing the funding decision. They are presented from a conservative stance – that is guiding where there are good levels of certainty, low levels of risk, and low downside, regarding return on investment

    Democracy in Practice: Lessons from New England

    Get PDF
    Political decision-making by elites require some form of civilian participation to regain legitimacy. Increasingly groups of Citizens do not trust in political elites and are increasingly frustrated by their behavior. When faced with the problem of diversity, even established democracies face problems of managing diversity. In the global context differences of opinion, culture, religion etc has defined many of the New Wars (Kaldor 1999). In the United States many non-state and semi-governmental organizations have developed programs to increase public knowledge of the legislature and its decision-making processes. The ultimate purpose of this is to exercise some control over state power. Legislators are also increasingly convening dialogue processes with their constituencies in order to create the best possible problem-solving mechanisms. Before the United States‘ model of public deliberation, many indigenous communities practiced a form of joint problem-solving in their villages throughout the world. But the history of New England is rich with a particular form of public deliberation that has continually demonstrated a capacity to increase civic participation and control of state power. New England Town Meetings are a model for direct democracy. The United States, which is also exporting democracy as a political and economic theory to countries facing violent conflict must improve its process domestically before contemplating its possible replication elsewhere. New England‘s public forums have faced certain challenges that must be overcome. These include theoretical and practical challenges with regard to their overall impact on legitimacy through increased citizen participation in decision-making. Deliberative democracy must prove that citizens can arrive at decisions that can affect the community in a positive way and that these decisions can be implemented by law-makers for the good of the people. While engaged in this process, the public must also grapple with the established forms of decision-making, lack of capacity and interest by its members, elite behavior and other practical and theoretical limitations

    Dissociation, enrichment, and the in vitro formation of gonocyte colonies from cryopreserved neonatal bovine testicular tissues

    Get PDF
    Gonocytes play an important role in early development of spermatogonial stem cells and fertility preservation to acquire more high quality gonocytes in vitro for further germ cell-related research and applications, it is necessarily needed to enrich and in vitro propagate gonocytes from cryopreserved bovine testicular tissues. This study aimed to investigate the isolation, enrichment, and colony formation of gonocytes in vitro for germ cell expansion from cryopreserved neonatal bovine testicular tissues. The effects of several different in vitro culture conditions, including seeding density, temperature, serum replacement and extracellular matrices were investigated for the maintenance, proliferation and formation of gonocyte colonies in vitro. Frozen/thawed two-week-old neonatal bovine testicular tissues were digested and gonocytes were enriched using a Percoll density gradient. Cell viability was accessed by trypan blue staining and cell apoptosis was evaluated by TUNEL assays. Gonocytes were identified and confirmed by immunofluorescence with the PGP9.5 germ cell marker and the OCT4 pluripotency marker while Sertoli cells were stained with vimentin. We found that neonatal bovine gonocytes were efficiently enriched by a 30%–40% Percoll density gradient (p < 0.05). No significant differences were detected between neonatal bovine testicular cells cultured at 34 °C or 37 °C. The formation of gonocyte colonies was observed in culture medium supplemented with knockout serum replacement (KSR), but not fetal bovine serum (FBS), at a seeding density higher than 5.0 × 104 cells/well. A greater number of gonocyte colonies were observed in culture plates coated with laminin (38.00 ± 6.24/well) and Matrigel (38.67 ± 3.78/well) when compared to plates coated with collagen IV and fibronectin (p < 0.05). In conclusion, bovine neonatal gonocytes were able to be efficiently isolated, enriched and maintained in gonocyte colonies in vitro; the development of this protocol provides vital information for the clinical translation of this technology and the future restoration of human fertility

    Temperature

    Get PDF
    KEY HEADLINES: • The first MCCIP ARC in 2006 reported following what was then the warmest year globally in 2005 (0.26°C higher than the 1981-2010 average). • Since 2005, new global record temperatures have been set in 2010 and then in each successive year 2014, 2015 and 2016. In these last three record years the global average temperature anomaly was 0.31,0.44, 0.56°C higher than the 1981-2010 average. • 2014 was a record warm year for coastal air and sea temperatures around the UK. Between 1984 and 2014 coastal water temperatures rose around the UK at an average rate of 0.28 °C/decade. The rate varies between regions, the slowest warming was in the Celtic Sea at 0.17 °C/decade and the maximum rate was in the Southern North Sea at 0.45 °C/decade. • There is also variability over shorter time periods. In all regions of UK seas there was a negative trend in the 10-year period between 2003 and 2013. This is due to variability within the ocean /atmosphere system which is natural. • There is a trend towards fewer in-situ observations, and this will ultimately influence the confidence in future assessments. • Some gridded datasets can offer alternatives to single point observations, but to understand the patterns of ocean variability, the quality information from ocean timeseries cannot yet be replaced by surface observations or autonomous data collection. • The first MCCIP report card in 2006 used the UKCIP projections from 2002 which had a very limited representation of the SST. • The latest updates to the UK Climate Projections shelf seas models were published in 2016 and projected increases in sea surface temperature for 2069-89 relative to 1960-89 of over 3 °C for most of the North Sea, English Channel, Irish and Celtic Seas. For the deeper areas to the north and west of Scotland out towards Rockall and in the Faroe Shetland Channel the increase in temperature is projected to be closer to 2 °C. • Over the last 10 years there has been a steady improvement in the scientific basis underlying centennial sea temperature projections for the seas around the UK, and significant progress in the field of seasonal and decadal projections. • The scientific basis to such projections and predictions will continue to improve over the next 10 years, with increasing resolution, treatment of climate uncertainties, and methodology. Over the centennial scale the difference between emissions scenarios are still the source of the largest uncertainties. • Development of North West European Shelf (NWS) modelling systems driven by seasonal forecasting systems may allow NWS temperature prediction over the monthly to decadal period

    Paleohydrological Context for Recent Floods and Droughts in the Fraser River Basin, British Columbia, Canada

    Get PDF
    The recent intensification of floods and droughts in the Fraser River Basin (FRB) of British Columbia has had profound cultural, ecological, and economic impacts that are expected to be exacerbated further by anthropogenic climate change. In part due to short instrumental runoff records, the long-Term stationarity of hydroclimatic extremes in this major North American watershed remains poorly understood, highlighting the need to use high-resolution paleoenvironmental proxies to inform on past streamflow. Here we use a network of tree-ring proxy records to develop 11 subbasin-scale, complementary flood-and drought-season reconstructions, the first of their kind. The reconstructions explicitly target management-relevant flood and drought seasons within each basin, and are examined in tandem to provide an expanded assessment of extreme events across the FRB with immediate implications for water management. We find that past high flood-season flows have been of greater magnitude and occurred in more consecutive years than during the observational record alone. Early 20th century low flows in the drought season were especially severe in both duration and magnitude in some subbasins relative to recent dry periods. Our Fraser subbasin-scale reconstructions provide long-Term benchmarks for the natural flood and drought variability prior to anthropogenic forcing. These reconstructions demonstrate that the instrumental streamflow records upon which current management is based likely underestimate the full natural magnitude, duration, and frequency of extreme seasonal flows in the FRB, as well as the potential severity of future anthropogenically forced events

    Simulations of the Microwave Sky

    Full text link
    We create realistic, full-sky, half-arcminute resolution simulations of the microwave sky matched to the most recent astrophysical observations. The primary purpose of these simulations is to test the data reduction pipeline for the Atacama Cosmology Telescope (ACT) experiment; however, we have widened the frequency coverage beyond the ACT bands to make these simulations applicable to other microwave background experiments. Some of the novel features of these simulations are that the radio and infrared galaxy populations are correlated with the galaxy cluster populations, the CMB is lensed by the dark matter structure in the simulation via a ray-tracing code, the contribution to the thermal and kinetic Sunyaev-Zel'dovich (SZ) signals from galaxy clusters, groups, and the IGM has been included, and the gas prescription to model the SZ signals matches the most recent X-ray observations. Regarding the contamination of cluster SZ flux by radio galaxies, we find for 148 GHz (90 GHz) only 3% (4%) of halos have their SZ decrements contaminated at a level of 20% or more. We find the contamination levels higher for infrared galaxies. However, at 90 GHz, less than 20% of clusters with M_{200} > 2.5 x 10^{14} Msun and z<1.2 have their SZ decrements filled in at a level of 20% or more. At 148 GHz, less than 20% of clusters with M_{200} > 2.5 x 10^{14} Msun and z<0.8 have their SZ decrements filled in at a level of 50% or larger. Our models also suggest that a population of very high flux infrared galaxies, which are likely lensed sources, contribute most to the SZ contamination of very massive clusters at 90 and 148 GHz. These simulations are publicly available and should serve as a useful tool for microwave surveys to cross-check SZ cluster detection, power spectrum, and cross-correlation analyses.Comment: Sims are now public at http://lambda.gsfc.nasa.gov/toolbox/tb_cmbsim_ov.cfm; Expanded discussion of N-body sim and IGM; Version accepted by Ap

    The Long-Baseline Neutrino Experiment: Exploring Fundamental Symmetries of the Universe

    Get PDF
    The preponderance of matter over antimatter in the early Universe, the dynamics of the supernova bursts that produced the heavy elements necessary for life and whether protons eventually decay --- these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our Universe, its current state and its eventual fate. The Long-Baseline Neutrino Experiment (LBNE) represents an extensively developed plan for a world-class experiment dedicated to addressing these questions. LBNE is conceived around three central components: (1) a new, high-intensity neutrino source generated from a megawatt-class proton accelerator at Fermi National Accelerator Laboratory, (2) a near neutrino detector just downstream of the source, and (3) a massive liquid argon time-projection chamber deployed as a far detector deep underground at the Sanford Underground Research Facility. This facility, located at the site of the former Homestake Mine in Lead, South Dakota, is approximately 1,300 km from the neutrino source at Fermilab -- a distance (baseline) that delivers optimal sensitivity to neutrino charge-parity symmetry violation and mass ordering effects. This ambitious yet cost-effective design incorporates scalability and flexibility and can accommodate a variety of upgrades and contributions. With its exceptional combination of experimental configuration, technical capabilities, and potential for transformative discoveries, LBNE promises to be a vital facility for the field of particle physics worldwide, providing physicists from around the globe with opportunities to collaborate in a twenty to thirty year program of exciting science. In this document we provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess.Comment: Major update of previous version. This is the reference document for LBNE science program and current status. Chapters 1, 3, and 9 provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess. 288 pages, 116 figure

    Lateralization of Simulated Sources and Echoes on the Basis of Interaural Differences of Level

    Get PDF
    This experiment assessed the relative weights given to source and echo pulses lateralized on the basis of interaural differences of level (IDLs). Separate conditions were run in which the to-be-judged target was the first (source) or second (echo) pulse. Each trial consisted of two intervals; the first presented a 3000-Hz diotic pulse that marked the intracranial midline and the pitch of the target frequency. The second presented the sequence of a source followed by an echo. Target frequency was always 3000 Hz, while the non-target pulse was presented at 1500, 3000, or 5000 Hz. Delays between the source and echo were varied from 8 to 128 ms. IDL’s were chosen for both pulses from Gaussian distributions with μ = 0 dB and σ = 4 dB. Dependent variables included normalized target weight, proportion correct, and the proportion of responses predicted from the weights. Although target weight and proportion correct generally increased with increasing non-target frequency and echo delay for both target conditions, the effects were always larger when the echo served as the target. The superiority of performance when judging echoes vs sources will be discussed in terms of recency effects in binaural hearing

    Galaxy and Mass Assembly: FUV, NUV, ugrizYJHK Petrosian, Kron and Sérsic photometry

    Get PDF
    In order to generate credible 0.1-2 μm spectral energy distributions, the Galaxy and Mass Assembly (GAMA) project requires many gigabytes of imaging data from a number of instruments to be reprocessed into a standard format. In this paper, we discuss the software infrastructure we use, and create self-consistent ugrizYJHK photometry for all sources within the GAMA sample. Using UKIDSS and SDSS archive data, we outline the pre-processing necessary to standardize all images to a common zero-point, the steps taken to correct for the seeing bias across the data set and the creation of gigapixel-scale mosaics of the three 4 × 12 deg2 GAMA regions in each filter. From these mosaics, we extract source catalogues for the GAMA regions using elliptical Kron and Petrosian matched apertures. We also calculate Sérsic magnitudes for all galaxies within the GAMA sample using sigma, a galaxy component modelling wrapper for galfit 3. We compare the resultant photometry directly and also calculate the r-band galaxy luminosity function for all photometric data sets to highlight the uncertainty introduced by the photometric method. We find that (1) changing the object detection threshold has a minor effect on the best-fitting Schechter parameters of the overall population (M*± 0.055 mag, α± 0.014, ϕ*± 0.0005 h3 Mpc−3); (2) there is an offset between data sets that use Kron or Petrosian photometry, regardless of the filter; (3) the decision to use circular or elliptical apertures causes an offset in M* of 0.20 mag; (4) the best-fitting Schechter parameters from total-magnitude photometric systems (such as SDSS modelmag or Sérsic magnitudes) have a steeper faint-end slope than photometric systems based upon Kron or Petrosian measurements; and (5) our Universe's total luminosity density, when calculated using Kron or Petrosian r-band photometry, is underestimated by at least 15 per cen
    • …
    corecore