1,401 research outputs found
Greater Jacksonville\u27s Response to the Florida Land Boom of the 1920s
The Florida land boom was an orgy of real estate speculation and development that swept the state during the period 1924 through 1926. The few books and articles that deal with that event rarely mention Jacksonville, although it was Florida\u27s largest city and its chief commercial and transportation center. This could lead one to the conclusion that the North Florida city did not become caught up in the boom. Yet scattered throughout the Jacksonville area are the remains of a number of real estate projects that date from that period.
Therefore, this thesis examines the effects of the boom on greater Jacksonville during the 1920s. During the years immediately following World War I, Jacksonville\u27s leaders concentrated on expansion of industry and commerce to promote their city\u27s growth, rather than building tourism. Jacksonville had not been a major winter resort since the building of railroads southward in the late 1800s, and this made the North Florida city different than its downstate rivals. The increasing prosperity of the 1920s brought growing numbers of tourists, new residents, and land speculators to resort centers in South and Central Florida, but few to Jacksonville.
As interest in Florida grew, the expanding numbers of land buyers created a frenzy of real estate sales and development downstate. The most immediate effect of the boom for Jacksonville was tremendous expansion of the city\u27s industries, as they provisioned the state. However, many local residents became interested in syphoning off some of the tourists and land buyers for their own community. This resulted in civic promotion of Jacksonville as a resort, and the construction of a number of new real estate projects primarily for winter residents, including San Jose, Venetia, Florida Beach, and San Marco. Local expansion of business and real estate also resulted in the construction of several major buildings in downtown Jacksonville.
Early in 1926, real estate prices broke downstate and many of the speculators and other newcomers went home. This created a statewide economic decline during the late 1920s that resulted in the failure of many real estate developments throughout Florida, including some in greater Jacksonville. With its extensive commercial and transportation complex, however, the North Florida city fared better than its tourist-dependent rivals downstate. Throughout the late 1920s, percentages of economic decline for Jacksonville were much smaller than in cities such as Miami and St. Petersburg
Sidechain control of porosity closure in multiple peptide-based porous materials by cooperative folding
Porous materials find application in separation, storage and catalysis. We report a crystalline porous solid formed by coordination of metal centres with a glycylserine dipeptide. We prove experimentally that the structure evolves from a solvated porous into a non-porous state as result of ordered displacive and conformational changes of the peptide that suppress the void space in response to environmental pressure. This cooperative closure, which recalls the folding of proteins, retains order in three-dimensions and is driven by the hydroxyl groups acting as H-bond donors in the peptide sequence through the serine residue. This ordered closure is also displayed by multipeptide solid solutions in which the combination of different sequences of amino acids controls their guest response in a non-linear way. This functional control can be compared to the effect of single point mutations in proteins, where the exchange of single amino acids can radically alter structure and functio
A Bayesian approach to star-galaxy classification
Star-galaxy classification is one of the most fundamental data-processing
tasks in survey astronomy, and a critical starting point for the scientific
exploitation of survey data. For bright sources this classification can be done
with almost complete reliability, but for the numerous sources close to a
survey's detection limit each image encodes only limited morphological
information. In this regime, from which many of the new scientific discoveries
are likely to come, it is vital to utilise all the available information about
a source, both from multiple measurements and also prior knowledge about the
star and galaxy populations. It is also more useful and realistic to provide
classification probabilities than decisive classifications. All these
desiderata can be met by adopting a Bayesian approach to star-galaxy
classification, and we develop a very general formalism for doing so. An
immediate implication of applying Bayes's theorem to this problem is that it is
formally impossible to combine morphological measurements in different bands
without using colour information as well; however we develop several
approximations that disregard colour information as much as possible. The
resultant scheme is applied to data from the UKIRT Infrared Deep Sky Survey
(UKIDSS), and tested by comparing the results to deep Sloan Digital Sky Survey
(SDSS) Stripe 82 measurements of the same sources. The Bayesian classification
probabilities obtained from the UKIDSS data agree well with the deep SDSS
classifications both overall (a mismatch rate of 0.022, compared to 0.044 for
the UKIDSS pipeline classifier) and close to the UKIDSS detection limit (a
mismatch rate of 0.068 compared to 0.075 for the UKIDSS pipeline classifier).
The Bayesian formalism developed here can be applied to improve the reliability
of any star-galaxy classification schemes based on the measured values of
morphology statistics alone.Comment: Accepted 22 November 2010, 19 pages, 17 figure
Numerical Simulations of Gravity-Driven Fingering in Unsaturated Porous Media Using a Non-Equilibrium Model
This is a computational study of gravity-driven fingering instabilities in
unsaturated porous media. The governing equations and corresponding numerical
scheme are based on the work of Nieber et al. [Ch. 23 in Soil Water Repellency,
eds. C. J. Ritsema and L. W. Dekker, Elsevier, 2003] in which non-monotonic
saturation profiles are obtained by supplementing the Richards equation with a
non-equilibrium capillary pressure-saturation relationship, as well as
including hysteretic effects. The first part of the study takes an extensive
look at the sensitivity of the finger solutions to certain key parameters in
the model such as capillary shape parameter, initial saturation, and capillary
relaxation coefficient. The second part is a comparison to published
experimental results that demonstrates the ability of the model to capture
realistic fingering behaviour
Observing the Evolution of the Universe
How did the universe evolve? The fine angular scale (l>1000) temperature and
polarization anisotropies in the CMB are a Rosetta stone for understanding the
evolution of the universe. Through detailed measurements one may address
everything from the physics of the birth of the universe to the history of star
formation and the process by which galaxies formed. One may in addition track
the evolution of the dark energy and discover the net neutrino mass.
We are at the dawn of a new era in which hundreds of square degrees of sky
can be mapped with arcminute resolution and sensitivities measured in
microKelvin. Acquiring these data requires the use of special purpose
telescopes such as the Atacama Cosmology Telescope (ACT), located in Chile, and
the South Pole Telescope (SPT). These new telescopes are outfitted with a new
generation of custom mm-wave kilo-pixel arrays. Additional instruments are in
the planning stages.Comment: Science White Paper submitted to the US Astro2010 Decadal Survey.
Full list of 177 author available at http://cmbpol.uchicago.ed
Party identification and party closeness in comparative perspective
The present analysis uses data from 1974 and 1981 U. S. cross sections, which incorporate a panel, to compare the standard NES measure of party identification (ID) with a measure of partisanship derived from a party closeness question widely employed in cross-national research. Important features of the two scales are examined by transforming the closeness measure into a scale of very close, fairly close, not very close, and no preference corresponding to the seven-point ID scale. The scales are highly correlated and are similar in their reliability. More than 75% of the âindependentsâ in the ID scale choose a party in the closeness version, and over half of these select the âfairly closeâ category. Respondents do not volunteer that they are independents when that alternative is not stated in the question.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/45482/1/11109_2004_Article_BF00990552.pd
LSST: from Science Drivers to Reference Design and Anticipated Data Products
(Abridged) We describe here the most ambitious survey currently planned in
the optical, the Large Synoptic Survey Telescope (LSST). A vast array of
science will be enabled by a single wide-deep-fast sky survey, and LSST will
have unique survey capability in the faint time domain. The LSST design is
driven by four main science themes: probing dark energy and dark matter, taking
an inventory of the Solar System, exploring the transient optical sky, and
mapping the Milky Way. LSST will be a wide-field ground-based system sited at
Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m
effective) primary mirror, a 9.6 deg field of view, and a 3.2 Gigapixel
camera. The standard observing sequence will consist of pairs of 15-second
exposures in a given field, with two such visits in each pointing in a given
night. With these repeats, the LSST system is capable of imaging about 10,000
square degrees of sky in a single filter in three nights. The typical 5
point-source depth in a single visit in will be (AB). The
project is in the construction phase and will begin regular survey operations
by 2022. The survey area will be contained within 30,000 deg with
, and will be imaged multiple times in six bands, ,
covering the wavelength range 320--1050 nm. About 90\% of the observing time
will be devoted to a deep-wide-fast survey mode which will uniformly observe a
18,000 deg region about 800 times (summed over all six bands) during the
anticipated 10 years of operations, and yield a coadded map to . The
remaining 10\% of the observing time will be allocated to projects such as a
Very Deep and Fast time domain survey. The goal is to make LSST data products,
including a relational database of about 32 trillion observations of 40 billion
objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures
available from https://www.lsst.org/overvie
Molecular diagnosis of Burkitt\u27s lymphoma.
BACKGROUND: The distinction between Burkitt\u27s lymphoma and diffuse large-B-cell lymphoma is crucial because these two types of lymphoma require different treatments. We examined whether gene-expression profiling could reliably distinguish Burkitt\u27s lymphoma from diffuse large-B-cell lymphoma.
METHODS: Tumor-biopsy specimens from 303 patients with aggressive lymphomas were profiled for gene expression and were also classified according to morphology, immunohistochemistry, and detection of the t(8;14) c-myc translocation.
RESULTS: A classifier based on gene expression correctly identified all 25 pathologically verified cases of classic Burkitt\u27s lymphoma. Burkitt\u27s lymphoma was readily distinguished from diffuse large-B-cell lymphoma by the high level of expression of c-myc target genes, the expression of a subgroup of germinal-center B-cell genes, and the low level of expression of major-histocompatibility-complex class I genes and nuclear factor-kappaB target genes. Eight specimens with a pathological diagnosis of diffuse large-B-cell lymphoma had the typical gene-expression profile of Burkitt\u27s lymphoma, suggesting they represent cases of Burkitt\u27s lymphoma that are difficult to diagnose by current methods. Among 28 of the patients with a molecular diagnosis of Burkitt\u27s lymphoma, the overall survival was superior among those who had received intensive chemotherapy regimens instead of lower-dose regimens.
CONCLUSIONS: Gene-expression profiling is an accurate, quantitative method for distinguishing Burkitt\u27s lymphoma from diffuse large-B-cell lymphoma
The National COVID Cohort Collaborative (N3C): Rationale, design, infrastructure, and deployment.
OBJECTIVE: Coronavirus disease 2019 (COVID-19) poses societal challenges that require expeditious data and knowledge sharing. Though organizational clinical data are abundant, these are largely inaccessible to outside researchers. Statistical, machine learning, and causal analyses are most successful with large-scale data beyond what is available in any given organization. Here, we introduce the National COVID Cohort Collaborative (N3C), an open science community focused on analyzing patient-level data from many centers.
MATERIALS AND METHODS: The Clinical and Translational Science Award Program and scientific community created N3C to overcome technical, regulatory, policy, and governance barriers to sharing and harmonizing individual-level clinical data. We developed solutions to extract, aggregate, and harmonize data across organizations and data models, and created a secure data enclave to enable efficient, transparent, and reproducible collaborative analytics.
RESULTS: Organized in inclusive workstreams, we created legal agreements and governance for organizations and researchers; data extraction scripts to identify and ingest positive, negative, and possible COVID-19 cases; a data quality assurance and harmonization pipeline to create a single harmonized dataset; population of the secure data enclave with data, machine learning, and statistical analytics tools; dissemination mechanisms; and a synthetic data pilot to democratize data access.
CONCLUSIONS: The N3C has demonstrated that a multisite collaborative learning health network can overcome barriers to rapidly build a scalable infrastructure incorporating multiorganizational clinical data for COVID-19 analytics. We expect this effort to save lives by enabling rapid collaboration among clinicians, researchers, and data scientists to identify treatments and specialized care and thereby reduce the immediate and long-term impacts of COVID-19
- âŠ