599,431 research outputs found
Perspectives on Innovation in the Fashion Industry
No matter the industry, innovation focuses on turning a creative idea into a well-executed, valuable product. In just the past 100 years, the fashion industry has demonstrated that creativity is the heart of innovation. Simple innovations such as improving the fit of a garment, to complex ideas such as developments in sustainable dyeing practices have all aided in the development of the industry today. Two major focuses for the future of fashion are technology and sustainability. By thinking ahead of the present time, innovators can implement solutions to improve the future of fashion
Reconstruction of interacting dark energy models from parameterizations
Models with interacting dark energy can alleviate the cosmic coincidence
problem by allowing dark matter and dark energy to evolve in a similar fashion.
At a fundamental level, these models are specified by choosing a functional
form for the scalar potential and for the interaction term. However, in order
to compare to observational data it is usually more convenient to use
parameterizations of the dark energy equation of state and the evolution of the
dark matter energy density. Once the relevant parameters are fitted it is
important to obtain the shape of the fundamental functions. In this paper I
show how to reconstruct the scalar potential and the scalar interaction with
dark matter from general parameterizations. I give a few examples and show that
it is possible for the effective equation of state for the scalar field to
cross the phantom barrier when interactions are allowed. I analyze the
uncertainties in the reconstructed potential arising from foreseen errors in
the estimation of fit parameters and point out that a Yukawa-like linear
interaction results from a simple parameterization of the coupling.Comment: 6 pages, 8 figure
Neuropsychological development in adolescents: Longitudinal associations with white matter microstructure
Important neuropsychological changes during adolescence coincide with the maturation of white matter microstructure. Few studies have investigated the association between neuropsychological development and white matter maturation longitudinally. We aimed to characterize developmental trajectories of inhibition, planning, emotion recognition and risk-taking and examine whether white matter microstructural characteristics were associated with neuropsychological development above and beyond age. In an accelerated longitudinal cohort design, n 1/4 112 healthy adolescents between ages 9 and 16 underwent cognitive assessment and diffusion MRI over three years. Fractional anisotropy (FA) and mean diffusivity (MD) were extracted for major white matter pathways using an automatic probabilistic reconstruction technique and mixed models were used for statistical analyses. Inhibition, planning and emotion recognition performance improved linearly across adolescence. Risk-taking developed in a quadratic fashion, with stable performance between 9 and 12 and an increase between ages 12 and 16. Including cingulum and superior longitudinal fasciculus FA slightly improved model fit for emotion recognition across age. We found no evidence that FA or MD were related to inhibition, planning or risk-taking across age. Our results challenge the additional value of white matter microstructure to explain neuropsychological development in healthy adolescents, but more longitudinal research with large datasets is needed to identify the potential role of white matter microstructure in cognitive development
A long, hard look at MCG-6-30-15 with XMM-Newton II: detailed EPIC analysis and modelling
The bright Seyfert 1 galaxy MCG-6-30-15 has provided some of the best
evidence to date for the existence of supermassive black holes in active
galactic nuclei. Observations with ASCA revealed an X-ray iron line profile
shaped by strong Doppler and gravitational effects. In this paper the shape of
the iron line, its variability characteristics and the robustness of this
spectral interpretation are examined using the long XMM-Newton observation
taken in 2001. A variety of spectral models, both including and excluding the
effects of strong gravity, are compared to the data in a uniform fashion. The
results strongly favour models in which the spectrum is shaped by emission from
a relativistic accretion disc. It is far more difficult to explain the 3-10 keV
spectrum using models dominated by absorption (either by warm or partially
covering cold matter), emission line blends, curved continua or additional
continuum components. These provide a substantially worse fit to the data and
fail to explain other observations (such as the simultaneous BeppoSAX
spectrum). This reaffirms the veracity of the relativistic `disc line'
interpretation. The short term variability in the shape of the energy spectrum
is investigated and explained in terms of a two-component emission model. Using
a combination of spectral variability analyses the spectrum is successfully
decomposed into a variable power-law component (PLC) and a reflection dominated
component (RDC). The former is highly variable while the latter is
approximately constant throughout the observation, leading to the well-known
spectral variability patterns. (Abridged)Comment: 25 pages. 24 figures. Accepted for publication in MNRA
Constraining quasar host halo masses with the strength of nearby Lyman-alpha forest absorption
Using cosmological hydrodynamic simulations we measure the mean transmitted
flux in the Lyman alpha forest for quasar sightlines that pass near a
foreground quasar. We find that the trend of absorption with pixel-quasar
separation distance can be fitted using a simple power law form including the
usual correlation function parameters r_{0} and \gamma so that ( = \sum
exp(-tau_eff*(1+(r/r_{0})^(-\gamma)))). From the simulations we find the
relation between r_{0} and quasar mass and formulate this as a way to estimate
quasar host dark matter halo masses, quantifying uncertainties due to
cosmological and IGM parameters, and redshift errors. With this method, we
examine data for ~3000 quasars from the Sloan Digital Sky Survey (SDSS) Data
Release 3, assuming that the effect of ionizing radiation from quasars (the
so-called transverse proximity effect) is unimportant (no evidence for it is
seen in the data.) We find that the best fit host halo mass for SDSS quasars
with mean redshift z=3 and absolute G band magnitude -27.5 is log10(M/M_sun) =
12.48^{+0.53}_{-0.89}. We also use the Lyman-Break Galaxy (LBG) and Lyman alpha
forest data of Adelberger et al in a similar fashion to constrain the halo mass
of LBGs to be log10(M/M_sun) = 11.13^{+0.39}_{-0.55}, a factor of ~20 lower
than the bright quasars. In addition, we study the redshift distortions of the
Lyman alpha forest around quasars, using the simulations. We use the quadrupole
to monopole ratio of the quasar-Lyman alpha forest correlation function as a
measure of the squashing effect. We find that this does not have a measurable
dependence on halo mass, but may be useful for constraining cosmic geometry.Comment: 10 pages, 11 figures, submitted to MNRA
Natural priors, CMSSM fits and LHC weather forecasts
Previous LHC forecasts for the constrained minimal supersymmetric standard
model (CMSSM), based on current astrophysical and laboratory measurements, have
used priors that are flat in the parameter tan beta, while being constrained to
postdict the central experimental value of MZ. We construct a different, new
and more natural prior with a measure in mu and B (the more fundamental MSSM
parameters from which tan beta and MZ are actually derived). We find that as a
consequence this choice leads to a well defined fine-tuning measure in the
parameter space. We investigate the effect of such on global CMSSM fits to
indirect constraints, providing posterior probability distributions for Large
Hadron Collider (LHC) sparticle production cross sections. The change in priors
has a significant effect, strongly suppressing the pseudoscalar Higgs boson
dark matter annihilation region, and diminishing the probable values of
sparticle masses. We also show how to interpret fit information from a Markov
Chain Monte Carlo in a frequentist fashion; namely by using the profile
likelihood. Bayesian and frequentist interpretations of CMSSM fits are compared
and contrasted
Claremont I and II - Were They Rightly Decided, and Where Have They Left Us?
[Excerpt] “Our children embody the enduring wonder of life. They hold our hopes for the future. We want them to be happy, to succeed in whatever they do both in work and in play. We want them to contribute to our country and the world in constructive ways.
But for these hopes to be realized our children must be educated-they must possess the requisite skills and knowledge to function well in this ever changing world. Yet, are we, as a society, meeting our responsibility to educate our children? What do we expect of our public schools? How important are these schools to us? Is a public education fit for the times guaranteed as a constitutional matter?
These questions loomed large in the New Hampshire Supreme Court\u27s decisions in Claremont I and Claremont II, issued respectively in 1993 and 1997. Constituting New Hampshire\u27s core education rulings, they are among the Court\u27s most controversial exercises of constitutional jurisprudence.
[…]
This article concludes that the New Hampshire Supreme Court correctly determined in Claremont I that Article 83 established enforceable positive constitutional rights for the provision and funding of an adequate public education. The Court acted properly in recognizing that the judiciary had an important role to play to assure these important constitutional rights. Claremont I properly upheld the State\u27s constitutional obligation to accord the State\u27s public school children with access to an education that would at all times enable them to be good citizens productive in their work. The decision also reflected proper regard for the prerogatives of the elected branches by leaving to them, at least initially, the development of an operational definition of adequacy in education, along with the responsibility to fashion the appropriate means to provide for it.
The Claremont II decision, however, does not earn like approbation. It fails to stand up strongly as a tax ruling. It does not constitute a good appellate review of the other Superior Court rulings against the petitioners. The Court majority, after issuing its decision, deferred to the elected branches to give them time to fashion a remedy. Its decision, however, was not well received, or easily accepted, by many in the Legislature. Only after much resistance and much delay did the elected branches manage to put in place certain educational adequacy /funding reforms.
Whatever their merits or flaws, this article sees these two decisions as having importantly and positively impacted New Hampshire\u27s public education system. The decisions had a good deal to do with ushering in needed reforms, so that the education system now operates with a specific definition for a constitutionally adequate education, regular assessment and accountability tools, and a costing out of adequacy linked to associated funding. The decisions have thus better positioned the public education system to meet the challenges of the future.
Development of the edible blend films with good mechanical and barrier properties from pea starch and guar gum
The individual and interactive impacts of guar gum and glycerol on the pea starch-based edible film characteristics were examined using three factors with three level Box–Behnken response surface design. The results showed that density and elongation at break were only significantly (p < 0.05) affected by pea starch and guar gum in a positive linear fashion. The quadratic regression coefficient of pea starch showed a significant effect (p < 0.05) on thickness, density, puncture force, water vapour permeability, and tensile strength. While tensile strength and Young modulus affected by the quadratic regression coefficient of glycerol and guar gum, respectively. The results were analysed using Pareto analysis of variance (ANOVA) and the developed predictive equations for each response variable presented reliable and satisfactory fit with high coefficient of determination (R2) values (≥ 0.96). The optimized conditions with the goal of maximizing mechanical properties and minimizing water vapour permeability were 2.5 g pea starch, 0.3 g guar gum and 25 % (w/w) glycerol based on the dry film matter in 100 ml of distilled water. Generally, changes in the concentrations of pea starch, guar gum and glycerol resulted in changes in the functional properties of film
Legitimate Interpretation – Or Legitimate Adjudication?
Current debate about the legitimacy of lawmaking by courts focuses on what constitutes legitimate interpretation. The debate has reached an impasse in that originalism and textualism appear to have the stronger case as a matter of theory while living constitutionalism and dynamic interpretation provide much account of actual practice. This Article argues that if we refocus the debate by asking what constitutes legitimate adjudication, as determined by the social practice of the parties and their lawyers who take part in adjudication, it is possible to develop an account of legitimacy that produces a much better fit between theory and practice. The decisional norms employed by adjudicators include faithful agent arguments about governing texts, arguments from precedent, and arguments from settled practice, but also, in a more qualified fashion, considerations of morality and social consequences. Adjudicators mix and match these norms in reaching outcomes but do so in a way that is regarded as legitimate by the losers as well as the winners in contested adjudications. A general normative implication of this refocused account of legitimacy is that adjudicators, including high-level appeals courts, should not stray far from their basic function of dispute resolution, as opposed to law declaration
- …