13,034 research outputs found
Bedding down the embedding : IL reality in a teacher education programme
Queensland University of Technology (QUT) is one of Australia's largest universities,enrolling 30,000 students. Our Information Literacy Framework and Syllabus wasendorsed as university policy in Feb 2001. QUT Library uses the AustralianInformation Literacy Standards as the basis and entry point for our syllabus. Theuniversity wide information literacy programme promotes critical thinking and equipsindividuals for lifelong learning (Peacock, 2002a). Information literacy has developedas a premium agenda within the university community; as documented by JudithPeacock, the university’s Information Literacy Coordinator (Peacock, 2002b).The Faculties at QUT have for the last few years, started to work through how theinformation literacy syllabus will be enacted in their curricula, and within theorientations of their subject areas. Attitudinal change is happening alongside arealisation that discipline content must be taught within a broader framework.Curricula and pedagogical reforms are a characteristic of the teaching environment.Phrases such as lifelong learning, generic skills, information revolution, learningoutcomes and information literacy standards are now commonplace in facultydiscussion. Liaison librarians are strategically placed to see the "big picture" ofcurricula across large scale faculties in a large scale university. We work withfaculty in collaborative and consultative partnerships, in order to implement reform. QUT Librarians offer three levels of information literacy curriculum to the university.The generic programme is characterised by free classes, offered around the start ofsemesters. The next level is integrated teaching, developed to answer a specificneeds for classes of students. The third level of information literacy is that ofembedding throughout a programme. This involves liaison librarians working toensure that information literacy is a developmental and assessed part of thecurriculum, sequenced through a programme in a similar way to traditional disciplineknowledge, and utilising the IL syllabus. This paper gives a glimpse of what ishappening as we attempt the process of embedding information literacy into theBachelor of Education programme
Gaia reference frame amid quasar variability and proper motion patterns in the data
Gaia's very accurate astrometric measurements will allow the International
Celestial Reference Frame (ICRF) to be improved by a few orders of magnitude in
the optical. Several sets of quasars are used to define a kinematically stable
non-rotating reference frame with the barycentre of the Solar System as its
origin. Gaia will also observe a large number of galaxies which could obtain
accurate positions and proper motions although they are not point-like. The
optical stability of the quasars is critical and we investigate how accurately
the reference frame can be recovered. Various proper motion patterns are also
present in the data, the best known is caused by the acceleration of the Solar
System Barycentre, presumably, towards the Galactic centre. We review some
other less-well-known effects that are not part of standard astrometric models.
We model quasars and galaxies using realistic sky distributions, magnitudes and
redshifts. Position variability is introduced using a Markov chain model. The
reference frame is determined using the algorithm developed for the Gaia
mission which also determines the acceleration of the Solar System. We also
test a method to measure the velocity of the Solar System barycentre in a
cosmological frame. We simulate the recovery of the reference frame and the
acceleration of the Solar System and conclude that they are not significantly
disturbed in the presence of quasar variability which is statistically
averaged. However, the effect of a non-uniform sky distribution of the quasars
can result in a correlation between the reference frame and acceleration which
degrades the solution. Our results suggest that an attempt should be made to
astrometrically determine the redshift dependent apparent drift of galaxies due
to our velocity relative to the CMB, which in principle could allow the
determination of the Hubble parameter.Comment: published in A&A, revised version (v2) (Abstract is same as v1 as the
character limit is 1920, see the pdf for v2
The interstellar D1 line at high resolution
Observations at a resolving power or a velocity resolution are reported of the interstellar D(sub 1) line of Na I in the spectra of gamma Cas, delta Ori, epsilon Ori, pi Sco, delta Cyg, and alpha Cyg. An echelle grating was used in a double-pass configuration with a CCD detector in the coude spectrograph of the 2.7 m reflector at McDonald Observatory. At least 42 kinematically distinct clouds are detected along the light paths to the five more distant stars, in addition to a single cloud seen toward delta Cyg. The absorption lines arising in 13 of the clouds are sufficiently narrow and unblended to reveal clearly resolved hyperfine structure components split by 1.05 km/s. An additional 13 clouds apparently show comparably narrow, but more strongly blended, lines. For each individual cloud, upper limits T(sub max) and (v sub t)(sub max) on the temperature and the turbulent velocity, respectively, are derived by fitting the observed lines with theoretical absorption profiles
Electroweak Constraints from Atomic Parity Violation and Neutrino Scattering
Precision electroweak physics can provide fertile ground for uncovering new
physics beyond the Standard Model (SM). One area in which new physics can
appear is in so-called "oblique corrections", i.e., next-to-leading order
expansions of bosonic propagators corresponding to vacuum polarization. One may
parametrize their effects in terms of quantities and that discriminate
between conservation and non-conservation of isospin. This provides a means of
comparing the relative contributions of precision electroweak experiments to
constraints on new physics. Given the prevalence of strongly -sensitive
experiments, there is an acute need for further constraints on , such as
provided by atomic parity-violating experiments on heavy atoms. We evaluate
constraints on arising from recently improved calculations in the Cs atom.
We show that the top quark mass provides stringent constraints on
within the context of the Standard Model. We also consider the potential
contributions of next-generation neutrino scattering experiments to improved
constraints.Comment: 10 pages, 4 figures, final corrected version to be published in
Physical Review
Assessing Consumer Preferences for Organically Grown Fresh Fruit and Vegetables in Eastern New Brunswick
consumer preferences, organic fresh produce, willingness-to-pay, Consumer/Household Economics, Crop Production/Industries, Demand and Price Analysis, Q1,
Casino Hosting: Back to the Basics
The heart and soul of Las Vegas has always been the casino industry. People from all over the world have traveled many miles to put their money on the line in the hopes of hitting the jackpot or just to see the “adult playground” so many have spoken of and viewed in multiple movies. In the 21st century the Las Vegas casino industry has competition popping up across the country and all over the world. These new casinos put intense pressure on Las Vegas to attract the “whales” and high rollers to the casinos on the Las Vegas strip. Many foreign gamblers now have casinos close to their home making it an easier, and possibly a shorter, trip for them to successfully scratch their gambling itch.
The weapon that casinos use to drive business from customers is the casino host. This host not only entertains the gambler and their entourage, but they also must understand the ins and outs of the casino and make high dollar decisions on a daily basis. Many casino hosts work their way up to this position through different departments on the casino floor starting as dealers, cage clerks, floor supervisors, etc.; thus having learned a vast amount of knowledge about the casino industry and games. But every casino customer is different, requiring a wide range of diversity and personalities throughout the casino host department. Some of these casino hosts are brand new to the casino industry upon hiring. How does this new employee grasp all of the important information needed to appropriately assist their customers?
The purpose of this manual is to guide new and experienced casino hosts through important decisions that affect their daily job
The extent of the local hi halo
Forty-five high-latitude, OB stars have been observed in the Ly alpha and 21 cm lines of HI in an effort to map out the vertical distribution and extent of the local HI halo. The 25 stars for which a reliable HI colum density can be obtained from Ly alpha lie between 60 and 3100 pc from the plane. The principal result is that the total column density of HI at z 1 kpc is, on the average, 5 + or - 3 x 10 the 19th power/sq cm, or 15% of the total sub HI. At relatively low z the data toward some stars suggest a low effective scale height and fairly high average foreground density, while toward others the effective scale height is large and the average density is low. This can be understood as the result of irregularities in the interstellar medium. A model with half of the HI mass in clouds having radii of a few pc and a Gaussian vertical distribution with sigma sub 2 = 135 pc, and half of the mass in an exponential component with a scale height of 500 pc, gives a satisfactory fit to the data. The technique of comparing Ly alpha and 21 cm column densities is also used to discuss the problem of estimating the distance to several possibly subluminous stars
Hobbs, William
Co. B Hope Case Det. Camp Pataduckhttps://dh.howard.edu/prom_members/1042/thumbnail.jp
Asymptotic Neutronic Solutions for Fast Burst Reactor Design
Deterministic numerical methodologies for solving time-eigenvalue problems are valuable in characterizing the inherent rapid transient neutron behavior of a Fast Burst Reactor (FBR). New nonlinear solution techniques used to solve eigenvalue problems show great promise in modeling the neutronics of reactors. This research utilizes nonlinear solution techniques to solve for the dominant time-eigenvalue associated with the asymptotic (exponential) solution to the neutron diffusion and even-parity form of the neutron transport equation, and lays the foundation for coupling with other physics phenomena associated with FBRs.
High security costs and proliferation risks associated with Highly Enriched Uranium (HEU) fueled FBRs are the motivation for this research. Use of Low Enriched Uranium (LEU) as fuel reduces these risks to acceptable levels. However, the use of LEU fuel introduces complexities such as, increased volume, and longer neutron lifetimes. Numerical techniques are sought to explore these complexities and determine the limitations and potential of a LEU fueled FBR.
A combination of deterministic and stochastic computational modeling techniques are tools used to investigate the effects these complexities have on reactor design and performance. Monte Carlo N-Particle (MCNP) code is useful to determine criticality and calculate reactor kinetics parameters of current and proposed designs. New deterministic methods are developed to directly calculate the fundamental time-eigenvalue in a way that will support multi-physics coupling. The methods incorporate Jacobian Free Newton Krylov solution techniques to address the nonlinear nature of the neutronics equations.
These new deterministic models produce data to determine LEU designs that may meet the performance requirements of proven HEU FBRs in terms of neutron burst yield and burst duration (pulse width) based on the Nordheim-Fuchs model. This computational data and measured performance characteristics of historical LEU FBRs show that LEU designs can generate pulses that are beneficial for meeting Research and Development (R&D) requirements. These modern computational neutronic results indicate that a LEU fueled FBR is a plausible alternative to current HEU fueled reactors
The Architectonics of Information: Ancient Topical Thought and Postmodern Information
This paper examines the usefulness of thought patterns from ancient rhetoric as they have been appropriated historically and as potentially applicable concepts for the present and future in today\u27s interlinked electronic environment.
An earlier version of this paper, first delivered as part of a panel at the Rhetoric Society of America meeting at Tucson in May 1 996, was delivered and published as The Architectonics of Information: Ancient Topical Thought and Postmodern Cognition in Proceedings of the Mid-America Symposium on E merging Computer Technologies, October 1 996. (The published papers are available in Information Problems at http://www.ou.edu/cas/english/agora/). I would like to thank Jana Moring and Dianne Juby for their collaboration on the panel
- …