1,374 research outputs found

    Peripheral cues and gaze direction jointly focus attention and inhibition of return

    Get PDF
    Centrally presented gaze cues typically elicit a delayed inhibition of return (IOR) effect compared to peripheral exogenous cues. We investigated whether gaze cues elicit early onset IOR when presented peripherally. Faces were presented in the left or right peripheral hemifields, which then gazed upward or downward. A target appeared in one of four oblique spatial locations giving the cue and target horizontal or vertical congruency, both, or neither. After establishing that peripheral movement and gaze direction jointly facilitate target processing at short durations (200 ms: Experiment 1), IOR was evident for peripheral motion at longer time courses (800 and 2400 ms: Experiment 2). Only after 2400 ms did gaze direction additionally contribute to IOR for the specific gazed at location, showing the inverse pattern of response times to Experiment 1. The onset of IOR for gaze cues is independent from peripheral exogenous cueing but nevertheless contributes to the allocation of attention

    Non-Serious Serious Games

    Get PDF
    Serious games have been shown to promote behavioural change and impart skills to players, and non-serious games have proven to have numerous benefits. This paper argues that non-serious digital games played in a ‘clan’ or online community setting can lead to similar real world benefits to serious games. This paper reports the outcomes from an ethnographic study and the analysis of user generated data from an online gaming clan. The outcomes support previous research which shows that non-serious games can be a setting for improved social well-being, second language learning, and self-esteem/confidence building. In addition this paper presents the novel results that play within online game communities can impart benefits to players, such as treating a fear of public speaking. This paper ultimately argues that communities of Gamers impart ‘serious’ benefits to their members

    Semiotics & Syringe Pumps

    Get PDF
    Semiotics can be a useful paradigm in HCI research and yet the cognitive process of semiosis is difficult to uncover and empirically study. Thus in the domain of HCI semiotics has largely remained a descriptive theory, able to provide a theoretical basis for the study of interfaces and interaction, but unable to produce empirical data and generative research. This work, made up of several studies, aims to investigates human errors motivated by the problems of medical interfaces. It takes an empirical approach to investigate the interplay between semiotic signs and human error, attempting to uncover how signs in the interface may affect the use of interactive devices. Interfaces are created from signs, collections of symbols, icons and indices which form a semiotic scene, a meaningful whole through which the user may interact with the underlying system. Therefore interaction with an interface relies heavily on the process of semiosis. The first study in this thesis was a questionnaire study looking at number pads as indurate signs for calcu- lators and telephones. The questionnaire was designed to ascertain how users interpreted number-pads and what features of the number-pad influenced this interpretation. We found that the layout of the numerical buttons on a number-pad had little to do with how the number-pad was perceived, and that the users based their assumptions about the use of the interface based entirely upon the extra contextualizing non-numerical buttons. The wish to use a semiotic paradigm in an empirical study demanded the exploration of a novel experimental methodology. The next set of studies were experiments to see whether the interpretation of indurate signs could be overcome under pressure. Thus we used a computer game based experiments as it was thought that they would allow for the complete control and manipulation of signs within the experimental environment, and encourage more natural semiosis that one might expect from participants in a real life task based explicit ex- periment. In these studies it was found that under pressure participants fell back upon the culturally fossilized meanings of the indurate signs they encountered, suggesting that indurate signs may cause misinterpretation in human-machine interaction if used ineffectively. Overall this thesis makes a contribution to semiotics by exploring the notion of indurate signs and how they are interpreted, by investigating what features of common interfaces affect semiosis, and by attempting to further the course of empirical semiotic studies. This thesis also contributes towards the use of computer games as a research tool by charting the evolution of the game based experimental methodology over the course of this thesis

    Warren J. Baker Center for Science and Mathematics California Polytechnic State University, San Luis Obispo Campus

    Get PDF
    The purpose of this report is to analyze the Warren J Baker Center for Science and Math on the California Polytechnic State University, San Luis Obispo campus. Baker Science is primarily used for classroom and laboratory instruction of college students, with study spaces, faculty offices, and assembly spaces. There are mechanical, electrical, and storage spaces ancillary to the main building functions. The building comprises six floors that are between 23,000 and 44,000 ft2. The first floor contains classrooms, faculty offices, mechanical/electrical spaces, and an auditorium with fixed seating. An atrium connects floors two through five. There are walkways running down the center of the atrium with openings on both sides. These floors all contain a mix of classrooms, laboratories, faculty offices, study spaces, and supporting mechanical, electrical, and storage rooms. The prescriptive analysis of the Baker Science building determines if the building construction complies with the applicable codes and standards. These codes and standards cover life safety, fire suppression, fire alarm and detection, and structural requirements. The life safety section analyzes the ability of the building to safely evacuate occupants in a timely manner. This is accomplished with code-specified stair and door widths, exit locations, and exit fire rating requirements. The building is protected throughout with an automatic wet pipe sprinkler system, fed from a fire pump on the first floor. The fire pump is supplied from a city water loop to the north of the building. All sprinklers in the building are quick response K-5.6 sprinklers. The fire suppression system activates the fire alarm system in the event of a fire. The alarm system can also be activated with smoke alarms, heat alarms, and manual pull stations. The generation of any part of the fire alarm system in the atrium will activate the passive smoke control system. This system opens roof vents that allow smoke to escape, and also opens doors at the bottom of the atrium to provide makeup air. Doors to the wings of the building are released and closed to reduce the travel of smoke to the east and west wings. The structural fire protection codes provide occupancy separation requirements, and limit building height and area based on construction type. The performance based analysis seeks to determine how well the building systems can handle a real-life fire scenario, with a focus on building occupants being able to safely evacuate the building in the event of a fire. The ability to safely exit the building is based upon the requirement for tenability to be maintained in the egress route for the entire time it takes for evacuation to be completed. The performance analysis in this report centers around two design fires: one in the atrium, and one in the lobby outside the assembly space. The design fires represent scenarios that would challenge the fire protection capabilities of the building, while still having a probability of occurring. Small fires or fires in unoccupied spaces were not analyzed since they would be unlikely to test the limits of the building’s fire protection systems. The design fire in the atrium exposed occupants to smoke and combustion products, and activated the atrium smoke control system. The smoke venting was inadequate to remove the smoke created by this design fire, resulting in smoke accumulation that limited the visibility of occupants egressing on the sixth floor. The available safe egress time is 3.23 min after ignition, based on the minimum visibility of 4 m being lost on the sixth floor, while that floor has a required safe egress time of 3.96 min. The design fire in the lobby outside the auditorium also exposed occupants to smoke, limiting visibility. The available safe egress time in the lobby is 2.00 min, at which point the 13 m visibility limit is no longer maintained. The required safe egress time is 8.64 min from the time of ignition, significantly longer than the time available to occupants. Prescriptive analysis of the Baker Science building determined that the building was adequately built to the relevant life safety and fire codes. The performance based analysis discovered some shortcomings in the building design. These faults were primarily centered around tenability time for evacuating occupants in the building. On the sixth floor in the atrium, occupants experienced reduced visibility that could impede their ability to safely find the exits and safely escape. The cause of the reduced visibility was the buildup of smoke from the fire, which in turn was caused by inadequate smoke removal by the smoke control system. As a passive system without fans, the smoke can only be removed at a limited rate. One solution to this would be to install additional passive smoke vents. A better solution would be to install powered smoke vents with a rating capable of evacuating adequate smoke from a challenging design fire. Inadequate visibility was also the conclusion of the second design fire. The tall ceiling outside the auditorium filled with smoke, limiting the visibility of evacuating occupants. Installing smoke control capability in the lobby could remedy this deficiency. An easier solution would be to install doors in the auditorium that do not egress into the lobby. This would allow auditorium occupants to avoid the smoke entirely, while reducing the congestion in the lobby for people evacuating from other parts of the first floor. The Baker Science building serves as an example of properly executed code implementation, with failings that can be exposed with demanding design fires

    Review of: Understanding and Dismantling Racism: Crowdsourcing a Pathway Model in Appalachia

    Get PDF
    The Journal of Appalachian Health is committed to reviewing published media that relates to contemporary concepts affecting the health of Appalachia. Examining Institutional Racism’s impact on health, career advancement and outcomes in Appalachian communities, impacts our ability to address and identify solutions to inform the fundamental framing of health equity. Dr. Matthew F. Hudson critiques the website: Understanding and Dismantling Racism: Crowdsourcing a Pathway Model in Appalachia

    Specular and diffuse X-ray scattering studies of surfaces and interfaces

    Get PDF
    The behaviour of thin film semiconducting and magnetic devices depends upon the chemical and physical status of the as-grown structure. Since the dimensions of many devices can be in the Angstrom and nanometre region, characterisation techniques capable of measuring chemical and physical parameters in this regime are necessary if an understanding of the effect of specimen structure on observed properties is to be achieved. This thesis uses high resolution x-ray scattering techniques to characterise sub-micron layered structures of semiconducting and magnetic materials. Double crystal diffraction is routinely employed in the semiconductor industry for the on line inspection of sample quality. While material parameters such as sample perfection and layer composition may be rapidly deduced, the non-destructive measurement of layer thickness is more difficult (particularly for multilayered samples) and lengthy simulation procedures are often necessary to extract the thickness information from a double crystal diffraction profile. However, for semiconductor structures which act as Bragg case interferometers, oscillations (known as thickness fringes) appear in the diffracted profile. The period of these fringes can be directly related to layer thickness. Attempts to Fourier transform diffraction data, in order to automatically extract the frequency" of thickness fringes, have previously been only partially successful. It is shown that the relatively weak intensity of the thickness fringes and the presence of the substrate peak in the analysed diffraction data, drastically reduce the quality of the subsequent Fourier transform. A procedure for the manipulation of diffraction data is suggested, where an "average” envelope is fitted to the thickness fringes and used to normalise the data. The application of an auto-correlation is shown to further increase the quality of the Fourier transform of the normalised data. The application of Fourier transform techniques to the routine analysis of double crystal diffraction data is discussedA novel technique for the measurement of absolute lattice parameters of single crystals is presented, which is capable of determining lattice constants with an absolute accuracy of around 2 parts in 10(^5). The technique requires only the use of a conventional triple crystal diffractometer with motorised 20 circle movement and the provision for a fine, precise rocking motion of the analyser. To demonstrate the technique, exemplary measurements on GaAs and InAs crystals are presented. Triple crystal diffi-action analysis has been performed on three material systems of current technological interest; the Hg(_1-x)Mn(_x)Te on GaAs, the Cd(_1-x)Hg(_x)Te on CdTe/Cd(_1-x)Zn(_x)Te and the low temperature grown GaAs systems. Studies on the Hg(_1-x)Mn(_x)Te on GaAs system reveal that the principal contribution to the rocking curve widths of layers grown using the direct alloy growth (DAG) method, arise from the tilt (i.e., mosaicity) of layer sub-grains. This finding is confirmed by double crystal topography which shows that the layers are highly mosaic with a typical grain size of (130±5)µm. Topographic studies of Hg(_1-x)Mn(_x)Te on GaAs, grown using the interdiffused multilayer process (IMP), show that sample quality is significantly improved with single crystal material being produced using this growth method. Triple crystal diffraction studies of the Cd(_1-x)Hg(_x)Te on CdTe/Cd(_0.96)Zn(_0.04)Te systems reveal several findings. These are that the main contribution to rocking curve widths is from lattice tilts and that the tilt distribution increases as the layer thickness decreases. Further, the quality of the Cd(_0.96)Zn(_0.04)Te substrate analysed is superior to that of the CdTe and that Cd(_1-x)Hg(_x)Te layers grown on Cd(_0.96)Zn(_0.04)Te substrates are generally of a higher quality than those grown on CdTe. Triple crystal analysis of MBE and ALE grown GaAs films, deposited at low growth temperatures, show that, at equivalent temperatures, superior quality films are grown by the ALE technique. Narrow lattice dilation and tilt distributions are reported for GaAs films grown at temperatures as low as 300ºC by the ALE method. While diffraction techniques are highly suitable for the study of relatively perfect crystalline material, they are not appropriate to the analysis of heavily dislocated or even amorphous specimens. This is not the case for the Grazing Incidence X-Ray Reflectivity (GIXR) technique, whose sensitivity is not dependent upon sample structure. The GIXR technique is currently attracting increasing interest following the development of commercial instruments. In this thesis, GIXR has been used to probe the layer thickness and interfacial roughness of a series of magnetic multilayer samples and Si/Si(_x)Ge(_1-x) superlattices. The technique is shown to be capable of measuring layer thickness to an accuracy of one monolayer. Modelling of specular GIXR data for the Si/Si(_x)Ge(_1-x) superlattices has shown that the magnitude of interfacial roughness is different for the two types of interface within the high Ge content superlattice samples, the Si(_x)Ge(_1-x)→Si interface possessing a long range sinusoidal roughness of (0.9±0.3)nm, in addition to die short range roughness of (0.5±0.2)nm present at all interfaces. By collecting the diffuse scatter from a GIXR experiment, conformal, or correlated, roughness has been observed in both the multilayer and superlattice samples

    Mining a MOOC to examine international views of the “Smart City”

    Get PDF
    Increasing numbers of cities are focussed on using technology to become “Smart”. Many of these Smart City programmes are starting to go beyond a technological focus to also explore the value of a more inclusive approach that values the input of citizens. However, the insights gained from working with citizens are typically focused around a single town or city. In this paper we explore whether it is possible to understand people’s opinions and views on the Smart City topics of Open Data, privacy and leadership by examining comments left on a Smart City MOOC that has been delivered internationally. In doing so we start to explore whether MOOCs can provide a lens for examining views on different facets of the Smart City agenda from a global audience, albeit limited to the demographic of the typical MOOC user

    FINANCIAL PERFORMANCE IN MEAT AND POULTRY MANUFACTURING AND WHOLESALING: AN HISTORICAL PERSPECTIVE

    Get PDF
    The financial performance of meat and poultry manufacturing and wholesaling firms is examined for the period from 1970 to 1986. Measures of liquidity, solvency, profitability, cash generation, and efficiency reported in the Robert Morris Associates Annual Statement Studies are used to examine relative performance across the different industries. The results suggest a similar performance in the wholesaling and manufacturing industries across the period in terms of liquidity. Profitability levels are similar for meat and poultry firms, although the poultry firms show a higher level of variability across the period. It appears that poultry firms leveraged themselves relatively more than did meat firms during the period. In terms of cash generation and efficiency the meat manufacturing industry performs slightly better than the other industries.Agricultural Finance,
    corecore