19,845 research outputs found

    The Campaign: a case study in identity construction through performance

    Get PDF
    This article undertakes a detailed case study of The Campaign, a teaching and learning innovation in media and communications that uses an online educational role-play. The case study draws on the qualitative analysis of classroom observations, online communications and semi-structured interviews, employing an interpretive approach informed by models drawn from social theory and sociotechnical theory. Educational authors argue that online educational role-plays engage students in authentic learning, and represent an improvement over didactic teaching strategies. According to this literature, online role-play systems afford students the opportunity of acting and doing instead of only reading and listening. Literature in social theory and social studies of technology takes a different view of certain concepts such as performance, identity and reality. Models such as performative self constitution and actor network theory ask us to consider the constructed nature of identity and the roles of all of the actors, including the system itself. This article examines these concepts by addressing a series of research questions relating to identity formation and mediation, and suggests certain limitations of the situationist perspective in explaining the educationalvalue of role-play systems

    Accuracy of vertical velocity determination

    Get PDF
    Typical wind spectra taken at Poker Flat, Alaska, using the vertically oriented antenna show velocities of 10's of cm to meters per second and spectral widths winds of 0.5 to 1 m/s. The potential errors in such measurements can be broken down into three categories: (1) those due to instrumental parameters and data processing, (2) those due to specular returns from non-horizontal surfaces, and (3) those due to other physical effects. Error analysis in vertical velocity measurement is further discussed

    Software development environments: Present and future, appendix D

    Get PDF
    Computerized environments which facilitate the development of appropriately functioning software systems are discussed. Their current status is reviewed and several trends exhibited by their history are identified. A number of principles, some at (slight) variance with the historical trends, are suggested and it is argued that observance of these principles is critical to achieving truly effective and efficient software development support environments

    LaRC local area networks to support distributed computing

    Get PDF
    The Langley Research Center's (LaRC) Local Area Network (LAN) effort is discussed. LaRC initiated the development of a LAN to support a growing distributed computing environment at the Center. The purpose of the network is to provide an improved capability (over inteactive and RJE terminal access) for sharing multivendor computer resources. Specifically, the network will provide a data highway for the transfer of files between mainframe computers, minicomputers, work stations, and personal computers. An important influence on the overall network design was the vital need of LaRC researchers to efficiently utilize the large CDC mainframe computers in the central scientific computing facility. Although there was a steady migration from a centralized to a distributed computing environment at LaRC in recent years, the work load on the central resources increased. Major emphasis in the network design was on communication with the central resources within the distributed environment. The network to be implemented will allow researchers to utilize the central resources, distributed minicomputers, work stations, and personal computers to obtain the proper level of computing power to efficiently perform their jobs

    Texture Modelling with Nested High-order Markov-Gibbs Random Fields

    Full text link
    Currently, Markov-Gibbs random field (MGRF) image models which include high-order interactions are almost always built by modelling responses of a stack of local linear filters. Actual interaction structure is specified implicitly by the filter coefficients. In contrast, we learn an explicit high-order MGRF structure by considering the learning process in terms of general exponential family distributions nested over base models, so that potentials added later can build on previous ones. We relatively rapidly add new features by skipping over the costly optimisation of parameters. We introduce the use of local binary patterns as features in MGRF texture models, and generalise them by learning offsets to the surrounding pixels. These prove effective as high-order features, and are fast to compute. Several schemes for selecting high-order features by composition or search of a small subclass are compared. Additionally we present a simple modification of the maximum likelihood as a texture modelling-specific objective function which aims to improve generalisation by local windowing of statistics. The proposed method was experimentally evaluated by learning high-order MGRF models for a broad selection of complex textures and then performing texture synthesis, and succeeded on much of the continuum from stochastic through irregularly structured to near-regular textures. Learning interaction structure is very beneficial for textures with large-scale structure, although those with complex irregular structure still provide difficulties. The texture models were also quantitatively evaluated on two tasks and found to be competitive with other works: grading of synthesised textures by a panel of observers; and comparison against several recent MGRF models by evaluation on a constrained inpainting task.Comment: Submitted to Computer Vision and Image Understandin

    Optimization of an optically implemented on-board FDMA demultiplexer

    Get PDF
    Performance of a 30 GHz frequency division multiple access (FDMA) uplink to a processing satellite is modelled for the case where the onboard demultiplexer is implemented optically. Included in the performance model are the effects of adjacent channel interference, intersymbol interference, and spurious signals associated with the optical implementation. Demultiplexer parameters are optimized to provide the minimum bit error probability at a given bandwidth efficiency when filtered QPSK modulation is employed

    Cap and Dividend: How to Curb Global Warming while Protecting the Incomes of American Families

    Get PDF
    This essay examines the distributional effects of a “cap-and-dividend" policy for reducing carbon emission in the United States: a policy that auctions carbon permits and distributes the revenue to the public on an equal per capita basis. The aim of the policy is to reduce U.S. emissions of carbon dioxide, the main pollutant causing global warming, while at the same time protecting the real incomes of middle-income and lower-income American families. The number of permits is set by a statutory cap on carbon emissions that gradually diminishes over time. The sale of carbon permits will generate very large revenues, posing the critical question of who will get the money. The introduction of carbon permits – or, for that matter, any policy to curb emissions – will raise prices of fossil fuels and have a regressive impact on income distribution, since fuel expenditures represent a larger fraction of income for lower-income households than for upper-income households. The net effect of carbon emission-reduction policies depends on who gets the money that households pay in higher prices. We find that a cap-and-dividend policy would have a strongly progressive net effect. Moreover, the majority of U.S. households would be net winners in purely monetary terms: that is, their real incomes, after paying higher fuel prices and receiving their dividends, would rise. From the standpoints of both distributional equity and political feasibility, a cap-and-dividend policy is therefore an attractive way to curb carbon emissions.Global warming; fossil fuels; climate change; carbon permits; cap-and-rebate; cap-and-auction; cap-and-trade

    Biodiversity, Ecosystem Services, and Land Use: Comparing Three Federal Policies

    Get PDF
    Natural ecosystems provide a variety of benefits to society, known as “ecosystem services.” Fundamental to the provision of ecosystem services in a region is its underlying biodiversity, i.e., the wealth and variety of plants, animals, and microorganisms. Because the benefits from ecosystem services and biodiversity are not valued in market exchanges, private landowners tend to undersupply them. We compare and contrast the different approaches taken to providing ecosystem services on private land in three federal programs—the Endangered Species Act, the Conservation Reserve Program, and Section 404 of the Clean Water Act. The Endangered Species Act (ESA) places restrictions on land uses for private landowners if endangered species, or critical habitats for endangered species, are found on their properties. The Conservation Reserve Program (CRP) compensates farmers for removing valuable property from agricultural production to preserve wildlife habitat, water and soil quality, and other ecosystem values. Section 404 of the Clean Water Act prohibits destruction or damage to wetlands, unless individuals buy credits for equivalent wetlands created by third parties—so-called “wetlands mitigation banks.” These three policies run the gamut from a command-and-control regulatory approach to a “payment for ecosystem services” option. We summarize the economics literature on key findings from these programs.biodiversity, critical habitat, conservation, green infrastructure, payment for ecosystem services, public goods, wetlands mitigation
    corecore