3,678 research outputs found

    Jesuit Colleges and Universities: Uniquely Suited to Tackle the Environmental Crisis

    Get PDF

    Experiments to Establish Current-carrying Capacity of Thermionic-emitting Cathodes Final Report, 11 Jan. 1966 - 30 Jan. 1967

    Get PDF
    Current carrying capabilities of thermionic cathodes in nitrogen at pressure levels above one atmospher

    You said…

    Get PDF

    Falling Below the Line: Minimum Subgroup Size and Special Education Enrollment

    Get PDF
    The No Child Left Behind Act of 2001 (NCLB) brought high-stakes accountability testing into every American public school with the goal of 100 percent proficiency for all students. Making annual yearly progress (AYP) toward this proficiency goal for the total student population as well as at-risk subgroups was required in order for schools to avoid possible sanctions, such as school restructuring. In implementing NCLB, states had flexibility to determine the minimum size of these subgroups as to provide statistical reliability and accountability for as many schools as possible. If a school did not meet the state’s minimum subgroup size, the proficiency of the students in the group were not calculated as part of AYP. The subjectivity of identification along with the lack of reliability in test score results makes manipulating the subgroup of students with disabilities possible and advantageous to schools. Using data from over 1,000 Arkansas schools for the years 2004-05 to 2013-14, school-level fixed effects analyses show that falling below the minimum subgroup cutoff of 40 is associated with a 1.5 percentage point decrease in students with disabilities at the school. For every student a school is above the cutoff, there is an increase of 0.09 percentage points in special education enrollment. Possible implications are discussed

    Television News and the Metaphor of Myth

    Get PDF

    Using C to build a satellite scheduling expert system: Examples from the Explorer platform planning system

    Get PDF
    Recently, many expert systems were developed in a LISP environment and then ported to the real world C environment before the final system is delivered. This situation may require that the entire system be completely rewritten in C and may actually result in a system which is put together as quickly as possible with little regard for maintainability and further evolution. With the introduction of high performance UNIX and X-windows based workstations, a great deal of the advantages of developing a first system in the LISP environment have become questionable. A C-based AI development effort is described which is based on a software tools approach with emphasis on reusability and maintainability of code. The discussion starts with simple examples of how list processing can easily be implemented in C and then proceeds to the implementations of frames and objects which use dynamic memory allocation. The implementation of procedures which use depth first search, constraint propagation, context switching and a blackboard-like simulation environment are described. Techniques for managing the complexity of C-based AI software are noted, especially the object-oriented techniques of data encapsulation and incremental development. Finally, all these concepts are put together by describing the components of planning software called the Planning And Resource Reasoning (PARR) shell. This shell was successfully utilized for scheduling services of the Tracking and Data Relay Satellite System for the Earth Radiation Budget Satellite since May 1987 and will be used for operations scheduling of the Explorer Platform in November 1991

    Using C to build a satellite scheduling expert system: Examples from the Explorer Platform planning system

    Get PDF
    A C-based artificial intelligence (AI) development effort which is based on a software tools approach is discussed with emphasis on reusability and maintainability of code. The discussion starts with simple examples of how list processing can easily be implemented in C and then proceeds to the implementations of frames and objects which use dynamic memory allocation. The implementation of procedures which use depth first search, constraint propagation, context switching, and blackboard-like simulation environment are described. Techniques for managing the complexity of C-based AI software are noted, especially the object-oriented techniques of data encapsulation and incremental development. Finally, all these concepts are put together by describing the components of planning software called the Planning And Resource Reasoning (PARR) Shell. This shell was successfully utilized for scheduling services of the Tracking and Data Relay Satellite System for the Earth Radiation Budget Satellite since May of 1987 and will be used for operations scheduling of the Explorer Platform in Nov. of 1991

    Non-equilibrium coherence dynamics of a soft boson lattice

    Full text link
    We study the non-equilibrium evolution of the phase coherence of a Bose-Einstein condensate (BEC) in a one dimensional optical lattice, as the lattice is suddenly quenched from an insulating to a superfluid state. We observe slowly damped phase coherence oscillations in the regime of large filling factor (~100 bosons per site) at a frequency proportional to the generalized Josephson frequency. The truncated Wigner approximation (TWA) predicts the frequency of the observed oscillations.Comment: 10 pages. 4 figure
    • …
    corecore