12,934 research outputs found

    The Calculation of Chemical and Ionization Equilibrium in a Conventional Shock Tube Scientific Report No. 12

    Get PDF
    Calculating chemical and ionization equilibrium of plasma in conventional shock tub

    The Disciplined Use of Simplifying Assumptions

    Get PDF
    Submitted to the ACM SIGSOFT Second Software Engineering Symposium: Workshop on Rapid Prototyping. Columbia, Maryland, April 19-21, 1982.Simplifying assumptions — everyone uses them but no one's programming tool explicitly supports them. In programming, as in other kinds of engineering design, simplifying assumptions are an important method for dealing with complexity. Given a complex programming problem, expert programmers typically choose simplifying assumptions which, though false, allow them to arrive rapidly at a program which addresses the important features of the problem without being distracted by all of its details. The simplifying assumptions are then incrementally retracted with corresponding modifications to the initial program. This methodology is particularly applicable to rapid prototyping because the main questions of interest can often be answered using only the initial program. Simplifying assumptions can easily be misused. In order to use them effectively two key issues must be addressed. First, simplifying assumptions should be chosen which simplify the design problems significantly without changing the essential character of the program which needs to be implemented. Second, the designer must keep track of all the assumptions he is making so that he can later retract them in an orderly manner. By explicitly dealing with these issues, a programming assistant system could directly support the use of simplifying assumptions as a disciplined part of the software development process.MIT Artificial Intelligence Laborator

    A History of Miranda and Why It Remains Vital Today

    Get PDF

    Structural and lithologic study of northern California Coast Range and Sacramento Valley, California

    Get PDF
    The author has identified the following significant results. Photgeologic examination of repetitive multispectral ERTS-1 imagery of Northern California has disclosed several systems of linear features which may be important for the interpretation of the structural history of California. They are separated from an orthogonal system of linears in the Klamath Mts. by a set of discontinuous southeast-trending linear features (the Mendocino system) which is traceable from the Pacific Coast, at Cape Mendocino, into the eastern foothills of the Sierra Nevada. Within the Sierra Nevada, the Mendocino system separates the north-trending Sierran system from a set of linears characteristic of the Modoc Plateau. With minor exception, little overlap exists among the systems which suggests a decipherable chronology and evolutionary history for the region. The San Andres system of linears appears to truncate or co-exist with most of the other systems in the northern Coast Ranges. The Mendocino system truncates the Klamath, Sierran, and Modoc systems. The Sierran system may represent fundamental and long-persisting pre-late Paleozoic zones of crustal weakness which have been reactivated from time to time. The Mendocino system was possibly developed in early Mesozoic and is important to the structural framework of Northern California

    Scalable Population Synthesis with Deep Generative Modeling

    Full text link
    Population synthesis is concerned with the generation of synthetic yet realistic representations of populations. It is a fundamental problem in the modeling of transport where the synthetic populations of micro-agents represent a key input to most agent-based models. In this paper, a new methodological framework for how to 'grow' pools of micro-agents is presented. The model framework adopts a deep generative modeling approach from machine learning based on a Variational Autoencoder (VAE). Compared to the previous population synthesis approaches, including Iterative Proportional Fitting (IPF), Gibbs sampling and traditional generative models such as Bayesian Networks or Hidden Markov Models, the proposed method allows fitting the full joint distribution for high dimensions. The proposed methodology is compared with a conventional Gibbs sampler and a Bayesian Network by using a large-scale Danish trip diary. It is shown that, while these two methods outperform the VAE in the low-dimensional case, they both suffer from scalability issues when the number of modeled attributes increases. It is also shown that the Gibbs sampler essentially replicates the agents from the original sample when the required conditional distributions are estimated as frequency tables. In contrast, the VAE allows addressing the problem of sampling zeros by generating agents that are virtually different from those in the original data but have similar statistical properties. The presented approach can support agent-based modeling at all levels by enabling richer synthetic populations with smaller zones and more detailed individual characteristics.Comment: 27 pages, 15 figures, 4 table

    Introduction: The Challenge of Risk Communication in a Democratic Society

    Get PDF
    The symposium editors review key issues concerning the relationship between risk communication and public participation

    THE RECOMMENDATIONS FOR OPEN HARBOR INITIATIVE

    Get PDF
    The efficiency of harbor management plays a significant economic role to a nation in various aspects, including trading business, logistics and the manufacturing. The visibility of harbor activities and management determines the performance of the whole logistic chain. The harbor agencies continuously strive to provide better operation models to the stakeholders by collecting and analyzing these data populated from the activities. To expedite this improvement, an Open Innovation Model is called to encourage more special interest groups to contribute their works; the harbor agencies disclose the datasets derived from those servicing activities through the government Open Data platforms. Since there is no clear picture of how these contributors would utilize the datasets for their researches, there is a considerable requirement gap between the dataset provider - the harbor agencies and the consumers - the interest groups. This paper surveyed the open datasets provided by the advanced harbors using the textual analysis and the text mining approaches to emerge the potential requirements for the Open Harbor initiative followers such as Taiwan. By taking the example of Taiwan Open Harbor initiative, it reexamined the potential meaning against the already opened datasets and tangibly identified where they could be further enhanced to bring more value to the interest groups. Based on these findings, this paper presents the initiative realization models through the Enterprise Architecture - a methodology of defining the information systems from the strategic planning to the realization -processes as the recommendations to those pursuing operation eminence harbor agencies

    Putting Black Kids into a Trick Bag: Anatomizing the Inner-City Public School Reform

    Get PDF
    Part I of this Article discusses the history of Brown, and the legal and political barriers that prevented the nation from fulfilling Brown\u27s promise. Part II, will examine the phenomenon of White flight, which resulted from the efforts to implement the court-ordered desegregation of public schools. The political and economic effects of White flight on school reform efforts will also be examined. Part III will provide the reader with possible explanations for why school desegregation failed. The author will argue that the unexpected complexity of the task of desegregation, the lack of a unified direction among the judiciary, and local political entities, as well as beliefs about the effects that school desegregation would have on White children, prevented desegregation efforts from being successful. Part IV will analyze the various alternatives to court-ordered school desegregation that developed as a result to the legal, social and political barriers, which prevented court-ordered desegregation from taking place. Part V briefly surveys the school-reform efforts of four cities. Part VI discusses the role of school finance in relation to student achievement. The property tax, as the major source of funding for public schools, will be examined, as well as the effects of funding disparities between affluent and poor school districts. Part VII follows with a discussion of the use of testing as a method of school reform
    corecore