452 research outputs found

    Studies in the design and implementation of programming languages for symbol manipulation

    Get PDF
    Compared with the development of computing hardware, the development of programming languages has followed a different course. Hardware innovations such as the use of transistors and integrated circuitry have resulted in machines with very substantially improved capabilities, making older machines and even comparatively modern machines obsolescent. The programming languages currently in most widespread use, however, remain those which were already in use as many as ten years ago, namely HJRTRAN, ALGOL 60, and COBOL. Nevertheless, considerable improvements can be made to these languages. The reasons why no improvements were made appear to be primarily twofold. Firstly, they are regarded as 'standard' languages, which in order to facilitate transferability of programs, has made them virtually immutable. Secondly, they can be employed in almost all programming situations without the need for change. Instead, very many other languages have been designed and implemented with particular objectives in view, but which almost invariably limit their application to a narrow field. Only recently have attempts been made to unify some of the developments under the cloak of a single language ( PL/1 and ALGOL 68 ). Data structures are a particular example of what features have been incorporated. There are still considerable omissions however. For instance, neither language has incorporated list processing or symbol manipulation facilities within its basic framework. The latter seems to be most surprising. With the increased capabilities of modern computers and the consequent broadening of their range of application, techniques involving symbol manipulation are becoming increasingly important. Natural language processing such as the analysis of texts for authorship and mechanical translation, and formal manipulations, such as those involved in mechanical theorem-proving and algebraic formula manipulation are some obvious applications. The last mentioned, that of algebraic manipulation of formulae, is one of the most important applications. Several systems, notably R3RMAC, have been developed for this purpose. With the advent of multi-access computing systems a much greater interaction between man and machine is becoming possible, where the advantages of algebraic manipulation and mathematical assistance packages are felt the greatest. This, further, demonstrates the need for symbol manipulation facilities to be available together with normal arithmetic facilities in a programming language, for not only must the formulae be manipulated but also they must be evaluated in normal arithmetic terns. This combination has not completely satisfactorily been acheived in any languages developed in the past. The present investigation is an attempt to overcome this deficiency. A language called ASTRA has been the result. Before discussing the design and implementation of ASTRA, several existing languages are examined in order to discern the desirable properties of a language for symbol manipulation. It is the belief of the present author that the features of ASTRA described herein represent an advance on previous languages. The methods used in the ASTRA compiler are also described

    Exploring barriers of m-commerce adoption in SMEs in the UK: Developing a framework using ISM

    Get PDF
    YesIn the modern business era, mobile commerce (m-commerce) is changing the way the business is conducted using the Internet. However, the prominence of m-commerce among small and medium-sized enterprises (SMEs) in the UK is minimal. The purpose of this study is to evaluate the existing literature and to extend the research surrounding the barriers that prevent the adoption of m-commerce amongst SMEs. The study uses an Interpretive Structural Modelling (ISM) and MICMAC approach for guiding and helping managers of SMEs. Data was collected from an expert participant group each of whom had extensive knowledge of m-commerce. The findings represent the unstable nature of variables in the context of their impact on each other, their relationships, and themselves. The listed factors in the proposed framework and the interrelationships between them highlight the multi-dimensional element of m-commerce adoption prevention. This observation proves criticality of analysing data as a collective entity rather than viewing the barriers in isolation. The findings also indicated ‘perceived risk’ being a key barrier that demonstrates how personal opinions of the concept of adoption can have a great significance on the outcome and whether other variables will come into effect

    Validation of Observed Bedload Transport Pathways Using Morphodynamic Modeling

    Get PDF
    Phenomena related to braiding, including local scour and fill, channel bar development, migration and avulsion, make numerical morphodynamic modeling of braided rivers challenging. This paper investigates the performance of a Delft3D model, in a 2D depth-averaged formulation, to simulate the morphodynamics of an anabranch of the Rees River (New Zealand). Model performance is evaluated using data from field surveys collected on the falling limb of a major high flow, and using several sediment transport formulas. Initial model results suggest that there is generally good agreement between observed and modeled bed levels. However, some discrepancies in the bed level estimations were noticed, leading to bed level, water depth and water velocity estimation errors

    Conditions for spontaneous homogenization of the Universe

    Full text link
    The present-day Universe appears to be homogeneous on very large scales. Yet when the casual structure of the early Universe is considered, it becomes apparent that the early Universe must have been highly inhomogeneous. The current paradigm attempts to answer this problem by postulating the inflation mechanism However, inflation in order to start requires a homogeneous patch of at least the horizon size. This paper examines if dynamical processes of the early Universe could lead to homogenization. In the past similar studies seem to imply that the set of initial conditions that leads to homogenization is of measure zero. This essay proves contrary: a set of initial conditions for spontaneous homogenization of cosmological models can form a set of non-zero measure.Comment: 7 pages. Fifth Award in the 2010 Gravity Research Foundation essay competitio

    Spin Transfer Measurements for (p,n) Reactions at Intermediate Energy

    Get PDF
    This research was sponsored by the National Science Foundation Grant NSF PHY 87-1440

    Greenhouse gas and ammonia emission mitigation priorities for UK policy targets

    Get PDF
    Acknowledgements Many thanks to the Association of Applied Biologist’s for organising and hosting the ‘Agricultural greenhouse gases and ammonia mitigation: Solutions, challenges, and opportunities’ workshop. This work was supported with funding from the Scottish Government’s Strategic Research Programme (2022-2027, C2-1 SRUC) and BBSRC (BBS/E/C/000I0320 and BBS/E/C/000I0330). We also acknowledge support from UKRI694 BBSRC (United Kingdom Research and Innovation-Biotechnology and Biological Sciences 695 Research Council; United Kingdom) via grants BBS/E/C/000I0320 and BBS/E/C/000I0330. and Rothamsted Research's Science Initiative Catalyst Award (SICA) supported by BBSRC.Peer reviewedPublisher PD

    Gamma-Ray Bursts: The Underlying Model

    Full text link
    A pedagogical derivation is presented of the ``fireball'' model of gamma-ray bursts, according to which the observable effects are due to the dissipation of the kinetic energy of a relativistically expanding wind, a ``fireball.'' The main open questions are emphasized, and key afterglow observations, that provide support for this model, are briefly discussed. The relativistic outflow is, most likely, driven by the accretion of a fraction of a solar mass onto a newly born (few) solar mass black hole. The observed radiation is produced once the plasma has expanded to a scale much larger than that of the underlying ``engine,'' and is therefore largely independent of the details of the progenitor, whose gravitational collapse leads to fireball formation. Several progenitor scenarios, and the prospects for discrimination among them using future observations, are discussed. The production in gamma- ray burst fireballs of high energy protons and neutrinos, and the implications of burst neutrino detection by kilometer-scale telescopes under construction, are briefly discussed.Comment: In "Supernovae and Gamma Ray Bursters", ed. K. W. Weiler, Lecture Notes in Physics, Springer-Verlag (in press); 26 pages, 2 figure

    Dimensionless cosmology

    Full text link
    Although it is well known that any consideration of the variations of fundamental constants should be restricted to their dimensionless combinations, the literature on variations of the gravitational constant GG is entirely dimensionful. To illustrate applications of this to cosmology, we explicitly give a dimensionless version of the parameters of the standard cosmological model, and describe the physics of Big Bang Neucleosynthesis and recombination in a dimensionless manner. The issue that appears to have been missed in many studies is that in cosmology the strength of gravity is bound up in the cosmological equations, and the epoch at which we live is a crucial part of the model. We argue that it is useful to consider the hypothetical situation of communicating with another civilization (with entirely different units), comparing only dimensionless constants, in order to decide if we live in a Universe governed by precisely the same physical laws. In this thought experiment, we would also have to compare epochs, which can be defined by giving the value of any {\it one} of the evolving cosmological parameters. By setting things up carefully in this way one can avoid inconsistent results when considering variable constants, caused by effectively fixing more than one parameter today. We show examples of this effect by considering microwave background anisotropies, being careful to maintain dimensionlessness throughout. We present Fisher matrix calculations to estimate how well the fine structure constants for electromagnetism and gravity can be determined with future microwave background experiments. We highlight how one can be misled by simply adding GG to the usual cosmological parameter set
    corecore