32 research outputs found

    2D Graphical Arrangement Tool incorporating L-System

    Get PDF
    A lot of software for arrangement tools and drawing tools had been developed nowadays. Drawing tools have made drawing much easier. Everybody candraw what ever they want to drawby using the drawing tools. Although there are many drawing tools in the market nowadays but none of them have integrated the L-System into it. L-System is a model consists of an initial string called the axiom and a set of production rules which are iteratively applied to that string. The string is examined for single characters which have associated production rules. When a rule is found; the string which corresponds to that character (as defined by the production rule) is substituted into the string. This project is basically a drawing tool that integrates the element of L-System into it. In this project, it still focused on the L-System but the difference is it applies in a way where user just can play around with the models. Byusing the tools provided, user can see the usefulness of L-System. The objectives of this project are to create user interface that integrate 2D LSystem with the functions of arrangement tools, to understand on L-System, to understand on Open GL User Interface (GLUI) and to show the usefulness of 2D LSystem. The project is developed using the Open GL and C++ language and the engine for the L-System tree is using the LParser

    Lessons learned in the transition to Ada from FORTRAN at NASA/Goddard

    Get PDF
    Two dynamics satellite simulators are developed from the same requirements, one in Ada and the other in FORTRAN. The purpose of the research was to find out how well the prescriptive Ada development model worked to develop the Ada simulator. The FORTRAN simulator development, as well as past FORTRAN developments, provided a baseline for comparison. Since this was the first simulator developed, the prescriptive Ada development model had many similarities to the usual FORTRAN development model. However, it was modified to include longer design and shorter testing phases, which is generally expected with Ada developments. One result was that the percentage of time the Ada project spent in the various development activities was very similar to the percentage of time spent in these activities when doing a FORTRAN project. Another finding was the difficulty the Ada team had with unit testing as well as with integration. It was realized that adding additional steps to the design phase, such as an abstract data type analysis, and certain guidelines to the implementation phase, such as to use primarily library units and nest sparingly, would have made development easier. These are among the recommendations made to be incorporated in a new Ada development model next time

    Parallel programming using functional languages

    Get PDF
    It has been argued for many years that functional programs are well suited to parallel evaluation. This thesis investigates this claim from a programming perspective; that is, it investigates parallel programming using functional languages. The approach taken has been to determine the minimum programming which is necessary in order to write efficient parallel programs. This has been attempted without the aid of clever compile-time analyses. It is argued that parallel evaluation should be explicitly expressed, by the programmer, in programs. To do achieve this a lazy functional language is extended with parallel and sequential combinators. The mathematical nature of functional languages means that programs can be formally derived by program transformation. To date, most work on program derivation has concerned sequential programs. In this thesis Squigol has been used to derive three parallel algorithms. Squigol is a functional calculus from program derivation, which is becoming increasingly popular. It is shown that some aspects of Squigol are suitable for parallel program derivation, while others aspects are specifically orientated towards sequential algorithm derivation. In order to write efficient parallel programs, parallelism must be controlled. Parallelism must be controlled in order to limit storage usage, the number of tasks and the minimum size of tasks. In particular over-eager evaluation or generating excessive numbers of tasks can consume too much storage. Also, tasks can be too small to be worth evaluating in parallel. Several program techniques for parallelism control were tried. These were compared with a run-time system heuristic for parallelism control. It was discovered that the best control was effected by a combination of run-time system and programmer control of parallelism. One of the problems with parallel programming using functional languages is that non-deterministic algorithms cannot be expressed. A bag (multiset) data type is proposed to allow a limited form of non-determinism to be expressed. Bags can be given a non-deterministic parallel implementation. However, providing the operations used to combine bag elements are associative and commutative, the result of bag operations will be deterministic. The onus is on the programmer to prove this, but usually this is not difficult. Also bags' insensitivity to ordering means that more transformations are directly applicable than if, say, lists were used instead. It is necessary to be able to reason about and measure the performance of parallel programs. For example, sometimes algorithms which seem intuitively to be good parallel ones, are not. For some higher order functions it is posible to devise parameterised formulae describing their performance. This is done for divide and conquer functions, which enables constraints to be formulated which guarantee that they have a good performance. Pipelined parallelism is difficult to analyse. Therefore a formal semantics for calculating the performance of pipelined programs is devised. This is used to analyse the performance of a pipelined Quicksort. By treating the performance semantics as a set of transformation rules, the simulation of parallel programs may be achieved by transforming programs. Some parallel programs perform poorly due to programming errors. A pragmatic method of debugging such programming errors is illustrated by some examples

    NASA Tech Briefs, January 1989

    Get PDF
    Topics include: Electronic Components & and Circuits. Electronic Systems, A Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Fabrication Technology, Mathematics and Information Sciences, and Life Sciences

    Migration of an operating system to an object-oriented paradigm

    Get PDF
    Operating System design has moved from monolithic systems such as UNIX, where all system services are implemented in a single kernel, to microkernel designs where the majority of system services are conducted in user space. A recent trend in operating system design has been to use architectural models based upon the object-oriented paradigms. This approach promotes the modelling of system resources and resource management as an organized collection of objects in such a way that the mechanisms, policites, algorithms, and data representations of the operating system are suitably encapsulated by the objects. Much of the research in this area to date has concentrated on the uses and benefits of object-oriented operating systems m the distributed systems arena. Similarly, almost all of these systems have been designed from the ground-up. I beheve that the progression towards object-oriented operating systems is likely to involve current operating systems incorporating and assimilating object-oriented features into their existing designs in a gradual manner, rather than an overnight switch to a new technology. In this light, the purpose of my thesis is to take an existing operating system and to propose a design which would migrate the original operating systems' facilities and features to an object-oriented paradigm. This thesis also evaluates the advantages and disadvantages of such a design over the existing one. Finally, future enhancements and directions are proposed based on the new operating system design

    Relative-fuzzy: a novel approach for handling complex ambiguity for software engineering of data mining models

    Get PDF
    There are two main defined classes of uncertainty namely: fuzziness and ambiguity, where ambiguity is ‘one-to-many’ relationship between syntax and semantic of a proposition. This definition seems that it ignores ‘many-to-many’ relationship ambiguity type of uncertainty. In this thesis, we shall use complex-uncertainty to term many-to-many relationship ambiguity type of uncertainty. This research proposes a new approach for handling the complex ambiguity type of uncertainty that may exist in data, for software engineering of predictive Data Mining (DM) classification models. The proposed approach is based on Relative-Fuzzy Logic (RFL), a novel type of fuzzy logic. RFL defines a new formulation of the problem of ambiguity type of uncertainty in terms of States Of Proposition (SOP). RFL describes its membership (semantic) value by using the new definition of Domain of Proposition (DOP), which is based on the relativity principle as defined by possible-worlds logic. To achieve the goal of proposing RFL, a question is needed to be answered, which is: how these two approaches; i.e. fuzzy logic and possible-world, can be mixed to produce a new membership value set (and later logic) that able to handle fuzziness and multiple viewpoints at the same time? Achieving such goal comes via providing possible world logic the ability to quantifying multiple viewpoints and also model fuzziness in each of these multiple viewpoints and expressing that in a new set of membership value. Furthermore, a new architecture of Hierarchical Neural Network (HNN) called ML/RFL-Based Net has been developed in this research, along with a new learning algorithm and new recalling algorithm. The architecture, learning algorithm and recalling algorithm of ML/RFL-Based Net follow the principles of RFL. This new type of HNN is considered to be a RFL computation machine. The ability of the Relative Fuzzy-based DM prediction model to tackle the problem of complex ambiguity type of uncertainty has been tested. Special-purpose Integrated Development Environment (IDE) software, which generates a DM prediction model for speech recognition, has been developed in this research too, which is called RFL4ASR. This special purpose IDE is an extension of the definition of the traditional IDE. Using multiple sets of TIMIT speech data, the prediction model of type ML/RFL-Based Net has classification accuracy of 69.2308%. This accuracy is higher than the best achievements of WEKA data mining machines given the same speech data

    Quality-of-service management in IP networks

    Get PDF
    Quality of Service (QoS) in Internet Protocol (IF) Networks has been the subject of active research over the past two decades. Integrated Services (IntServ) and Differentiated Services (DiffServ) QoS architectures have emerged as proposed standards for resource allocation in IF Networks. These two QoS architectures support the need for multiple traffic queuing systems to allow for resource partitioning for heterogeneous applications making use of the networks. There have been a number of specifications or proposals for the number of traffic queuing classes (Class of Service (CoS)) that will support integrated services in IF Networks, but none has provided verification in the form of analytical or empirical investigation to prove that its specification or proposal will be optimum. Despite the existence of the two standard QoS architectures and the large volume of research work that has been carried out on IF QoS, its deployment still remains elusive in the Internet. This is not unconnected with the complexities associated with some aspects of the standard QoS architectures. [Continues.

    A model for systematically investigating relationships between variables that affect the performance of novice programmers

    Get PDF
    This research was motivated by an interest in novices learning to program and a desire to understand the factors that affect their learning. The traditional approach to performing such an investigation has been to select factors which may be important and then perform statistical tests on a few potential relationships. A new research model is proposed and tested to ensure that a thorough and systematic investigation of the data is performed. This thesis describes the data, defines the model and explains the application and validation of the model. The research process is managed by a control algorithm that is the heart of the model. This algorithm is seeded by a hypothesis that connects two variables of interest and dictates the testing of a series of hypotheses; as it does this, it also delves deeper into the data to identify additional relationships. In this research the model was applied to investigate the relationships between: learning style and achievement; programming behaviour and achievement; and learning style and programming behaviour. Learning style was assessed using Kolb’s Learning Style Inventory, achievement was based on exam score and programming behaviour was extracted from a log of student activities using a programming tool. The largest number of significant relationships was found between aspects of behaviour and achievement. The model was validated by classifying the significant hypotheses based on the research model’s tree structure, the section of the programming tool in use and the literature. These three classification schemes provided a structure to explore their similarities and differences. The model was thus demonstrated to be robust and repeatable by comparing the results with those from both using a programming tool, and expert opinion. This research has revealed several attributes of the learning behaviour that affected the students’ results within this group, including aspects of timeliness and overall volume of activity. These are suitable targets for future investigations. The research model could be applied to other data sets where an in-depth investigation into pairwise data is required.

    Renewal of a linear electrical network simulator into Ada

    Get PDF
    A dissertation submitted to the Faculty of Engineering, University of the Witwatersrand, Johannesburg, in fulfilment Of the requirements for the degree of Master of Science in Engineering. Johannesburg, 1993Renewal is the extraction of the intellectual content (algorithms, data structures) from an existing program, and then puilding a new more maiatainable program using more modem progra1Tlming methods and languages. A survey of software structure on maintenance. highlighted the different hierarchies produced by functional and object-oriented design methods. Elecsim, a linear circuit sL~ulator written in Pascal, was chosen as the existing program to be renewed, The new version follows the approach of decoupling the user interface and introducing an explicit scheduler. The object-oriented design technique is used extensively. Other issues addressed include online-help and. documentation for the program. Conclusions are drawn which are generally applicable from the specificlessons learnt from the Elecsim/Elector case study.MT201
    corecore