147 research outputs found
Recommended from our members
APT: A principled design for an animated view of program execution for novice programmers
This thesis is concerned with the principled design of a computational environment which depicts an animated view of program execution for novice programmers. We assert that a principled animated view of program execution should benefit novice programmers by: (i) helping students conceptualize what is happening when programs are executed; (ii) simplifying debugging through the presentation of bugs in a manner which the novice will understand; (iii) reducing program development time.
The design is based on principles which have been extracted from three areas: (i) the problems that novices encounter when learning a programming language; (ii) the general design principles for computer systems; and (iii) systems which present a view of program execution.
The design principles have been embodied in three 'canned stepper displays for Prolog, Lisp and 6502 Assembler. These prototypes, called APT-0 (Animated Program Tracer), demonstrate that the design principles can be broadly applied to procedural and declarative; low and high level languages. Protocol data was collected from subjects using the prototypes in order to check the direction of the research and to suggest improvements in the design. These improvements have been incorporated in a real implementation of APT for Prolog.
This principled approach embodied by APT provides two important facilities which have previously not been available, firstly a means of demonstrating dynamic programming concepts such as variable binding, recursion, and backtracking, and secondly a debugging tool which allows novices to step through their own code watching the virtual machine in action. This moves towards simplifying the novice's debugging environment by supplying program execution information in a form that the novice can easily assimilate.
An experiment into the misconceptions novices hold concerning the execution of Prolog programs shows that the order of database search, and the concepts of variable binding, unification and backtracking are poorly understood. A further experiment was conducted which looked at the effect that APT had on the ability of novice Prolog programmers to understand the execution of Prolog programs. This demonstrated that the performance of subjects significantly increased after being shown demonstrations of the execution of Prolog programs on APT, while the control group who saw no demonstration showed no improvement.
The experimental evidence demonstrates the potential of APT, and the principled approach which it embodies, to communicate run-time information to novice programmers, increasing their understanding of the dynamic aspects of the Prolog interpreter.
APT, uses an object centred representation, is built on top of a Prolog interpreter and environment, and is implemented in Common Lisp and Zeta Lisp and runs on the Symbolics 3600 range of machines
EMBODIMENT IN COMPUTER SCIENCE LEARNING: HOW SPACE, METAPHOR, GESTURE, AND SKETCHING SUPPORT STUDENT LEARNING
Recently, correlational studies have found that psychometrically assessed spatial skills may be influential in learning computer science (CS). Correlation does not necessarily mean causation; these correlations could be due to several reasons unrelated to spatial skills. Nonetheless, the results are intriguing when considering how students learn to program and what supports their learning. However, it's hard to explain these results. There is not an obvious match between the logic for computer programming and the logic for thinking spatially. CS is not imagistic or visual in the same way as other STEM disciplines since students can't see bits or loops. Spatial abilities and STEM performance are highly correlated, but that makes sense because STEM is a highly visual space. In this thesis, I used qualitative methods to document how space influences and appears in CS learning. My work is naturalistic and inductive, as little is known about how space influences and appears CS learning. I draw on constructivist, situative, and distributed learning theories to frame my investigation of space in CS learning. I investigated CS learning through two avenues. The first is as a sense-making, problem-solving activity, and the second is as a meaning-making and social process between teachers and students. In some ways, I was inspired to understand what was actually happening in these classrooms and how students are actually learning and what supports that learning. While looking for space, I discovered the surprising role embodiment and metaphor played while students make sense of computation and teachers express computational ideas. The implication is that people make meaning from their body-based, lived experiences and not just through their minds, even in a discipline such as computing, which is virtual in nature. For example, teachers use the following spatial language when describing a code trace: "then, it goes up here before going back down to the if-statement." The code is not actually going anywhere, but metaphor and embodiment are used to explain the abstract concept. This dissertation makes three main contributions to computing education research. First, I conducted some of the first studies on embodiment and space in CS learning. Second, I present a conceptual framework for the kinds of embodiment in CS learning. Lastly, I present evidence on the importance of metaphor for learning CS.Ph.D
Recommended from our members
Empirical Studies of Novices Learning Programming
This thesis is concerned with the problems that novices have in learning to program: in particular it is concerned with the difficulties experienced by novices learning at a distance, using instructional materials which have been designed especially for novices. One of the major problems for novices is how to link the new information which they encounter with their existing knowledge. Du Boulay, O'Shea and Monk (1981) suggest helping novices to bridge the gap between their existing knowledge and new information by teaching via a conceptual model, which serves to explain the new information in familiar terms.In this thesis the difficulties which novices have when learning to program with the help of a conceptual model were investigated. The curricula and conceptual models of four different programming languages are examined, all of which were designed to teach novices. Du Boulay, O'Shea and Monk (1981) have suggested criteria for analysing conceptual models. It is argued that these criteria, however, do not address the presentation of the conceptual model, and so are insufficient to evaluate them. An additional form of analysis was proposed and used, in addition to the criteria offered by Du Boulay et al. This is a way of describing the conceptual model which distinguishes three views of the conceptual model: state, procedure and function, and which highlights the different aspects which are important for the novice learner by identifying the different kinds of knowledge which are necessary to understand the conceptual model. This analysis of the conceptual models showed that the environments are not as exemplary as the du Boulay et al's criteria suggest, and indicated that three of the environments, SOLO, PT501 and DESMOND, lack a functional representation, and that the fourth, Open Logo, has other different problems.An empirical study was carried out to study the transfer effects of learning two of the languages, a high level and a low level language, sequentially. There was no evidence for such transfer effects. The difficulties novices have in learning the four different languages were also investigated. These studies show that even though the novices were studying environments designed for novices learning at a distance, they did not develop good levels of competence, and the problems they had fall into two main categories: programming and pedagogical.Although the different languages had different aims and curricula, novices had some problems which were common to all or most of the languages. These included understanding flow of control, developing and using programming plans, developing accurate mental models, and in the high level languages, understanding recursion. It is argued that some of these problems are related to the conceptual models. In particular, the difficulties novices had in developing and using plan knowledge, which is one of their main problems, can be explained by the lack of an appropriate functional description in the languages.The subjects' pedagogical problems arose from the relationship between the style and structure of the curriculum, its content, and the subjects themselves. In all the four texts the teaching material is very carefully structured and it is suggested that this may encourage the learner to adopt an over-dependent attitude towards the text, and in some cases, to work at an inappropriate syntactic level.The relationship between the distance learning situation and the novice programmer is discussed, and recommendations are made for improving the curricula used for teaching novices programming
Recommended from our members
Faster Than Real-Time GPGPU Radiation Pressure Modeling Methods
Solar radiation pressure (SRP) is a significant contributing dynamic force on spacecraft in all orbit regimes. Predicting, accommodating, and either leveraging or canceling its effect, is paramount to effective orbit determination, maneuver and mission design. As a result spacecraft numerical simulation requires computational models which provide the facility to model SRP with sufficient accuracy. However, typically the computationally intense nature of performing high-fidelity SRP evaluations has limited such evaluations to being an offline computation which generates lookup data. Precomputation limits the ability for a spacecraft dynamic simulation to accommodate the myriad time varying changes which occur to the spacecraft state during a mission.
In the past decade the computer graphics industry has driven the development of highly parallel graphics processing units (GPU) capable of performing many thousands of floating point operations per second. General purpose GPU programming (GPGPU) has been leveraged particularly in Engineering and the Sciences where the high computational power of parallel GPU hardware presents the opportunity for significant increases in the size and dimension of computational problems now manageable on personal computers.
This dissertation presents two modeling approaches which take advantage of the GPGPU aspect of commodity GPU hardware. The first contribution is a modeling approach which utilizes the vector graphics application programming interface (API) Open Graphics Library (OpenGL) and the GPGPU computing API Open Computing Language to develop a high geometric fidelity SRP modeling approach. The OpenGL-CL modeling approach computes SRP induced force and torque across a detailed spacecraft mesh model. The method utilizes the OpenGL-OpenCL shared context to facilitate modeling data between the two APIs. The OpenGL render pipeline is manipulated to render the sun-frame projected surface of the spacecraft into OpenGL Texture data objects. A custom OpenCL parallel reduction kernel is developed which subsequently computes the SRP force and torque across the spacecraft rendered into the OpenGL Textures. The method presents faster than real time computation speeds while accommodating spacecraft meshes with many thousands of vertices, arbitrary articulated components and detailed spacecraft material optical parameters.
The second contribution is a GPU based parallel ray tracing modeling approach which ex- hibits faster than real time evaluation speeds. Techniques and algorithms from the computer graphics discipline are used to develop and implement a method which computes SRP force and torque across a detailed spacecraft triangulated mesh model. Efficient data structures such as bounding volume hierarchy (BVH) acceleration support a minimization of computational burden by reducing the ray-surface intersection search space. Accurate ray reflections are computed for complex materials by applying a Quasi-Monte Carlo integration method and importance sampling. Complex material bidirectional reflectance distribution functions (BRDF) are implemented with as both, ideal mirror-like specular and Lambertian diffuse, and as microfacet BRDF models. Arbitrary spacecraft articulation are accommodated at run time with no appreciable reduction in computational speed.
Both SRP models utilize the latent computing power of the GPU which is exists in the large majority of consumer grade personal computing systems. Further access to latent computing power is enabled by the development of a software simulation communication middleware called Black Lion (BL). The third contribution of this thesis is the description of a novel software architecture and the design principles applied to the development of the BL software. Black Lion enables the integration of multiple local or distributed heterogeneous applications never intended to run in a cooperative settings. It is shown that BL enables access to more powerful latent personal computing resources by creating a means to transparently facilitate distributed simulation across multiple simulation nodes and computers.
Finally, this dissertation demonstrates the utility of both modeling methods by their applica- tions in two case studies. Firstly, the high-fidelity SRP effects are computed for an ongoing asteroid sample return mission. Agreement between the OpenGL-CL methods is demonstrated. Both SRP modeling approaches make significant use of pre and post launch engineering data. The utility of direct access to a model’s physical parameters is demonstrated in an analysis of contributors to possible error between modeled and estimated SRP accelerations. Secondly, capability of fast computational speed paired with high geometric resolution, of both OpenGL-CL and ray tracing methods, is demonstrated. Each method is employed in the simulation and long-term propagation of realistic multi-layer insulation (MLI) debris object mesh models and the effect of departing from the typical flat-plate MLI model is investigated.</p
Recommended from our members
Seeing things as people: anthropomorphism and common-sense psychology
This thesis is about common-sense psychology and its role in cognitive science. Put simply, the argument is that common-sense psychology is important because it offers clues to some complex problems in cognitive science, and because common-sense psychology has significant effects on our intuitions, both in science and on an everyday level.
The thesis develops a theory of anthropomorphism in common-sense psychology. Anthropomorphism, the natural human tendency to ascribe human characteristics (and especially human mental characteristics) to things that aren't human, is an important theme in the thesis. Anthropomorphism reveals an endemic anthropocentricity that deeply influences our thinking about other minds. The thesis then constructs a descriptive model of anthropomorphism in common-sense psychology, and uses it to analyse two studies of the ascription of mental states. The first, Baron- Cohen et al. 's (1985) false belief test, shows how cognitive modelling can be used to compare different theories of common-sense psychology. The second study, Searle's (1980) `Chinese Room', shows 'that this same model can reproduce the patterns of scientific intuitions taken to systems which pass the Turing test (Turing, 1950), suggesting that it is best seen as a common-sense test for a mind, not a scientific one. Finally, the thesis argues that scientific theories involving the ascription of mentality through a model or a metaphor are partly dependent on each individual scientist's common-sense psychology.
To conclude, this thesis develops an interdisciplinary study of common-sense psychology and shows that its effects are more wide ranging than is commonly thought. This means that it affects science more than might be expected, but that careful study can help us to become mindful of these effects. Within this new framework, a proper understanding of common-sense psychology could lay important new foundations for the future of cognitive science
Recommended from our members
ITSY: an automated programming adviser
This thesis presents an automated programming adviser. This system (called ITSY) tutors students in Lisp. This is from the viewpoint of automated program debugging of novice programs. Work within HCRL [Eisenstadt et al, Hasemer, Lewis] has shown that novice programming students can benefit from relatively small changes to the environment and from help via (intelligent) debugging tools. This thesis investigates the use of these debugging techniques in tutoring. The debugging techniques described here rely totally on detecting patterns in the student's code which represent erroneous concepts the student may have.
The thesis is divided into three parts. Each part describes a separate area of investigation.
The first part provides a detailed description of the types of errors that professional programmers make when using a 'traditional' (i.e. glass teletype) Lisp environment.
In the second part the concept of a programming cliche has been inverted and used as a basis for a system designed to help overcome the difficulties described in the first part of the thesis. This approach can be used in the design of computing systems built to help novices in certain domains. The constraint on the domain is that students' answers are complex enough to contain patterns of errors (so one word answers would not suffice). This would include domains where students are learning procedural skills - such as arithmetic, algebra or mechanics.
The third part describes a study involving professional programmers using the system
Algorithms for pre-microrna classification and a GPU program for whole genome comparison
MicroRNAs (miRNAs) are non-coding RNAs with approximately 22 nucleotides that are derived from precursor molecules. These precursor molecules or pre-miRNAs often fold into stem-loop hairpin structures. However, a large number of sequences with pre-miRNA-like hairpin can be found in genomes. It is a challenge to distinguish the real pre-miRNAs from other hairpin sequences with similar stem-loops (referred to as pseudo pre-miRNAs). The first part of this dissertation presents a new method, called MirID, for identifying and classifying microRNA precursors. MirID is comprised of three steps. Initially, a combinatorial feature mining algorithm is developed to identify suitable feature sets. Then, the feature sets are used to train support vector machines to obtain classification models, based on which classifier ensemble is constructed. Finally, an AdaBoost algorithm is adopted to further enhance the accuracy of the classifier ensemble. Experimental results on a variety of species demonstrate the good performance of the proposed approach, and its superiority over existing methods.
In the second part of this dissertation, A GPU (Graphics Processing Unit) program is developed for whole genome comparison. The goal for the research is to identify the commonalities and differences of two genomes from closely related organisms, via multiple sequencing alignments by using a seed and extend technique to choose reliable subsets of exact or near exact matches, which are called anchors. A rigorous method named Smith-Waterman search is applied for the anchor seeking, but takes days and months to map millions of bases for mammalian genome sequences. With GPU programming, which is designed to run in parallel hundreds of short functions called threads, up to 100X speed up is achieved over similar CPU executions
- тАж