418 research outputs found

    Algorithmic Debugging of Real-World Haskell Programs: Deriving Dependencies from the Cost Centre Stack

    Get PDF
    Existing algorithmic debuggers for Haskell require a transformation of all modules in a program, even libraries that the user does not want to debug and which may use language features not supported by the debugger. This is a pity, because a promising ap- proach to debugging is therefore not applicable to many real-world programs. We use the cost centre stack from the Glasgow Haskell Compiler profiling environment together with runtime value observations as provided by the Haskell Object Observation Debugger (HOOD) to collect enough information for algorithmic debugging. Program annotations are in suspected modules only. With this technique algorithmic debugging is applicable to a much larger set of Haskell programs. This demonstrates that for functional languages in general a simple stack trace extension is useful to support tasks such as profiling and debugging

    Lightweight Computation Tree Tracing for Lazy Functional Languages

    Get PDF
    A computation tree of a program execution describes computations of functions and their dependencies. A computation tree describes how a program works and is at the heart of algorithmic debugging. To generate a computation tree, existing algorithmic debuggers either use a complex implementation or yield a less informative approximation. We present a method for lazy functional languages that requires only a simple tracing library to generate a detailed computation tree. With our algorithmic debugger a programmer can debug any Haskell program by only importing our library and annotating suspected functions

    A Survey of Algorithmic Debugging

    Full text link
    "© ACM, 2017. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in ACM Computing Surveys, {50, 4, 2017} https://dl.acm.org/doi/10.1145/3106740"[EN] Algorithmic debugging is a technique proposed in 1982 by E. Y. Shapiro in the context of logic programming. This survey shows how the initial ideas have been developed to become a widespread debugging schema ftting many diferent programming paradigms and with applications out of the program debugging feld. We describe the general framework and the main issues related to the implementations in diferent programming paradigms and discuss several proposed improvements and optimizations. We also review the main algorithmic debugger tools that have been implemented so far and compare their features. From this comparison, we elaborate a summary of desirable characteristics that should be considered when implementing future algorithmic debuggers.This work has been partially supported by the EU (FEDER) and the Spanish Ministerio de Economia y Competitividad under grant TIN2013-44742-C4-1-R, TIN2016-76843-C4-1-R, StrongSoft (TIN2012-39391-C04-04), and TRACES (TIN2015-67522-C3-3-R) by the Generalitat Valenciana under grant PROMETEO-II/2015/013 (SmartLogic) and by the Comunidad de Madrid project N-Greens Software-CM (S2013/ICE-2731).Caballero, R.; Riesco, A.; Silva, J. (2017). A Survey of Algorithmic Debugging. ACM Computing Surveys. 50(4):1-35. https://doi.org/10.1145/3106740S135504Abramson, D., Foster, I., Michalakes, J., & SosiÄŤ, R. (1996). Relative debugging. Communications of the ACM, 39(11), 69-77. doi:10.1145/240455.240475K. R. Apt H. A. Blair and A. Walker. 1988. Towards a theory of declarative knowledge. In Foundations of Deductive Databases and Logic Programming J. Minker (Ed.). Morgan Kaufmann Publishers Inc. San Francisco CA 89--148. 10.1016/B978-0-934613-40-8.50006-3 K. R. Apt H. A. Blair and A. Walker. 1988. Towards a theory of declarative knowledge. In Foundations of Deductive Databases and Logic Programming J. Minker (Ed.). Morgan Kaufmann Publishers Inc. San Francisco CA 89--148. 10.1016/B978-0-934613-40-8.50006-3Arora, T., Ramakrishnan, R., Roth, W. G., Seshadri, P., & Srivastava, D. (1993). Explaining program execution in deductive systems. Lecture Notes in Computer Science, 101-119. doi:10.1007/3-540-57530-8_7E. Av-Ron. 1984. Top-Down Diagnosis of Prolog Programs. Ph.D. Dissertation. Weizmann Institute. E. Av-Ron. 1984. Top-Down Diagnosis of Prolog Programs. Ph.D. Dissertation. Weizmann Institute.A. Beaulieu. 2005. Learning SQL. O’Reilly Farnham UK. A. Beaulieu. 2005. Learning SQL. O’Reilly Farnham UK.D. Binks. 1995. Declarative Debugging in Gödel. Ph.D. Dissertation. University of Bristol. D. Binks. 1995. Declarative Debugging in Gödel. Ph.D. Dissertation. University of Bristol.B. BraĂźel and H. Siegel. 2008. Debugging Lazy Functional Programs by Asking the Oracle. Springer-Verlag Berlin 183--200. DOI:http://dx.doi.org/10.1007/978-3-540-85373-2_11 10.1007/978-3-540-85373-2_11 B. BraĂźel and H. Siegel. 2008. Debugging Lazy Functional Programs by Asking the Oracle. Springer-Verlag Berlin 183--200. DOI:http://dx.doi.org/10.1007/978-3-540-85373-2_11 10.1007/978-3-540-85373-2_11Caballero, R. (2005). A declarative debugger of incorrect answers for constraint functional-logic programs. Proceedings of the 2005 ACM SIGPLAN workshop on Curry and functional logic programming - WCFLP ’05. doi:10.1145/1085099.1085102Caballero, R., GarcĂ­a-Ruiz, Y., & Sáenz-PĂ©rez, F. (2012). Declarative Debugging of Wrong and Missing Answers for SQL Views. Lecture Notes in Computer Science, 73-87. doi:10.1007/978-3-642-29822-6_9Caballero, R., GarcĂ­a-Ruiz, Y., & Sáenz-PĂ©rez, F. (2015). Debugging of wrong and missing answers for datalog programs with constraint handling rules. Proceedings of the 17th International Symposium on Principles and Practice of Declarative Programming - PPDP ’15. doi:10.1145/2790449.2790522Caballero, R., Martin-Martin, E., Riesco, A., & Tamarit, S. (2015). A zoom-declarative debugger for sequential Erlang programs. Science of Computer Programming, 110, 104-118. doi:10.1016/j.scico.2015.06.011Caballero, R., & RodrĂ­guez-Artalejo, M. (2002). A Declarative Debugging System for Lazy Functional Logic Programs. Electronic Notes in Theoretical Computer Science, 64, 113-175. doi:10.1016/s1571-0661(04)80349-9Ceri, S., Gottlob, G., & Tanca, L. (1989). What you always wanted to know about Datalog (and never dared to ask). IEEE Transactions on Knowledge and Data Engineering, 1(1), 146-166. doi:10.1109/69.43410Chen, M., Mao, S., & Liu, Y. (2014). Big Data: A Survey. Mobile Networks and Applications, 19(2), 171-209. doi:10.1007/s11036-013-0489-0Chitil, O., & Davie, T. (2008). Comprehending finite maps for algorithmic debugging of higher-order functional programs. Proceedings of the 10th international ACM SIGPLAN symposium on Principles and practice of declarative programming - PPDP ’08. doi:10.1145/1389449.1389475Chitil, O., Faddegon, M., & Runciman, C. (2016). A Lightweight Hat. Proceedings of the 28th Symposium on the Implementation and Application of Functional Programming Languages - IFL 2016. doi:10.1145/3064899.3064904O. Chitil C. Runciman and M. Wallace. 2001. Freja Hat and Hood—A Comparative Evaluation of Three Systems for Tracing and Debugging Lazy Functional Programs. Springer Berlin 176--193. O. Chitil C. Runciman and M. Wallace. 2001. Freja Hat and Hood—A Comparative Evaluation of Three Systems for Tracing and Debugging Lazy Functional Programs. Springer Berlin 176--193.O. Chitil C. Runciman and Malcolm Wallace. 2003. Transforming Haskell for Tracing. Springer-Verlag Berlin 165--181. DOI:http://dx.doi.org/10.1007/3-540-44854-3_11 10.1007/3-540-44854-3_11 O. Chitil C. Runciman and Malcolm Wallace. 2003. Transforming Haskell for Tracing. Springer-Verlag Berlin 165--181. DOI:http://dx.doi.org/10.1007/3-540-44854-3_11 10.1007/3-540-44854-3_11Minh Ngoc Dinh, Abramson, D., & Chao Jin. (2014). Scalable Relative Debugging. IEEE Transactions on Parallel and Distributed Systems, 25(3), 740-749. doi:10.1109/tpds.2013.86Faddegon, M., & Chitil, O. (2015). Algorithmic debugging of real-world haskell programs: deriving dependencies from the cost centre stack. ACM SIGPLAN Notices, 50(6), 33-42. doi:10.1145/2813885.2737985Faddegon, M., & Chitil, O. (2016). Lightweight computation tree tracing for lazy functional languages. Proceedings of the 37th ACM SIGPLAN Conference on Programming Language Design and Implementation - PLDI 2016. doi:10.1145/2908080.2908104Ferrand, G. (1987). Error diagnosis in logic programming an adaptation of E.Y. Shapiro’s method. The Journal of Logic Programming, 4(3), 177-198. doi:10.1016/0743-1066(87)90001-xFritzson, P., Shahmehri, N., Kamkar, M., & Gyimothy, T. (1992). Generalized algorithmic debugging and testing. ACM Letters on Programming Languages and Systems, 1(4), 303-322. doi:10.1145/161494.161498Fromherz, M. P. J. (s. f.). Towards declarative debugging of concurrent constraint programs. Lecture Notes in Computer Science, 88-100. doi:10.1007/bfb0019403Harman, M., & Hierons, R. (2001). An overview of program slicing. Software Focus, 2(3), 85-92. doi:10.1002/swf.41F. Henderson T. Conway Z. Somogyi D. Jeffery P. Schachte S. Taylor C. Speirs T. Dowd R. Becket M. Brown and P. Wang. 2014. The Mercury Language Reference Manual (Version 14.01.1). The University of Melbourne. F. Henderson T. Conway Z. Somogyi D. Jeffery P. Schachte S. Taylor C. Speirs T. Dowd R. Becket M. Brown and P. Wang. 2014. The Mercury Language Reference Manual (Version 14.01.1). The University of Melbourne.C. Hermanns and H. Kuchen. 2013. Hybrid Debugging of Java Programs. Springer-Verlag Berlin 91--107. DOI:http://dx.doi.org/10.1007/978-3-642-36177-7_6 10.1007/978-3-642-36177-7_6 C. Hermanns and H. Kuchen. 2013. Hybrid Debugging of Java Programs. Springer-Verlag Berlin 91--107. DOI:http://dx.doi.org/10.1007/978-3-642-36177-7_6 10.1007/978-3-642-36177-7_6Hirunkitti, V., & Hogger, C. J. (s. f.). A generalised query minimisation for program debugging. Lecture Notes in Computer Science, 153-170. doi:10.1007/bfb0019407Hughes, J. (2010). Software Testing with QuickCheck. Lecture Notes in Computer Science, 183-223. doi:10.1007/978-3-642-17685-2_6G. Hutton. 2016. Programming in Haskell. Cambridge University Press Cambridge UK. G. Hutton. 2016. Programming in Haskell. Cambridge University Press Cambridge UK.Insa, D., & Silva, J. (2010). An algorithmic debugger for Java. 2010 IEEE International Conference on Software Maintenance. doi:10.1109/icsm.2010.5609661Insa, D., & Silva, J. (2011). Optimal Divide and Query. Lecture Notes in Computer Science, 224-238. doi:10.1007/978-3-642-24769-9_17Insa, D., & Silva, J. (2011). An optimal strategy for algorithmic debugging. 2011 26th IEEE/ACM International Conference on Automated Software Engineering (ASE 2011). doi:10.1109/ase.2011.6100055D. Insa and J. Silva. 2011c. Scaling Up Algorithmic Debugging with Virtual Execution Trees. Springer-Verlag Berlin 149--163. DOI:http://dx.doi.org/10.1007/978-3-642-20551-4_10 10.1007/978-3-642-20551-4_10 D. Insa and J. Silva. 2011c. Scaling Up Algorithmic Debugging with Virtual Execution Trees. Springer-Verlag Berlin 149--163. DOI:http://dx.doi.org/10.1007/978-3-642-20551-4_10 10.1007/978-3-642-20551-4_10D. Insa and J. Silva. 2015a. Automatic transformation of iterative loops into recursive methods. Information 8 Software Technology 58 (2015) 95--109. DOI:http://dx.doi.org/10.1016/j.infsof.2014.10.001 10.1016/j.infsof.2014.10.001 D. Insa and J. Silva. 2015a. Automatic transformation of iterative loops into recursive methods. Information 8 Software Technology 58 (2015) 95--109. DOI:http://dx.doi.org/10.1016/j.infsof.2014.10.001 10.1016/j.infsof.2014.10.001Insa, D., & Silva, J. (2015). A Generalized Model for Algorithmic Debugging. Lecture Notes in Computer Science, 261-276. doi:10.1007/978-3-319-27436-2_16Insa, D., Silva, J., & Riesco, A. (2013). Speeding Up Algorithmic Debugging Using Balanced Execution Trees. Lecture Notes in Computer Science, 133-151. doi:10.1007/978-3-642-38916-0_8Insa, D., Silva, J., & Tomás, C. (2013). Enhancing Declarative Debugging with Loop Expansion and Tree Compression. Lecture Notes in Computer Science, 71-88. doi:10.1007/978-3-642-38197-3_6K. Jensen and N. Wirth. 1974. PASCAL User Manual and Report. Springer-Verlag Berlin. 10.1007/978-3-662-21554-8 K. Jensen and N. Wirth. 1974. PASCAL User Manual and Report. Springer-Verlag Berlin. 10.1007/978-3-662-21554-8Jia, Y., & Harman, M. (2011). An Analysis and Survey of the Development of Mutation Testing. IEEE Transactions on Software Engineering, 37(5), 649-678. doi:10.1109/tse.2010.62Kamkar, M., Shahmehri, N., & Fritzson, P. (s. f.). Bug localization by algorithmic debugging and program slicing. Lecture Notes in Computer Science, 60-74. doi:10.1007/bfb0024176S. Köhler B. Ludäscher and Y. Smaragdakis. 2012. Declarative Datalog Debugging for Mere Mortals. Springer-Verlag Berlin 111--122. S. Köhler B. Ludäscher and Y. Smaragdakis. 2012. Declarative Datalog Debugging for Mere Mortals. Springer-Verlag Berlin 111--122.Kouh, H.-J., & Yoo, W.-H. (2003). The Efficient Debugging System for Locating Logical Errors in Java Programs. Lecture Notes in Computer Science, 684-693. doi:10.1007/3-540-44839-x_72BenzmĂĽller, C., & Miller, D. (2014). Automation of Higher-Order Logic. Handbook of the History of Logic, 215-254. doi:10.1016/b978-0-444-51624-4.50005-8Kowalski, R., & Kuehner, D. (1971). Linear resolution with selection function. Artificial Intelligence, 2(3-4), 227-260. doi:10.1016/0004-3702(71)90012-9K. Kuchcinski W. Drabent and J. Maluszynski. 1993. Automatic Diagnosis of VLSI Digital Circuits Using Algorithmic Debugging. Springer-Verlag Berlin 350--367. DOI:http://dx.doi.org/10.1007/BFb0019419 10.1007/BFb0019419 K. Kuchcinski W. Drabent and J. Maluszynski. 1993. Automatic Diagnosis of VLSI Digital Circuits Using Algorithmic Debugging. Springer-Verlag Berlin 350--367. DOI:http://dx.doi.org/10.1007/BFb0019419 10.1007/BFb0019419S. Liang. 1999. Java Native Interface: Programmer’s Guide and Reference (1st ed.). Addison-Wesley Longman Publishing Co. Inc. Boston MA. S. Liang. 1999. Java Native Interface: Programmer’s Guide and Reference (1st ed.). Addison-Wesley Longman Publishing Co. Inc. Boston MA.Lloyd, J. W. (1987). Declarative error diagnosis. New Generation Computing, 5(2), 133-154. doi:10.1007/bf03037396J. W. Lloyd. 1987b. Foundations of Logic Programming (2nd ed.). Springer-Verlag Berlin. 10.1007/978-3-642-83189-8 J. W. Lloyd. 1987b. Foundations of Logic Programming (2nd ed.). Springer-Verlag Berlin. 10.1007/978-3-642-83189-8W. Lux. 2006. MĂĽnster Curry User’s guide (Release 0.9.10 of May 10 2006). Retrieved from http://danae.uni-muenster.de/∼lux/curry/user.pdf. W. Lux. 2006. MĂĽnster Curry User’s guide (Release 0.9.10 of May 10 2006). Retrieved from http://danae.uni-muenster.de/∼lux/curry/user.pdf.Lux, W. (2008). Declarative Debugging Meets the World. Electronic Notes in Theoretical Computer Science, 216, 65-77. doi:10.1016/j.entcs.2008.06.034I. MacLarty. 2005. Practical Declarative Debugging of Mercury Programs. Ph.D. Dissertation. Department of Computer Science and Software Engineering The University of Melbourne. I. MacLarty. 2005. Practical Declarative Debugging of Mercury Programs. Ph.D. Dissertation. Department of Computer Science and Software Engineering The University of Melbourne.Naganuma, J., Ogura, T., & Hoshino, T. (s. f.). High-level design validation using algorithmic debugging. Proceedings of European Design and Test Conference EDAC-ETC-EUROASIC. doi:10.1109/edtc.1994.326833Naish, L. (1992). Declarative diagnosis of missing answers. New Generation Computing, 10(3), 255-285. doi:10.1007/bf03037939H. Nilsson. 1998. Declarative Debugging for Lazy Functional Languages. Ph.D. Dissertation. Linköping Sweden. H. Nilsson. 1998. Declarative Debugging for Lazy Functional Languages. Ph.D. Dissertation. Linköping Sweden.NILSSON, H. (2001). How to look busy while being as lazy as ever: the Implementation of a lazy functional debugger. Journal of Functional Programming, 11(6), 629-671. doi:10.1017/s095679680100418xNilsson, H., & Fritzson, P. (s. f.). Algorithmic debugging for lazy functional languages. Lecture Notes in Computer Science, 385-399. doi:10.1007/3-540-55844-6_149Nilsson, H., & Fritzson, P. (1994). Algorithmic debugging for lazy functional languages. Journal of Functional Programming, 4(3), 337-369. doi:10.1017/s095679680000109xNilsson, H., & Sparud, J. (1997). Automated Software Engineering, 4(2), 121-150. doi:10.1023/a:1008681016679Ostrand, T. J., & Balcer, M. J. (1988). The category-partition method for specifying and generating fuctional tests. Communications of the ACM, 31(6), 676-686. doi:10.1145/62959.62964Pereira, L. M. (1986). Rational debugging in logic programming. Third International Conference on Logic Programming, 203-210. doi:10.1007/3-540-16492-8_76B. Pope. 2006. A Declarative Debugger for Haskell. Ph.D. Dissertation. The University of Melbourne Australia. B. Pope. 2006. A Declarative Debugger for Haskell. Ph.D. Dissertation. The University of Melbourne Australia.Ramakrishnan, R., & Ullman, J. D. (1995). A survey of deductive database systems. The Journal of Logic Programming, 23(2), 125-149. doi:10.1016/0743-1066(94)00039-9Riesco, A., Verdejo, A., MartĂ­-Oliet, N., & Caballero, R. (2012). Declarative debugging of rewriting logic specifications. The Journal of Logic and Algebraic Programming, 81(7-8), 851-897. doi:10.1016/j.jlap.2011.06.004DeRose, L., Gontarek, A., Vose, A., Moench, R., Abramson, D., Dinh, M. N., & Jin, C. (2015). Relative debugging for a highly parallel hybrid computer system. Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis on - SC ’15. doi:10.1145/2807591.2807605Runeson, P. (2006). A survey of unit testing practices. IEEE Software, 23(4), 22-29. doi:10.1109/ms.2006.91Russo, F., & Sancassani, M. (1992). A declarative debugging environment for DATALOG. Lecture Notes in Computer Science, 433-441. doi:10.1007/3-540-55460-2_32E. Y. Shapiro. 1982a. Algorithmic Program Debugging. MIT Press Cambridge MA. E. Y. Shapiro. 1982a. Algorithmic Program Debugging. MIT Press Cambridge MA.Shapiro, E. Y. (1982). Algorithmic program diagnosis. Proceedings of the 9th ACM SIGPLAN-SIGACT symposium on Principles of programming languages - POPL ’82. doi:10.1145/582153.582185Shmueli, O., & Tsur, S. (1991). Logical diagnosis ofLDL programs. New Generation Computing, 9(3-4), 277-303. doi:10.1007/bf03037166Silva, J. (s. f.). A Comparative Study of Algorithmic Debugging Strategies. Lecture Notes in Computer Science, 143-159. doi:10.1007/978-3-540-71410-1_11Silva, J. (2011). A survey on algorithmic debugging strategies. Advances in Engineering Software, 42(11), 976-991. doi:10.1016/j.advengsoft.2011.05.024Silva, J., & Chitil, O. (2006). Combining algorithmic debugging and program slicing. Proceedings of the 8th ACM SIGPLAN symposium on Principles and practice of declarative programming - PPDP ’06. doi:10.1145/1140335.1140355J. A. Silva E. R. Faria R. C. Barros E. R. Hruschka A. C. P. L. F. de Carvalho and J. Gama. 2013. Data stream clustering: A survey. Comput. Surv. 46 1 Article 13 (July 2013) 31 pages.DOI:http://dx.doi.org/10.1145/2522968.2522981 10.1145/2522968.2522981 J. A. Silva E. R. Faria R. C. Barros E. R. Hruschka A. C. P. L. F. de Carvalho and J. Gama. 2013. Data stream clustering: A survey. Comput. Surv. 46 1 Article 13 (July 2013) 31 pages.DOI:http://dx.doi.org/10.1145/2522968.2522981 10.1145/2522968.2522981SOSIÄŚ, R., & ABRAMSON, D. (1997). Guard: A Relative Debugger. Software: Practice and Experience, 27(2), 185-206. doi:10.1002/(sici)1097-024x(199702)27:23.0.co;2-dL. Sterling and E. Shapiro. 1986. The Art of Prolog: Advanced Programming Techniques. The MIT Press Cambridge MA. L. Sterling and E. Shapiro. 1986. The Art of Prolog: Advanced Programming Techniques. The MIT Press Cambridge MA.P. Kambam Sugavanam. 2013. Debugging Framework for Attribute Grammars. Ph.D. Dissertation. University of Minnesota. P. Kambam Sugavanam. 2013. Debugging Framework for Attribute Grammars. Ph.D. Dissertation. University of Minnesota.Tamarit, S., Riesco, A., Martin-Martin, E., & Caballero, R. (2016). Debugging Meets Testing in Erlang. Lecture Notes in Computer Science, 171-180. doi:10.1007/978-3-319-41135-4_10A. Tessier and G. Ferrand. 2000. Declarative diagnosis in the CLP scheme. In Analysis and Visualization Tools for Constraint Programming: Constraint Debugging Pierre Deransart Manuel V. Hermenegildo and Jan Maluszynski (Eds.). Springer-Verlag Berlin 151--174. 10.1007/10722311_6 A. Tessier and G. Ferrand. 2000. Declarative diagnosis in the CLP scheme. In Analysis and Visualization Tools for Constraint Programming: Constraint Debugging Pierre Deransart Manuel V. Hermenegildo and Jan Maluszynski (Eds.). Springer-Verlag Berlin 151--174. 10.1007/10722311_6Zinn, C. (2013). Algorithmic Debugging for Intelligent Tutoring: How to Use Multiple Models and Improve Diagnosis. Lecture Notes in Computer Science, 272-283. doi:10.1007/978-3-642-40942-4_24Zinn, C. (2014). Algorithmic Debugging and Literate Programming to Generate Feedback in Intelligent Tutoring Systems. KI 2014: Advances in Artificial Intelligence, 37-48. doi:10.1007/978-3-319-11206-0_

    A Lightweight Hat: Simple Type-Preserving Instrumentation for Self-Tracing Lazy Functional Programs

    Get PDF
    Existing methods for generating a detailed trace of a computation of a lazy functional program are complex. These complications limit the use of tracing in practice. However, such a detailed trace is desirable for understanding and debugging a lazy functional program. Here we present a lightweight method that instruments a program to generate such a trace, namely the augmented redex trail introduced by the Haskell tracer Hat. The new method is a major step towards an omniscient debugger for real-world Haskell programs

    How can the teaching of programming be used to enhance computational thinking skills?

    No full text
    The use of the term computational thinking, introduced in 2006 by Jeanette Wing, is having repercussions in the field of education. The term brings into sharp focus the concept of thinking about problems in a way that can lead to solutions that may be implemented in a computing device. Implementation of these solutions may involve the use of programming languages.This study explores ways in which programming can be employed as a tool to teach computational thinking and problem solving. Data is collected from teachers, academics, and professionals, purposively selected because of their knowledge of the topics of problem solving, computational thinking, or the teaching of programming. This data is analysed following a grounded theory approach. A Computational Thinking Taxonomy is developed. The relationships between cognitive processes, the pedagogy of programming, and the perceived levels of difficulty of computational thinking skills are illustrated by a model.Specifically, a definition for computational thinking is presented. The skills identified are mapped to Bloom’s Taxonomy: Cognitive Domain. This mapping concentrates computational skills at the application, analysis, synthesis, and evaluation levels. Analysis of the data indicates that the less difficult computational thinking skills for beginner programmers are generalisation, evaluation, and algorithm design. Abstraction of functionality is less difficult than abstraction of data, but both are perceived as difficult. The most difficult computational thinking skill is reported as decomposition. This ordering of difficulty for learners is a reversal of the cognitive complexity predicted by Bloom’s model. The plausibility of this inconsistency is explored.The taxonomy, model, and the other results of this study may be used by educators to focus learning onto the computational thinking skills acquired by the learners, while using programming as a tool. They may also be employed in the design of curriculum subjects, such as ICT, computing, or computer science

    An overview of decision table literature 1982-1995.

    Get PDF
    This report gives an overview of the literature on decision tables over the past 15 years. As much as possible, for each reference, an author supplied abstract, a number of keywords and a classification are provided. In some cases own comments are added. The purpose of these comments is to show where, how and why decision tables are used. The literature is classified according to application area, theoretical versus practical character, year of publication, country or origin (not necessarily country of publication) and the language of the document. After a description of the scope of the interview, classification results and the classification by topic are presented. The main body of the paper is the ordered list of publications with abstract, classification and comments.

    Debugging Type Errors with a Blackbox Compiler

    Get PDF
    Type error debugging can be a laborious yet necessary process for programmers of statically typed functional programming languages. Often a compiler compounds this by inaccurately reporting the location of a type error, a problem that has been a subject of research for over thirty years. However, despite its long history, the solutions proposed are often reliant on direct modifications to the compiler, often distributed in the form of patches. These patches append another level of arduous activity to the task of debugging, keeping them modernised to the ever-changing programming language they support. This thesis investigates an additional option; the blackbox compiler. Split into three central parts, it shows the individual solutions involved in using a blackbox compiler to debug type errors in functional programming languages. First is a demonstration of how the combination of a blackbox compiler and a generic debugging algorithm can successfully locate type errors. Next tackled is a side-effect of this new combination, the introduction of extra errors, combated with a new speed boosted algorithm, evaluated with a proposed framework based on Data Science techniques to quantify the quality of a type error debugger. Lastly, the algorithms employed throughout this thesis, along with the blackbox compiler, have agnostic properties, they do not need language-specific knowledge. Thus, the final part presents utilising the agnostic abilities for an agnostic debugger to locate type errors

    A Program Visualization System That Supports the Program Understanding Process.

    Get PDF
    The goal of this research is to provide a graphical system that supports the program understanding process by representing the program\u27s control flow, the code and the identifiers local to a specific point within the program. By having more information local to the point of interest, the programmer can maintain continuity in developing program understanding. The programmer can see loops, procedure calls, and other structures with respect to their execution order and can view them in the environment or the context in which they will execute. The Peec system supplies a graphical representation of the program\u27s control flow in which the control structures are represented as tiers. The tiers are arranged in a three-dimensional space representing the program\u27s operational flow. The body of the procedure or function is nested within the reference tier so that the programmer views the routine local to its reference point. Also, a list of live identifiers is displayable for the current tier element. The advantage is that the routine\u27s text and the identifier list are local to the area of study and the programmer does not have to look elsewhere for the program text and the identifier definition. The programmer can maintain a continuity in developing program understanding using information local to the point of interest. The Peec system consists of the Peec compiler which transforms a Pascal program into tier and identifier information, and the Peec environment for modeling the program\u27s operational flow image. The Peec environment provides the programmer many interactive capabilities. These capabilities consist of browsing the flow model, displaying text, displaying identifiers and transforming the three-dimensional flow model into appropriate views. These features are aimed at assisting the programmer in the processing of developing program understanding

    Developer-Centric Automated Debugging

    Get PDF
    Software debugging is an expensive activity that is responsible for a significant part of software maintenance cost. In particular, locating faulty code (i.e., fault localization) is one of the most challenging parts of software debugging. In the past years, researchers have proposed many techniques that aim at fully automating the task of fault localization. Although these techniques have been shown to be effective in reducing the amount of code developers need to inspect to locate faults, there is growing evidence that they provide developers with limited help in realistic debugging scenarios. I believe that a practical automated debugging technique should have human developers at the center of the debugging process rather than trying to completely replace them. In this dissertation, I present three of my techniques that support developer-centric automated debugging. First, I present ENLIGHTEN, an interactive, feedback-driven fault localization technique. ENLIGHTEN supports and automates developers’ debugging workflow as follows. It 1) uses traditional statistical fault localization (SFL) to formulate an initial hypothesis of where the fault may be; 2) identifies a relevant subset of execution that can help support or refute the formulated hypothesis; 3) presents the developer with a query about the identified execution subset in the form of a correctness question about the input-output relation of the partial execution; 4) refines its hypothesis of the fault by using the developer’s feedback; and 5) repeats these steps until the fault is found. Second, I discuss my work on improving the accuracy of dynamic dependence analysis, which is a powerful tool for developers to investigate program behavior in an interactive debugging setting and a foundation that many automated debugging techniques leverage to model dynamic execution semantics. I present my finding that existing dynamic dependence analysis techniques could miss the cause-effect relations between faults and the observed failures if the faulty program states propagate via incorrect computation of memory addresses. To address this limitation, I define the concept of potential memory-address dependence, which explicitly represents this type of causal relations, and describe an algorithm that computes it. Third, I present TESSERACT, a technique that improves the scalability of dynamic dependency analysis in the context of interactive debugging. Many existing dependency-based debugging techniques are shown to work well on short executions, but fail to scale to larger ones. TESSERACT has the potential to address this limitation by utilizing a record-and-replay system to efficiently recreate the failing execution, break it down into small time slices, and analyze these slices in a parallelized, on-demand fashion.Ph.D
    • …
    corecore