350,657 research outputs found
The development of a Java image processing framework : a thesis presented in partial fulfillment of the requirements for the degree of Master of Technology in Computer Systems Engineering at Massey University
Practical computer-based teaching methods are often used in conjunction with theory-based lecture sessions and textbooks when teaching image processing. In kind, electronic or on-line image processing courses commonly provide both theoretical and interactive components, however these are often disparate in that the software used to provide each component is independent rather than integrated. It is less common to find electronic instructional resources for image processing that integrate theoretical textual and practical interactive content together into one seamless package. An integrated approach has the advantage that the concepts are more easily conveyed and reinforced when taught 'side-by-side' this way. The World Wide Web offers an attractive medium for delivering an integrated instructional resource on image processing. Applets written in Java may be seamlessly integrated into a hypertext environment. These applets can provide practical demonstrations of image processing concepts along side the relevant hypertext-based theoretical content. One of the major barriers to realising this kind of resource is the development effort required to create the necessary applets. This research demonstrates that the provision of a software framework can significantly reduce the burden of developing these applets. Such a framework provides a common code base that can be drawn upon during applet development, thereby avoiding the need to start from scratch each time a new applet is needed. The framework's design is modelled on a dataflow view of image processing, allowing applets to be built in terms of interconnections between operations. This design is intended to provide the developer with an intuitive and easy-to-use application programming interface (API) for developing applets. The framework also provides APIs for the programmer to implement new operations and data types, thereby extending the capabilities of the framework. Further, the framework's design is general enough to allow it to be used for developing general purpose image processing programs, or other programs that lend themselves to development using a dataflow language. This thesis shows that the proposed framework achieves its aims through an example application of the development of an applet that demonstrates a thresholding operation
Recommended from our members
An Innovative Framework of Integrating ERP into IS 2010 Model Curriculum
The wide spread of Enterprise Resource Planning (ERP) technology has made information systems (IS) education shift the focus from functional applications development to enterprise software implementation and configuration. The latest model curriculum for undergraduate IS programs, the IS 2010, has made teaching the large and complex ERP software system an important issue. This paper presents a framework of innovatively integrating ERP into four core and three elective courses proposed in IS 2010. The paper illustrates the integrated ERP curriculum by discussing the design, content, and teaching methods for the seven courses using SAP as the software tool. The purpose of the paper is to provide a useful guideline for those who seek to teach ERP technology in the undergraduate IS curriculum
Tools to Facilitate Event-Driven Program Design in Introductory Courses
Widespread acceptance of the Windows environment has increased the popularity of application development tools that facilitate creation of Windows programs. In response, many universities are starting to teach introductory programming courses using these new software development tools. However, the event-driven nature of these new tools requires a design change to the traditional methods of teaching introductory programming. Unfortunately, most programming textbooks that employ the new tools neglect to provide a suitable framework for designing programs for this new event-driven software paradigm. This paper will present the key differences between event-driven and conventional programming, particularly as it affects teaching programming development concepts to beginning students. It will also describe how a new design tool (Object-Event Diagram) can be used to promote student understanding of event-driven programs
Addressing challenges to teach traditional and agile project management in academia
In order to prepare students for a professional IT career, most universities attempt to provide a current
educational curriculum in the Project Management (PM) area to their students. This is usually based on
the most promising methodologies used by the software industry. As instructors, we need to balance
traditional methodologies focused on proven project planning and control processes leveraging widely
accepted methods and tools along with the newer agile methodologies. Such new frameworks
emphasize that software delivery should be done in a flexible and iterative manner and with significant
collaboration with product owners and customers. In our experience agile methodologies have
witnessed an exponential growth in many diverse software organizations, and the various agile PM tools
and techniques will continue to see an increase in adoption in the software development sector.
Reflecting on these changes, there is a critical need to accommodate best practices and current methodologies in our courses that deliver Project Management content. In this paper we analyse two of the most widely used methodologies for traditional and agile software development â the widely used
ISO/PMBOK standard provided by the Project Management Institute and the well-accepted Scrum
framework. We discuss how to overcome curriculum challenges and deliver a quality undergraduate PM
course for a Computer Science and Information systems curricula. Based on our teaching experience
in Europe and North America, we present a comprehensive comparison of the two approaches. Our research covers the main concepts, processes, and roles associated with the two PM frameworks and recommended learning outcomes. The paper should be of value to instructors who are keen to see their computing students graduate with a sound understanding of current PM methodologies and who can deliver real-world software products.Accepted manuscrip
Practicing Scrum in Institute Course
Scrum is one of the most popular agile methods following \textit{Manifesto for Agile Software Development}, and is a value-driven software development approach which focuses on maximizing the values of the customers. Many top software companies like Amazon.com, Apple, and Microsoft directly apply Scrum and other Agile methods for developing great software products. To raise the talents required by industry, teaching agile methods in university is necessary. However, with the limits of time, space, and experts in agile development, it can be difficult for students to learn the practices of agile methods in college. In this paper, we describe an experimental course in Feng Chia University that practices Scrum for term projects among five teams composed of 34 students. To the best of our knowledge the practices is the few attempts to practically apply all the factors described in Scrum framework such as sprint planning, daily scrum, review, retrospective meetings, product owner, and scrum master in institutional agile education. In this paper, the design and the process of the term projects based on Scrum are described, and the lessons learned from practicing Scrum in college are presented as discussion
Computer-based collaborative concept mapping : motivating Indian secondary students to learn science : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Education at Massey University, Manawatu, New Zealand
This is a study of the design, development, implementation and evaluation of a teaching and learning intervention. The overarching aim of the study was to investigate the effectiveness of the intervention âComputer-based Collaborative Concept Mappingâ (CCCM) on Indian secondary studentsâ conceptual learning and motivation towards science learning. CCCM was designed based on constructivist and cognitive theories of learning and reinforced by recent motivation theories. The study followed a Design-based research (DBR) methodology. CCCM was implemented in two selected Indian secondary grade 9 classrooms. A quasi-experimental Solomon Four-Group research design was adopted to carry out the teaching experiment and mixed methods of data collection were used to generate and collect data from 241 secondary students and the two science teachers. The intervention was designed and piloted to check the feasibility for further implementation. The actual implementation of CCCM followed the pilot testing for 10 weeks. Students studied science concepts in small groups using the computer software Inspiration. Students constructed concept maps on various topics after discussing the concepts in their groups. The achievement test ATS9 was designed and administered as a pre-post-test to examine the conceptual learning and science achievement. Studentsâ responses were analysed to examine their individual conceptual learning whereas group concept maps were analysed to assess group learning. The motivation questionnaire SMTSL was also administered as a pre-post-test to investigate studentsâ initial and final motivation to learn science. At the end of the teaching experiment, the science teachers and two groups of students were interviewed. Analyses of the quantitative data suggested a statistically significant enhancement of science achievement, conceptual learning and motivation towards science learning. The
qualitative data findings revealed positive attitudes of students and teachers towards the CCCM use. Students and teachers believed that CCCM use could promote conceptual learning and motivate students to learn science. Both students and teachers preferred CCCM over on-going traditional didactic methods of teaching-learning. Some enablers and barriers identified by teachers and students in the Indian science classroom context are also explored and discussed. A framework for enhancing secondary school studentsâ motivation towards science learning and conceptual learning is proposed based on the findings. The findings of the study also contribute to addressing the prevailing learning crisis in Indian secondary school science classrooms by offering CCCM an active and participatory instructional strategy as envisioned by the Indian National Curriculum Framework 2005
Agile in Teaching and Learning: Conceptual Framework and Research Agenda
Agile software development methods are widespread in industry, and there is a wealth of academic research and practitioner publications currently available from this perspective. With the rise of Agile within companies worldwide, it is increasingly important for information systems education to keep up with this trend to ensure curriculum and courses are up-to-date. Students in the computing disciplines must be prepared to enter a job market where Agile is commonplace. As such, the topic of Agile in teaching and learning is critically important. The current special issue includes a rich collection of articles providing information systems educators with research-based, practical approaches for both teaching Agile (âthe whatâ) and using Agile as a pedagogical approach (âthe howâ). In an effort to assist information systems educators categorize the growing amount of literature related to Agile in teaching and learning, a conceptual framework is provided which places the literature along the two axes of pedagogy (âthe howâ) and the content (âthe whatâ) ranging from other, non-Agile to Agile. Finally, the authors present a call for future research integrating Agile on a meta-level in the course development process. We hope that this special issue inspires educators and researchers to consider integrating Agile into their teaching and learning
Big Data as a Technology-to-think-with for Scientific Literacy
This research aimed to identify indications of scientific literacy resulting
from a didactic and investigative interaction with Google Trends Big Data
software by first-year students from a high-school in Novo Hamburgo, Southern
Brazil. Both teaching strategies and research interpretations lie on four
theoretical backgrounds. Firstly, Bunge's epistemology, which provides a
thorough characterization of Science that was central to our study. Secondly,
the conceptual framework of scientific literacy of Fives et al. that makes our
teaching focus precise and concise, as well as supports one of our
methodological tool: the SLA (scientific literacy assessment). Thirdly, the
"crowdledge" construct from dos Santos, which gives meaning to our study when
as it makes the development of scientific literacy itself versatile for paying
attention on sociotechnological and epistemological contemporary phenomena.
Finally, the learning principles from Papert's Constructionism inspired our
educational activities. Our educational actions consisted of students, divided
into two classes, investigating phenomena chose by them. A triangulation
process to integrate quantitative and qualitative methods on the assessments
results was done. The experimental design consisted in post-tests only and the
experimental variable was the way of access to the world. The experimental
group interacted with the world using analyses of temporal and regional plots
of interest of terms or topics searched on Google. The control class did
'placebo' interactions with the world through on-site observations of
bryophytes, fungus or whatever in the schoolyard. As general results of our
research, a constructionist environment based on Big Data analysis showed
itself as a richer strategy to develop scientific literacy, compared to a free
schoolyard exploration.Comment: 23 pages, 2 figures, 8 table
A holistic method for improving software product and process quality
The concept of quality in general is elusive, multi-faceted and is perceived differently by different stakeholders. Quality is difficult to define and extremely difficult to measure. Deficient software systems regularly result in failures which often lead to significant financial losses but more importantly to loss of human lives. Such systems need to be either scrapped and replaced by new ones or corrected/improved through maintenance. One of the most serious challenges is how to deal with legacy systems which, even when not failing, inevitably require upgrades, maintenance and improvement because of malfunctioning or changing requirements, or because of changing technologies, languages, or platforms. In such cases, the dilemma is whether to develop solutions from scratch or to re-engineer a legacy system. This research addresses this dilemma and seeks to establish a rigorous method for the derivation of indicators which, together with management criteria, can help decide whether restructuring of legacy systems is advisable.
At the same time as the software engineering community has been moving from corrective methods to preventive methods, concentrating not only on both product quality improvement and process quality improvement has become imperative. This research investigation combines Product Quality Improvement, primarily through the re-engineering of legacy systems; and Process Improvement methods, models and practices, and uses a holistic approach to study the interplay of Product and Process Improvement. The re-engineering factor rho, a composite metric was proposed and validated.
The design and execution of formal experiments tested hypotheses on the relationship of internal (code-based) and external (behavioural) metrics. In addition to proving the hypotheses, the insights gained on logistics challenges resulted in the development of a framework for the design and execution of controlled experiments in Software Engineering.
The next part of the research resulted in the development of the novel, generic and, hence, customisable Quality Model GEQUAMO, which observes the principle of orthogonality, and combines a top-down analysis of the identification, classification and visualisation of software quality characteristics, and a bottom-up method for measurement and evaluation. GEQUAMO II addressed weaknesses that were identified during various GEQUAMO implementations and expert validation by academics and practitioners.
Further work on Process Improvement investigated the Process Maturity and its relationship to Knowledge Sharing, resulted in the development of the I5P Visualisation Framework for Performance Estimation through the Alignment of Process Maturity and Knowledge Sharing. I5P was used in industry and was validated by experts from academia and industry. Using the principles that guided the creation of the GEQUAMO model, the CoFeD visualisation framework, was developed for comparative quality evaluation and selection of methods, tools, models and other software artifacts. CoFeD is very useful as the selection of wrong methods, tools or even personnel is detrimental to the survival and success of projects and organisations, and even to individuals.
Finally, throughout the many years of research and teaching Software Engineering, Information Systems, Methodologies, I observed the ambiguities of terminology and the use of one term to mean different concepts and one concept to be expressed in different terms. These practices result in lack of clarity. Thus my final contribution comes in my reflections on terminology disambiguation for the achievement of clarity, and the development of a framework for achieving disambiguation of terms as a necessary step towards gaining maturity and justifying the use of the term âEngineeringâ 50 years since the term Software Engineering was coined.
This research resulted in the creation of new knowledge in the form of novel indicators, models and frameworks which can aid quantification and decision making primarily on re-engineering of legacy code and on the management of process and its improvement. The thesis also contributes to the broader debate and understanding of problems relating to Software Quality, and establishes the need for a holistic approach to software quality improvement from both the product and the process perspectives
- âŠ