3,354 research outputs found

    The World at our Doorstep

    Get PDF
    The Onondaga Citizens League studied the issues of refugee resettlement in Central New York. The purposes of the study were to first develop a clearer picture and understanding of the refugee dynamic in Onondaga County -- the needs, the service continuum and the opportunities new refugee populations offe and then to recommend programming and policies to help it be a more welcoming community. The lessons learned crossed sectors from literacy to public safety, and offer information both in process and potential.The community has a long history of welcoming people from around the world and has seen an increase in New Americans in the last four to five years, as global unrest has grown. The higher numbers, coinciding as they did with an economic downturn that hurt all residents of the community, made the refugee presence more noticeable and for some, more problematic. Underlying the study was an unspoken question -- does Onondaga County have the resources and the willingness to welcome this population in a way that helps them without negatively affecting others with human service needs? The actions recommended might be targeted towards helping this new population, but were built on the premise that by helping them, Onondaga County (and other communities) help themselves

    A heuristic-based approach to code-smell detection

    Get PDF
    Encapsulation and data hiding are central tenets of the object oriented paradigm. Deciding what data and behaviour to form into a class and where to draw the line between its public and private details can make the difference between a class that is an understandable, flexible and reusable abstraction and one which is not. This decision is a difficult one and may easily result in poor encapsulation which can then have serious implications for a number of system qualities. It is often hard to identify such encapsulation problems within large software systems until they cause a maintenance problem (which is usually too late) and attempting to perform such analysis manually can also be tedious and error prone. Two of the common encapsulation problems that can arise as a consequence of this decomposition process are data classes and god classes. Typically, these two problems occur together – data classes are lacking in functionality that has typically been sucked into an over-complicated and domineering god class. This paper describes the architecture of a tool which automatically detects data and god classes that has been developed as a plug-in for the Eclipse IDE. The technique has been evaluated in a controlled study on two large open source systems which compare the tool results to similar work by Marinescu, who employs a metrics-based approach to detecting such features. The study provides some valuable insights into the strengths and weaknesses of the two approache

    ITACS Annual Accountability Report: FY2004 Accomplishments and Challenges

    Get PDF

    Construct by Contract: Construct by Contract: An Approach for Developing Reliable Software

    Get PDF
    This research introduces “Construct by Contract” as a proposal for a general methodology to develop dependable software systems. It describes an ideal process to construct systems by propagating requirements as contracts from the client’s desires to the correctness proof in verification stage, especially in everyday-used software like web applications, mobile applications and desktop application. Such methodology can be converted in a single integrated workspace as standalone tool to develop software. To achieve the already mentioned goal, this methodology puts together a collection of software engineering tools and techniques used throughout the software’s lifecycle, from requirements gathering to the testing phase, in order to ensure a contract-based flow. Construct by Contract is inclusive, regarding the roles of the people involved in the software construction process, including for instance customers, users, project managers, designers, developers and testers, all of them interacting in one common software development environment, sharing information in an understandable presentation according to each stage. It is worth to mention that we focus on the verification phase, as the key to achieve the reliability sought. Although at this point, we only completed the definition and the specification of this methodology, we evaluate the implementation by analysing, measuring and comparing different existing tools that could fit at any of the stages of software’s lifecycle, and that could be applied into a piece of commercial software. These insights are provided in a proof of concept case study, involving a productive Java Web application using struts framework

    Online collaborative learning in tertiary ICT education to enhance students' learning in Malaysia

    Get PDF
    This study investigated the nature of students’, and student group, interactions through the incorporation of an online collaborative learning (OCL) initiative, with its aim to enhance students’ learning in a Malaysian tertiary classroom. In order to contribute to knowledge and understanding about the nature and quality of OCL, the learning processes and outcomes were drawn predominantly from Harasim’s model, with inclusion of a socio-cultural framework aimed at enhancing learning outcomes for undergraduate science and ICT education students. Harasim’s model of OCL that was used in the intervention includes steps to setting up the stage and a system for Idea Generating (IG), modeling and guiding the OCL discussions for Idea Organizing (IO), and evaluating and reflecting the OCL discussions for Intellectual Convergence (IC). The interactions in OCL were analysed through four dimensions: participative, interactive, social, and cognitive in support of the students’ cognitive, social and emotional development. The OCL intervention in this study was conducted through an ICT education course in a Malaysian university that required OCL discussions for 13 weeks: the first four weeks were intra-group work discussions (Task 1), followed by four/five weeks of inter-group work discussions (Task 2), and the remaining four weeks were for the final intra-group work discussions (Task 3). The OCL intervention was aimed at facilitating interdisciplinary collaboration and interaction between students from Chemistry, Physics and Mathematics majors through the university’s Learning Management System (Moodle), which provided the shared space for the OCL discourse and tools for collaboration. A total of nine groups of four to six students (N=46) were involved in this study. In order to evaluate the OCL intervention using a holistic view, an interpretive approach that included the collection of quantitative and qualitative data was adopted to frame the collection and analysis of the data. Quantitative data were obtained from online questionnaires, together with online data based on the frequency of students’ posts in participative, interactive, social, and cognitive dimensions. Qualitative data were gathered via interviews with students (group and post-course interviews) and lecturers, and online transcripts that included online postings and students’ online journal entries. These data were collected and analysed in order to triangulate the findings and to help the researcher assess the extent to which the intervention was successful in enhancing students’ learning. The findings from the study revealed the nature of students’ interactions in OCL correspond with particular socio-cultural views that students’ interactions are characterised based on the participative, interactive, social and cognitive dimensions in support of the students’ cognitive, social and emotional development. From a socio-cultural perspective, the outcomes that arose from the study included: • The socio-cultural learning constructs have been useful as a framework for the analysis of the OCL intervention based on the participative, interactive, social and cognitive dimensions. • The affordances of the OCL group work helped the students’ in their group work. • The constraints of OCL influence the communication methods, and interaction styles used by students in achieving task goals through group work in the OCL intervention. The findings also show students’ interactions and student group interactions were an important part of the learning process. The implementation of OCL intervention into the course can lead to the facilitation of the student group learning process as well as supporting their cognitive, social and emotional development, and potential constraints from the technology (e.g. Internet connection) or the lack of social and verbal cues (e.g. facial expression) can lead to different working methods of communication for achieving task goals and different styles of interactions. Overall, the findings of the study indicate the value of OCL in a tertiary classroom to enhance learning

    To enhance collaborative learning and practice network knowledge with a virtualization laboratory and online synchronous discussion

    Get PDF
    This work is licensed under a Creative Commons Attribution 4.0 Internatinal License.Recently, various computer networking courses have included additional laboratory classes in order to enhance students' learning achievement. However, these classes need to establish a suitable laboratory where each student can connect network devices to configure and test functions within different network topologies. In this case, the Linux operating system can be used to operate network devices and the virtualization technique can include multiple OSs for supporting a significant number of students. In previous research, the virtualization application was successfully applied in a laboratory, but focused only on individual assignments. The present study extends previous research by designing the Networking Virtualization-Based Laboratory (NVBLab), which requires collaborative learning among the experimental students. The students were divided into an experimental group and a control group for the experiment. The experimental group performed their laboratory assignments using NVBLab, whereas the control group completed them on virtual machines (VMs) that were installed on their personal computers. Moreover, students using NVBLab were provided with an online synchronous discussion (OSD) feature that enabled them to communicate with others. The laboratory assignments were divided into two parts: Basic Labs and Advanced Labs. The results show that the experimental group significantly outperformed the control group in two Advanced Labs and the post-test after Advanced Labs. Furthermore, the experimental group's activities were better than those of the control group based on the total average of the command count per laboratory. Finally, the findings of the interviews and questionnaires with the experimental group reveal that NVBLab was helpful during and after laboratory class

    A distributed solution to software reuse

    Get PDF
    Reuse can be applied to all stages of the software lifecycle to enhance quality and to shorten time of completion for a project. During the phases of design and implementation are some examples of where reuse can be applied, but one frequent obstruction to development is the building of and the identifying of desirable components. This can be costly in the short term but an organisation can gain the profits of applying this scheme if they are seeking long-term goals. Web services are a recent development in distributed computing. This thesis combines the two research areas to produce a distributed solution to software reuse that displays the advantages of distributed computing within a reuse system. This resulted in a web application with access to web services that allowed two different formats of component to be inserted into a reuse repository. These components were searchable by keywords and the results are adjustable by the popularity of a component’s extraction from the system and by user ratings of it; this improved the accuracy of the search. This work displays the accuracy, usability, and speed of this system when tested with five undergraduate and five postgraduate students

    An Information Systems Design Theory for Service Network Effects

    Get PDF
    Service platforms make software applications available as a service to end users. Platforms enable noticeable economic benefits for scaling and transforming a business. Their long-term competitiveness is ensured in controlled cooperation with channel intermediaries and network partners. Hence, service platforms must be designed to harness self-enforcing effects of value generation, so-called network effects. In an exaptation of existing knowledge, we present an information systems design theory to inform the design of methods that analyze, describe, and guide the design of service platforms through the means of causal loops and control methods. We describe the theory’s purpose and scope as well as the underlying justificatory knowledge behind the constructs and principles of form and function. The design theory covers the design of all service platform participants and activities as well as their transactions and influences in areas of staged platform authority, using enforcing and incentivizing control methods. We demonstrate the principles of implementation with an expository instantiation and apply it to the M-Engineering service platform, which offers surveillance, control, and data acquisition solutions. Furthermore, we present and discuss testable propositions and a study design to evaluate our design principles

    A Review on Software Performance Analysis for Early Detection of Latent Faults in Design Models

    Get PDF
    Organizations and society could face major breakdown if IT strategies do not comply with performance requirements. This is more so in the era of globalization and emergence of technologies caused more issues. Software design models might have latent and potential issues that affect performance of software. Often performance is the neglected area in the industry. Identifying performance issues in the design phase can save time, money and effort. Software engineers need to know the performance requirements so as to ensure quality software to be developed. Software performance engineering a quantitative approach for building software systems that can meet performance requirements. There are many design models based on UML, Petri Nets and Product-Forms. These models can be used to derive performance models that make use of LQN, MSC, QNM and so on. The design models are to be mapped to performance models in order to predict performance of system early and render valuable feedback for improving quality of the system. Due to emerging distributed technologies such as EJB, CORBA, DCOM and SOA applications became very complex with collaboration with other software. The component based software systems, software systems that are embedded, distributed likely need more systematic performance models that can leverage the quality of such systems. Towards this end many techniques came into existence. This paper throws light into software performance analysis and its present state-of-the-art. It reviews different design models and performance models that provide valuable insights to make well informed decisions
    • …
    corecore