1,683 research outputs found

    Complex adaptive systems based data integration : theory and applications

    Get PDF
    Data Definition Languages (DDLs) have been created and used to represent data in programming languages and in database dictionaries. This representation includes descriptions in the form of data fields and relations in the form of a hierarchy, with the common exception of relational databases where relations are flat. Network computing created an environment that enables relatively easy and inexpensive exchange of data. What followed was the creation of new DDLs claiming better support for automatic data integration. It is uncertain from the literature if any real progress has been made toward achieving an ideal state or limit condition of automatic data integration. This research asserts that difficulties in accomplishing integration are indicative of socio-cultural systems in general and are caused by some measurable attributes common in DDLs. This research’s main contributions are: (1) a theory of data integration requirements to fully support automatic data integration from autonomous heterogeneous data sources; (2) the identification of measurable related abstract attributes (Variety, Tension, and Entropy); (3) the development of tools to measure them. The research uses a multi-theoretic lens to define and articulate these attributes and their measurements. The proposed theory is founded on the Law of Requisite Variety, Information Theory, Complex Adaptive Systems (CAS) theory, Sowa’s Meaning Preservation framework and Zipf distributions of words and meanings. Using the theory, the attributes, and their measures, this research proposes a framework for objectively evaluating the suitability of any data definition language with respect to degrees of automatic data integration. This research uses thirteen data structures constructed with various DDLs from the 1960\u27s to date. No DDL examined (and therefore no DDL similar to those examined) is designed to satisfy the law of requisite variety. No DDL examined is designed to support CAS evolutionary processes that could result in fully automated integration of heterogeneous data sources. There is no significant difference in measures of Variety, Tension, and Entropy among DDLs investigated in this research. A direction to overcome the common limitations discovered in this research is suggested and tested by proposing GlossoMote, a theoretical mathematically sound description language that satisfies the data integration theory requirements. The DDL, named GlossoMote, is not merely a new syntax, it is a drastic departure from existing DDL constructs. The feasibility of the approach is demonstrated with a small scale experiment and evaluated using the proposed assessment framework and other means. The promising results require additional research to evaluate GlossoMote’s approach commercial use potential

    1957-2007: 50 Years of Higher Order Programming Languages

    Get PDF
    Fifty years ago one of the greatest breakthroughs in computer programming and in the history of computers happened – the appearance of FORTRAN, the first higher-order programming language. From that time until now hundreds of programming languages were invented, different programming paradigms were defined, all with the main goal to make computer programming easier and closer to as many people as possible. Many battles were fought among scientists as well as among developers around concepts of programming, programming languages and paradigms. It can be said that programming paradigms and programming languages were very often a trigger for many changes and improvements in computer science as well as in computer industry. Definitely, computer programming is one of the cornerstones of computer science. Today there are many tools that give a help in the process of programming, but there is still a programming tasks that can be solved only manually. Therefore, programming is still one of the most creative parts of interaction with computers. Programmers should chose programming language in accordance to task they have to solve, but very often, they chose it in accordance to their personal preferences, their beliefs and many other subjective reasons. Nevertheless, the market of programming languages can be merciless to languages as history was merciless to some people, even whole nations. Programming languages and developers get born, live and die leaving more or less tracks and successors, and not always the best survives. The history of programming languages is closely connected to the history of computers and computer science itself. Every single thing from one of them has its reflexions onto the other. This paper gives a short overview of last fifty years of computer programming and computer programming languages, but also gives many ideas that influenced other aspects of computer science. Particularly, programming paradigms are described, their intentions and goals, as well as the most of the significant languages of all paradigms

    Enterprise Resource Planning Systems and Their Impact on Development and Training: A Study of Instructional Methods in North America

    Get PDF
    One of the most important issues facing the modern world today is the Year 2000 problem. One of the greatest impacts of this problem is experienced by legacy computer systems. These are the database systems that run our businesses. Legacy systems are operated by segregated software packages that may or may not be able to communicate to each other. With the globalization of the economy, business computer systems need to be able to communicate with each other. This type of situation is what enterprise resource planning software is designed to solve. Enterprise Resource Planning (ERP) systems provide a common, consistent system to capture data organization-wide without redundancies. ERP is defined as a software management system that integrates all facets of the business, including planning, manufacturing, sales, and marketing. In addition to integrating the information across functions, also provides a set of tools for planning and monitoring the organizational functions and ensuring progress towards a common organizational goal (Sudhakar, 1998). As ERP methodology has become more popular, software applications have emerged to help business managers implement ERP. The purpose of this pilot study is to identify development times and delivery methods for high-level computer technology training. This has become an issue for Human Resource Development (HRD) because of the Year 2000 situation and its resulting issues, which have compelled businesses around North America to implement these ERP systems. After circulating a survey for almost seven month, the data collection period ended in March 1999. The response rate of 47 returned surveys from 63 requests calculates to 74.6%. A total of 34, or 72%, of respondents developed training programs for high-level computer technology

    Programming language trends : an empirical study

    Get PDF
    Predicting the evolution of software engineering technology trends is a dubious proposition. The recent evolution of software technology is a prime example; it is fast paced and affected by many factors, which are themselves driven by a wide range of sources. This dissertation is part of a long term project intended to analyze software engineering technology trends and how they evolve. Basically, the following questions will be answered: How to watch, predict, adapt to, and affect software engineering trends? In this dissertation, one field of software engineering, programming languages, will be discussed. After reviewing the history of a group of programming languages, it shows that two kinds of factors, intrinsic factors and extrinsic factors, could affect the evolution of a programming language. Intrinsic factors are the factors that can be used to describe the general desigu criteria of programming languages. Extrinsic factors are the factors that are not directly related to the general attributes of programming languages, but still can affect their evolution. In order to describe the relationship of these factors and how they affect programming language trends, these factors need to be quantified. A score has been assigued to each factor for every programming language. By collecting historical data, a data warehouse has been established, which stores the value of each factor for every programming language. The programming language trends are described and evaluated by using these data. Empirical research attempts to capture observed behaviors by empirical laws. In this dissertation, statistical methods are used to describe historical programming language trends and predict the evolution of the future trends. Several statistics models are constructed to describe the relationships among these factors. Canonical correlation is used to do the factor analysis. Multivariate multiple regression method has been used to construct the statistics models for programming language trends. After statistics models are constructed to describe the historical programming language trends, they are extended to do tentative prediction for future trends. The models are validated by comparing the predictive data and the actual data

    Comparative Study of Needs-Assessment Methodologies as They Apply to the Development of a University Computer Science Curriculum in a Central African Country

    Get PDF
    A Central African society consisting of two local ethnic groups, as well as Europeans, Asians [Pakistanis and Indians), and North Americans, provided a milieux for the evaluations of five needs-assessment methodologies as to their appropriateness in a multi-ethnic environment. Five methodologies, a questionnaire, a job analysis log, audio and video interviews, and an informal indigenous contact, were used in a needs assessment for the computer science department of the Adventist University of Central Africa, a private church-operated university, located at Mudende, Gisenyi, Rwanda, Central Africa. The five methodologies were evaluated in two different manners: 1. An evaluation of the appropriateness of the methodology based on 12 modified standards selected from those suggested by The Joint Committee on Standards for Educational Evaluation (1994), designed to assess the complete evaluation process (In this dissertation, the 12 selected standards were chosen and adapted to examine the impact of culture on the methodologies and to determine the appropriateness of the use of the methodologies in a needs assessment in the Fourth World.) 2. A comparison of the number of recommendations provided by each methodology. The numbering included the total count of recommendations provided by each methodology, the number of nonunique recommendations, as well as the number of unique recommendations. The recommendations were classified in new, consider. and improve categories. The recommendations were also examined as to how they were implemented. The results of the study indicate th at all five methodologies had their own set of unique strengths and weaknesses when used in a Fourth World setting. In fact, no one methodology would have been appropriate if used by itself. In this study, reducing the number of methodologies would have resulted in a loss of vital information needed for decision making. The greatest amount of information came from methods th at allowed the researcher to develop a researcher-respondent relationship prior to collection of information. The single most productive methodology was the audio interview, with its greater use of affective communication

    Acquiring data designs from existing data-intensive programs

    Get PDF
    The problem area addressed in this thesis is extraction of a data design from existing data intensive program code. The purpose of this is to help a software maintainer to understand a software system more easily because a view of a software system at a high abstraction level can be obtained. Acquiring a data design from existing data intensive program code is an important part of reverse engineering in software maintenance. A large proportion of software systems currently needing maintenance is data intensive. The research results in this thesis can be directly used in a reverse engineering tool. A method has been developed for acquiring data designs from existing data intensive programs, COBOL programs in particular. Program transformation is used as the main tool. Abstraction techniques and the method of crossing levels of abstraction are also studied for acquiring data designs. A prototype system has been implemented based on the method developed. This involved implementing a number of program transformations for data abstraction, and thus contributing to the production of a tool. Several case studies, including one case study using a real program with 7000 Hues of source code, are presented. The experiment results show that the Entity-Relationship Attribute Diagrams derived from the prototype can represent the data designs of the original data intensive programs. The original contribution of the thesis is that the approach presented in this thesis can identify and extract data relationships from the existing code by combining analysis of data with analysis of code. The approach is believed to be able to provide better capabilities than other work in the field. The method has indicated that acquiring a data design from existing data intensive program code by program transformation with human assistance is an effective method in software maintenance. Future work is suggested at the end of the thesis including extending the method to build an industrial strength tool

    Genesis of an Expert System For UMR Degree Auditing

    Get PDF
    This paper describes the features, design, and development of an expert system for degree auditing at the University of Missouri--Rolla. It summarizes artificial intelligence as it is known today while specifically addressing expert systems. It describes selected expert systems currently in existence. The present audit procedure utilized at the University of Missouri--Rolla is discussed. A description is given of the design and development of an expert system, written in LISP, to conduct a degree audit. Finally there are concluding remarks which include an analysis of the system and a discussion of possible system enhancement
    • …
    corecore