233 research outputs found

    A Comparative Analysis Of C# And Java As An Introductory Programming Language For Information Systems Students

    Get PDF
    Since the introduction of the C# programming language by Microsoft, the decision to teach C# or Java or both as the introductory language for Information Systems (IS) majors has always been, and continues to be an ongoing predicament. The purpose of this paper is to submit analyses based on a comparison on which language is better as the primary language taught as an introductory language for IS majors. In this paper we intend to show an answer to the question: “C# or Java, that is the question!”  Each language has pros and cons for use in an introductory programming language course; however we make a suggestion as to which is the most optimal choice

    Anomaly Detection in RFID Networks

    Get PDF
    Available security standards for RFID networks (e.g. ISO/IEC 29167) are designed to secure individual tag-reader sessions and do not protect against active attacks that could also compromise the system as a whole (e.g. tag cloning or replay attacks). Proper traffic characterization models of the communication within an RFID network can lead to better understanding of operation under “normal” system state conditions and can consequently help identify security breaches not addressed by current standards. This study of RFID traffic characterization considers two piecewise-constant data smoothing techniques, namely Bayesian blocks and Knuth’s algorithms, over time-tagged events and compares them in the context of rate-based anomaly detection. This was accomplished using data from experimental RFID readings and comparing (1) the event counts versus time if using the smoothed curves versus empirical histograms of the raw data and (2) the threshold-dependent alert-rates based on inter-arrival times obtained if using the smoothed curves versus that of the raw data itself. Results indicate that both algorithms adequately model RFID traffic in which inter-event time statistics are stationary but that Bayesian blocks become superior for traffic in which such statistics experience abrupt changes

    Data Mining

    Get PDF
    Recently data mining has become more popular in the information industry. It is due to the availability of huge amounts of data. Industry needs turning such data into useful information and knowledge. This information and knowledge can be used in many applications ranging from business management, production control, and market analysis, to engineering design and science exploration. Database and information technology have been evolving systematically from primitive file processing systems to sophisticated and powerful databases systems. The research and development in database systems has led to the development of relational database systems, data modeling tools, and indexing and data organization techniques. In relational database systems data are stored in relational tables. In addition, users can get convenient and flexible access to data through query languages, optimized query processing, user interfaces and transaction management and optimized methods for On-Line Transaction Processing (OLTP). The abundant data, which needs powerful data analysis tools, has been described as a data rich but information poor situation. The fast-growing, tremendous amount of data, collected and stored in large and numerous databases. Humans can not analyze these large amounts of data. So we need powerful tools to analyze this large amount of data. As a result, data collected in large databases become data tombs. These are data archives that are seldom visited. So, important decisions are often not made based on the information-rich data stored in databases rather based on a decision maker's intuition. This is because the decision maker does not have the tools to extract the valuable knowledge embedded in the vast amounts of data. Data mining tools which perform data analysis may uncover important data patterns, contributing greatly to business strategies, knowledge bases, and scientific and medical research. So data mining tools will turn data tombs into golden nuggets of knowledge

    ISSUES IN THE SUBTITLING AND DUBBING OF ENGLISH-LANGUAGE FILMS INTO ARABIC: PROBLEMS AND SOLUTIONS

    Get PDF
    ABSTRACT This study investigates the problems that translators tend to face in the subtitling and dubbing of English-language films and television programmes into Arabic and suggests solutions for these problems. In the light of an examination of the generic features of audiovisual translation and of the particular cultural constraints inherent in translation for Arabic-speaking audiences, it is proposed that certain elements of translation theory can be useful in overcoming the technical and cultural barriers identified. This proposition is tested through analysis of the translation of three feature films, one television sitcom and an animation series that have been subtitled and dubbed into Arabic, with a particular focus on the translation of dialect, swear words, and humour. Technical, linguistic and cultural issues constitute a challenge to Arabic translators who need to deal with: 1) the limitations on screen such as space, time, lip and character synchronizations; 2) the issue of rendering English dialects into Modern Standard Arabic (MSA) and 3) the problem of culture which restricts them when they want to translate taboo expressions. This results in a loss (partial or complete) of the source film’s message. Each of the audiovisual works mentioned above was considered as a case study that was analysed using both qualitative and quantitative methods. Interviews, experiments and a questionnaire were conducted in this respect to find answers to the research questions. The interviews aimed to gather evidence of how professionals translate, what problems they face and what possible solutions they may suggest for them. The experiments and the questionnaire, on the other hand, were audience-focused tools in the sense that sample audiences watched and judged the ability of a translation both in subtitled and dubbed forms to deliver the message of a movie to them, and therefore, provided evidence on the relative effectiveness of different translation procedures. Based on this, solutions were both suggested and tested in terms of their viability to overcome the barriers that emerge during the subtitling and dubbing of dialect, swear words and humour into Arabic. The findings show that translators have significant scope for improving the quality of their output, especially by adopting a more functional translation approach that can help them successfully deal with the difficulties inherent in this type of translation and make the translated dialogue have a similar effect on the target audience as that which the source text has on its audience

    Restructuring Object -Oriented Designs Using a Metric-Driven Approach.

    Get PDF
    The benefits of object-oriented software are now widely recognized. However, methodologies that are used to develop object-oriented software are still in their infancy. There is a lack of methods to assess the quality of the various components that are derived during the development process. The design of a system is a crucial component derived during the system development process. Little attention has been given to assessing object-oriented designs to determine the goodness of the designs. There are metrics that can provide guidance for assessing the quality of the design. The objective of this research is to develop a system to evaluate object-oriented designs and to provide guidance for the restructuring of the design based on the results of the evaluation process. We identify a basic set of metrics that reflects the benefits of the object-oriented paradigm such as inheritance, encapsulation, and method interactions. Specifically, we include metrics that measure depth of inheritance, methods usage, cardinality of subclasses, coupling, class responses, and cohesion. We define techniques to evaluate the metric values on existing object-oriented designs. We then define techniques to utilize the metric values to help restructure designs so that they conform to predetermined design criteria. These methods and techniques are implemented as a part of a Design Evaluation Assistant that automates much of the evaluation and restructuring process

    English-Speakers' Errors in Arabic as L2 Writing System: A Teacher Perspective

    Get PDF
    Errors are significant in terms of understanding the acquisition, competence, difficulties, and development of L2 writing. Within the framework of second language writing system (L2WS), this study investigates teacher perspectives on Arabic writing errors made by English-speaking learners in L2 classes. The results suggest that numerous difficulties seem to face the English-speaking learner of L2WS Arabic, as mentioned by the interviewees, such as the move from a fairly dot-free system into a dot-full system. Nine categories emerged and described by the interviewees as the common writing errors; namely 1) letter shape including teeth and size; 2) direction, 3) dots, 4) phonological issues, 5) spelling issues, 6) letter connecting, 7) letter doubling, 8) letterforms, and 9) other errors. According to the interviewees, the reasons of making these errors in such a context are mixture of phonological differences; orthographic differences, spelling error causes; and other reasons. Several suggestions also were voiced by the interviewees in order to develop the teaching methods of Arabic as L2WS. Acknowledgment: the author is thankful to the research centre at Arabic Linguistics Institute for their generous support. Thanks are also extended to researchers who reviewed this article and responded with valuable feedback. Keywords: Writing Systems; Arabic Writing System; L2 Writing; L2 Writing Systems; Error Analysis; Phonological Errors; Orthographic Errors

    Enhancing the Process of Testing Object -Oriented Systems.

    Get PDF
    Testing is a crucial step in the overall system development process. Using testing techniques that support features of the underlying software paradigm more effectively tests program than do testing techniques that support features of other paradigms. Systems developed with the object-oriented paradigm require techniques that support object-oriented features such as inheritance, data abstraction, encapsulation, and dynamic binding. Many techniques that are used to test systems developed with the structured paradigm are not sufficient for the testing of object-oriented systems. The goal of this research is to develop methods that will improve the process of testing object-oriented systems. Specifically, emphasis is given to improving the level of testing of methods because the level of method testing is generally considered inadequate. Algorithms are included that identify the set of methods, both interobject and intraobject, that should be tested for a given system. These algorithms are implemented as a part of an automated testing system that derives a framework for the testing of methods. This system includes the automatic generation of test drivers to facilitate the testing. It captures the results of tests for the purposes of reuse for future system maintenance. This framework provides the software engineer who is testing a system a mechanism to determine the level of method coverage that has been achieved in the testing process

    Grid Computing: The Trend Of The Millenium

    Get PDF
    A grid can be simply defined as a combination of different components which function collectively as a part of one large electrical or electronic circuit. The term “Grid Computing” can similarly be applied to a large number of computers which connect together to collectively solve a problem (which may be of scientific interest in most cases) of very high complexity and magnitude. The fundamental idea behind the making of any computer based grid is to utilize the idle time of processor cycles. Simply stated, a processor during the times it would stay idle would now team up with similar idle processors to tackle various complexities.  The role each processor plays is very carefully defined and there is utmost transparency in the working of each processor/computer in a grid. This is called the “Division of Labor” in the smart world of intelligent computing. In lay man terms this is equivalent to a student and his group of friends collectively solving a single assignment which contains more than a single problem. The solution is trivial but the effort is collective. A grid computing environment may take many forms. It could be molded as a cluster based, distributed computing environment based or peer-to-peer system. A cluster based environment would see a central computer often called a “cluster head” distributing or maintaining a job schedule of the other computers in the grid. A distributed environment is seen often in the web environment. For example when a user requests a popular web page from the web server and if the web server is experiencing traffic congestion, then the user is re-routed to the same page on a different web server. The transition takes place so rapidly that the momentary delay due to server bottleneck problems is hardly felt. Peer to peer computing can be best explained through music download engines. If a user has a file he decides to share it via the web, other users needing the same file copy it through their music download engines. With great computing power comes great responsibility. Security is of utmost importance in a grid environment. Since a grid performs large computations, data is assumed to be available at every node in the processing cycle. This increases the risk of data manipulation in various forms. Also we have to keep in mind what happens to the data when a node fails. An ideal grid will have a small time of convergence and a low recovery time in case of a complete grid failure. By convergence, we mean that each and every processor node will have complete information about each and every other processor node in the grid. Recovery time is the time it takes for the grid to start from scratch after a major breakdown. To put things in the right perspective, a good grid based computing environment will have an intelligent grid administrator to monitor user logs and scheduled jobs and a good grid operating system which will be tailor made to suit the application of the grid. We propose a similarity between the OSI model and a Grid Model to elaborate the functions and utilities of a grid. We also try to propose a queuing theory for Grid Computing. There have been numerous previous comparisons and each is knowledgeable in its own right. But a similarity with a network model adds more weight since physically a grid is nothing but an interconnection, and interconnection can be best defined in relation to a computer network interconnection. How do users access networked computers, how are files shared and what are the levels of security are best explained through a networked computer system

    Information Technology Issues Faced By The Business World

    Get PDF
    As the “information age” becomes more centered around technology, the disparity between the use and spread of information technology between developed and developing countries becomes more obvious.  This digital divide further alienates developing nations from the global economy.  However, rapid changes and advances in technology cannot be ignored.  Those developed and technologically advanced nations, such as the United States, use technology to the best of their advantage, especially in the business world.  Because of this, many businesses are turning to the practice of outsourcing.  With so many people using technology, many potential problems exist.  An increasing number of people using technology calls for a need of security.  Many individuals and businesses are obtaining this security through the use of firewalls.  Information technology and the need to protect information are important issues in the business world today

    IT: In Search Of Security Post Katrina/Rita Disaster Preparedness

    Get PDF
    What has been done to safe-guard the IT infrastructure and the mountains of data from future disaster, both natural and man made? Upgrading existing systems, building safe houses, and duplicating existing systems are some of the methods being utilized by Gulf Coast companies and business. Other options may be to relocate inland, away from natural disaster factors, though still not protected from terrorists, either from within or abroad.  This study addresses many of the current problems which linger:  unsuitable structures, costly relocation, escalating utility costs, security risks, weak disaster preparedness design, and conventional shortsightedness. Within the study, one can see that IT management is moving toward security, looking at future security as part of today’s operational norm. Disasters hit many American companies squarely in the pocketbook, but not a single company surveyed relied on Federal funds to offset losses. Companies, who lost their physical structures, were able to rebuild despite the odds, and two years later are still solvent. This speaks to the management and the spirit of the company itself.  This survey will be on-going, gathering information into the future as well as the perspective of the past. What will IT security look like in 2010? We will revisit this issue in real time as the survey results continue
    corecore