492 research outputs found
Recommended from our members
On the Cost of Transitive Closures in Relational Databases
We consider the question of taking transitive closures on top of pure relational systems (Sybase and Ingres in this case). We developed three kinds of transitive closure programs, one using a stored procedure to simulate a built-in transitive closure operator, one using the C language embedded with SQL statements to simulate the iterated execution of the transitive closure operation, and one using Floyd's matrix algorithm to compute the transitive closure of an input graph. By comparing and analyzing the respective performances of their different versions in terms of elapsed time spent on taking the transitive closure, we identify some of the bottlenecks that arise when defining the transitive closure operator on top of existing relational systems. The main purpose of the work is to estimate the costs of taking transitive closures on top of relational systems, isolate the different cost factors (such as logging, network transmission cost, etc.), and identify some necessary enhancements to existing relational systems in order to support transitive closure operation efficiently. We argue that relational databases should be augmented with efficient transitive closure operators if such queries are made frequently
A general framework for positioning, evaluating and selecting the new generation of development tools.
This paper focuses on the evaluation and positioning of a new generation of development tools containing subtools (report generators, browsers, debuggers, GUI-builders, ...) and programming languages that are designed to work together and have a common graphical user interface and are therefore called environments. Several trends in IT have led to a pluriform range of developments tools that can be classified in numerous categories. Examples are: object-oriented tools, GUI-tools, upper- and lower CASE-tools, client/server tools and 4GL environments. This classification does not sufficiently cover the tools subject in this paper for the simple reason that only one criterion is used to distinguish them. Modern visual development environments often fit in several categories because to a certain extent, several criteria can be applied to evaluate them. In this study, we will offer a broad classification scheme with which tools can be positioned and which can be refined through further research.
Building a Data Warehouse and Data Mining for a Strategic Advantage
Technology is fundamentally changing the way companies do business. Consolidations, globalization, and deregulation have put increased pressure on managers to better understand their businesses and take them to the next level. Given the fast-paced business environment today, decision-making cycles have been shortened and managers need accurate information in a timely manner in order to make quality decisions. A properly designed and populated data warehouse can provide the relevant data necessary to make good decisions. Significant advances in computer hardware and end user software have made it easy to access, analyze, and display information at the desktop. The data companies continue to collect from their current information system provides a great source of information about its customers and processes. Data mining software programs are powerful tools that can be used to interrogate the massive amounts of data contained in the data warehouse in order to uncover relationships. To help business leaders and decision makers manage their companies effectively, companies need to make as much information as possible available and give decision-makers the tools they need to explore it according to Kapstone (1995). By implementing a data warehouse and using data mining tools companies can uncover relationships that can be used to achieve strategic advantages. First, I will explain data warehouses, why they are built, and how to build them. Second, I will cover data mining tools and the benefits companies are experiencing by using them. Finally, I will focus on the strategic advantages of building a data warehouse and extracting valuable data using sophisticated data mining tools
The organizational preparation of existing relational databases for the integration of expert systems.
This thesis is a management guide for strategically planning a future
integration of relational databases and expert systems. It relates best to an
organization with large established relational database(s), that is trying to assess the
changes required to integrate expert systems with those databases. Technical
considerations for such a change are discussed, and include the role of database
normalization and the requirement to maintain applications that are independent of the
database structure. The organizational considerations of such an integration are
examined, and focus on the people skills required within an organization to develop
and maintain database and expert system combinations. Three product categories are
established to represent an integrated system, and a commercial off the shelf product
from each category is reviewed to illustrate its specific capabilities. The combination
of relational databases and expert systems has the potential to deliver information
systems of future strategic importance. This thesis serves to assist the information
systems management of military organizations in planning the transition to such a
system.http://archive.org/details/organizationalpr00snowMajor, U.S. Air ForceApproved for public release; distribution is unlimited
Forensic attribution challenges during forensic examinations of databases
An aspect of database forensics that has not yet received much attention in the academic research community is the attribution of actions performed in a database. When forensic attribution is performed for actions executed in computer systems, it is necessary to avoid incorrectly attributing actions to processes or actors. This is because the outcome of forensic attribution may be used to determine civil or criminal liability. Therefore, correctness is extremely important when attributing actions in computer systems, also when performing forensic attribution in databases. Any circumstances that can compromise the correctness of the attribution results need to be identified and addressed. This dissertation explores possible challenges when performing forensic attribution in databases. What can prevent the correct attribution of actions performed in a database? Thirst identified challenge is the database trigger, which has not yet been studied in the context of forensic examinations. Therefore, the dissertation investigates the impact of database triggers on forensic examinations by examining two sub questions. Firstly, could triggers due to their nature, combined with the way databases are forensically acquired and analysed, lead to the contamination of the data that is being analysed? Secondly, can the current attribution process correctly identify which party is responsible for which changes in a database where triggers are used to create and maintain data? The second identified challenge is the lack of access and audit information in NoSQL databases. The dissertation thus investigates how the availability of access control and logging features in databases impacts forensic attribution. The database triggers, as dened in the SQL standard, are studied together with a number of database trigger implementations. This is done in order to establish, which aspects of a database trigger may have an impact on digital forensic acquisition, analysis and interpretation. Forensic examinations of relational and NoSQL databases are evaluated to determine what challenges the presence of database triggers pose. A number of NoSQL databases are then studied to determine the availability of access control and logging features. This is done because these features leave valuable traces for the forensic attribution process. An algorithm is devised, which provides a simple test to determine if database triggers played any part in the generation or manipulation of data in a specific database object. If the test result is positive, the actions performed by the implicated triggers will have to be considered in a forensic examination. This dissertation identified a group of database triggers, classified as non-data triggers, which have the potential to contaminate the data in popular relational databases by inconspicuous operations, such as connection or shutdown. It also established that database triggers can influence the normal ow of data operations. This means what the original operation intended to do, and what actually happened, are not necessarily the same. Therefore, the attribution of these operations becomes problematic and incorrect deductions can be made. Accordingly, forensic processes need to be extended to include the handling and analysis of all database triggers. This enables safer acquisition and analysis of databases and more accurate attribution of actions performed in databases. This dissertation also established that popular NoSQL databases either lack sufficient access control and logging capabilities or do not enable them by default to support attribution to the same level as in relational databases.Dissertation (MSc)--University of Pretoria, 2018.Computer ScienceMScUnrestricte
Design an Object-Oriented Home Inspection Application for a Portable Device
Recent advancements in the personal digital assistant (PDA) Windows application programming methodology made it easier to develop PDA applications. The release of the Microsoft® Visual Studio 2005 .NET incorporated handheld programming support while the Microsoft® Mobile® 5.0 operating system dramatically improved the PDA\u27s operation and hardware configuration. This paper researches and analyzes object-oriented languages, relational database and dynamic report generation technologies for the PDA as they apply to the development of a professional home inspection application. The focus of this paper is on the implementation of the most advanced PDA technologies for a high-end database PDA application design
- …