12 research outputs found

    Perancangan Sistem Informasi Eksekutif (Studi Kasus Di UGM)

    Full text link
    Adanya inisiatif integrasi informasi di Universitas Gadjah Mada menumbuhkan kesempatan untuk pengembangan sistem informasi eksekutif. Sistem informasi eksekutif dapat memanfaatkan data dari pemilik data primer untuk menunjukkan indikator kinerja universitas. Pengetahuan tentang kinerja diharapkan dapat membantu pengambilan keputusan di level eksekutif universitas.Pembuatan sistem informasi eksekutif diawali dengan mengumpulkan berbagai informasi dari pemilik data primer ke dalam sebuah data warehouse. Langkah selanjutnya, dilakukan proses pengolahan data dengan luaran kinerja universitas. Informasi kinerja universitas ini ditampilkan dengan pendekatan Dashboard dalam sistem informasi eksekutif.Kata kunci—executive information system; data warehouse; mdx query; dashboard

    Desain Etl Dengan Contoh Kasus Perguruan Tinggi

    Full text link
    Data Warehouse for higher education as a paradigm for helping high management in order to make an effective and efficient strategic decisions based on reliable and trusted reports which is produced from Data Warehouse itself. Data Warehouse is not a software, hardware or tool but Data Warehouse is an environment where the transactional database is modelled in other view for decision making purposes. ETL (Extraction, Transformation and Loading) is a bridge to build Data Warehouse and transform data from transactional database. In every fact and dimension table will be inserted with fields which represent the construction merge loading as an ETL (Extraction, Transformation and Loading) extraction. ETL needs an ETL table and ETL process where ETL table as table connectivity between tables in OLTP database and tables in Data Warehouse and ETL process will transform data from table in OLTP database into Data Warehouse table based on ETL table. The extraction process will be run with a table database as differentiate ETL process and an ETL algorithm which will be run automatically in idle transactional process, along with daily transactional database backup when the information system are not used

    Analisis Pengaruh Loyalitas dan Stres Kerja Terhadap Motivasi Kerja Tenaga Kependidikan Di Bagian Akademik STMIK AKAKOM Yogyakarta

    Get PDF
    Penelitian ini akan membahas sejauh mana hubungan antara loyalitas dan stres kerja terhadap motivasi tenagakependidikan di bagian akademik STMIK AKAKOM. Jenis data yang digunakan dalam penelitian ini berupa data primer. Metode pengumpulan data menggunakan kuesioner yang diuji dengan uji instrument mengenai tingkat validitas dan realibilitas. Responden adalah tenaga kependidikan di bagian akademik STMIK AKAKOM Yogyakarta dengan jumlah sampel sebanyak 19 responden. Metoda yang digunakan untuk menganalisis data adalah analisis deskriptif, uji normalitas, analisis regresi Linear berganda. Berdasarkan hasil pengolahan data, diperoleh hasil yang menunjukan bahwa loyalitas dan stres kerja mempunyai pengaruh terhadap motivasi tenaga kependidikan di bagian akademik sebesar 68 % sedangkan sisanya sebesar 32 % dipengaruhi oleh variabel lain diluar model penelitian, loyalitas dan stres kerja secara bersama sama berpengaruh signifikan terhadap motivasi kerja tenaga kependidikan bagian akademik dengan nilai F 17,009 dan tingkat signifikansi 0,000, dan Loyalitas mempunyai pengaruh yang dominan terhadap motivasi kerja tenaga kependidikan bagian akademik

    Examining Quality Factors Influencing the Success of Data Warehouse

    Get PDF
    Increased organizational dependence on data warehouse (DW) systems has drived the management attention towards improving data warehouse systems to a success. However, the successful implementation rate of the data warehouse systems is low and many firms do not achieve intended goals. A recent study shows that improves and evaluates data warehouse success is one of the top concerns facing IT/DW executives. Nevertheless, there is a lack of research that addresses the issue of the data warehouse systems success. In addition, it is important for organizations to learn about quality needs to be emphasized before the actual data warehouse is built. It is also important to determine what aspects of data warehouse systems success are critical to organizations to help IT/DW executives to devise effective data warehouse success improvement strategies. Therefore, the purpose of this study is to further the understanding of the factors which are critical to evaluate the success of data warehouse systems. The study attempted to develop a comprehensive model for the success of data warehouse systems by adapting the updated DeLone and McLean IS Success Model. Researcher models the relationship between the quality factors on the one side and the net benefits of data warehouse on the other side. This study used quantitative method to test the research hypotheses by survey data. The data were collected by using a web-based survey. The sample consisted of 244 members of The Data Warehouse Institution (TDWI) working in variety industries around the world. The questionnaire measured six independent variables and one dependent variable. The independent variables were meant to measure system quality, information quality, service quality, relationship quality, user quality, and business quality. The dependent variable was meant to measure the net benefits of data warehouse systems. Analysis using descriptive analysis, factor analysis, correlation analysis and regression analysis resulted in the support of all hypotheses. The research results indicated that there are statistically positive causal relationship between each quality factors and the net benefits of the data warehouse systems. These results imply that the net benefits of the data warehouse systems increases when the overall qualities were increased. Yet, little thought seems to have been given to what the data warehouse success is, what is necessary to achieve the success of data warehouse, and what benefits can be realistically expected. Therefore, it appears nearly certain and plausible that the way data warehouse systems success is implemented in the future could be changed

    DESAIN ETL DENGAN CONTOH KASUS PERGURUAN TINGGI

    Get PDF
    Data Warehouse for higher education as a paradigm for helping high management in order to make an effective and efficient strategic decisions based on reliable and trusted reports which is produced from Data Warehouse itself. Data Warehouse is not a software, hardware or tool but Data Warehouse is an environment where the transactional database is modelled in other view for decision making purposes. ETL (Extraction, Transformation and Loading) is a bridge to build Data Warehouse and transform data from transactional database. In every fact and dimension table will be inserted with fields which represent the construction merge loading as an ETL (Extraction, Transformation and Loading) extraction. ETL needs an ETL table and ETL process where ETL table as table connectivity between tables in OLTP database and tables in Data Warehouse and ETL process will transform data from table in OLTP database into Data Warehouse table based on ETL table. The extraction process will be run with a table database as differentiate ETL process and an ETL algorithm which will be run automatically in idle transactional process, along with daily transactional database backup when the information system are not used

    Data quality maintenance in Data Integration Systems

    Get PDF
    A Data Integration System (DIS) is an information system that integrates data from a set of heterogeneous and autonomous information sources and provides it to users. Quality in these systems consists of various factors that are measured in data. Some of the usually considered ones are completeness, accuracy, accessibility, freshness, availability. In a DIS, quality factors are associated to the sources, to the extracted and transformed information, and to the information provided by the DIS to the user. At the same time, the user has the possibility of posing quality requirements associated to his data requirements. DIS Quality is considered as better, the nearer it is to the user quality requirements. DIS quality depends on data sources quality, on data transformations and on quality required by users. Therefore, DIS quality is a property that varies in function of the variations of these three other properties. The general goal of this thesis is to provide mechanisms for maintaining DIS quality at a level that satisfies the user quality requirements, minimizing the modifications to the system that are generated by quality changes. The proposal of this thesis allows constructing and maintaining a DIS that is tolerant to quality changes. This means that the DIS is constructed taking into account previsions of quality behavior, such that if changes occur according to these previsions the system is not affected at all by them. These previsions are provided by models of quality behavior of DIS data, which must be maintained up to date. With this strategy, the DIS is affected only when quality behavior models change, instead of being affected each time there is a quality variation in the system. The thesis has a probabilistic approach, which allows modeling the behavior of the quality factors at the sources and at the DIS, allows the users to state flexible quality requirements (using probabilities), and provides tools, such as certainty, mathematical expectation, etc., that help to decide which quality changes are relevant to the DIS quality. The probabilistic models are monitored in order to detect source quality changes, strategy that allows detecting changes on quality behavior and not only punctual quality changes. We propose to monitor also other DIS properties that affect its quality, and for each of these changes decide if they affect the behavior of DIS quality, taking into account DIS quality models. Finally, the probabilistic approach is also applied at the moment of determining actions to take in order to improve DIS quality. For the interpretation of DIS situation we propose to use statistics, which include, in particular, the history of the quality models

    Datenqualität in Sensordatenströmen

    Get PDF
    Die stetige Entwicklung intelligenter Sensorsysteme erlaubt die Automatisierung und Verbesserung komplexer Prozess- und Geschäftsentscheidungen in vielfältigen Anwendungsszenarien. Sensoren können zum Beispiel zur Bestimmung optimaler Wartungstermine oder zur Steuerung von Produktionslinien genutzt werden. Ein grundlegendes Problem bereitet dabei die Sensordatenqualität, die durch Umwelteinflüsse und Sensorausfälle beschränkt wird. Ziel der vorliegenden Arbeit ist die Entwicklung eines Datenqualitätsmodells, das Anwendungen und Datenkonsumenten Qualitätsinformationen für eine umfassende Bewertung unsicherer Sensordaten zur Verfügung stellt. Neben Datenstrukturen zur effizienten Datenqualitätsverwaltung in Datenströmen und Datenbanken wird eine umfassende Datenqualitätsalgebra zur Berechnung der Qualität von Datenverarbeitungsergebnissen vorgestellt. Darüber hinaus werden Methoden zur Datenqualitätsverbesserung entwickelt, die speziell auf die Anforderungen der Sensordatenverarbeitung angepasst sind. Die Arbeit wird durch Ansätze zur nutzerfreundlichen Datenqualitätsanfrage und -visualisierung vervollständigt
    corecore