41 research outputs found
Updating Data Warehouses with Temporal Data
There has been a growing trend to use temporal data in a data warehouse for making strategic and tactical decisions. The key idea of temporal data management is to make data available at the right time with different time intervals. The temporal data storing enables this by making all the different time slices of data available to whoever needs it. Users with different data latency needs can all be accommodated. Data can be “frozen” via a view on the proper time slice. Data as of a point in time can be obtained across multiple tables or multiple subject areas, resolving consistency and synchronization issues. This paper will discuss implementations such as temporal data updates, coexistence of load and query against the same table, performance of load and report queries, and maintenance of views against the tables with temporal data
Toward Understanding PU and PEOU of Technology Acceptance Model
Technology Acceptance Model (TAM) is considered one of the most popular models used in Information System (IS) research. Fred Davis developed this model as part of his doctoral research at MIT in 1986. Since then this model has been widely used in IS research and other disciplines. Two main components of TAM are Perceived Usefulness (PU) and Perceived Ease of Use (PEOU). This model allowed researchers to plug-in external factors to these two components. Researchers have used a variety of external factors to draw relationships between these two internal factors of TAM model. However, most of the research used these TAM-based internal factors without elaborating actually what makes a system or technology useful. This research makes an attempt to understand those two internal factors based on literature review and a comprehensive exploratory study. The author proposes a set of external factors that are technology focused and have practical value. This work is expected to guide future researchers in using this model with proposed external factors that match the definition of PU and PEOU
Practical Implications of Real Time Business Intelligence
The primary purpose of business intelligence is to improve the quality of decisions while decreasing the time it takes to make them. Because focus is required on internal as well as external factors, it is critical to decrease data latency, improve report performance and decrease systems resource consumption. This article will discuss the successful implementation of a BI reporting project directly against an OLTP planning solver. The planning solver imports data concerning supply, demand, capacity, bill of materials, inventory and the like. It then uses linear programming to determine the correct product mix to produce at various factories worldwide. The article discusses the challenges faced and a working model in which real-time BI was achieved by providing data to a separate BI server in an innovativeway resulting in decreased latency, reduced resource consumption and improved performance. We demonstrated an alternative approach to hosting data for the BI application separately by loading BI and solver databases at the same time, resulting in faster access to information
Socioeconomic Status Measurement: An Analysis of Incorporation of Mixed Variables into Principal Component Approach
Socioeconomic status of a household in Bangladesh changes overtime for many reasons. The measurement of this change is a very important tool in many aspects. This paper aims to examine the dynamic nature of wealth status in Bangladesh. In particular, we want to capture the overall wealth transition in rural area of Bangladesh from year 2004 to 2015. To calculate this transition, we construct wealth index for each of the year 2004, 2009, and 2015 using the ‘poverty analysis survey data’. This survey has conducted on the same households in each three years. Nonlinear principal component analysis (PCA) with optimal scaling using gifi method as our PCA tool is used here for wealth index construction. This method is designed to use with a data set that contains both numerical and categorical variables jointly. Then the transition of wealth is calculated using these three-wealth index. Based on the transition result, we classified each of the households into four different social groups such as non-poor, ascending poor, descending non-poor, and chronically poor
Development of Efficient Callus Initiation of Malta (Citrus sinensis) Through Tissue Culture
The effects of different hormonal concentration on shoot formation and callus induction were studied on BARI Malta-1 (Citrus sinensis). Seeds containing seed coat and without seed coat was treated by Murashige and Skoog (MS) media supplemented with 6-benzyl adenine (BA) and Kinetin (KIN). Removal of seed coat showed an early response for shoot formation. The highest (70%) shoot formation was obtained from seeds without seed coat treated with MS basal media + BA 1.0 mg/l while KIN showed no response for shoot formation in any supplemented concentration. However, in case of callus induction internodes and apical shoot tips were used as explants and 2, 4-dichlorophenoxy acetic acid (2, 4-D) was used as callus inducing hormone. MS basal media supplemented with 2, 4-D, 2.0 mg/l showed highest (68%) callus induction. DOI: http://dx.doi.org/10.3329/ijarit.v1i1-2.13935 Int. J. Agril. Res. Innov. & Tech. 1 (1&2): 64-68, December, 201
An ETL Metadata Model for Data Warehousing
Metadata is essential for understanding information stored in data warehouses. It helps increase levels of adoption and usage of data warehouse data by knowledge workers and decision makers. A metadata model is important to the implementation of a data warehouse; the lack of a metadata model can lead to quality concerns about the data warehouse. A highly successful data warehouse implementation depends on consistent metadata. This article proposes adoption of an ETL (extracttransform-load) metadata model for the data warehouse that makes subject area refreshes metadata-driven, loads observation timestamps and other useful parameters, and minimizes consumption of database systems resources. The ETL metadata model provides developers with a set of ETL development tools and delivers a user-friendly batch cycle refresh monitoring tool for the production support team
Factors Affecting Big Data Technology Adoption
With the advancement of computer science, hardware and software engineering, and computing power, and later with the advent of the internet, social networking tools and other sources such as sensors data growth has increased significantly. These data are called big data which are mostly unstructured, generated in large volumes, data need to be captured in near real-time. To handle big data a completely new set of tools and technologies are being emerged. I have studied big data literature to identify the factors that might influence big data adoption. I was able to list quite a few factors or attributes that might be evaluated to understand the acceptance of big data technologies. These factors have been identified based on literature review. I have classified these factors under different dimensions. I have developed a draft of the research model to understand the factors affecting big data technology acceptance. I have developed a brief outline of model validation, survey instrument validation, data collection methods. I would like to present my research proposal to this Symposium to get valuable feedback from the audience