17 research outputs found

    Data quality enhancement in oil reservoir operations : an application of IPMAP

    Get PDF
    Thesis (S.M. in Engineering and Management)--Massachusetts Institute of Technology, Engineering Systems Division, System Design and Management Program, 2012.Cataloged from PDF version of thesis.Includes bibliographical references (p. 70-71).This thesis presents a study of data quality enhancement opportunities in upstream oil and gas industry. Information Product MAP (IPMAP) methodology is used in reservoir pressure and reservoir simulation data, to propose data quality recommendations for the company under study. In particular, a new 4-step methodology for examining data quality for reservoir pressure management systems is proposed: 1. Trace the data flow and draw the IPMAP; 2. Highlight the cross-system and organizational boundaries; 3. Select data quality analytical questions based on data quality literature review; 4. Apply the analytical questions at each boundary and document the results. This original methodology is applied to the three management systems to collect a pressure survey: using a spreadsheet, a standardized database and an automated database. IPMAPs are drawn to each of these three systems and cross-system and organizational boundaries are highlighted. Next, data quality systematic questions are applied. As a result, three data quality problems are identified and documented: well identifier number, well bore data and reservoir datum. The second experiment investigates the data quality issues in the scope of reservoir simulation and forecasting. A high-level IPMAP and a process flow on reservoir simulation and forecasting are generated. The next section further elaborates on the first high level process flow and drills into the process flow for simulation. The analytical data quality questions are raised to the second simulation process flow and limited findings were documented. This thesis concludes with lessons learned and directions for future research.by Paul Hong-Yi Lin.S.M.in Engineering and Managemen

    Acceptance model of SaaS cloud computing at northern Malaysian main campus public universities

    Get PDF
    Technology advancement has side effects, although it has moved in a fast pace that facilitated life and increased business revenue. To cope with negative aspects while looking for friendly technology, Software as a Service (SaaS) Cloud Computing emerged to preserve natural resources, effectively utilize computing and power consumption, while achieving performance, decreasing cost, and increasing revenue. Yet, there are paucity in empirical studies investigating salient factors affecting the usage, acceptance, or adoption of SaaS services from the individual perspectives specifically in higher education sector. The main objective of this study is to investigate the salient factors with proper model that includes technical, social and control characteristics, as well as user security predisposition. Besides, educational level has also proven to be influential in adopting innovations. Hence, probing its role is another objective. The last objective is to investigate differences between student and lecturer groups in the relationships postulated in the model. A survey with questionnaires was conducted on students and lecturers in four public universities in Northern Malaysia. The scope of the acceptance is to investigate the personal-level use of SaaS services. Decomposed Theory of Planned Behaviour (DTPB) and Diffusion of Innovation Theory (DOI) were applied. Results revealed appropriateness of the model although the role of Trialability and Subjective Norms were not significance. The findings contribute to the body of knowledge and literature in highlighting the role of these factors that SaaS providers could benefit in planning for new services and in promoting SaaS usage to universities

    A framework for managing timetable data quality within the NMMU

    Get PDF
    This dissertation investigates the influencing factors on timetable quality, not only from a data quality perspective, but also from an information quality perspective which takes into account the quality of the business processes involved in creating the timetable. The Nelson Mandela Metropolitan University was used as a case study for assessing the quality of the timetable process, the quality of the source data, and the quality of the final timetable produced. A framework for managing the data quality during the timetabling process is proposed. The framework is based on reviews done on data quality management best practices and data quality aspects. Chapter 1 introduces the current Nelson Mandela Metropolitan University timetable, and motivates why data quality management is essential to its success. The scope and research objectives are presented for this dissertation. Chapter 2 covers a literature study on business process and data quality management best practices. The common thread through all the management methodologies investigated, was top management involvement and commitment to continuously improving the quality of data. Chapter 3 discusses various characteristics of data quality. Quality is determined to be whether the end result meets the quality requirements for which it was intended. Hence each system could have quality aspects that are unique to it. Chapter 4 explains various research designs and which were followed for this dissertation. The combination of literature studies, a questionnaire and a case study were used. Chapter 5 is a case study of the data quality and timetabling processes used at the Nelson Mandela Metropolitan University and is based on the research design described in chapter 4. The current business processes followed in setting up the current timetable are presented, as well as the proposed timetabling process that should produce a better quality timetable for the Nelson Mandela Metropolitan 4 University. The data quality aspects most pertinent to the Nelson Mandela Metropolitan University are determined, being timeliness, accountability, integrity and consistency, as well as the most probable causes for bad timetable quality, like uniform technology, processes, ownership and using a common terminology. Chapter 6 presents a framework for managing timetable data quality at the Nelson Mandela Metropolitan University using an Information Product Map approach that will ensure a better quality timetable. Future research is also proposed. It is evident from this dissertation that data quality of source data as well as the quality of the business process involved is essential for producing a timetable that satisfies the requirements for which it was intended. The management framework proposed for the Nelson Mandela Metropolitan University timetabling process can potentially be used at other institutions as well

    Process-driven data and information quality management in the financial service sector

    Get PDF
    Highly regulated sectors face challenges on data and information quality management (DIQM) to conform to increasing regulations. With the financial service sector, as the most highly regulated industry, we are interested in current and future DIQM challenges. For a sustaining improvement, data quality should be managed process-driven. Process-driven data quality management (PDDQM) provides continuous improvement of data quality by redesigning processes that create or modify data. Therefore, business process management (BPM) is a basis for PDDQM. In an information systems’ context, enterprise resource planning (ERP) systems offer a platform for integrating processes and data. We examine market developments and IT trends by conducting semi-structured expert interviews with participants in IT-strategic decision making. We present current trends in the insurance sector and identify three main DIQM challenges: The IT-independent management of data, an increasing need to engage in PDDQM, and guiding existing and future measures by a data governance framework

    A Neural Network Approach to Border Gateway Protocol Peer Failure Detection and Prediction

    Get PDF
    The size and speed of computer networks continue to expand at a rapid pace, as do the corresponding errors, failures, and faults inherent within such extensive networks. This thesis introduces a novel approach to interface Border Gateway Protocol (BGP) computer networks with neural networks to learn the precursor connectivity patterns that emerge prior to a node failure. Details of the design and construction of a framework that utilizes neural networks to learn and monitor BGP connection states as a means of detecting and predicting BGP peer node failure are presented. Moreover, this framework is used to monitor a BGP network and a suite of tests are conducted to establish that this neural network approach as a viable strategy for predicting BGP peer node failure. For all performed experiments both of the proposed neural network architectures succeed in memorizing and utilizing the network connectivity patterns. Lastly, a discussion of this framework\u27s generic design is presented to acknowledge how other types of networks and alternate machine learning techniques can be accommodated with relative ease

    BGP-Multipath Routing in the Internet

    Get PDF
    BGP-Multipath, or BGP-M, is a routing technique for balancing traffic load in the Internet. It enables a Border Gateway Protocol (BGP) border router to install multiple ‘equally-good’ paths to a destination prefix. While other multipath routing techniques are deployed at internal routers, BGP-M is deployed at border routers where traffic is shared on multiple border links between Autonomous Systems (ASes). Although there are a considerable number of research efforts on multipath routing, there is so far no dedicated measurement or study on BGP-M in the literature. This thesis presents the first systematic study on BGP-M. I proposed a novel approach to inferring the deployment of BGP-M by querying Looking Glass (LG) servers. I conducted a detailed investigation on the deployment of BGP-M in the Internet. I also analysed BGP-M’s routing properties based on traceroute measurements using RIPE Atlas probes. My research has revealed that BGP-M has already been used in the Internet. In particular, Hurricane Electric (AS6939), a Tier-1 network operator, has deployed BGP-M at border routers across its global network to hundreds of its neighbour ASes on both IPv4 and IPv6 Internet. My research has provided the state-of-the-art knowledge and insights in the deployment, configuration and operation of BGP-M. The data, methods and analysis introduced in this thesis can be immensely valuable to researchers, network operators and regulators who are interested in improving the performance and security of Internet routing. This work has raised awareness of BGP-M and may promote more deployment of BGP-M in future because BGP-M not only provides all benefits of multipath routing but also has distinct advantages in terms of flexibility, compatibility and transparency

    The Role of Expansion Movement in the Establishment of New Region in Indonesia: a Study of Parigi Moutong Regency

    Full text link
    The study explains the dimension of the structure of resource mobilization in the political movement of new region establishment in Indonesia. The establishment of new regions has been seen only in the utilization of formal structures. In fact, the involvement of non-formal organizations also contributes to the importance and determines a region expansion. The study employed a qualitative approach with the support of primary and secondary data related to the establishment of Parigi Moutong Regency. The data was obtained through in-depth interviews with the group figures of the expansion. The secondary data was obtained from mass media and government agencies as well as personal documentation. The theory used was the dimension of the resource mobilization structure of the political opportunity structure (POST) theory. The study reveals that the success of the expansion movement in Parigi Moutong Regency for their structure resource mobilization by civil society organizations or non-formal to formal institutional build up pressure by using lobbying based on personal, professional and primordial networks. The influence of national political reforms motivated and mobilized the mobilization of movement resources as a repetition of the movement that had taken place in the previous expansion movement in Parigi Moutong Regency

    TINTE - A two-dimensional code for reactor dynamics

    Get PDF
    The TINTE main documentation consists of three parts, the first two of which/1/,/2/ have been published (in German) in the late eighties. In the first part the problems of modelling the nuclear and thermo-gas-dynamic behaviour of the primary circuit of a high temperature gas cooled reactor (HTGR) have been discussed in detail. It has been explained how the multiconnected system can be decomposed into single tasks to be solved separately. The solution of the total system is thus found by iteration of the partial results. In the second part of the documentation some major applications of TINTE are demonstrated. Among them the analyses of dynamic reactor experiments performed at the AVR reactor /4/ are of special interest. These results play a major role in the TINTE validation process, and the very good conformance obtained with the experimental data validate the TINTE calculations to a considerable extent. Earlier post-calculations of the AVR experiments with a previous version of the TINTE code have been described in /5/. Moreover, the basic algorithms as used in TINTE together with some applications have been shown in /4/. Since not all of the capabilities of TINTE could be addressed in these analysis, the validation process was continued, e.g. with the evaluation of the VELUNA corrosion experiments /6/. An addendum to the principal considerations of /1/ has been added as a supplement to /2/. Here the gas flow in an optional 1-D component and flow network is described, which may be used to enhance the 2-D reactor model for special situations. This flow network was necessary to model non-central pipes and other three-dimensional gas flow paths. It allows the description of co-axial pipes and a lumped parameter simulation of the primary side ofheat exchangers or steam generators. One example in /4/ shows that under certain limitations even a simulation of a gas-gas heat exchanger and the incorporation of the secondary loop is possible with the aid of that flow-network. In this addendum the possibility is also introduced to calculate the pressure inside the reactorfrom a given (fixed or variable in time) gas inventory. This is of relevance for accident analyses, where a failure of the pressure enclosure is assumed. If the pressure increases significantly gas may be removed from the system by burst discs or safety valves. This document starts with a description of the TINTE code structure (Section 4), while Section 5 is dedicated to the description and interpretation of the main input data. Section 6 deals with the preparation of the nuclear data base, the generation of the cross-section polynomial expansions and the necessary interface codes. Aspects included here are the evaluation of nuclide vectors (prepared by burn-up codes) and the preparation of spectrum calculations with variation of temperatures, buckling and concentrations for spectrum relevant nuclides. User notes on the code installation and calculational procedures are presented in Section 7, while Sections 8 and 9 discuss the TINTE control options and output data options, respectively. Section 10 lists the changes made in the TINTE source code over the years. The report also includes in the Appendices some newer algorithms for the treatment of special situations, while a description of the correlations used for the heat capacities and thermal conductivities are also given. Of special note here is Appendix E, which lists the detail of the ROMO model newly implemented in TINTE in 2004

    Globalization and the Texas metropolises: competition and complementarity in the Texas Urban Triangle

    Get PDF
    This dissertation examines relationships between cities, and more specifically the largest Texas cities, and the global economy. Data on headquarters location and corporation sales over a 20-year period (1984-2004) supported the hypothesis that globalization is not homogeneous, regular or unidirectional, but actually showed contrasted phases. Texas cities have been raising in global rankings, due to corporate relocations and, to lesser extent, the growth of local activities. By year 2004, Dallas and Houston ranked among the top-20 headquarters cities measured by corporation sales The Texas Urban Triangle had one of the major global concentrations of oil- and computer-related corporation headquarters; conversely, key sectors like banking, insurance and automotive were not significant. Standardized employment data in major U.S. metropolitan areas was examined through principal components analyses. Overall, larger places showed higher degrees of diversity, and no trend toward economic convergence. The TUT also presented a degree of intra-regional diversity comparable to other urban regions. Findings confirmed the relevance of oil- and information-related activities, along with construction, and weakness of activities linked to finance and corporate management. Traffic and air linkages in Texas cities were contrasted to other American gateways. Dallas and Houston have been major nodes in global air transportation, with very important roles as transit hubs for domestic (the former) and short international (the latter) flights. For long-haul international traffic both cities were second-level American gateways, with Houston mobilizing better connected to Western Europe and Mesoamerica, and Dallas to South America and East Asia. Dallas central location strengthened its role in the domestic market, as the center of one of the five major subsystems in the country and a top gateway in enplanements, number of linkages and connectivity measures. The Texas air travel network hierarchical organization was relatively unbalanced, with two strong nodes at the top, three little-relevant middle nodes, and several very poorly interconnected gateways at the bottom. Finally, the high supply of regional flights between primary destinations, namely Dallas and Houston, resulted in significant effects of time-space convergence. Such effects were only found between highly-connected major gateways, and completely bypassed other places, independently of their size and relative location
    corecore