10 research outputs found

    МИГРАЦИЯ ДАННЫХ В КОНТЕКСТЕ ERP-СИСТЕМ

    Get PDF
    At the heart of every ERP is a single database that allows employees of the organization to rely on the same consistent set of information. Data migration is an important component of ERPs upgrade, implementation and integration projects. At the same time, the migration scenario can be complex and lengthy, require a large amount of resources and high competencies of the management staff. Underestimating the required time and effort can lead to a significant increase in costs and delay in the commissioning of the ERP. The accuracy and completeness of the transmitted data is also of great importance, since many aspects of the business – customer satisfaction, decision-making, supply chain and relationships with partners – will depend on the quality of the data. Despite this, the complexity of data migration scenarios is traditionally underestimated. In most existing studies, data migration is considered mainly from the technical side. Aspects related to the conceptual content of data migration, its relationship with business processes and company management, as well as the specific role of data migration in projects of updating, implementing and integrating ERPs, remain insufficiently developed. The aim of the study. The aim of this study is to supplement theoretical ideas about the content, diversity, problems and strategies of data migration in the context of ERPs. Materials and methods. The article summarizes and systematizes the types, stages of the project, key strategies and the most significant problems of data migration. For the purposes of this study, the material from 23 sources on a similar topic was reviewed, revised and supplemented. Results. The paper describes classifications of types of data migration, provides examples related to ERP. The stages of the data migration project are described in detail and supplemented. The comparison of two key data migration strategies is given, their advantages and disadvantages are highlighted, recommendations for the application of a particular strategy are formed. The main problems of data migration in the context of ERPs, the consequences of these problems for the entire migration project are considered. Conclusion. The results obtained suggest that data migration is a complex and time-consuming process that requires serious competencies from management and performers. The migration strategy should be developed in an effective way and take into account all the variety of influencing factors.В основе каждой ERP-системы лежит единая база данных, которая позволяет сотрудникам организации полагаться на один и тот же согласованный набор информации. Миграция данных является важной составляющей проектов обновления, внедрения и интеграции ERP. При этом сценарий миграции может быть сложным и длительным, требовать большого количества ресурсов и высокой компетенции руководящего персонала. Недооценка необходимых времени и усилий может привести к существенному увеличению расходов и затягиванию сроков ввода ERP в эксплуатацию. Точность и полнота передаваемых данных также имеют большое значение, поскольку многие аспекты бизнеса – удовлетворение потребностей клиентов, принятие решений, цепочка поставок и отношения с партнерами – будут зависеть от качества данных. Несмотря на это, сложность сценариев миграции данных традиционно недооценивается. В большинстве существующих исследований миграция данных рассматривается преимущественно с технической стороны. Аспекты, связанные с концептуальным содержанием миграции данных, ее взаимосвязью с бизнес-процессами и управлением компанией, а также специфической ролью миграции данных в проектах обновления, внедрения и интеграции ERP-систем, остаются недостаточно проработанными. Цель исследования. Целью данной работы является дополнение теоретических представлений о содержании, разнообразии, проблемах и стратегиях миграции данных в контексте ERP-систем. Материалы и методы. В статье приведены обобщение и систематизация видов, этапов, ключевых стратегий и наиболее значимых проблем миграции данных. В целях данного исследования был рассмотрен, переработан и дополнен материал из 23 источников на близкую тематику. Результаты. В работе описаны классификации видов миграции данных, приведены примеры, относящиеся к ERP. Детально описаны и дополнены этапы проекта миграции данных. Приведено сравнение двух ключевых стратегий миграции данных, выделены их преимущества и недостатки, сформированы рекомендации для применения той или иной стратегии. Рассмотрены основные проблемы миграции данных в контексте ERP-систем, последствия данных проблем для всего проекта миграции. Заключение. Полученные результаты позволяют утверждать, что миграция данных является сложным и трудоемким процессом, требующим серьезных компетенций от руководства и исполнителей. Стратегия миграции должна быть разработана эффективным образом с учетом всего многообразия влияющих факторов

    Re-engineering a database driven software tool: Rebuilding, automating processes and data migration

    Get PDF
    This thesis aims to re-engineer a database driven software tool that is used to insert engine related data and generate an EIAPP Technical File that is needed for certification of marine engines to show that they comply with IMO’s emission regulations specified in MARPOL Annex VI and NTC 2008. The need for an updated tool has emerged as the way of working is to be changed, from document management to content management. The current tool is also divided into two different tools, one for engines built in Italy and one for engines built in Finland, which leads to another objective that is to merge these tools into one. The tools are built-in Microsoft Access which does no longer suit the needs. Therefore, the last purpose of the research is to conduct a data migration from Microsoft Access to SQL Server. The research was divided into theoretical and empirical research. The theoretical part first presented the theory behind software engineering and software re-engineering. Then the theory behind databases and data migration was explored to at last go through the emission regulation and certification for marine diesel engines to better understand why the tool is needed. In the empirical part, first, the existing tool and the certification process were inspected. Furthermore, the research method, the constructive research approach was discussed, that focuses on producing a construction (solution) to a real-world problem in practice. At last, a more in-depth analysis of the tool was made to propose a plan on how to re-engineer the tool, which included an implementation process plan. The main result of this research is a re-engineered EIAPP tool that has the front-end in Microsoft Access and back-end in SQL Server. The tables have been restructured to comply with the change to only use one document number for the whole Technical File. The forms have been redesigned and processes have been automated to make the tool more reliable and efficient. The new re-engineered tool has more than 50 % fewer objects and fewer lines of code compared to the two existing tools. In addition, the research provides suggestions on how to further develop the certification process and the toolDenna avhandlings syfte är att återutveckla ett databasdrivet mjukvaruverktyg som används för att sätta in motor relaterad data och generera en EIAPP Teknisk Fil som krävs för certifiering av motorer för att visa att de uppfyller och följer IMO:s utsläppsbestämmelser som anges i MARPOL:s bilaga VI och NTC 2008. Behovet av ett uppdaterat verktyg har uppkommit eftersom strukturen och arbetsättet skall ändras, från dokumenthantering till innehållshantering. Det nuvarande verktyget är också indelat i två olika verktyg, ett för motorer byggda i Italien och ett för motorer byggda i Finland, vilket leder till ett annat syfte som är att slå samman dessa verktyg till ett. Verktygen är byggda i Microsoft Access som inte längre passar behoven. Därför är det sista syftet med forskningen att utföra en datamigrering från Microsoft Access till SQL Server. Forskningen delades in i teoretisk och empirisk forskning. Den teoretiska delen presenterade först teorin bakom mjukvaruteknik och omstrukturering (re-engineering) av mjukvara. Sedan undersöktes teorin bakom databaser och datamigrering för att till slut genomgå utsläppsreglering och certifiering av marina diesel motorer. I den empiriska delen inspekterades först det befintliga verktyget och certifieringsprocessen. Vidare diskuterades konstruktiva forsknings strategin, som fokuserar på att producera en konstruktion (lösning) till ett verkligt problem i praktiken. Till sista gjordes en mera djupgående analys av verktyget för att föreslå en plan för hur man skall omstrukturera (re-engineer) verktyget, som inkluderade en implementeringssprocessplan. Huvudresultatet av denna forskning är ett omstrukturerat EIAPP verktyg som har frontend i Microsft Access och backend i SQL Server. Tabellerna har omstrukturerats för att uppfylla ändringen i att bara använda ett dokumentnummer för hela tekniska filen. Formerna har omarbetats och processer har automatiserats för att göra verktyget mera tillförlitligt och effektivt. Det nya omstrukturerade verktyget har mer än 50 % färre object och färre kodrader jämfört med de två befintliga verktygen. Dessutom ger forskningen förslag på hur man kan vidareutveckla certifieringsprocessen och verktyget

    Database migration processes and optimization using BSMS (bank staff management system)

    Get PDF
    Veritabanları temel olarak karmaşık verilere bağlı görevleri yerine getirmek ve bu görevleri gerçekleştirmek için tasarlanmış bir depolama teknolojisidir, veri bütünlüğü önemlidir. Pek çok şirket için, veritabanları kelimenin tam anlamıyla şirketin işinin elektronik bir temsilidir ve göç sırasında herhangi bir veri parçasını kaybeder ve kaybeder kabul edilemez. Verilerin taşınmasının çeşitli ticari nedenleri vardır, bunlardan bazıları arşivleme, veri depolama, yeni ortama, platformlara veya teknolojiye geçmedir. Veri tabanı geçişi, genellikle değerlendirme, veri tabanı şeması dönüşümü, veri geçişi ve işlevsel testi içeren karmaşık, çok fazlı bir işlemdir. Çevrimiçi İşlem İşleme (OLTP) veritabanları genellikle veri bütünlüğü sağlama, veri fazlalığını ortadan kaldırma ve kayıt kilitlemesini azaltma gibi görevleri yerine getirerek verimlilik için çok normalize edilir. Ancak bu veritabanı tasarım sistemi bize çok sayıda tablo sunar ve bu tabloların ve yabancı anahtar kısıtlamalarının her biri veri taşıma noktasında dikkate alınmalıdır. Ayrıca, geleneksel görevlerden farklı olarak veri taşıma işi için Kabul kriteri tamamen% 100'dür, çünkü hatalar veritabanlarında tolere edilmez ve kalite önemlidir. Bu tez, verilerin Paradox veritabanı adı verilen yavaş, verimsiz ve eski bir veritabanı platformundan, verileri başarıyla geçiren Oracle adı verilen çok daha gelişmiş bir veritabanına aktarılması sırasında ortaya çıkan zorlukları ve kaygıları göstermektedir. Herhangi bir tutarsızlık ve veri kaybı olmadan verileri hızlı bir şekilde alarak, bir sorgunun performansını iyileştirmek için indeksleme tekniği kullanılmıştır

    Large Software Implementation Project: A study of software development and project management literature

    Get PDF
    This study focuses on large scale software delivery, where development is done on top of an existing system or parallel to it. This thesis aims to answer to the question: How to implement a large scale custom solution? Large scale projects take longer than smaller projects to implement and usually they are done in more than in one release. The application’s life-cycle is also planned to last up to decades. Large projects also need special project management skills, executive support, internal investments, strategical vision as well as alignment between IT and business. Large projects are usually complex and have several dependencies. This study also explains what issues projects usually have and what are the constrains of legacy systems and data migration. Different eras of IT systems are also presented as well as reasons why companies should invest to IT solutions. Waterfall model and Agile methodology fundamentals and background are presented shortly. From Agile methodology Scrum and SAFe frameworks are presented as examples. Keywords: Legacy system, Data Migration, Software implementation, project management, COTS, Agile development, Waterfal

    Assessing the influence of additive manufacturing and digital inventories on an Oil & Gas company's inventory management strategy

    Get PDF
    The oil and gas (O&G) industry is increasingly considering using additive manufactured (AM) for spare part management. As O&G companies seek to align their inventory management (IM) strategies, adopting digital inventories (DI) in combination with AM has emerged as a promising solution to current IM challenges in the industry. O&G operators may leverage AM and DI to reduce physical inventories, supply chain risk, inventory-related costs and lead times. However, approaches that describe how companies may implement the DI in operations and existing IM strategies are lacking. As such, this thesis will investigate how Norwegian O&G operators may implement DI to existing IM strategies. Seven companies with experience with AM and currently participate in a DI ecosystem have been interviewed as part of the research for the thesis. The aim of the interviews was to explore the current challenges, best practices, and essential aspects for enabling the successful implementation of DIs. The interview findings are compared against relevant literature to understand the DI ecosystem comprehensively. Key challenges in the ecosystem are unsuitable procurement processes, lack of commercial models, AM spare part (SP) qualification and a lack of demand for AM projects. ERP integration and assessments of AM suitability for stored SP are discussed as key enablers of DI implementation. The benefits and challenges associated with ERP integration will be discussed, thereby providing insight towards how O&G companies may utilise existing ERP systems to maximise the value of DI. A set of frameworks for identifying SP with characteristics suitable for AM is presented. The frameworks may be used to reduce physical inventories. Two decision models visualising critical decisions and stages for a SP ordered through a DI are presented. The first decision model considers decisions and stages for a given SP ordered through a DI. The second model may be adopted by O&G operators aiming to reduce physical inventories. In conclusion, the thesis's main contribution is expanding knowledge of implementing DI as a part of IM strategies for Norwegian O&G operators. The findings highlight a need to address current procurement processes and develop attractive commercial models for all actors in the DI ecosystem. ERP integration and assessment of stored SP suitable for AM are identified as essential enablers of DI implementation

    Website development project with Joomla 3 Content Management System

    Get PDF
    Content management systems (CMS) are probably the most popular frameworks to build content rich websites that require extensive editorial tools. Joomla is currently the second most popular CMS and over the years it has become an extremely robust and secure platform thanks to its active and thriving developer community. This thesis describes re-development process of a website called “Freedom for Sale”. The website was built with Joomla and the implementation included the entire development life-cycle including some special activities such as data cleansing and data migration. The project was commissioned by Art Films production Oy and it was carried out between September 2013 and April 2014. The goal of Freedom for Sale is to promote human rights and freedom of speech by highlighting grievances and monitoring governments and multinational corporations, whose actions support or ignore violations of human rights and free speech. By providing videos, articles and reports the website provides a medium for the mistreated to get their voices heard and by so help the world’s development. The website has a troublesome past. It was initially launched in 2007, but it was shut down after becoming a victim of harmful cyber-attacks, which corrupted majority of data and content in the process. Valid backups were not available to be used. Besides describing the development process, this report describes the core concepts of CMSs, Joomla, information architecture and data cleansing. The thesis will also suggest a method how data cleansing and migration could be applied to Joomla with Excel or similar spreadsheet application

    Multi-office system's data migration into centralized database

    Get PDF
    Datamigraatio on tietojärjestelmäuudistuksiin liittyvä tehtävä, jossa vanhan järjestelmän data viedään uuteen järjestelmään. Datamigraatiossa tietoa ei pelkästään kopioida uuteen järjestelmään, vaan datan rakenteelle täytyy suorittaa transformaatio kohdetietorakenteisiin. Tietojärjestelmäuudistuksia tehdään yrityksissä liiketoiminnallisista syistä, jolloin datamigraation toimittamiseen liittyy myös liiketoiminnallisia näkökulmia. Tietojärjestelmäuudistusten tavoitteena on yleensä parantaa yrityksen datan saatavuutta ja laatua. Tämän tavoitteen saavuttamiseksi täytyy lähdesovelluksen dataa analysoida ja tarpeen vaatiessa ehostaa. Tässä diplomityössä luodaan kokonaiskuva datamigraatioon liittyvistä työvaiheista ja siten autetaan rakentamaan toimiva datamigraatioprosessi tuotannollista datamigraatiota varten. Työssä esitellään Matthesin "Towards an integrated data migration process model" -artikkelissa kokoama datamigraatioprosessimalli ja Atostekin toimittama datamigraatioprojekti. Toiminnallisesta näkökulmasta datamigraatioprojektissa tärkeintä on muodostaa dokumentoitu, testattu ja harjoiteltu datamigraatioprosessi, jota noudattaen datamigraatio suoritetaan tuotannossa. Teknisestä näkökulmasta datamigraatioprojektissa rakennetaan datamigraatioalusta, joka sisältää tarvittavat välitietovarastot sekä datan purku-, transformaatio- ja vientiohjelmat. Lisäksi datamigraatioon liittyvien ohjelmien suoritusten hallitsemista varten voidaan tuottaa erillinen orkestrointikomponentti. Atostekin toimittamassa datamigraatioprojektissa ei sovellettu tarkoituksella mitään datamigraatioprojekteihin liittyviä viitekehyksiä, mutta projektissa tuotettu datamigraatioprosessi vastasi Matthesin datamigraatioprosessimallia. Atostekin toimittamassa projektissa muodostettiin yksityiskohtaisesti dokumentoitu, testattu ja harjoiteltu datamigraatioprosessi, jonka avulla tuotannollinen datamigraatio suoritettiin onnistuneesti. Datamigraatioprosessin tueksi tuotettiin työkaluja, joita voidaan uudelleen käyttää myös tulevissa datamigraatioprojekteissa

    E-commerce System Database Migration

    Get PDF
    Předmětem diplomové práce je návrh a vytvoření hotového nástroje určeného k migraci systémové databáze elektronického obchodu z ptaformy ZenCart na platformu PrestaShop. Obě systémové databáze budou popsány, analyzovány a na základě získaných informací bude vytvořen migrační nástroj splňující podmínky zadavatele a provedena finální migrace dat z databáze původní do databáze nové.The object of master‘s thesis is design and creation of e-commerce system database migration tool from the ZenCart platform to the PrestaShop platform. Both system databases will be described and analysed and based on gained information the migration tool will be created according customers‘ requirements and then final data migration from original to the new database will be executed.

    Avaliar e melhorar a qualidade dos dados com impacto no negócio num processo de migração de dados entre ERPs

    Get PDF
    Mestrado em Gestão de Sistemas de InformaçãoApesar de toda a literatura publicada sobre a melhoria da qualidade de dados, problemas de qualidade de dados continuam a afetar a operacionalidade das empresas e dos seus sistemas de decisão. Reconhecendo este facto, a RetailPC, uma empresa de comércio a retalho de equipamento informático, aceitou a realização da presente investigação, a qual teve como objetivo a avaliação e melhoria da qualidade de dados da sua entidade Cliente, durante um processo de migração de dados entre dois ERPs (Primavera Professional para SAP Business One). Para o efeito, foi utilizada a metodologia Action Research, uma vez que permite ao investigador assumir um papel intervencionista na resolução do problema da qualidade de dados. No caso concreto deste trabalho, foi avaliada e melhorada a qualidade de dados durante a migração entre ERPs e alterados processos de recolha dos mesmos, tendo sido disponibilizados meios de diagnóstico para futuros ciclos de Action Research. No final, foi possível constatar que a qualidade de dados foi melhorada significativamente. Foi possível corrigir todos os erros detetados nos atributos ShipType (Modo de expedição), PymCode (Formas de pagamento), Currency (Moeda) e LangCode (Língua da documentação enviada para o cliente); 98,53% dos erros detetados em sujeitos passivos coletivos com respeito ao atributo LicTradNum (NIF); 56,67% das moradas com erros do atributo ZipCode (Código Postal) e 99,65% dos tuplos que continham valores no atributo IntrntSite, uma vez estava a ser utilizado para um fim diferente do previsto pelo ERP, tendo esses valores sido migrados para o atributo E_Mail para posterior tratamento.Despite all the published literature on data quality enhancement, data quality problems continue to affect the company's operation and their decision systems. In recognition of that, RetailPC, an IT equipment retail trading company, accepted to be part of the present research, which aimed the assess and improve data quality of its Customer entity during a data migration process between ERPs (Primavera Professional to SAP Business One). For this purpose, it was used the Action Research methodology, as it allows the researcher to assume an interventional role in the resolution of the data quality problem. In the specific case of this research, has been assessed and enhanced data quality during data migration and changed data collection processes, were made available diagnostic methods for future cycles of Action Research. At the end it was perceived that the quality of data is improved significantly. It was possible to correct all the errors detected in attributes ShipType (Delivery mode), PymCode (Payment Methods), Currency (Currency) and LangCode (Language of documents sent to customer); correct 98.53% of detected errors in collective taxpayers with respect to LicTradNum attribute (Tax ID); 56.67% of addresses with errors in ZipCode attribute (Postal Code) and 99.65% of tuples that contain values in IntrntSite attribute, as was being used for a different purpose from that defined by the ERP, and these values were migrated to E_Mail attribute for further processing. They were also detected and eliminated 323 tuples of entities that were duplicated

    Master datan laadun parantaminen toiminnanohjausjärjestelmän käyttöönoton datamigraatiossa

    Get PDF
    ERP systems constitute the information system backbone of most organizations across all industries. An ERP implementation is a huge commitment for organizations and too often they are failed or ran over schedule and budget. Poor data is claimed to be the number-one reason for the high failure rate of new computer system implementations. The problem is that the data which is migrated from a legacy system to a target system has poor quality. The aim of this thesis was to study how the master data quality can be improved in the data migration process of SAP ERP implementation. The research was conducted at a general level and specific cases or organizations were not examined. The objective was to compile a list of methods how to improve the master data quality in the data migration process. To achieve the goal the main barriers for the master data quality in the data migration process were also recognized. The research was conducted in two parts: theoretical and empirical. The theoretical section was based on scientific literature about the research title and formed the foundation for the empirical part. The empirical research was conducted as a case study where the qualitative data was collected by interviewing nine SAP consultants. Furthermore, an additional questionnaire was conducted in order to point out the most intrinsic results. The results show that the data migration process includes several data quality barriers which need to be taken into consideration. The barriers were divided into three groups: data, people, and process related barriers. The methods to improve the master data quality in the data migration process were derived from the data quality barriers and they were also divided into the same three groups: data, people, and process relater methods. According to the results the most intrinsic methods to improve the master data quality in the data migration of SAP implementation are taking care of the good engagement with the client, defining and communicating the data related roles and responsibilities unambiguously, analyzing the status of the data in the legacy systems at starting point, arranging and executing the data cleansing carefully already in the legacy system, creating unambiguous data collection templates and carrying out walkthrough for them, and determining SAP rules for data to correspond to the business rules
    corecore