1,129 research outputs found

    Factores de éxito en la implementación del ERP Microsoft Dynamics AX – Caso de estudio: Empresa manufacturera

    Get PDF
    Con los años, los sistemas de información se han convertido en herramientas que le permiten a las compañías tomar decisiones oportunas para prestar un buen servicio a sus clientes y enfrentar los retos que se presentan en una economía que cada vez es más competitiva. Los sistemas de planificación de recursos empresariales (ERP) son soluciones que integran la información generada por los procesos productivos, administrativos y comerciales, evitando reprocesos y duplicidad en los datos. El propósito de este trabajo es mostrar los resultados del análisis realizado para identificar los factores críticos de éxito en los proyectos de implementación del Sistema de Planificación de Recursos Empresariales (ERP) Microsoft Dynamics AX en una empresa de producción colombiana. El trabajo se presenta en cuatro capítulos. Inicialmente, se realiza una introducción a los conceptos de sistemas de información y ERP. A continuación, se describe la metodología utilizada para el estudio, la selección del modelo a partir de la revisión de literatura, la construcción del instrumento y la recolección de datos a través de dos encuestas de percepción. Posteriormente se realiza el análisis de la información y por último las conclusiones y recomendaciones del trabajo.Abstract: Over the years, information systems have become tools that allow companies to make timely decisions, to provide a good service to their customers and face the challenges that arise in an economy increasingly competitive. Enterprise resources planning systems (ERP) are solutions that integrate the generated information by the productive, administrative and commercial processes, avoiding reprocessing and duplicity on data. The purpose of this project is to present the results of the analysis conducted in a Colombian manufacturing company. In order to identify the critical success factors in projects where the Enterprise Resources Planning System (ERP) Microsoft Dynamics AX is implemented. This work is present into four main sections. The first one explains the concepts of information systems and ERP. The second one, it is described the methodology used for the study, the selection of the model from the literary review, the construction of the instrument and the collection of the data through two perception surveys. The third one, it is the analysis of the information and finally, the conclusions and recommendations.Maestrí

    D i g i t a l i z a t i o n a n d N e w B u y e r B e h a v i o r i s C h a n g i n g B 2 B R e l a t i o n s h i p M a r k e t i n g

    Get PDF
    Problem definition: The increasingly informed customer will lead to an even greater demand for expertise and knowledge of marketers. Firms need to find new ways to utilize the informed customer as a co-creator of value by more proficiently analyzing behavior, both online and offline. An uncertainty lies in to which extent the operational standards of KIBS firms translate to their marketing and sales efforts, or to what degree they are using potential customers to help shape their value propositions. The research question that has been identified is how the relationships seen in knowledgeintensive B2B marketing are affected by the digitalization of society and the change in buyer behavior that is a result of these societal changes. Purpose: The purpose of this thesis will be to identify the effects of digitalization and changing buyer behavior on relationships seen across marketing and sales of knowledge-intensive services in a B2B context. This will lead to a recommendation for the case company’s future direction of its marketing and sales functions. Methodology: The research approach of this master’s thesis has been a combination of a descriptive and !! ! IV! Digitalization and New Buyer Behavior is Changing B2B Relationship Marketing Tobias Olsson Emil Uhlin an explorative study. The descriptive approach intends to describe the overall areas of the problem formulation, while the explorative approach aims to collect as much information as possible regarding these areas. The goal of research has been to put more weight on the explorative approach. The research is approached as a case study focusing on a company that is both B2B and in a sector that includes many interesting angles of the problem. Case company: The choice of Company X as case company for this thesis was rooted in three overall observations. First off, the area of digital marketing is currently seeing increased urgency in B2B. The sector of IT and business consultancy is also interesting. The companies in this sector often have many different ways of working within the same company. The choice of a B2B company is motivated by the fact that the new wave of digital marketing has seen greater advancements in B2C. B2B is historically stronger in much of the relationship marketing basics, like close network relationships. The digital advancements in B2B deviate from those in B2C, and are probably not as standardized. Lastly, an interesting aspect of Company X is that it offers business units on opposite sides of the spectrum in regards to overall digital advancements. Conclusions: Information really is the common denominator for everything that pertains to the power balance of supplier and customer. The authors believe that the presented framework provides a good intersection of assessing relational strength in B2B, the ability to grade strengths and weaknesses as well as opportunities and threats in digitalization, and lastly the level of current buyer insight. The models are secondary and may be modified, but the choice to observe relationships, digitalization and more in-depth buyer behavior should provide a holistic view for similar studies

    GPU Accelerated Approach to Numerical Linear Algebra and Matrix Analysis with CFD Applications

    Get PDF
    A GPU accelerated approach to numerical linear algebra and matrix analysis with CFD applications is presented. The works objectives are to (1) develop stable and efficient algorithms utilizing multiple NVIDIA GPUs with CUDA to accelerate common matrix computations, (2) optimize these algorithms through CPU/GPU memory allocation, GPU kernel development, CPU/GPU communication, data transfer and bandwidth control to (3) develop parallel CFD applications for Navier Stokes and Lattice Boltzmann analysis methods. Special consideration will be given to performing the linear algebra algorithms under certain matrix types (banded, dense, diagonal, sparse, symmetric and triangular). Benchmarks are performed for all analyses with baseline CPU times being determined to find speed-up factors and measure computational capability of the GPU accelerated algorithms. The GPU implemented algorithms used in this work along with the optimization techniques performed are measured against preexisting work and test matrices available in the NIST Matrix Market. CFD analysis looked to strengthen the assessment of this work by providing a direct engineering application to analysis that would benefit from matrix optimization techniques and accelerated algorithms. Overall, this work desired to develop optimization for selected linear algebra and matrix computations performed with modern GPU architectures and CUDA developer which were applied directly to mathematical and engineering applications through CFD analysis

    Effective Resource and Workload Management in Data Centers

    Get PDF
    The increasing demand for storage, computation, and business continuity has driven the growth of data centers. Managing data centers efficiently is a difficult task because of the wide variety of datacenter applications, their ever-changing intensities, and the fact that application performance targets may differ widely. Server virtualization has been a game-changing technology for IT, providing the possibility to support multiple virtual machines (VMs) simultaneously. This dissertation focuses on how virtualization technologies can be utilized to develop new tools for maintaining high resource utilization, for achieving high application performance, and for reducing the cost of data center management.;For multi-tiered applications, bursty workload traffic can significantly deteriorate performance. This dissertation proposes an admission control algorithm AWAIT, for handling overloading conditions in multi-tier web services. AWAIT places on hold requests of accepted sessions and refuses to admit new sessions when the system is in a sudden workload surge. to meet the service-level objective, AWAIT serves the requests in the blocking queue with high priority. The size of the queue is dynamically determined according to the workload burstiness.;Many admission control policies are triggered by instantaneous measurements of system resource usage, e.g., CPU utilization. This dissertation first demonstrates that directly measuring virtual machine resource utilizations with standard tools cannot always lead to accurate estimates. A directed factor graph (DFG) model is defined to model the dependencies among multiple types of resources across physical and virtual layers.;Virtualized data centers always enable sharing of resources among hosted applications for achieving high resource utilization. However, it is difficult to satisfy application SLOs on a shared infrastructure, as application workloads patterns change over time. AppRM, an automated management system not only allocates right amount of resources to applications for their performance target but also adjusts to dynamic workloads using an adaptive model.;Server consolidation is one of the key applications of server virtualization. This dissertation proposes a VM consolidation mechanism, first by extending the fair load balancing scheme for multi-dimensional vector scheduling, and then by using a queueing network model to capture the service contentions for a particular virtual machine placement

    Scalable Scientific Computing Algorithms Using MapReduce

    Get PDF
    Cloud computing systems, like MapReduce and Pregel, provide a scalable and fault tolerant environment for running computations at massive scale. However, these systems are designed primarily for data intensive computational tasks, while a large class of problems in scientific computing and business analytics are computationally intensive (i.e., they require a lot of CPU in addition to I/O). In this thesis, we investigate the use of cloud computing systems, in particular MapReduce, for computationally intensive problems, focusing on two classic problems that arise in scienti c computing and also in analytics: maximum clique and matrix inversion. The key contribution that enables us to e ectively use MapReduce to solve the maximum clique problem on dense graphs is a recursive partitioning method that partitions the graph into several subgraphs of similar size and running time complexity. After partitioning, the maximum cliques of the di erent partitions can be computed independently, and the computation is sped up using a branch and bound method. Our experiments show that our approach leads to good scalability, which is unachievable by other partitioning methods since they result in partitions of di erent sizes and hence lead to load imbalance. Our method is more scalable than an MPI algorithm, and is simpler and more fault tolerant. For the matrix inversion problem, we show that a recursive block LU decomposition allows us to e ectively compute in parallel both the lower triangular (L) and upper triangular (U) matrices using MapReduce. After computing the L and U matrices, their inverses are computed using MapReduce. The inverse of the original matrix, which is the product of the inverses of the L and U matrices, is also obtained using MapReduce. Our technique is the rst matrix inversion technique that uses MapReduce. We show experimentally that our technique has good scalability, and it is simpler and more fault tolerant than MPI implementations such as ScaLAPACK

    Multigrid with FFT smoother for a simplified 2D frictional contact problem

    Get PDF
    This paper aims to develop a fast multigrid (MG) solver for a Fredholm integral equation of the first kind, arising from the 2D elastic frictional contact problem. After discretization on a rectangular contact area, the integral equation gives rise to a linear system with the coefficient matrix being dense, symmetric positive definite and Toeplitz. A so-called fast Fourier transform (FFT) smoother is proposed. This is based on a preconditioner M that approximates the inverse of the original coefficient matrix, and that is determined using the FFT technique. The iterates are then updated by Richardson iteration: adding the current residuals preconditioned with the Toeplitz preconditioner M. The FFT smoother significantly reduces most components of the error but enlarges several smooth components. This causes divergence of the MG method. Two approaches are studied to remedy this feature: subdomain deflation (SD) and row sum modification (RSM). MG with the FFT + RSM smoother appears to be more efficient than using the FFT + SD smoother. Moreover, the FFT + RSM smoother can be applied as an efficient iterative solver itself. The two methods related to RSM also show rapid convergence in a test with a wavy surface, where the Toeplitz structure is lost

    Probabilistic models for structured sparsity

    Get PDF
    corecore