695 research outputs found

    Resources-Events-Agents Design Theory: A Revolutionary Approach to Enterprise System Design

    Get PDF
    Enterprise systems typically include constructs such as ledgers and journals with debit and credit entries as central pillars of the systems’ architecture due in part to accountants and auditors who demand those constructs. At best, structuring systems with such constructs as base objects results in the storing the same data at multiple levels of aggregation, which creates inefficiencies in the database. At worst, basing systems on such constructs destroys details that are unnecessary for accounting but that may facilitate decision making by other enterprise functional areas. McCarthy (1982) proposed the resources-events-agents (REA) framework as an alternative structure for a shared data environment more than thirty years ago, and scholars have further developed it such that it is now a robust design theory. Despite this legacy, the broad IS community has not widely researched REA. In this paper, we discuss REA’s genesis and primary constructs, provide a history of REA research, discuss REA’s impact on practice, and speculate as to what the future may hold for REA-based enterprise systems. We invite IS researchers to consider integrating REA constructs with other theories and various emerging technologies to help advance the future of information systems and business research

    Exploring the Long-Term Effects of Domestic Violence in Art Therapy Treatment

    Get PDF
    This qualitative case study explores the long-term effects of domestic violence through the lens of art therapy treatment. The study is based on a twelve-week long art therapy treatment group for women who have experienced domestic violence. The study includes a literature review and a qualitative analysis of the participants’ artwork and details of their experiences of domestic violence related trauma. The research focuses on two participants and utilizes textual and visual analysis to identify four emergent themes: Family and identity, hope in moving forward, support and connection, and freedom. The findings discuss the value of art therapy in revealing coping skills, strengths and internalized fears related to domestic violence trauma. The women in the study illuminate an increase in awareness of internal resources and hopeful narratives for healing. The study demonstrates the potential of art therapy to make visible the long-term effect of domestic violence, and assist in the treatment of survivors

    A new formal and analytical process to product modeling (PPM) method and its application to the precast concrete industry

    Get PDF
    The current standard product (data) modeling process relies on the experience and subjectivity of data modelers who use their experience to eliminate redundancies and identify omissions. As a result, product modeling becomes a social activity that involves iterative review processes of committees. This study aims to develop a new, formal method for deriving product models from data collected in process models of companies within an industry sector. The theoretical goals of this study are to provide a scientific foundation to bridge the requirements collection phase and the logical modeling phase of product modeling and to formalize the derivation and normalization of a product model from the processes it supports. To achieve these goals, a new and formal method, Georgia Tech Process to Product Modeling (GTPPM), has been proposed. GTPPM consists of two modules. The first module is called the Requirements Collection and Modeling (RCM) module. It provides semantics and a mechanism to define a process model, information items used by each activity, and information flow between activities. The logic to dynamically check the consistency of information flow within a process also has been developed. The second module is called the Logical Product Modeling (LPM) module. It integrates, decomposes, and normalizes information constructs collected from a process model into a preliminary product model. Nine design patterns are defined to resolve conflicts between information constructs (ICs) and to normalize the resultant model. These two modules have been implemented as a Microsoft Visio ™ add-on. The tool has been registered and is also called GTPPM ™. The method has been tested and evaluated in the precast concrete sector of the construction industry through several GTPPM modeling efforts. By using GTPPM, a complete set of information items required for product modeling for a medium or a large industry can be collected without generalizing each company's unique process into one unified high-level model. However, the use of GTPPM is not limited to product modeling. It can be deployed in several other areas including: workflow management system or MIS (Management Information System) development software specification development business process re-engineering.Ph.D.Committee Chair: Eastman, Charles M.; Committee Co-Chair: Augenbroe, Godfried; Committee Co-Chair: Navathe, Shamkant B.; Committee Member: Hardwick, Martin; Committee Member: Sacks, Rafae

    Culturally-driven agency in value cocreation within portuguese business ecosystem - a multiple case study approach

    Get PDF
    Doutoramento em GestãoUnderpinned by Service-dominant Logic (SdL), Neo-institutional Theory (NiT) and Theory of Structuration, the present research intends to deepen the understanding of how meta-layer, culturally-driven institutions shape the agency of decision-makers in service exchange engagement. Since there is a lack of works in SdL addressing how institutions coordinate value cocreation, the present research aims to fill this gap by focusing on how individuals conciliate institutions as it has significant implications on how they engage in service exchange. From the articulation of the three aforementioned distinct bodies of literature, a conceptual framework addressing the phenomenon under study emerged. In order to meet the research purposes, an interpretivist, naturalistic and mainly deductive perspective was taken and a multiple embedded case study research was adopted. The primary unit of analysis of the research is the individual, in its two dimensions (i.e. agency and social structure) and two additional units of analysis (embedded units) are considered: the firm and the organizational field where the individual is embedded in order to represent the phenomenon’s complexity. The research population is comprised by Portuguese individuals performing decision-making and troubleshooting roles as service beneficiaries, in Portuguese firms. From the population, eight cases meeting literal and theoretical replication criteria were initially selected and four additional cases were added to accomplish theoretical saturation. Interviews were conducted and document analyses and observation were performed to collect data. A framework based on empirical data regarding how culturally-driven institutions shape the agency of decision-makers is proposed. It considers not only what actions are affected by institutions but also how different institutions interact inside the individual’s structure. Also, this research brings to the forefront of SdL literature the concept of cultural resource by stressing its empirical relevance. These are two of the main contributions made by the current research to SdL, NiT and management practice.Suportada na Lógica dominante do Serviço (LdS), na Teoria Neo-institucional (TNi) e na Teoria da Estruturação, esta investigação pretende aprofundar o conhecimento sobre o efeito de instituições culturais na ação de decisores envolvidos em trocas de serviço. Dada a escassez de estudos que abordem a influência das instituições na co-criação de valor, a presente investigação pretende contribuir para reduzir esta lacuna focando-se na forma como os decisores conciliam as instituições, que tem, por sua vez, implicações na forma como os indivíduos desenvolvem trocas de serviço. Da articulação das três referidas teorias resulta um modelo conceptual do fenómeno a investigar. Durante a investigação foi adotada uma perspectiva interpretativista, naturalista e maioritariamente dedutiva para conduzir múltiplos casos de estudo em que a principal unidade de análise é o indivíduo nas suas duas dimensões (i.e. agência e estrutura). Foram ainda consideradas duas unidades de análise adicionais (i.e. a empresa e o campo organizacional em que o indivíduo está inserido) por forma a representar a complexidade do fenómeno em investigação. A população do estudo compreende indivíduos portugueses com papéis de decisão e de resolução de problemas enquanto beneficiários de um serviço, em empresas portuguesas. Da população foram selecionados oito casos iniciais com base em critérios de replicação literal e teórica e posteriormente, foram escolhidos mais quatro casos de forma a atingir a saturação teórica. A informação foi recolhida, através de entrevistas, dados secundários e observação, e analisada. A partir dos dados empíricos é proposto um modelo de como as instituições culturais condicionam a ação dos decisores, que considera que as ações são afetadas pelas instituições, bem como as diferentes instituições interagem dentro da estrutura do individuo. Adicionalmente, esta investigação sublinha a relevância empírica do conceito de recurso cultural, entre outras contribuições feitas para a LdS, para a TNi e para a prática da gestão.N/

    Extending a methodology for migration of the database layer to the cloud considering relational database schema migration to NoSQL

    Get PDF
    The advances in Cloud computing and in modern Web applications have raised the need for highly available and scalable distributed databases to accommodate the big data being created and consumed. Along with the explosion in data growth comes the necessity to rapidly evolve databases and schemas to meet user demands for new functionality. A special attention is being paid to the vast amounts of semi-structured and un-structured data, and the data management tools should reflect the support for these needs. This has lead to the development of new Cloud serving systems such as "Not Only" SQL (NoSQL) databases. NoSQL databases were driven by the scalability needs of the big companies, such as Google, Facebook, Amazon, and Yahoo. While the demands of these key players are different from those of small and medium enterprises in terms of scalability, the core problem is the same - storage arrays are not scalable and force you into expensive, forklift upgrades. These facts combined with changes in how IT resources are delivered and consumed through the Cloud computing paradigm, projects adopting NoSQL solutions are not a hype anymore. NoSQL databases are being offered as a service by the big Cloud providers, such as Google, Amazon, Microsoft, but by smaller vendors as well. In this master thesis we investigate the possibilities and limitations of mapping relational database schemas to NoSQL schemas when migrating the database layer to the Cloud. Based on literature research we provide recommendations and guidelines with regard to schema transformation and discuss the implications at other application architecture layers, such as business logic and data access layer. We extend an existing data migration tool and methodology for incorporating the migration guidelines and hints. Moreover, we validate our work based on a chosen sub-set of relational and NoSQL databases by using example data from the established TPC-H benchmark

    TLAD 2010 Proceedings:8th international workshop on teaching, learning and assesment of databases (TLAD)

    Get PDF
    This is the eighth in the series of highly successful international workshops on the Teaching, Learning and Assessment of Databases (TLAD 2010), which once again is held as a workshop of BNCOD 2010 - the 27th International Information Systems Conference. TLAD 2010 is held on the 28th June at the beautiful Dudhope Castle at the Abertay University, just before BNCOD, and hopes to be just as successful as its predecessors.The teaching of databases is central to all Computing Science, Software Engineering, Information Systems and Information Technology courses, and this year, the workshop aims to continue the tradition of bringing together both database teachers and researchers, in order to share good learning, teaching and assessment practice and experience, and further the growing community amongst database academics. As well as attracting academics from the UK community, the workshop has also been successful in attracting academics from the wider international community, through serving on the programme committee, and attending and presenting papers.This year, the workshop includes an invited talk given by Richard Cooper (of the University of Glasgow) who will present a discussion and some results from the Database Disciplinary Commons which was held in the UK over the academic year. Due to the healthy number of high quality submissions this year, the workshop will also present seven peer reviewed papers, and six refereed poster papers. Of the seven presented papers, three will be presented as full papers and four as short papers. These papers and posters cover a number of themes, including: approaches to teaching databases, e.g. group centered and problem based learning; use of novel case studies, e.g. forensics and XML data; techniques and approaches for improving teaching and student learning processes; assessment techniques, e.g. peer review; methods for improving students abilities to develop database queries and develop E-R diagrams; and e-learning platforms for supporting teaching and learning

    TLAD 2010 Proceedings:8th international workshop on teaching, learning and assesment of databases (TLAD)

    Get PDF
    This is the eighth in the series of highly successful international workshops on the Teaching, Learning and Assessment of Databases (TLAD 2010), which once again is held as a workshop of BNCOD 2010 - the 27th International Information Systems Conference. TLAD 2010 is held on the 28th June at the beautiful Dudhope Castle at the Abertay University, just before BNCOD, and hopes to be just as successful as its predecessors.The teaching of databases is central to all Computing Science, Software Engineering, Information Systems and Information Technology courses, and this year, the workshop aims to continue the tradition of bringing together both database teachers and researchers, in order to share good learning, teaching and assessment practice and experience, and further the growing community amongst database academics. As well as attracting academics from the UK community, the workshop has also been successful in attracting academics from the wider international community, through serving on the programme committee, and attending and presenting papers.This year, the workshop includes an invited talk given by Richard Cooper (of the University of Glasgow) who will present a discussion and some results from the Database Disciplinary Commons which was held in the UK over the academic year. Due to the healthy number of high quality submissions this year, the workshop will also present seven peer reviewed papers, and six refereed poster papers. Of the seven presented papers, three will be presented as full papers and four as short papers. These papers and posters cover a number of themes, including: approaches to teaching databases, e.g. group centered and problem based learning; use of novel case studies, e.g. forensics and XML data; techniques and approaches for improving teaching and student learning processes; assessment techniques, e.g. peer review; methods for improving students abilities to develop database queries and develop E-R diagrams; and e-learning platforms for supporting teaching and learning

    A Methodology for Evaluating Relational and NoSQL Databases for Small-Scale Storage and Retrieval

    Get PDF
    Modern systems record large quantities of electronic data capturing time-ordered events, system state information, and behavior. Subsequent analysis enables historic and current system status reporting, supports fault investigations, and may provide insight for emerging system trends. Unfortunately, the management of log data requires ever more efficient and complex storage tools to access, manipulate, and retrieve these records. Truly effective solutions also require a well-planned architecture supporting the needs of multiple stakeholders. Historically, database requirements were well-served by relational data models, however modern, non-relational databases, i.e. NoSQL, solutions, initially intended for “big data” distributed system may also provide value for smaller-scale problems such as those required by log data. However, no evaluation method currently exists to adequately compare the capabilities of traditional (relational database) and modern NoSQL solutions for small-scale problems. This research proposes a methodology to evaluate modern data storage and retrieval systems. While the methodology is intended to be generalizable to many data sources, a commercially-produced unmanned aircraft system served as a representative use case to test the methodology for aircraft log data. The research first defined the key characteristics of database technologies and used those characteristics to inform laboratory simulations emulating representative examples of modern database technologies (relational, key-value, columnar, document, and graph). Based on those results, twelve evaluation criteria were proposed to compare the relational and NoSQL database types. The Analytical Hierarchy Process was then used to combine literature findings, laboratory simulations, and user inputs to determine the most suitable database type for the log data use case. The study results demonstrate the efficacy of the proposed methodology

    Corporate Social Responsibility in the Information Technology Industry

    Get PDF
    Information technology companies contribute to greenhouse gas emissions via data servers, office administrative operations, and manufacturing of hardware components. Corporate Social Responsibility programs are a voluntary method of tracking and reducing negative environmental impacts. The 2019 Corporate Social Responsibility data from four Bay Area (California, USA) information technology companies (Adobe, Cisco, Salesforce, and Nvidia) all listed in sustainable index funds were compared to determine best reporting standards. The data analysis also allowed for research into how carbon emissions are categorized and reported on, and which emissions companies pay attention to through Corporate Social Responsibility projects. The physical ownership responsibility for CO2 emissions are titled Scope 1 (company’s own emissions), Scope 2 (energy used in operations), and Scope 3 (suppliers and stakeholders’ emissions). Most of the company’s emissions come from Scope 3 operations, but most of the Corporate Social Responsibility projects address emissions in Scope 1. Without industrywide key performance indicators or standard reporting frameworks, sustainable actions cannot be easily compared. Adding regulatory requirements, such as those proposed by US Securities and Exchange Commission, will improve corporate CO2 reporting standardization and transparency, allowing for easier comparisons and highlighting scalable environmental improvements. Companies in the information technology industry can reduce their carbon emissions by improving energy efficiency at data centers, implementing sustainable software design, and prolonging the lifespan of hardware components by making products modular and repairable
    • …
    corecore