15 research outputs found

    Conceptual Model of Digital Storytelling (DST)

    Get PDF
    Digital storytelling (DST) is an evolution of the age-old traditional storytelling, by augmenting the power of storytelling via the latest technology. In order for a digital storyteller to construct a digital story, there are sets of guided elements to be followed. However, these experts-proposed elements vary; while some are repetitive others do not cater for interactivity. Therefore, the main aim of this study is to identify the commonality of the diverse elements used by the different experts to eliminate their redundancy. By doing so, this study can identify the DST core elements and present them in the form of a conceptual model. In achieving the main aim, three sub-objectives were constructed; (1) to identify the core elements of digital storytelling that represent interactive and non-interactive forms, (2) to construct a conceptual model of the identified DST core elements, (3) to evaluate the proposed conceptual model by DST experts and potential users. In ensuring that the study is guided and focused, four phases of methodology were followed through: (1) groundwork, (2) induction, (3) iteration, and (4) conclusion. Eventually, the conceptual model was reviewed by five international experts and evaluated by 62 potential users. The evaluation on the quality of the model encompassed the following constructs: Perceived Ease of Understanding, Perceived Usefulness, User Satisfaction, and Perceived Semantic Quality. The findings indicated that the respondents perceived the conceptual model as having quality (mean score of 4.936 over a scale of 7.000). T-Test also revealed that there is no significant difference between the perception of those with experience in developing DST and those without experience. This suggests that the conceptual model consisting of the DST core elements, which is the main contribution of the study, could guide digital storytellers in developing digital story

    Data Modeling Patterns: A Method and Evaluation

    Get PDF
    Patterns capture abstractions of situations that occur frequently in data modeling. Effective use of data modeling patterns can lead to high quality designs and productivity gains. Data modeling patterns are widely available in the public domain, yet there is a lack of studies on usability of such patterns. In this exploratory study we examine the usability of data modeling patterns. Effective use of patterns presupposes the users’ ability to find similarities between task and pattern. We present and evaluate some heuristics for finding the similarities. The results of the empirical evaluation indicate that the heuristics are useful and can lead to accurate solutions. Future research as well as implications for researchers and practitioners is also discussed

    AN EXPLORATORY EVALUATION OF THREE I.S. PROJECT PERFORMANCE MEASUREMENT METHODS

    Get PDF
    Information systems projects play an important strategic role in organisations and are key drivers to the delivery of change. Given this prominence it is essential to find measurement methods that effectively analyze and communicate the performance to its stakeholders. Further, to assure contribution to both research and practice it is essential to verify the utility of the artifacts (i.e. methods) developed to help validate or justify that the solutions are suitable for practice, and fit the needs and contexts for which it is created. Grounded in the design science paradigm, this paper reports an exploratory evaluation of the perception of certain qualities of two recently developed measurement methods (The Project Performance Scorecard and Project Objectives Measurement Model) against the traditional Triple Constraint method. An analytic scenario-based survey of fifty-one (51) participants, comprising of three (3) sets of independent sample of seventeen (17) respondents each was used. The study analyzed dimensions of task performance, ease of use, perceived usefulness, perceived semantic qualities and user satisfaction from the perspective of the participants. The preliminary study revealed encouraging results for the new methods and the general design process which can help guide current use and further refinements. The limitations of the study and future research directions are discussed

    A Conceptual Model of an Information Security Domain Knowledge Base

    Get PDF
    Information Security breaches and threats continue to grow worldwide. Securing information systems issues persist despite the development of several Information security standards. The low adoption rate of these security standards is one of the main contributing factors for this growing problem. As emerging economies seek to be a part of the digital economy it is prudent that they make information security a priority. The lack of effective Information Security Strategies in developing countries has resulted in these countries facing the problem of becoming targets for cyber criminals. In this research we present a Conceptual Model and a design of an Information Security Domain Knowledge Base (InfoSec DKB) that can assist in developing and managing information security strategies. This design is based on a combination of decision making, security and auditing frameworks, namely concepts of the Value Focused Thinking (VFT) approach used in decision making, the Guidelines for Management of IT security (ISO/IEC 27001), Control Objectives for Information and Related Technologies (COBIT)

    Conceptual design model of interactive television advertising: towards influencing impulse purchase tendency

    Get PDF
    Previous research indicates the importance of content creation development in interactive television (iTV) advertising, which could bring the opportunities for advertisers to increase the effectiveness and interactivity of the iTV advertising. Impulse purchase is one of the important factors that influence consumers to purchase product. Previous studies revealed that impulse purchase behavior has been studied in different medium such as website, traditional television, and retail store. However, those studies are not dedicated to design models to increase impulse purchase tendency on iTV advertising. Hence, this study focuses on the development of a Conceptual Design Model of Interactive Television Advertising that could influence impulse purchase tendency. The model is shortnamed as iTVAdIP. Four (4) specific objectives were formulated: (i) to identify relevant impulse purchase components for iTV advertising, (ii) to develop a conceptual design model and a conformity tool of the iTVAdIP that embed impulse purchase tendency elements, (iii) to validate the proposed conceptual design model, and (iv) to measure the perceived influence of the conceptual design model elements on impulse purchase tendency. This study followed design science research methodology. The conceptual design model was validated through expert review. Then, an instrument was developed to measure the perceived influence of the conceptual design model. Eight dimensions were elicited from various relevant studies to form the instrument which are perceived ease of use, perceived usefulness, clarity, flexibility, visibility, applicability, satisfaction and motivation. A total of 37 potential advertising designers participated in this study. The results show that all dimensions are significantly correlated to the overall perceived influence, and the mean score of the overall perceived influence is high. Therefore, it is concluded that the iTVAdIP conceptual design model with its proposed elements is perceived as able to influence impulse purchase tendency. The iTVAdIP conceptual design model together with the conformity tool are the main contributions of this study. Both can be adopted as impulse purchase design guidelines for the advertising designers particularly the novice ones

    Minimum Viable Model (MVM) Methodology for Integration of Agile Methods into Operational Simulation of Logistics

    Get PDF
    Background: Logistics problems involve a large number of complexities, which makes the development of models challenging. While computer simulation models are developed for addressing complexities, it is essential to ensure that the necessary operational behaviours are captured, and that the architecture of the model is suitable to represent them. The early stage of simulation modelling, known as conceptual modelling (CM), is thus dependent on successfully extracting tacit operational knowledge and avoiding misunderstanding between the client (customer of the model) and simulation analyst. Objective: This paper developed a methodology for managing the knowledge-acquisition process needed to create a sufficient simulation model at the early or the CM stage to ensure the correctness of operation representation. Methods: A minimum viable model (MVM) methodology was proposed with five principles relevant to CM: iterative development, embedded communication, soliciting tacit knowledge, interactive face validity, and a sufficient model. The method was validated by a case study of freight operations, and the results were encouraging. Conclusions: The MVM method improved the architecture of the simulation model through eliciting tacit knowledge and clearing up communication misunderstandings. It also helped shape the architecture of the model towards the features most appreciated by the client, and features not needed in the model. Originality: The novel contribution of this work is the presentation of a method for eliciting tacit information from industrial clients, and building a minimally sufficient simulation model at the early modelling stage. The framework is demonstrated for logistics operations, though the principles may benefit simulation practitioners more generally.</jats:p

    A method for the unified definition and treatment of conceptual schema quality issues

    Get PDF
    The modern world is software-intensive. National infrastructures, smartphones and computers, health-care systems, e-commerce... everything is run by software. Therefore, developing high-quality software solutions is essential for our society. Conceptual modeling is an early activity of the software development process whose aim is to define the conceptual schema of a domain. As the role played by conceptual schemas in software development becomes more relevant---because of, for example, the emergence of model-driven approaches---, their quality becomes crucial too. The quality of a conceptual schema can be analyzed in terms of ``quality properties''. All conceptual schemas should have the fundamental properties of syntactic and semantic correctness, relevance and completeness, as well as any other quality property that has been proposed in the literature and that may be required or recommended in particular projects. It is a fact that only a few quality properties have been integrated into the development environments used by professionals and students, and thus enforced in the conceptual schemas developed by them. A possible explanation of this unfortunate fact may be that the proposals have been defined in the literature in disparate ways, which makes it difficult to integrate them into those environments. The goal of this thesis is to ease the integration of those quality properties that can be evaluated using the conceptual schema itself. We propose a method that permits the unified definition and treatment of conceptual schema quality issues, which we understand as ``important quality topics or problems for debate or discussion''. Our work includes, on the one hand, a characterization and formalization of conceptual schema quality issues, and, on the other hand, the creation of a catalog of quality issues obtained from the literature and defined using the aforementioned formalization. We also provide a prototype implementation of our method, which integrates the catalog of quality issues on top of a real modeling tool. This implementation provides assistance to conceptual modelers during the development of a conceptual schema in a non-disruptive manner. Moreover, our thesis discusses incremental methods for the efficient evaluation of OCL expressions in the context of quality issues and integrates one of them into our prototype tool.El món actual funciona a través del programari. Les infraestructures nacionals, els ordinador i telèfons inte¿ligents, els sistemes de salut, de comerç electrònic... tot depèn del programari. És, doncs, per aquest motiu que cal dissenyar solucions de programari d'alta qualitat. La modelització conceptual és una de les etapes inicials en el procés de desenvolupament de programari. El seu objectiu és definir l'esquema conceptual d'un domini. A mesura que el rol que juguen els esquemes conceptuals esdevé més i més rellevants dins d'aquest context (degut a, per exemple, l'aparició de metodologies de disseny de programari dirigides per models), la seva qualitat també esdevé un requisit elemental. La qualitat d'un esquema conceptual es pot analitzar a través de diferents "propietats de qualitat". Així, tenim que tots els esquemes conceptuals haurien de satisfer les propietats fonamentals de correctesa sintàctica i semàntica, rellevància i completesa, així com altres propietats que s'han proposat a la literatura i que el projecte on s'estigui desenvolupant l'esquema requereixi o recomani. Malauradament, ens trobem que només algunes propietats de qualitat s'han integrat als entorns de desenvolupament que utilitzen els professionals i els estudiants i, per tant, les propietats que podem assegurar se satisfaran són poques. Una possible explicació d'aquest fet és que les propostes existents es defineixen de maneres força diferents, cosa que en dificulta la integració a aquests entorns. L'objectiu de la tesi és pa¿liar aquesta situació i simplificar la integració d'aquelles propietats de qualitat que avaluen un esquema conceptual utilitzant la informació disponible al propi esquema. El mètode que proposem permet la definició i el tractament unificat de "quality issues" per a la modelització conceptual, els quals entenem com "tot allò que pot ser rellevant considerar per assegurar la qualitat d'un esquema". La feina inclou, per una banda, la caraterització i formalizació dels "quality issues" i, per l'altra, la creació d'un catàleg d'issues disponibles a la literatura i expressats mitjançant la nostra formalizació. A més a més, la tesi també inclou la implementació d'un prototipus que demostra com funciona el mètode. El prototipus integra el catàleg de "quality issues" dins una eina de modelització conceptual real i permet assistir als modeladors durant el desenvolupament d'esquemes. Finalment, la nostra feina també ofereix una breu discussió sobre la importància que tenen els mètodes incrementals d'avaluació d'expressions OCL, com es poden adaptar a la nostra definició de "quality issues" i descriu la seva integració a l'eina prototipus que hem desenvolupa

    NEW ARTIFACTS FOR THE KNOWLEDGE DISCOVERY VIA DATA ANALYTICS (KDDA) PROCESS

    Get PDF
    Recently, the interest in the business application of analytics and data science has increased significantly. The popularity of data analytics and data science comes from the clear articulation of business problem solving as an end goal. To address limitations in existing literature, this dissertation provides four novel design artifacts for Knowledge Discovery via Data Analytics (KDDA). The first artifact is a Snail Shell KDDA process model that extends existing knowledge discovery process models, but addresses many existing limitations. At the top level, the KDDA Process model highlights the iterative nature of KDDA projects and adds two new phases, namely Problem Formulation and Maintenance. At the second level, generic tasks of the KDDA process model are presented in a comparative manner, highlighting the differences between the new KDDA process model and the traditional knowledge discovery process models. Two case studies are used to demonstrate how to use KDDA process model to guide real world KDDA projects. The second artifact, a methodology for theory building based on quantitative data is a novel application of KDDA process model. The methodology is evaluated using a theory building case from the public health domain. It is not only an instantiation of the Snail Shell KDDA process model, but also makes theoretical contributions to theory building. It demonstrates how analytical techniques can be used as quantitative gauges to assess important construct relationships during the formative phase of theory building. The third artifact is a data mining ontology, the DM3 ontology, to bridge the semantic gap between business users and KDDA expert and facilitate analytical model maintenance and reuse. The DM3 ontology is evaluated using both criteria-based approach and task-based approach. The fourth artifact is a decision support framework for MCDA software selection. The framework enables users choose relevant MCDA software based on a specific decision making situation (DMS). A DMS modeling framework is developed to structure the DMS based on the decision problem and the users\u27 decision preferences and. The framework is implemented into a decision support system and evaluated using application examples from the real-estate domain
    corecore