38 research outputs found

    A Conceptual Framework for Mobile Learning

    Get PDF
    Several technology projects have been launched to explore the opportunities that mobile technologies bring about when tackling issues of democratic participation and social inclusion through mobile learning. Mobile devices are cheaper than for instance a PC, and their affordance, usability and accessibility are such that they can potentially complement or even replace traditional computer technology. The importance of communication and collaboration features of mobile technologies has been stressed in the framework of ICT-mediated learning. In this paper, a theoretical framework for mobile learning and e-inclusion is developed for people outside the conventional education system. The framework draws upon the fields of pedagogy (constructivist learning in particular), mobile learning objects and sociology.Mobile Learning, Digital Divide, Constructivist Pedagogy, Forms Of Capital

    A PROCESS-BASED APPROACH TO KNOWLEDGE MANAGEMENT

    Get PDF
    This paper analyses the relationship between business process modelling, knowledge management and information systems development projects. The paper’s main objective is to present business rules as the encoded knowledge of corporate business practices. Further, it introduces a rule-based business activity meta-model as a repository in which business knowledge can be captured and traced from their origin in the business environment through to their implementation in information systems. The case study of the Croatian Ministry of Finance is presented, discussing the practical experience in integrating business process repository and organisational knowledge as the foundation for information system development

    The formulation of competitive actions in practice

    Get PDF
    This is a study of what managers do in relation to the formulation of competitive actions. The study started with Project 1 (P1), a literature review that looked at managers’ cognitions in respect of competitive positioning and competitive strategy. A gap was found in how individual competitive actions are formulated and executed. A gap was also found concerning what managers do in response to interpretations of their competitive environments. Following the literature review, a series of semi-structured interviews were undertaken with managers and 26 individual competitive actions were recorded and analysed in Project 2 (P2). A structure to the formulation of competitive actions was discerned and developed into a processual model that is triggered by a stimulus, followed by the manager envisaging desired outcome and setting objectives, then deciding which levers to use, developing the action and refining it. Its application to practice was developed in Project 3 (P3) through an aide memoir tool to assist managers. The study makes a contribution to theory by providing a framework that captures the way in which managers construe and formulate competitive actions. In P2 it was found that managers tend to follow a largely homogenous process and that the tools and techniques offered in the extant literature are seldom used. The managers interviewed in mature industries were far more aware of who their competitors were in more when compared to nascent industries. This had a bearing on the formulation of competitive actions insofar as companies operating in mature industries formulated competitive actions to fend off or compete with their competitors more effectively while companies operating in nascent industries tended to formulate competitive actions with the aim of exploiting gaps in the market. It was found in P2 that managers’ backgrounds, including their functional and educational, as well as their nationalistic and cultural backgrounds, had a bearing on how they construed their competitors and the competitive actions they formulated. It was also found that competitive actions were formulated and executed on an iterative process, whereby managers would refine their actions applying the learnings from previous actions. Managers, particularly those with more experience, relied heavily on intuition and tacit knowledge, as well as input from colleagues and customers, when formulating competitive actions. Contrary to the assertions many in much of the extant literature about companies not deviating from industry norms when formulating competitive actions, the study found that managers would often do so in search of abnormal profits. The study makes a contribution to practice by providing a guide to assist in formulating competitive actions. The guide is based on the processual model developed in P2 and was summarised in five key steps, comprising Stimulus, Objectives, Levers, Actions and Refinement, and abbreviated ‘SOLAR’

    Local Binary Patterns in Focal-Plane Processing. Analysis and Applications

    Get PDF
    Feature extraction is the part of pattern recognition, where the sensor data is transformed into a more suitable form for the machine to interpret. The purpose of this step is also to reduce the amount of information passed to the next stages of the system, and to preserve the essential information in the view of discriminating the data into different classes. For instance, in the case of image analysis the actual image intensities are vulnerable to various environmental effects, such as lighting changes and the feature extraction can be used as means for detecting features, which are invariant to certain types of illumination changes. Finally, classification tries to make decisions based on the previously transformed data. The main focus of this thesis is on developing new methods for the embedded feature extraction based on local non-parametric image descriptors. Also, feature analysis is carried out for the selected image features. Low-level Local Binary Pattern (LBP) based features are in a main role in the analysis. In the embedded domain, the pattern recognition system must usually meet strict performance constraints, such as high speed, compact size and low power consumption. The characteristics of the final system can be seen as a trade-off between these metrics, which is largely affected by the decisions made during the implementation phase. The implementation alternatives of the LBP based feature extraction are explored in the embedded domain in the context of focal-plane vision processors. In particular, the thesis demonstrates the LBP extraction with MIPA4k massively parallel focal-plane processor IC. Also higher level processing is incorporated to this framework, by means of a framework for implementing a single chip face recognition system. Furthermore, a new method for determining optical flow based on LBPs, designed in particular to the embedded domain is presented. Inspired by some of the principles observed through the feature analysis of the Local Binary Patterns, an extension to the well known non-parametric rank transform is proposed, and its performance is evaluated in face recognition experiments with a standard dataset. Finally, an a priori model where the LBPs are seen as combinations of n-tuples is also presentedSiirretty Doriast

    Algorithmic Techniques in Gene Expression Processing. From Imputation to Visualization

    Get PDF
    The amount of biological data has grown exponentially in recent decades. Modern biotechnologies, such as microarrays and next-generation sequencing, are capable to produce massive amounts of biomedical data in a single experiment. As the amount of the data is rapidly growing there is an urgent need for reliable computational methods for analyzing and visualizing it. This thesis addresses this need by studying how to efficiently and reliably analyze and visualize high-dimensional data, especially that obtained from gene expression microarray experiments. First, we will study the ways to improve the quality of microarray data by replacing (imputing) the missing data entries with the estimated values for these entries. Missing value imputation is a method which is commonly used to make the original incomplete data complete, thus making it easier to be analyzed with statistical and computational methods. Our novel approach was to use curated external biological information as a guide for the missing value imputation. Secondly, we studied the effect of missing value imputation on the downstream data analysis methods like clustering. We compared multiple recent imputation algorithms against 8 publicly available microarray data sets. It was observed that the missing value imputation indeed is a rational way to improve the quality of biological data. The research revealed differences between the clustering results obtained with different imputation methods. On most data sets, the simple and fast k-NN imputation was good enough, but there were also needs for more advanced imputation methods, such as Bayesian Principal Component Algorithm (BPCA). Finally, we studied the visualization of biological network data. Biological interaction networks are examples of the outcome of multiple biological experiments such as using the gene microarray techniques. Such networks are typically very large and highly connected, thus there is a need for fast algorithms for producing visually pleasant layouts. A computationally efficient way to produce layouts of large biological interaction networks was developed. The algorithm uses multilevel optimization within the regular force directed graph layout algorithm.Siirretty Doriast

    Sparse Predictive Modeling : A Cost-Effective Perspective

    Get PDF
    Many real life problems encountered in industry, economics or engineering are complex and difficult to model by conventional mathematical methods. Machine learning provides a wide variety of methods and tools for solving such problems by learning mathematical models from data. Methods from the field have found their way to applications such as medical diagnosis, financial forecasting, and web-search engines. The predictions made by a learned model are based on a vector of feature values describing the input to the model. However, predictions do not come for free in real world applications, since the feature values of the input have to be bought, measured or produced before the model can be used. Feature selection is a process of eliminating irrelevant and redundant features from the model. Traditionally, it has been applied for achieving interpretable and more accurate models, while the possibility of lowering prediction costs has received much less attention in the literature. In this thesis we consider novel feature selection techniques for reducing prediction costs. The contributions of this thesis are as follows. First, we propose several cost types characterizing the cost of performing prediction with a trained model. Particularly, we consider costs emerging from multitarget prediction problems as well as a number of cost types arising when the feature extraction process is structured. Second, we develop greedy regularized least-squares methods to maximize the predictive performance of the models under given budget constraints. Empirical evaluations are performed on numerous benchmark data sets as well as on a novel water quality analysis application. The results demonstrate that in settings where the considered cost types apply, the proposed methods lead to substantial cost savings compared to conventional methods

    Selective Outsourcing in Global IT Services : Operational Level Challenges and Opportunities

    Get PDF
    Companies need to answer and react timely and efficiently to their customers’ perception in order to stay in business. Companies are finding ways to control and reduce costs. Increasingly, internal IT development and service delivery activities are outsourced to external suppliers. The most common outsourcing forms are total and selective outsourcing, which are produced in nearshore and/or offshore mode. In this dissertation, the case units are two global units in Nokia Devices: IT unit and Delivery Quality and Corrective Action Preventive Action (DQ and CAPA) unit. This dissertation consists of five publications and five research questions. The motives for the research questions originate from the case units’ real-life needs and challenges. The research approach used is qualitative. Action research was conducted during years 2009-2013. This research gives focus on the global IT service delivery, although the case company’s core-competence was to produce end-consumer products. The target was to get operational level knowledge from the case units’ outsourcing operation and practices in a Global Selective Outsourcing Environment (GSOE). This dissertation addresses the opportunities and challenges of outsourcing faced by the operational level personnel. In the GSOE, the service purchasing company’s personnel and the supplier’s personnel jointly cooperate to produce the expected outcomes and IT services. This research found that the GSOE-based operation includes multi-level customer- and supplier-ships. In order to answer the customers’ perception, the operation included quality and customer-centric practices. This research found that defining and implementing customer centricity is challenging. Unclear definitions, requirements, roles, responsibilities, and activities can negatively affect the operational level implementation. The GSOEbased operation includes also contract negotiations among the GSOE parties. Successful IT outsourcing is not built only on formal contracts. Focus is needed also on building trust, commitment, communication, and mutual cooperation and dependence. This study found that retaining operational level progress and information visibility inside the service purchasing company made it possible to hold the ownership and avoid getting into a “supplier trap.” The operational level cooperation, interaction and quality management practices affected the service purchasing company’s trust and satisfaction. The trust in the case units was found to exist among people, and this trust was formed based on an individual’s knowledge, capabilities, behavior, and performance. Quality management practices played a significant role in building trust that added to the credibility of the operation

    “Becoming Ahuman: making it desirable to abandon certainty, including certainty of the self, and play in this chaotic situation”

    Get PDF
    Title: “Becoming Ahuman: making it desirable to abandon certainty, including certainty of the self, and play in this chaotic situation” Ralph Dorey, Northumbria University, 2020. This research brings together resonating creative processes from feminist literature, game design, queer gender politics, post-structuralist philosophy, and horror cinema. It uses these to articulate an art practice which is unstable and generative both for the artist during the process of production, and again for the audience. The PhD output as combined thesis and practice consists of three books, each approaching the question, “How to negotiate art practice as involving processes which are unstable, affective, and resistant to structures?” Each book takes a different position regarding this question and in doing so reshapes it into a sub-question. The book “Ahuman Desire” explores the question “How to negotiate art practice as involving affects which are at some times indescribable, or overwhelming?” The book “Ahuman Use” explores the question “How to negotiate art practice as involving salvaged or stolen systems, which are always already breaking down?” The book “Unknown Lacuna” explores the question “How to negotiate art practice as involving unstable things which can only be seen through what they do?” Each engages the same question, but with a different emphasis. They are three different attempts and the obvious implication is that these are three of many more potentially attempts. I have undertaken an extensive literature review across fields which border on art practice. The three books bring together a vast matrix research sources and makes these visible and accessible as an act of care, in keeping with the feminist writing practices which underpin the work. I have developed original methodologies which are used in the different documents across the three books and include the use of speculative fiction, plagiarism, formalist writing strategies, drawing, performance, games, and screenplays as research. As well as using artworks as a site to examine the relationships between different theories of creative process. The rigour of the PhD Output exists not just in the scale of the sources processed and responded too, but in its infrastructural approach which departs from academic norms to resist a cataloguing or hierarchical envelope for the knowledges within. The PhD Output addresses one of its returning processes of Excess through its form. It is large in scope and shifts responsibility to the reader to navigate this Excess. This demonstrates the affects of anxiety address in many of its documents, before the aforementioned attention to acts of care re-frame this disorder as generative. This mirrors the repeated conceptual and narrative refrain in many documents whereby the horror of the unknown is reorientated to become a creative and dynamic approach to knowledge which does not need to be fixed or enveloped. The PhD Output aims to support reader engagement based on their desire, rather than through an external economy that ascribes or denies a degree of value based on adherence to pre-existing parameters. This approach is a departure from the common structures of academic research, while still demonstrating critical judgment and original contributions to knowledge. The departure is necessary firstly because of the research questions above, and secondly the commonality of destabilisation in the source materials from feminist writing practices and philosophy, to collaborative games and horror media. Thirdly, the departure enables the specificity of the practice based PhD Output to not just describe processes but to enact them at the reader’s point of encounter with the research. The primary findings of the research are. The potential for the form of Tabletop Role Playing Game Manuals to inform an art practice when combined with the philosophy of Gilles Deleuze and FĂ©lix Guattari. The mutual illumination offered when combined with feminist writing practices or Écriture FĂ©minine. The potential for Écriture FĂ©minine to inform contemporary queer feminist art practices which incorporate the forms of video-games, as well recognising the event of audience encounter with such artworks as a creative one. The use of horror cinema as a means to articulate art practice concerned with affect. The potential of practice-based art research to produce new ways to produce and deliver original research in a dynamic rather than fixed structure. This research is of value due to its relevance to contemporary practice. This relevance is evidenced by the recent attention to queer indie game design (‘Beyond the Console’, n.d.; Faber, 2019; Humphreys, n.d.; Thaddeus-Johns, 2019; Wallace, 2019), experimental feminist writing practices incorporating speculative fiction (Hedva, 2018; Hval, 2018; Jackson & Leslie, 2018; Waidner, 2019), the divisive concept of “elevated horror” (Carrol, 2019; Crump, 2019; Ehrlich, 2019; Gardner, 2019; Taylor, 2019), and the folding of these into art practice. The research include in-depth analyses of artworks by two artists who have relatively recently received a high international profile (Apexart, 2019; ‘Dark Continent: Semiramis Performance | Arts Council Collection’, n.d.; ‘Porpentine Charity Heartscape’, n.d.; Tate, n.d.) and have not yet been the subject of monographs or a large amount of academic study, particularity within the field of art. The relevance of this research is further supported by the recent publications and events in a overlapping fields (Brazil, 2019; Burrows & O’Sullivan, 2019; Editorial Staff, 2019; Fisher, 2018; ‘Flickering Monstrosities Hyperfiction Reading Group’, 2019; ‘ICA | I, I, I, I, I, I, I, Kathy Acker’, n.d.; Lewis, n.d.; Little, 2019; Pyrne, 2019; Shaw & Reeves-Evison, 2017)
    corecore