2,158 research outputs found

    A review of methodologies to assess urban freight initiatives

    Get PDF
    Only few urban freight initiatives are expanding their scale of application beyond the initial pilot experimentation. To overcome existing barriers to larger scale optimization of urban freight distribution activities, it is necessary to develop and test proper methodologies that assess all aspects relevant to this context. In this paper we propose a classification of existing assessment methodologies, in order to underline their advantages and disadvantages, along with possible research gaps and future trends. For this review we adopt a framework constructed on two dimensions of an assessment methodology, namely method used and scope. As for the method used, methodologies can be either quantitative, if they aim at simulating or evaluating the outcomes in terms of vehicle flows, pollutant emissions, or monetary outcomes, or qualitative, if they are directed towards elucidating the subjective assessment of stakeholders. Concerning the scope, existing methodologies can cover three main aspects of urban freight distribution systems, such as measures to be assessed, stakeholders and impact areas

    A novel reusable learning object development (RLO) for supporting engineering laboratory education

    Get PDF
    A novel reusable learning object development (RLO) for supporting engineering laboratory educatio

    Advances on Time Series Analysis using Elastic Measures of Similarity

    Get PDF
    135 p.A sequence is a collection of data instances arranged in an structured manner. When thisarrangement is held in the time domain, sequences are instead referred to as time series. As such,each observation in a time series represents an observation drawn from an underlying process,produced at a specific time instant. However, other type of data indexing structures, such as spaceorthreshold-based arrangements are possible. Data points that compose a time series are oftencorrelated to each other. To account for this correlation in data mining tasks, time series are usuallystudied as a whole data object rather than as a collection of independent observations. In thiscontext, techniques for time series analysis aim at analyzing this type of data structures by applyingspecific approaches developed to harness intrinsic properties of the time series for a wide range ofproblems such as, classification, clustering and other tasks alike.The development of monitoring and storage devices has made time series analysisproliferate in numerous application fields including medicine, economics, manufacturing andtelecommunications, among others. Over the years, the community has gathered efforts towards thedevelopment of new data-based techniques for time series analysis suited to address the problemsand needs of such application fields. In the related literature, such techniques can be divided in threemain groups: feature-, model- and distance- based methods. The first group (feature-based)transforms time series into a collection of features, which are then used by conventional learningalgorithms to provide solutions to the task under consideration. In contrast, methods belonging to thesecond group (model-based) assume that each time series is drawn from a generative model, whichis then harnessed to elicit information from data. Finally, distance-based techniques operate directlyon raw time series. To this end, these latter methods resort to specially defined measures of distanceor similarity for comparing time series, without requiring any further processing. Among them,elastic similarity measures (e.g., dynamic time warping and edit distance) compute the closenessbetween two sequences by finding the best alignment between them, disregarding differences intime gaps and thus focusing exclusively on shape differences.This Thesis presents several contributions to the field of distance-based techniques for timeseries analysis, namely: i) a novel multi-dimensional elastic similarity learning method for timeseries classification; ii) an adaptation of elastic measures to streaming time series scenarios; and iii)the use of distance-based time series analysis to make machine learning methods for imageclassification robust against adversarial attacks. Throughout the Thesis, each contribution is framedwithin its related state of the art, explained in detail and empirically evaluated. The obtained resultslead to new insights on the application of distance-based time series methods for the consideredscenarios, and motivates research directions that highlight the vibrant momentum of this researcharea

    Advances on Time Series Analysis using Elastic Measures of Similarity

    Get PDF
    A sequence is a collection of data instances arranged in a structured manner. When this arrangement is held in the time domain, sequences are instead referred to as time series. As such, each observation in a time series represents an observation drawn from an underlying process, produced at a specific time instant. However, other type of data indexing structures, such as space- or threshold-based arrangements are possible. Data points that compose a time series are often correlated with each other. To account for this correlation in data mining tasks, time series are usually studied as a whole data object rather than as a collection of independent observations. In this context, techniques for time series analysis aim at analyzing this type of data structures by applying specific approaches developed to leverage intrinsic properties of the time series for a wide range of problems, such as classification, clustering and other tasks alike. The development of monitoring and storage devices has made time se- ries analysis proliferate in numerous application fields, including medicine, economics, manufacturing and telecommunications, among others. Over the years, the community has gathered efforts towards the development of new data-based techniques for time series analysis suited to address the problems and needs of such application fields. In the related literature, such techniques can be divided in three main groups: feature-, model- and distance-based methods. The first group (feature-based) transforms time series into a collection of features, which are then used by conventional learning algorithms to provide solutions to the task under consideration. In contrast, methods belonging to the second group (model-based) assume that each time series is drawn from a generative model, which is then har- nessed to elicit knowledge from data. Finally, distance-based techniques operate directly on raw time series. To this end, these methods resort to specially defined measures of distance or similarity for comparing time series, without requiring any further processing. Among them, elastic sim- ilarity measures (e.g., dynamic time warping and edit distance) compute the closeness between two sequences by finding the best alignment between them, disregarding differences in time, and thus focusing exclusively on shape differences. This Thesis presents several contributions to the field of distance-based techniques for time series analysis, namely: i) a novel multi-dimensional elastic similarity learning method for time series classification; ii) an adap- tation of elastic measures to streaming time series scenarios; and iii) the use of distance-based time series analysis to make machine learning meth- ods for image classification robust against adversarial attacks. Throughout the Thesis, each contribution is framed within its related state of the art, explained in detail and empirically evaluated. The obtained results lead to new insights on the application of distance-based time series methods for the considered scenarios, and motivates research directions that highlight the vibrant momentum of this research area

    Case-based medical informatics

    Get PDF
    BACKGROUND: The "applied" nature distinguishes applied sciences from theoretical sciences. To emphasize this distinction, we begin with a general, meta-level overview of the scientific endeavor. We introduce the knowledge spectrum and four interconnected modalities of knowledge. In addition to the traditional differentiation between implicit and explicit knowledge we outline the concepts of general and individual knowledge. We connect general knowledge with the "frame problem," a fundamental issue of artificial intelligence, and individual knowledge with another important paradigm of artificial intelligence, case-based reasoning, a method of individual knowledge processing that aims at solving new problems based on the solutions to similar past problems. We outline the fundamental differences between Medical Informatics and theoretical sciences and propose that Medical Informatics research should advance individual knowledge processing (case-based reasoning) and that natural language processing research is an important step towards this goal that may have ethical implications for patient-centered health medicine. DISCUSSION: We focus on fundamental aspects of decision-making, which connect human expertise with individual knowledge processing. We continue with a knowledge spectrum perspective on biomedical knowledge and conclude that case-based reasoning is the paradigm that can advance towards personalized healthcare and that can enable the education of patients and providers. We center the discussion on formal methods of knowledge representation around the frame problem. We propose a context-dependent view on the notion of "meaning" and advocate the need for case-based reasoning research and natural language processing. In the context of memory based knowledge processing, pattern recognition, comparison and analogy-making, we conclude that while humans seem to naturally support the case-based reasoning paradigm (memory of past experiences of problem-solving and powerful case matching mechanisms), technical solutions are challenging. Finally, we discuss the major challenges for a technical solution: case record comprehensiveness, organization of information on similarity principles, development of pattern recognition and solving ethical issues. SUMMARY: Medical Informatics is an applied science that should be committed to advancing patient-centered medicine through individual knowledge processing. Case-based reasoning is the technical solution that enables a continuous individual knowledge processing and could be applied providing that challenges and ethical issues arising are addressed appropriately

    Fostering Improvement in Occupational Performance Through Environment Modification in Skilled Nursing Facilities

    Get PDF
    First-year occupational therapy students at the University of North Dakota School of Medicines and Health Sciences completed this CAT. The general topic assigned was the completion of activities of daily living (ADLs) and instrumental activities of daily living (IADLs) by older adults in a skilled nursing facility (SNF). Students formed a focus question, a case scenario, and then researched and presented key findings. Students then presented implications for practice in occupational therapy with the evidence of scholarly research in the areas of theory, environment, population, and interventions. The intervention that was chosen for this CAT was the Green House Project. The Green House Project is an intervention strategy in which the proponents of this intervention suggest remodeling skilled nursing facilities into small homes (10 or fewer residents per home) that are designed to be similar to the clients’ home environments (Cutler & Kane, 2009). The population of older adults living in SNF in urban areas was further explored to grasp a better understanding of their unique characteristics. The Person-Environment-Occupation model (PEO) (Law et al., 1996) was the theory chosen to guide scholarly research for this CAT. The founders of PEO emphasized the importance of goodness of fit (Law et al., 1996). Law et al. (1996) suggested that the person, environment, and occupation should be interdependent upon one another, creating an optimal performance for the individual in whichever context he or she is performing his or her occupation

    Pedagogical interventions and their influences on university-level students learning pharmacology-a realist review

    Get PDF
    IntroductionThe knowledge complexity and varied delivery formats in pharmacology education can leave students unprepared in essential pharmacotherapy skills. This significantly influences their ways of thinking and working in clinical environments, resulting in a challenging clinical transition. This need demands pedagogical innovations to strengthen pharmacology education and improve learners’ skills and competencies in pharmacotherapy. This evidence-based realist review aimed to examine the contextual factors and program theories or causal mechanisms crucial for effective pedagogical interventions in pharmacology, seeking to answer the question of ‘what works for whom, under what circumstances, how, and why’.MethodThe realist synthesis was initiated after retrieving data from Medline (OVID), Cochrane, EBSCO hosted ERIC, SCOPUS, and Embase (OVID) including other sources for additional records. The preliminary analysis enabled the establishment of context, mechanism, and outcome configurations (CMOC) and formulation and refinement of the initial program theory regarding the pedagogical interventions in pharmacology. Data synthesis iteration helped to identify the relevant context and unravel its relationships with underlying causal mechanisms through which said interventions generate outcomes of interest.ResultsA realist review analyzed 1,217 records and identified 75 articles examining a range of educational interventions from individual efforts to faculty-wide curriculum changes in pharmacology education. The key contexts for pharmacology education were troublesome content, traditional delivery methods, inadequate and limited opportunities for knowledge integration, and application. Active participation in interactive learning, along with enjoyment and motivation, was proposed as a causal mechanism for optimizing cognitive load and achieving positive outcomes. The outcomes of the review include subjective perceptions of improved confidence and satisfaction, objective measurements of high post-test scores.DiscussionPedagogical scaffolding in constructivist learning environments helps students overcome challenges in learning troublesome pharmacology knowledge. Considering the human cognitive system’s processing capacity, these interventions improve learning by effectively using cognitive resources. Innovations that focus on enhancing cognitive load through task construction can also promote positive emotional experiences in students, such as engagement and enjoyment, as explained by flow theory. A constructive learning environment, where the cognitive load is optimized and high flow is achieved, can maximize the impact of pedagogical interventions in pharmacology.Systematic Review Registrationhttps://www.crd.york.ac.uk/prospero/display_record.php?RecordID=160441, PROSPERO (CRD42020160441)

    IMMERSIVE VIRTUAL REALITY FOR EXPERIENTIAL LEARNING

    Get PDF
    Immersive virtual reality is any computer-generated environment capable of fooling the user’s senses with a feeling of presence (being there). Two different types of hardware are usually used to access immersive virtual reality: Head Mounted Displays (HMD) or Cave Automated Virtual Environment (CAVE). Due to its ability to generate any kind of environment, either real or imaginary, immersive virtual reality can be used as a tool to deliver experiential learning, as described by Kolb (1984) in his experiential learning circle model. Such model identifies four different steps that, as part of a circle, describe the process of learning by experiencing something, these steps are: (1) concrete experience, (2) observations and reflections, (3) formulation of abstract concepts and generalization, (4) testing implications of concepts in new situations. Immersive virtual reality has been out for decades, but in spite of the big buzz around it, a large adoption of the technology has not occurred yet. One of the main barriers to adoptions is the high cost of gear needed. However, recent development in technology are pushing prices down. For instance, Google Cardboard offers a very inexpensive way to experience virtual reality through smartphones. Moreover, the price of HMD and the powerful computers needed to run virtual reality software are expected to fall as it already happened with desktop computers before. The Technology Acceptance Model (TAM), as introduced by Davis (1989), is an attempt to understand the factors behind the adoption of new technologies. In particular, this model introduces the two key concepts of (1) perceived usefulness and (2) perceived ease of use. Looking at these, the manuscript attempts to bring some light in the current state of the adoption. The findings of this study have both theoretical and managerial implications, useful both to schools and vendors. The main finding of this study is that more research is needed to understand how people learn in immersive virtual reality, and how to develop software capable of delivering experiential learning. A tighter collaboration between schools, students, manufacturers, software developers seems to be the most viable way to go
    • …
    corecore