72 research outputs found

    The Economic Impacts of Port Activity in Antwerp: A Disaggregated Analysis

    Get PDF
    The importance of ports is usually measured by indicators such as added value, employment and investment on a much aggregated level. This paper tries to define the importance of the port of Antwerp for the regional and national economy on a disaggregated level. It attempts to identify, quantify and locate the mutual relationships between the different players in the port and between these players and other industries. Finally, it proposes a method to calculate the effects of changes in port activity at a detailed level. A sector analysis is done by means of a reduced regional input-output table, through a bottom-up approach. The most important customers and suppliers of the port's key players or stakeholders are identified. A geographical analysis is feasible by using data on a disaggregated level. Each customer or supplier can be located by means of their postcode. In this way, the extent of the economic impact of the port of Antwerp is quantified.

    Physical play - How do we inspire and motivate young children to be physically active through play? An international analysis of twelve countries’ national early years curriculum policies and practices for physical activity and physical play

    Get PDF
    Lifelong movement and physical activity (PA) patterns develop during early childhood. Therefore, educators (teachers and practitioners) in early childhood education and care (ECEC) should provide opportunities to support children’s play, PA, and movement development. The World Health Organization (2019) offers new recommendations for PA, for children under five years. The guidelines do not specify the ways ECEC staff can support PA through play. Therefore, this paper investigates, how physical play (PP) is enacted globally. An international policy and practice analysis of twelve countries, (Australia [Victoria], Belgium [Flanders], Canada [Alberta], China, Finland, Ireland, Italy, Portugal, Spain, Sweden, UK [England] and USA) was completed by analyzing the ECEC curricula and their implementation in different cultural contexts. A content analysis was undertaken by AIESEP Early Years SIG experts revealing that PP was not clearly defined. When defined, it was described as PA, and important for children’s holistic development. The majority of curricula did not state the length/time for PP. Three main strategies for implementing PP were found: a) pedagogical framework; b) active learning methods; and c) motor development. This international analysis highlights the global need for better ECEC staff support in acknowledging and implementing PP to aid children’s overall development, PA and wellbeing

    Guidelines for the use and interpretation of assays for monitoring autophagy (4th edition)1.

    Get PDF
    In 2008, we published the first set of guidelines for standardizing research in autophagy. Since then, this topic has received increasing attention, and many scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Thus, it is important to formulate on a regular basis updated guidelines for monitoring autophagy in different organisms. Despite numerous reviews, there continues to be confusion regarding acceptable methods to evaluate autophagy, especially in multicellular eukaryotes. Here, we present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes. These guidelines are not meant to be a dogmatic set of rules, because the appropriateness of any assay largely depends on the question being asked and the system being used. Moreover, no individual assay is perfect for every situation, calling for the use of multiple techniques to properly monitor autophagy in each experimental setting. Finally, several core components of the autophagy machinery have been implicated in distinct autophagic processes (canonical and noncanonical autophagy), implying that genetic approaches to block autophagy should rely on targeting two or more autophagy-related genes that ideally participate in distinct steps of the pathway. Along similar lines, because multiple proteins involved in autophagy also regulate other cellular pathways including apoptosis, not all of them can be used as a specific marker for bona fide autophagic responses. Here, we critically discuss current methods of assessing autophagy and the information they can, or cannot, provide. Our ultimate goal is to encourage intellectual and technical innovation in the field

    Guidelines for the use and interpretation of assays for monitoring autophagy (4th edition)

    Get PDF

    Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition)

    Get PDF
    In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure fl ux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defi ned as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (inmost higher eukaryotes and some protists such as Dictyostelium ) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the fi eld understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation it is imperative to delete or knock down more than one autophagy-related gene. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways so not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field

    Apports de la programmation "orienté-objet" à la conception et la réalisation d'applications en imagerie médicale

    No full text
    A survey of existing tools for quantitative image analysis in PET shows us that they are not really suited to our needs. Moreover, they are limited for any kind of extension. These facts forced us to develop our own software. We decide to use object oriented programming (OOP) techniques to perform our development. They are based on domain specific data instead of domain specific functions like in procedural programming techniques. OOP brings several concepts that led us to construct robust software, easy to maintain and to extend. These software qualities are provided by decomposition of the initial problem. Class concept achieves this decomposition. Each relevant type of domain specific data leads to a class definition. A class contains a data structure and several methods that access the data structure. An object is an instance of a class that has a state, a behaviour and a unique identity. Objects and classes are defined by identifying by abstractions based on an understanding of the problem. They can also be defined by common behaviors or through scenarios leading to a use case driven design. OOP provides object encapsulation, hiding the details of an object that do not contribute to its essential characteristics. Encapsulation means also protection of the data which can only be accessed by class methods. OOP provides inheritance, a parental relationship between classes. The derived class inherits of a part or all of the data and the methods of its parent class. Inheritance provides programming by extension. It implies class hierarchy and allows code and structural reuse. We will use OOP techniques to provide a complete bundle of applications that fulfil our needs in quantitative medical image analysis. Object and class design are the most important part in OOP. By analyzing the domain of application, we determine the relevant objects. The sorting of these images as function as time, space or modality criteria is performed by the class ImageList. Each image of a Study is an instance of the class Image. The ColorScale class provides to each image an appropriate color lookup table. We define also an Administrative-Part class that contains image information such as image size and correction factor. It also include patient information such as name, age, … Quantitative analysis is achieved by image segmentation. The outline of regions of interest is performed by using instances of a class Object-of-Interest (OOI). We derive different object shapes from OOI. Polygons are used to estimate regional activity. When used at different times, it produces tome activity curves. When assembled in space, they produce volume of interest. A line of interest is used as fiducial marker. A group of interest contains several instances of OOIs and allow a parental structure of the OOIs defined on an image. A signature and a time stamp is associated with each instance of OOI since image segmentation may lead to medical decisions. The different classes include methods to access their data structures. These methods depend also on the domain of application analysis. For instance, class Image contains methods for mathematical operations that are notified, time stamped and signed. It contains also methods to achieve color scale manipulation and methods for OOI definition and quantification. Class Study contains import and export methods and several methods connecting the images. The OOI class defines creation, manipulation and quantification methods. Each class contains creation and destruction methods together with read and write methods. All of our objects are defined as persistent, which means that they survive in time and in space. We create a super class from which we derived all our classes. This super class implements computer independent data representation and an activation/passivation mechanism that reduces data file sizes. We also introduce a version in our objects to fulfil backward compatibility. Once the classes design was done, we investigated different object oriented languages and decided to choose the C++ language, a C language extension, for its availability and its wide-spread use. So far we did not discuss about visualisation of our objects. They are designed separately. We defined a visualisation class for each described class, so our objects and their view can evolve independently. Moreover this separation allows to build different interfaces on the same objects such as graphical or alphanumeric interfaces. These two types of interface are made equivalent through a command language interpreter. This interpreter allows also macro command execution. A brief analysis of some graphical user interface builders led us to choose the InterViews graphical library, we have developed application to perform 2D image segmentation (Mediman), mathematical operations on images (Calculator) and kinetic modelling (Models). These applications fulfil our needs concerning quantitative medical image analysis. The defined objects will be further used to perform image coregistrations, volume of interest visualisation and their quantificationLa rigidité et la manque d’extension possible des logiciels destines à l’analyse quantitative d’images nous ont incités à élaborer nos propres logiciels. Pour ce faire, nous avons utilisé des nouvelles techniques de programmation, à savoir la programmation orienté-objet (POO). Contrairement aux techniques traditionnelles qui se basent sur les fonctions à réaliser, la POO se base sur les données inhérentes au domaine d’application. A chaque type de données, considéré comme essentiel, on associe une classe. Il s’agit d’une entité contenant la structure des données et les méthodes d’accès à cette structure. Une classe représente donc un type d’objets à manipuler. Les classes sont définies soit sur base de vocabulaire utilisé dans le domaine d’application, soit en fonction de comportement communs d’objets ou encore sur base de situations possibles ou de scénarios. La définition et la conception des classes sont des points essentiels des techniques de la POO. Ces classes permettent une simplification du problème par décomposition, ce qui conduit à des logiciels robustes, de maintenance et d’extension aisées. Ces qualités sont renforcées grâce aux concepts de la POO. En effet, une classe fournit l’encapsulation des données et méthodes, c’est-à-dire que les détails de l’implémentation sont rendus invisibles. De plus, le concept d’héritage permet de dériver des classes d’une ou plusieurs classes existantes. La nouvelle classe profite des données et méthodes de sa (ou ses) classe(s) parent(s). Ceci permet une extension aisée des fonctionnalités, fournit une organisation hiérarchique des classes et diminue la quantité de code à écrire. Une analyse du domaine d’application permet de déterminer les types d’objets intervenant dans les logiciels d’analyse d’images. Nous avons défini une classes Etude contenant toutes les images d’un patient donné. Afin de trier ces images selon les modalités d’imagerie ou un ordre spatio-temporel, on définit une classe Liste-d’images. Les images seront également des objets d’une classe Image. L’analyse d’images quantitative se fait par extraction d’informations contenue dans des zones d’intérêt. Pour pouvoir segmenter l’image, on définit une classe Objet-d’intérêt (OOI) dont des formes polygonales, linéaires ou ponctuelles sont les classes dérivées. A chaque image on associe également une instance de la classe Echelle-de-couleurs. Celle-ci nous permet d’afficher chaque image avec une échelle de couleurs appropriée. Les informations relatives à l’image et au patient sont regroupées dans la classes Données-administratives. Les différentes classes contiennent des méthodes dictées par les fonctionnalités du domaine d’application. La classe Image contient des opérations mathématiques sui seront signées et notifiées ainsi que des opérations de définition d’OOI également signées. Pour les OOI, on définit des méthodes de calcul de surface, de distance ou de position ainsi que des méthodes de calcul courbes d’activité ; elles peuvent également se faire dans l’espace et fournir la quantification de courbes d’activité ; elles peuvent également se faire dans l’espace et fournir la quantification de volumes d’intérêt. Toutes les classes contiendront également des méthodes de construction et de destruction des objets ainsi que des méthodes de lecture et d’écriture. Ces dernières fournissent des fichiers dont le format est indépendant de tous types d’architecture d’ordinateur (XDR°. Elle incluent également un mécanisme d’activation/passivation qui permet de diminuer la dimension des fichiers en écrivant de façon unique un objet à multiples références. Les objets crées seront persistants, c’est-à-dire qu’ils continuent à exister dans l’espace et le temps. Ils bénéficient également d’une version, ce qui assure leur compatibilité arrière et leur permet d’évoluer. Une étude des différents langages de POO nous a mené à choisir le langage C++, une extension du langage C, essentiellement pour sa disponibilité et son utilisation répandue. EN ce qui concerne la partie visuelle, elle est développée séparément. A chacune des classes définies ci-dessus, on associe une classe de visualisation, ce qui permet aux objets et à leur vue d’évoluer indépendamment. Après une analyse des différents environnements permettant la création d’interface, nous avons choisi la bibliothèque Interviews. Elle est écrite en C++ et encapsule les fonctions de base du système Xwindow dans quelques classes. Ce choix assure la portabilité de nos logiciels. Sur base des classes que nous avons définies et sur base des choix de langage et d’outil de création d’interfaces, nous avons développé un logiciel pour la segmentation bidimensionnelle des images ainsi qu’un logiciel destiné à l’analyse pharmacocinétique des traceurs. Ultérieurement, ces classes seront réutilisées pour le développement de logiciels destinés à la coregistration et à la segmentation tridimensionnelles d’images, c’est-à-dire la création et la quantification de volumes d’intérêt.Thèse de doctorat en sciences biomédicales (informatique) -- UCL, 199

    A Virtual Maze Game to Explain Reinforcement Learning

    No full text
    We demonstrate how Virtual Reality can explain the basic concepts of Reinforcement Learning through an interactive maze game. A player takes the role of an autonomous learning agent and must learn the shortest path to a hidden treasure through experience. This application visualises the learning process of Watkins' Q(λ), one of the fundamental algorithms in the field. A video can be found at https://youtu.be/sLJRiUBhQqM.info:eu-repo/semantics/publishe

    Distilling Deep Reinforcement Learning Policies in Soft Decision Trees

    No full text
    An important step in Reinforcement Learning (RL) research is to create mechanisms that give higher level insights into the black-box policy models used nowadays and provide explanations for these learned behaviors or motivate the choices behind certain decision steps. In this paper, we illustrate how Soft Decision Tree (SDT) distillation can be used to make policies that are learned through RL more interpretable. Soft Decision Trees create binary trees of predetermined depth, where each branching node represents a hierarchical filter that influences the classification of input data. We distill SDTs from a deep neural network RL policy for the Mario AI benchmark and inspect the learned hierarchy of filters, showing which input features lead to specific action distributions in the episode. We realize preliminary steps towards interpreting the learned behavior of the policy and discuss future improvements.info:eu-repo/semantics/publishe

    Probl(mes d'instabilit pour des quations diff rentiellesfonctionnelles

    No full text
    BSE B226786M / UCL - Université Catholique de LouvainSIGLEBEBelgiu
    corecore