8,261 research outputs found

    Anuário científico da Escola Superior de Tecnologia da Saúde de Lisboa - 2021

    Get PDF
    É com grande prazer que apresentamos a mais recente edição (a 11.ª) do Anuário Científico da Escola Superior de Tecnologia da Saúde de Lisboa. Como instituição de ensino superior, temos o compromisso de promover e incentivar a pesquisa científica em todas as áreas do conhecimento que contemplam a nossa missão. Esta publicação tem como objetivo divulgar toda a produção científica desenvolvida pelos Professores, Investigadores, Estudantes e Pessoal não Docente da ESTeSL durante 2021. Este Anuário é, assim, o reflexo do trabalho árduo e dedicado da nossa comunidade, que se empenhou na produção de conteúdo científico de elevada qualidade e partilhada com a Sociedade na forma de livros, capítulos de livros, artigos publicados em revistas nacionais e internacionais, resumos de comunicações orais e pósteres, bem como resultado dos trabalhos de 1º e 2º ciclo. Com isto, o conteúdo desta publicação abrange uma ampla variedade de tópicos, desde temas mais fundamentais até estudos de aplicação prática em contextos específicos de Saúde, refletindo desta forma a pluralidade e diversidade de áreas que definem, e tornam única, a ESTeSL. Acreditamos que a investigação e pesquisa científica é um eixo fundamental para o desenvolvimento da sociedade e é por isso que incentivamos os nossos estudantes a envolverem-se em atividades de pesquisa e prática baseada na evidência desde o início dos seus estudos na ESTeSL. Esta publicação é um exemplo do sucesso desses esforços, sendo a maior de sempre, o que faz com que estejamos muito orgulhosos em partilhar os resultados e descobertas dos nossos investigadores com a comunidade científica e o público em geral. Esperamos que este Anuário inspire e motive outros estudantes, profissionais de saúde, professores e outros colaboradores a continuarem a explorar novas ideias e contribuir para o avanço da ciência e da tecnologia no corpo de conhecimento próprio das áreas que compõe a ESTeSL. Agradecemos a todos os envolvidos na produção deste anuário e desejamos uma leitura inspiradora e agradável.info:eu-repo/semantics/publishedVersio

    Multiscale structural optimisation with concurrent coupling between scales

    Get PDF
    A robust three-dimensional multiscale topology optimisation framework with concurrent coupling between scales is presented. Concurrent coupling ensures that only the microscale data required to evaluate the macroscale model during each iteration of optimisation is collected and results in considerable computational savings. This represents the principal novelty of the framework and permits a previously intractable number of design variables to be used in the parametrisation of the microscale geometry, which in turn enables accessibility to a greater range of mechanical point properties during optimisation. Additionally, the microscale data collected during optimisation is stored in a re-usable database, further reducing the computational expense of subsequent iterations or entirely new optimisation problems. Application of this methodology enables structures with precise functionally-graded mechanical properties over two-scales to be derived, which satisfy one or multiple functional objectives. For all applications of the framework presented within this thesis, only a small fraction of the microstructure database is required to derive the optimised multiscale solutions, which demonstrates a significant reduction in the computational expense of optimisation in comparison to contemporary sequential frameworks. The derivation and integration of novel additive manufacturing constraints for open-walled microstructures within the concurrently coupled multiscale topology optimisation framework is also presented. Problematic fabrication features are discouraged through the application of an augmented projection filter and two relaxed binary integral constraints, which prohibit the formation of unsupported members, isolated assemblies of overhanging members and slender members during optimisation. Through the application of these constraints, it is possible to derive self-supporting, hierarchical structures with varying topology, suitable for fabrication through additive manufacturing processes.Open Acces

    The Future of Work and Digital Skills

    Get PDF
    The theme for the events was "The Future of Work and Digital Skills". The 4IR caused a hollowing out of middle-income jobs (Frey & Osborne, 2017) but COVID-19 exposed the digital gap as survival depended mainly on digital infrastructure and connectivity. Almost overnight, organizations that had not invested in a digital strategy suddenly realized the need for such a strategy and the associated digital skills. The effects have been profound for those who struggled to adapt, while those who stepped up have reaped quite the reward.Therefore, there are no longer certainties about what the world will look like in a few years from now. However, there are certain ways to anticipate the changes that are occurring and plan on how to continually adapt to an increasingly changing world. Certain jobs will soon be lost and will not come back; other new jobs will however be created. Using data science and other predictive sciences, it is possible to anticipate, to the extent possible, the rate at which certain jobs will be replaced and new jobs created in different industries. Accordingly, the collocated events sought to bring together government, international organizations, academia, industry, organized labour and civil society to deliberate on how these changes are occurring in South Africa, how fast they are occurring and what needs to change in order to prepare society for the changes.Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ) British High Commission (BHC)School of Computin

    Foundations for programming and implementing effect handlers

    Get PDF
    First-class control operators provide programmers with an expressive and efficient means for manipulating control through reification of the current control state as a first-class object, enabling programmers to implement their own computational effects and control idioms as shareable libraries. Effect handlers provide a particularly structured approach to programming with first-class control by naming control reifying operations and separating from their handling. This thesis is composed of three strands of work in which I develop operational foundations for programming and implementing effect handlers as well as exploring the expressive power of effect handlers. The first strand develops a fine-grain call-by-value core calculus of a statically typed programming language with a structural notion of effect types, as opposed to the nominal notion of effect types that dominates the literature. With the structural approach, effects need not be declared before use. The usual safety properties of statically typed programming are retained by making crucial use of row polymorphism to build and track effect signatures. The calculus features three forms of handlers: deep, shallow, and parameterised. They each offer a different approach to manipulate the control state of programs. Traditional deep handlers are defined by folds over computation trees, and are the original con-struct proposed by Plotkin and Pretnar. Shallow handlers are defined by case splits (rather than folds) over computation trees. Parameterised handlers are deep handlers extended with a state value that is threaded through the folds over computation trees. To demonstrate the usefulness of effects and handlers as a practical programming abstraction I implement the essence of a small UNIX-style operating system complete with multi-user environment, time-sharing, and file I/O. The second strand studies continuation passing style (CPS) and abstract machine semantics, which are foundational techniques that admit a unified basis for implementing deep, shallow, and parameterised effect handlers in the same environment. The CPS translation is obtained through a series of refinements of a basic first-order CPS translation for a fine-grain call-by-value language into an untyped language. Each refinement moves toward a more intensional representation of continuations eventually arriving at the notion of generalised continuation, which admit simultaneous support for deep, shallow, and parameterised handlers. The initial refinement adds support for deep handlers by representing stacks of continuations and handlers as a curried sequence of arguments. The image of the resulting translation is not properly tail-recursive, meaning some function application terms do not appear in tail position. To rectify this the CPS translation is refined once more to obtain an uncurried representation of stacks of continuations and handlers. Finally, the translation is made higher-order in order to contract administrative redexes at translation time. The generalised continuation representation is used to construct an abstract machine that provide simultaneous support for deep, shallow, and parameterised effect handlers. kinds of effect handlers. The third strand explores the expressiveness of effect handlers. First, I show that deep, shallow, and parameterised notions of handlers are interdefinable by way of typed macro-expressiveness, which provides a syntactic notion of expressiveness that affirms the existence of encodings between handlers, but it provides no information about the computational content of the encodings. Second, using the semantic notion of expressiveness I show that for a class of programs a programming language with first-class control (e.g. effect handlers) admits asymptotically faster implementations than possible in a language without first-class control

    BECOMEBECOME - A TRANSDISCIPLINARY METHODOLOGY BASED ON INFORMATION ABOUT THE OBSERVER

    Get PDF
    ABSTRACT Andrea T. R. Traldi BECOMEBECOME A Transdisciplinary Methodology Based on Information about the Observer The present research dissertation has been developed with the intention to provide practical strategies and discover new intellectual operations which can be used to generate Transdisciplinary insight. For this reason, this thesis creates access to new knowledge at different scales. Firstly, as it pertains to the scale of new knowledge generated by those who attend Becomebecome events. The open-source nature of the Becomebecome methodology makes it possible for participants in Becomebecome workshops, training programmes and residencies to generate new insight about the specific project they are working on, which then reinforce and expand the foundational principles of the theoretical background. Secondly, as it pertains to the scale of the Becomebecome framework, which remains independent of location and moment in time. The method proposed to access Transdisciplinary knowledge constitutes new knowledge in itself because the sequence of activities, described as physical and mental procedures and listed as essential criteria, have never been found organised 6 in such a specific order before. It is indeed the order in time, i.e. the sequence of the ideas and activities proposed, which allows one to transform Disciplinary knowledge via a new Transdisciplinary frame of reference. Lastly, new knowledge about Transdisciplinarity as a field of study is created as a consequence of the heretofore listed two processes. The first part of the thesis is designated ‘Becomebecome Theory’ and focuses on the theoretical background and the intellectual operations necessary to support the creation of new Transdisciplinary knowledge. The second part of the thesis is designated ‘Becomebecome Practice’ and provides practical examples of the application of such operations. Crucially, the theoretical model described as the foundation for the Becomebecome methodology (Becomebecome Theory) is process-based and constantly checked against the insight generated through Becomebecome Practice. To this effect, ‘information about the observer’ is proposed as a key notion which binds together Transdisciplinary resources from several studies in the hard sciences and humanities. It is a concept that enables understanding about why and how information that is generated through Becomebecome Practice is considered of paramount importance for establishing the reference parameters necessary to access Transdisciplinary insight which is meaningful to a specific project, a specific person, or a specific moment in time

    A productive response to legacy system petrification

    Get PDF
    Requirements change. The requirements of a legacy information system change, often in unanticipated ways, and at a more rapid pace than the rate at which the information system itself can be evolved to support them. The capabilities of a legacy system progressively fall further and further behind their evolving requirements, in a degrading process termed petrification. As systems petrify, they deliver diminishing business value, hamper business effectiveness, and drain organisational resources. To address legacy systems, the first challenge is to understand how to shed their resistance to tracking requirements change. The second challenge is to ensure that a newly adaptable system never again petrifies into a change resistant legacy system. This thesis addresses both challenges. The approach outlined herein is underpinned by an agile migration process - termed Productive Migration - that homes in upon the specific causes of petrification within each particular legacy system and provides guidance upon how to address them. That guidance comes in part from a personalised catalogue of petrifying patterns, which capture recurring themes underlying petrification. These steer us to the problems actually present in a given legacy system, and lead us to suitable antidote productive patterns via which we can deal with those problems one by one. To prevent newly adaptable systems from again degrading into legacy systems, we appeal to a follow-on process, termed Productive Evolution, which embraces and keeps pace with change rather than resisting and falling behind it. Productive Evolution teaches us to be vigilant against signs of system petrification and helps us to nip them in the bud. The aim is to nurture systems that remain supportive of the business, that are adaptable in step with ongoing requirements change, and that continue to retain their value as significant business assets

    Graphical scaffolding for the learning of data wrangling APIs

    Get PDF
    In order for students across the sciences to avail themselves of modern data streams, they must first know how to wrangle data: how to reshape ill-organised, tabular data into another format, and how to do this programmatically, in languages such as Python and R. Despite the cross-departmental demand and the ubiquity of data wrangling in analytical workflows, the research on how to optimise the instruction of it has been minimal. Although data wrangling as a programming domain presents distinctive challenges - characterised by on-the-fly syntax lookup and code example integration - it also presents opportunities. One such opportunity is how tabular data structures are easily visualised. To leverage the inherent visualisability of data wrangling, this dissertation evaluates three types of graphics that could be employed as scaffolding for novices: subgoal graphics, thumbnail graphics, and parameter graphics. Using a specially built e-learning platform, this dissertation documents a multi-institutional, randomised, and controlled experiment that investigates the pedagogical effects of these. Our results indicate that the graphics are well-received, that subgoal graphics boost the completion rate, and that thumbnail graphics improve navigability within a command menu. We also obtained several non-significant results, and indications that parameter graphics are counter-productive. We will discuss these findings in the context of general scaffolding dilemmas, and how they fit into a wider research programme on data wrangling instruction

    Industry 4.0: product digital twins for remanufacturing decision-making

    Get PDF
    Currently there is a desire to reduce natural resource consumption and expand circular business principles whilst Industry 4.0 (I4.0) is regarded as the evolutionary and potentially disruptive movement of technology, automation, digitalisation, and data manipulation into the industrial sector. The remanufacturing industry is recognised as being vital to the circular economy (CE) as it extends the in-use life of products, but its synergy with I4.0 has had little attention thus far. This thesis documents the first investigating into I4.0 in remanufacturing for a CE contributing a design and demonstration of a model that optimises remanufacturing planning using data from different instances in a product’s life cycle. The initial aim of this work was to identify the I4.0 technology that would enhance the stability in remanufacturing with a view to reducing resource consumption. As the project progressed it narrowed to focus on the development of a product digital twin (DT) model to support data-driven decision making for operations planning. The model’s architecture was derived using a bottom-up approach where requirements were extracted from the identified complications in production planning and control that differentiate remanufacturing from manufacturing. Simultaneously, the benefits of enabling visibility of an asset’s through-life health were obtained using a DT as the modus operandi. A product simulator and DT prototype was designed to use Internet of Things (IoT) components, a neural network for remaining life estimations and a search algorithm for operational planning optimisation. The DT was iteratively developed using case studies to validate and examine the real opportunities that exist in deploying a business model that harnesses, and commodifies, early life product data for end-of-life processing optimisation. Findings suggest that using intelligent programming networks and algorithms, a DT can enhance decision-making if it has visibility of the product and access to reliable remanufacturing process information, whilst existing IoT components provide rudimentary “smart” capabilities, but their integration is complex, and the durability of the systems over extended product life cycles needs to be further explored

    Investigative Methods: An NCRM Innovation Collection

    Get PDF
    This Innovation Collection on investigative methods brings together investigators working in different domains, sectors, and on different topics of interest to help capture the breadth, scope and relevance of investigative practices over 10 substantive chapters. Each of the papers presents a different investigative method or set of methods and, through case studies, attempts to demonstrate their value. All the contributions, in different ways and for different purposes, seek to reconstruct acts, events, practices, biographies and/or milieux, to which the researchers in question lack direct access, but which they want to reconstruct via the traces those phenomena leave behind, traces themselves often produced as part of the phenomena under investigation. These include reports of methods used in investigations on: - The use of force by state actors, including into police violence, military decisions to attack civilians, the provenance of munitions used to attack civilians, and the use and abuse of tear gas; - Networks of far-right discourse, and its links to criminal attacks and state-leveraged misinformation campaigns; - Archives to establish the penal biographies of convicts and the historical practices of democratic petitioning; - Corporate structures and processes that enable tax avoidance and an avoidance of legal responsibilities to workers and the environment. A working principle of the collection is that investigative methods may be considered, alongside creative, qualitative, quantitative, digital, participatory and mixed methods, a distinct yet complementary style of research
    corecore