537,692 research outputs found

    Concept Maps as Sites of Rhetorical Invention: Teaching the Creative Act of Synthesis as a Cognitive Process

    Get PDF
    Synthesis is one of the most cognitively demanding practices novice writers must undertake, and research demonstrates that first-year students’ synthesis writing practices result in more knowledge telling rather than knowledge creation and transforming. Pedagogies used to teach synthesis often focus on developing text-building strategies but lack explicit instruction on the more cognitively demanding conceptualizing behavior. To explore alternative pedagogies and heuristics, this study looks beyond composition scholarship to incorporate studies in neuroeducation and rhetoric to define synthesis as an ongoing, generative act of cognitive invention, effectively shifting pedagogical focus from text-centered product to student-centered cognitive processes that inform development of synthesized texts (a product). The methods were designed to explore any effects a visual intervention might have on developing student conceptual awareness and reflective practice over time, and whether that transferred into a final researched essay as knowledge transforming. This small-scale exploratory study applies a mixed-methods, design-based methodology to a semester-long intervention in first-year writing classrooms using digital concept maps (DCMAPs) as an ongoing, student-designed space of visualized concept construction. A Control group applied traditional reading-to-write text-based synthesis instruction and practice, while the Intervention group used DCMAPs to enact a prolonged, visualized and reflective practice of active construction of associations, relationships, and structural knowledge building. The DCMAP platform affordances positioned students as knowledge designers enacting creative / constructive processes, an approach based on neuroscience research on patterning and visualization. Intervention data includes reflective journals, narrated mapping process reflections, digital concept map images and construction processes, and a final researched essay that required synthesis of source ideas. Because of the exploratory nature of the study, results are not framed as cause-effect but as correlational possibilities that suggest inventional acts of visually creating connections and labeling them using rhetorically-based associational concepts lead to generative learning behaviors. Results suggest a number of possibilities for future iterations and research, as well as implications for our field’s approach to the teaching of synthesis

    Ambiguities in decision-oriented Life Cycle Inventories The Role of mental models

    Get PDF
    If the complexity of real, socio-economic systems is acknowledged, life cycle inventory analysis (LCI) in life cycle assessment (LCA) cannot be considered as unambiguous, objective, and as an exclusively data and science based attribution of material and energy flows to a product. The paper thus suggests a set of criteria for LCI derived from different scientific disciplines, practice of product design and modelling characteristics of LCI and LCA. A product system with its respective LCI supporting the process of effective and efficient decision-making should ideally be: a) complete, operational, decomposable, non-redundant, minimal, and comparable; b) efficient, i.e., as simple, manageable, transparent, cheap, quick, but still as ‘adequate' as possible under a functionalistic perspective which takes given economic constraints, material and market characteristics, and the goal and scope of the study into account; c) actor-based when reflecting the decision-makers' action space, risk-level, values, and knowledge (i.e. mental model) in view of the management rules of sustainable development; d) as site- and case-specific as possible, i.e. uses as much site-specific information as possible. This rationale stresses the significance of considering both (i) material and energy flows within the technosphere with regard to the sustainable management rules; (ii) environmental consequences of the environmental interventions on ecosphere. Further, the marginal cost of collecting and computing more and better information about environmental impacts must not exceed the marginal benefits of information for the natural environment. The ratio of environmental benefits to the economic cost of the tool must be efficient compared to other investment options. As a conclusion, in comparative LCAs, the application of equal allocation procedures does not lead to LCA-results on which products made from different materials can be compared in an adequate way. Each product and material must be modelled according to its specific material and market characteristics as well as to its particular management rules for their sustainable use. A generic LCA-methodology including preferences on methodological options is not definabl

    A multisensor approach for improved protein A load phase monitoring by conductivity-based background subtraction of UV spectra

    Get PDF
    Real‐time monitoring and control of protein A capture steps by process analytical technologies (PATs) promises significant economic benefits due to the improved usage of the column\u27s binding capacity, by eliminating time‐consuming off‐line analytics and costly resin lifetime studies, and enabling continuous production. The PAT method proposed in this study relies on ultraviolet (UV) spectroscopy with a dynamic background subtraction based on the leveling out of the conductivity signal. This point in time can be used to collect a reference spectrum for removing the majority of spectral contributions by process‐related contaminants. The removal of the background spectrum facilitates chemometric model building and model accuracy. To demonstrate the benefits of this method, five different feedstocks from our industry partner were used to mix the load material for a case study. To our knowledge, such a large design space, which covers possible variations in upstream condition besides the product concentration, has not been disclosed yet. By applying the conductivity‐based background subtraction, the root mean square error of prediction (RMSEP) of the partial least squares (PLS) model improved from 0.2080 to 0.0131 gL−1^{-1}. Finally, the potential of the background subtraction method was further evaluated for single wavelength‐based predictions to facilitate implementation in production processes. An RMSEP of 0.0890 gL−1^{-1} with univariate linear regression was achieved, showing that by subtraction of the background better prediction accuracy is achieved then without subtraction and a PLS model. In summary, the developed background subtraction method is versatile, enables accurate prediction results, and is easily implemented into existing chromatography setups with typically already integrated sensors

    Modelling a fluidized wet granulation process

    Get PDF
    Trabalho Final de Mestrado Integrado, CiĂȘncias FarmacĂȘuticas, 2020, Universidade de Lisboa, Faculdade de FarmĂĄcia.Tradicionalmente, a produção de medicamentos pela indĂșstria farmacĂȘutica Ă© realizada em modo descontĂ­nuo sendo o produto acabado libertado apĂłs a sua verificação, tambĂ©m designado por quality-by-testing. Este fato decorre em parte devido ao elevado nĂ­vel de regulação, o que, no passado, juntamente com outros fatores, dificultou a transição para processos contĂ­nuos e novos paradigmas de avaliação da qualidade. Atualmente, a transição para processos de produção em contĂ­nuo começou a ser incentivada pelas autoridades reguladoras, sendo que estes processos serĂŁo vantajosos nĂŁo sĂł para a indĂșstria, por aumentar a eficiĂȘncia dos processos de produção, mas tambĂ©m para os consumidores, fornecendo uma maior consistĂȘncia na qualidade dos produtos fabricados. A Food and Drug Administration (FDA) e a European Medicines Agency (EMA), assim como as novas guidelines do Conselho Internacional para Harmonização de Requisitos TĂ©cnicos de Produtos FarmacĂȘuticos para Uso Humano (ICH). incentivam agora o desenvolvimento de processos de produção de medicamentos baseados no conceito de Quality-By-Design (QbD). Permitindo implementar processos contĂ­nuos de produção baseando-se num conhecimento aprofundado das principais variĂĄveis que influenciam o processo de fabrico, de forma a conceber um produto de qualidade e, tendo em conta que a qualidade nĂŁo deve ser testada no produto final, mas sim desenvolvida desde a primeira etapa de produção. Com o conceito de QbD em mente, foi decidido, neste estudo, desenvolver um design space (DS) para um processo de granulação, uma vez que se trata de um processo importante na produção de vĂĄrias formas farmacĂȘuticas. Para isso, estudou-se uma combinação de variĂĄveis e parĂąmetros do processo que demostram resultar num produto de qualidade, ou seja, dentro das especificaçÔes que foram estabelecidas. As variĂĄveis escolhidas para este estudo incluĂ­ram a formulação e parĂąmetros identificados como crĂ­ticos do processo. Para alcançar este objetivo, foi utilizado um mĂ©todo de delineamento experimental de forma a definir os ensaios a realizar. Os grĂąnulos foram testados de acordo com vĂĄrios parĂąmetros de qualidade de forma a estabelecer o DS. De modo a testar a influĂȘncia do processo de granulação na forma farmacĂȘutica final, foi decidido tambĂ©m proceder Ă  produção de comprimidos, que foram tambĂ©m testados.Traditionally the production of medicines by the pharmaceutical industry is done in batches with the finished product released after verification, also known as quality-by-testing. This is due to the fact that it is an industry with tight regulations that, in the past, together with other factors, made the transition to continuous processes, as well as quality control methods difficult. Currently, the transition to continuous processes has been encouraged by regulatory authorities as these processes will be advantageous not only for the industry, as it increases the efficiency of the processes, but also for the consumers, providing higher quality to the manufactured product. Food and Drug Administration (FDA) and the European Medicines Agency (EMA), as well as the new guidelines from the International Council for Harmonization of Technical Requirements for Pharmaceutical Products for Human Use (ICH) now encourage the development of drug manufacturing processes based on the concept of quality-bydesign(QbD), with the opportunity of implementing continuous production processes based on an in-depth knowledge of the main variables that influence the manufacturing process, in order to design a quality product, with the notion that quality cannot be tested on a product but developed from the ground up starting with the first step of production. With the concept of QbD in mind, it was decided, in this study, to develop a design space (DS) for a granulation process, an important process in the production of various pharmaceutical forms, studying a combination of variables and process parameters that guarantee to result in a product with quality, that is, within the specifications that were decided. The variables chosen for this process included the formulation and critical parameters of the process. To achieve this objective, experimental design was used in order to establish the most important tests. The granules were tested in various experiments suitable to our variables in order to establish the DS. In order to test the influence of the granulation process on the final pharmaceutical form, it was also decided to proceed with the production of tablets, which were also tested.Com o patrocĂ­nio da Faculdade de FarmĂĄcia da Universidade de Lisboa

    Optimisation of a legacy product with a history of tablet friability failures utilising quality by design

    Get PDF
    The concept of Quality by Design (QbD) was introduced as a method of building quality into the product during the initial stages of manufacturing. This study explores the suitability of utilising QbD to optimise a legacy product. With the aid of QbD, a higher level of quality assurance and product knowledge was achieved. Sound scientific and risk-based decisions allowed for a robust manufacturing process with inherent operational quality and flexibility. By the establishment a quality target product profile (QTPP) and determining the influence of the critical processing parameters (CPP's) on the product's critical quality attributes (cQA's) the process understanding of Product X can be more accurately defined. The relationships between several explanatory variables will be explored by using a sequence of Design of Experiments (DoE) to obtain an optimal response. The DoE were performed and analysed using MinitabŸ statistical software version 17.0 (Minitab Inc., United Kingdom). A Response Surface Methodology (RSM) using a central composite experimental design (CCD) was utilised to capture the data. The data was analysed using the collection of statistical models (ANOVA) to analyse the differences between the means and their associated procedures. Input variables investigated were: compression machine tooling shape, hardness, and loss on drying LOD (post drying). The significant value (α) of 0.05 helped to determine if the null hypothesis would be accepted or rejected. The DoE identified the factors that had the highest risk of affecting the output variables and helped to establish the design space. Post completion of the DoE, a confirmatory batch was made which served as a diagnostic tool for evaluating the effectiveness of the generated model. The establishment of a strategy to control the variables and responses is of critical importance in order to appropriately use the flexibility given to products developed or optimised using QbD principles. This study show that the structured approach used in Quality by Design methodology can be successfully applied to optimise a commercialised legacy product

    A foundation for machine learning in design

    Get PDF
    This paper presents a formalism for considering the issues of learning in design. A foundation for machine learning in design (MLinD) is defined so as to provide answers to basic questions on learning in design, such as, "What types of knowledge can be learnt?", "How does learning occur?", and "When does learning occur?". Five main elements of MLinD are presented as the input knowledge, knowledge transformers, output knowledge, goals/reasons for learning, and learning triggers. Using this foundation, published systems in MLinD were reviewed. The systematic review presents a basis for validating the presented foundation. The paper concludes that there is considerable work to be carried out in order to fully formalize the foundation of MLinD
    • 

    corecore