1,252,278 research outputs found

    High efficiency cathodes for SOFCs prepared by spray-pyrolysis

    Get PDF
    In recent years, lowering the operating temper-ature of the Solid Oxide Fuel Cells (SOFCs) to the intermediate temperature range (500-700 ÂșC) has become the main challenge for this technology. The electrolyte resistance might be substantially reduced by using thin film electrolytes. However, the cathode polarization resitance is responsible for much of the loss in performance at low temperatures. Thus, the development of cathode materials with high electro-catalytic activity for the oxygen reduction reaction (ORR) is essential for this technology. Lanthanum strontium manganite La1-xSrxMnO3- (LSM) is the cathode material most widely used in SOFCs. However, LSM exhibits high activation energy for oxygen reduction reaction (ORR) and poor ionic conductivity, limiting its application at high temperatures. Alternative mixed ionic-electronic conductors, such as La1-xSrxCo1-yFeyO3-ÎŽ (LSCF) and GdBaCo2O5+x (GBC) has been investi-gated, exhibiting better performances in the inter-mediate temperature range [1]. The performance of these electrodes might be improved at reduced temperature by extending the triple phase bounda-ry length at which gas, electrode and electrolyte phases are simultaneously in contact, serving as the predominant site for the electrochemical reac-tions. To date, the preparation of electrodes via wet infiltration of a cation solution into a porous electrolyte backbone is one of the most effective methods to increase the TPB area and to improve the efficiency of the cathodes, despite the limitations of this process for large-scale manufacturing of SOFCs. In this contribution an alternative preparation method based on spray-pyrolisis deposition into an electrolyte backbone is proposed, which posses a series of advantages with respect to the classical wet infiltration process, including easy industrial implementation, preparation in one single deposition/thermal step as well as low cost [2]. The most widely used cathodes in SOFCs technology were prepared by this alternative method process: La1-xSrxMnO3- and La0.6Sr0.4Co1-yFeyO3-ÎŽ (y = 0-2) series. The electrodes were deposited on porous Ce0.8Gd0.1O1.95 (CGO) backbones at 250 ÂșC by conventional spray pyrolysis from an aqueous precursor solution of metal nitrates. The structure, microstructure and electrochemical properties of these materials have been investigated by X-ray diffraction, field-emission SEM (Fig. 1.a) and im-pedance spectroscopy in symmetrical cells. The values of polarization resistance (Fig. 1.b) are ex-tremely low, ranging from 0.40 cm2 for LSM to 0.07 cm2 for LSCF0.2 at 600 ÂșC in air, compared to those previously reported in the literature for commercial electrolytes deposited at high tempera-ture, e.g. 25 cm2 for LSM.Universidad de MĂĄlaga. Campus de Excelencia Internacional AndalucĂ­a Tech

    Exploring the Impact of Performance-Based Funding Policy Reform: The Role of Institutional Research in Supporting Data-Driven Decision-Making

    Get PDF
    The institutional pressures placed on the Ontario college system, exercised through funding model reform, brought forward organizational challenges difficult for even the most fiscally savvy to navigate. The enrollment corridor mechanism and the expansion of the proportions of the differentiation envelope to create a performance-based grant, implemented via the 2020-25 Strategic Mandate Agreement (SMA3), demonstrate the Provincial Government’s calls for efficiencies and accountability and the alignment of institutional and provincial priorities. Remaining financially sustainable while moving from performance reporting to performance funding and weathering the impacts of the Covid-19 pandemic requires a solid understanding of not only enrollment challenges and opportunities but also data and information used to inform decisions. Institutional Research (IR) units are responsible for providing leaders with data and information for this work. However, access to data and information does not imply their effective use (Marsh et al., 2006), pointing to a gap in data literacy skills amongst higher education leaders (Mathies, 2018). The problem of practice that will be examined is the role of IR in supporting effective data-driven decision-making related to achievement of the College X enrollment and SMA3 priorities. This Organizational Improvement Plan proposes that an existing Strategic Enrollment Management governance structure be leveraged for development and implementation of a group-level capacity building strategy. The planned change is used to inform enhancements to existing data tools and resources responsive to stakeholder needs and mindful of organizational context. The Change Path Model (Cawsey et al., 2016) provides the framework to implement this solution using distributed and adaptive leadership approaches

    Damage identification in structural health monitoring: a brief review from its implementation to the Use of data-driven applications

    Get PDF
    The damage identification process provides relevant information about the current state of a structure under inspection, and it can be approached from two different points of view. The first approach uses data-driven algorithms, which are usually associated with the collection of data using sensors. Data are subsequently processed and analyzed. The second approach uses models to analyze information about the structure. In the latter case, the overall performance of the approach is associated with the accuracy of the model and the information that is used to define it. Although both approaches are widely used, data-driven algorithms are preferred in most cases because they afford the ability to analyze data acquired from sensors and to provide a real-time solution for decision making; however, these approaches involve high-performance processors due to the high computational cost. As a contribution to the researchers working with data-driven algorithms and applications, this work presents a brief review of data-driven algorithms for damage identification in structural health-monitoring applications. This review covers damage detection, localization, classification, extension, and prognosis, as well as the development of smart structures. The literature is systematically reviewed according to the natural steps of a structural health-monitoring system. This review also includes information on the types of sensors used as well as on the development of data-driven algorithms for damage identification.Peer ReviewedPostprint (published version

    Final report on the farmer's aid in plant disease diagnoses

    Get PDF
    This report is the final report on the FAD project. The FAD project was initiated in september 1985 to test the expert system shell Babylon by developing a prototype crop disease diagnosis system in it. A short overview of the history of the project and the main problems encountered is given in chapter 1. Chapter 2 describes the result of an attempt to integrate JSD with modelling techniques like generalisation and aggregation and chapter 3 concentrates on the method we used to elicit phytopathological knowledge from specialists. Chapter 4 gives the result of knowledge acquisition for the 10 wheat diseases most commonly occurring in the Netherlands. The user interface is described briefly in chapter 5 and chapter 6 gives an overview of the additions to the implementation we made to the version of FAD reported in our second report. Chapter 7, finally, summarises the conclusions of the project and gives recommendations for follow-up projects

    Standardization Framework for Sustainability from Circular Economy 4.0

    Get PDF
    The circular economy (CE) is widely known as a way to implement and achieve sustainability, mainly due to its contribution towards the separation of biological and technical nutrients under cyclic industrial metabolism. The incorporation of the principles of the CE in the links of the value chain of the various sectors of the economy strives to ensure circularity, safety, and efficiency. The framework proposed is aligned with the goals of the 2030 Agenda for Sustainable Development regarding the orientation towards the mitigation and regeneration of the metabolic rift by considering a double perspective. Firstly, it strives to conceptualize the CE as a paradigm of sustainability. Its principles are established, and its techniques and tools are organized into two frameworks oriented towards causes (cradle to cradle) and effects (life cycle assessment), and these are structured under the three pillars of sustainability, for their projection within the proposed framework. Secondly, a framework is established to facilitate the implementation of the CE with the use of standards, which constitute the requirements, tools, and indicators to control each life cycle phase, and of key enabling technologies (KETs) that add circular value 4.0 to the socio-ecological transition

    Mixing multi-core CPUs and GPUs for scientific simulation software

    Get PDF
    Recent technological and economic developments have led to widespread availability of multi-core CPUs and specialist accelerator processors such as graphical processing units (GPUs). The accelerated computational performance possible from these devices can be very high for some applications paradigms. Software languages and systems such as NVIDIA's CUDA and Khronos consortium's open compute language (OpenCL) support a number of individual parallel application programming paradigms. To scale up the performance of some complex systems simulations, a hybrid of multi-core CPUs for coarse-grained parallelism and very many core GPUs for data parallelism is necessary. We describe our use of hybrid applica- tions using threading approaches and multi-core CPUs to control independent GPU devices. We present speed-up data and discuss multi-threading software issues for the applications level programmer and o er some suggested areas for language development and integration between coarse-grained and ne-grained multi-thread systems. We discuss results from three common simulation algorithmic areas including: partial di erential equations; graph cluster metric calculations and random number generation. We report on programming experiences and selected performance for these algorithms on: single and multiple GPUs; multi-core CPUs; a CellBE; and using OpenCL. We discuss programmer usability issues and the outlook and trends in multi-core programming for scienti c applications developers

    Low Power Implementation of Non Power-of-Two FFTs on Coarse-Grain Reconfigurable Architectures

    Get PDF
    The DRM standard for digital radio broadcast in the AM band requires integrated devices for radio receivers at very low power. A System on Chip (SoC) call DiMITRI was developed based on a dual ARM9 RISC core architecture. Analyses showed that most computation power is used in the Coded Orthogonal Frequency Division Multiplexing (COFDM) demodulation to compute Fast Fourier Transforms (FFT) and inverse transforms (IFFT) on complex samples. These FFTs have to be computed on non power-of-two numbers of samples, which is very uncommon in the signal processing world. The results obtained with this chip, lead to the objective to decrease the power dissipated by the COFDM demodulation part using a coarse-grain reconfigurable structure as a coprocessor. This paper introduces two different coarse-grain architectures: PACT XPP technology and the Montium, developed by the University of Twente, and presents the implementation of a\ud Fast Fourier Transform on 1920 complex samples. The implementation result on the Montium shows a saving of a factor 35 in terms of processing time, and 14 in terms of power consumption compared to the RISC implementation, and a\ud smaller area. Then, as a conclusion, the paper presents the next steps of the development and some development issues

    The 2007-13 operational programmes: a preliminary assessment: Spring – Autumn 2005

    Get PDF
    A preliminary assessment of the 2007-13 operational programmes on EU cohesion policy

    Obvious: a meta-toolkit to encapsulate information visualization toolkits. One toolkit to bind them all

    Get PDF
    This article describes “Obvious”: a meta-toolkit that abstracts and encapsulates information visualization toolkits implemented in the Java language. It intends to unify their use and postpone the choice of which concrete toolkit(s) to use later-on in the development of visual analytics applications. We also report on the lessons we have learned when wrapping popular toolkits with Obvious, namely Prefuse, the InfoVis Toolkit, partly Improvise, JUNG and other data management libraries. We show several examples on the uses of Obvious, how the different toolkits can be combined, for instance sharing their data models. We also show how Weka and RapidMiner, two popular machine-learning toolkits, have been wrapped with Obvious and can be used directly with all the other wrapped toolkits. We expect Obvious to start a co-evolution process: Obvious is meant to evolve when more components of Information Visualization systems will become consensual. It is also designed to help information visualization systems adhere to the best practices to provide a higher level of interoperability and leverage the domain of visual analytics
    • 

    corecore