8,315 research outputs found

    Aligning teaching and learning approach and assessment method to enhance the quality of higher design education

    Get PDF
    The design of modern higher education programmes in the field of industrial and product design must strive to meet the career expectations of both employers and students. How educators attempt to match the expectations of both students and industry is of critical importance to the knowledge-driven UK economy. Recent research indicates that skills, knowledge and creativity (SKC) are equally important for both seasoned designers and graduates alike (Guo, 2011). These attributes are important, but not enough; the ability to manage, liaise with clients and apply real-world experiences are required by most employers (Zerillo, 2005). There is, however, a mismatch between employer’s expectations and the abilities of graduate students in a number of areas. In fact a staggering 21% of UK-based design consultancies said they were 'not at all satisfied' with the skills of the graduates they were employing in the areas of design skills, literacy, mathematical ability and, perhaps most importantly, business awareness. Teaching is the delivery or transmission of information/knowledge to facilitate students’ learning through growth or by changing their conceptions of the subject matter. This is done by a number of established teaching paradigms and learning models (Fox, 1983; Martin, 1990; Dall’ Alba, 1990; Samuelowicz and Bain, 1992). The teaching and learning methods employed at university levels 4 and 5 (years 1 and 2) are often rote and peer-based communication models as delivered in most secondary schools. In the final year of a degree, the teaching and learning style is often more aligned with the self-directed method, giving students greater freedom to select their preferred direction and equipping themselves to meet the requirements and needs of industry. The method of assessment plays a significant role in a student’s education: many students tend to place great emphasis on passing a module without first seeking to understand the content. Intelligent selection of assessment method is therefore critical in ensuring students are strategically steered towards learning the new skills and knowledge laid out in the syllabus. This paper shows that both the teaching and learning approach and the method of assessment method are of equal importance in curriculum/module design. Furthermore a roadmap, an idealized “learners journey” is presented for studies at the higher educational level. This model is considers student aspirations, university/HEIs’ curriculum design, skills expected of students to fulfil career goals and the needs of the product and industrial design industries

    Teams Frightened of Failure Fail More: Modelling Reward Sensitivity in Teamwork

    Get PDF
    According to Gray's Reinforcement Sensitivity Theory (RST), individuals have differing sensitivities to rewards and punishments, which in turn affect their behaviours. The behavioural inhibition system (BIS) is associated with sensitivity to punishment while the behavioural activation system (BAS) is associated with sensitivity to reward. In this work, we model BIS/BAS by supplementing an existing agent-based model of team collaboration in order to explore the combined effect on team performance for a more complex and realistic personality structure. We investigate the significance of BIS/BAS on team behaviour for tasks with differing levels of uncertainty. Findings include a prediction that for tasks with uncertainty, a majority of personality types are significantly influenced by behavioural activation system, and that all personality types are significantly negatively influenced by behavioural inhibition system. The more sensitive to punishments, the worse teams perform

    Interpersonal relations in China: Expatriates’ perspective on the development and use of guanxi

    Full text link
    © 2017 Elsevier Ltd The literature on social networks identifies relationship building through guanxi as an effective way for Western organizations to reduce their liability of foreignness in China. Even though it is individuals rather than organizations who build these relationships, the focus in previous literature has been on organizational outcomes, and only a handful of studies have attempted to explain how expatriates perceive guanxi relations are built and maintained. To help address this issue, we conducted in-depth, semi-structured interviews with 36 Western expatriates working in China. Our findings suggest that guanxi is perceived to be an informal process that is used to build trust between individuals, which in turn can reduce the uncertainty around contract enforcement in China. We also find that the process for building guanxi between parties is initiated by the individual whose organization has less market power. Finally, the findings suggest that firms should be cautious if they elect to use agents as intermediaries to help connect to, and build relations with buyers and sellers

    Analysis of reaction and timing attacks against cryptosystems based on sparse parity-check codes

    Full text link
    In this paper we study reaction and timing attacks against cryptosystems based on sparse parity-check codes, which encompass low-density parity-check (LDPC) codes and moderate-density parity-check (MDPC) codes. We show that the feasibility of these attacks is not strictly associated to the quasi-cyclic (QC) structure of the code but is related to the intrinsically probabilistic decoding of any sparse parity-check code. So, these attacks not only work against QC codes, but can be generalized to broader classes of codes. We provide a novel algorithm that, in the case of a QC code, allows recovering a larger amount of information than that retrievable through existing attacks and we use this algorithm to characterize new side-channel information leakages. We devise a theoretical model for the decoder that describes and justifies our results. Numerical simulations are provided that confirm the effectiveness of our approach

    Validation of a cationic polyacrylamide flocculant for the harvesting fresh and seawater microalgal biomass

    Full text link
    © 2019 Elsevier B.V. A simple, efficient, and fast settling flocculation technique to harvest microalgal biomass was demonstrated using a proprietary cationic polyacrylamide flocculant for a freshwater (Chlorella vulgaris) and a marine (Phaeodactylum tricornutum) microalgal culture at their mid-stationary growth phase. The optimal flocculant doses were 18.9 and 13.7 mg/g of dry algal biomass for C. vulgaris and P. tricornutum, respectively (equivalent to 7 g per m3 of algal culture for both species). The obtained optimal dose was well corroborated with changes in cell surface charge, and culture solution optical density and turbidity. At the optimal dose, charge neutralization of 64 and 86% was observed for C. vulgaris and P. tricornutum algal cells, respectively. Algae recovery was independent of the culture solution pH in the range of pH 6 to 9. Algal biomass recovery was achieved of 100 and 90% for C vulgaris and P. tricornutum respectively, and over 98% medium recovery was achievable by simple decanting

    Running Experiments with Confidence and Sanity

    Get PDF
    Analyzing data from large experimental suites is a daily task for anyone doing experimental algorithmics. In this paper we report on several approaches we tried for this seemingly mundane task in a similarity search setting, reflecting on the challenges it poses. We conclude by proposing a workflow, which can be implemented using several tools, that allows to analyze experimental data with confidence. The extended version of this paper and the support code are provided at https://github.com/Cecca/running-experiments

    LEDAkem: a post-quantum key encapsulation mechanism based on QC-LDPC codes

    Full text link
    This work presents a new code-based key encapsulation mechanism (KEM) called LEDAkem. It is built on the Niederreiter cryptosystem and relies on quasi-cyclic low-density parity-check codes as secret codes, providing high decoding speeds and compact keypairs. LEDAkem uses ephemeral keys to foil known statistical attacks, and takes advantage of a new decoding algorithm that provides faster decoding than the classical bit-flipping decoder commonly adopted in this kind of systems. The main attacks against LEDAkem are investigated, taking into account quantum speedups. Some instances of LEDAkem are designed to achieve different security levels against classical and quantum computers. Some performance figures obtained through an efficient C99 implementation of LEDAkem are provided.Comment: 21 pages, 3 table

    On the equivalence of Eulerian and Lagrangian variables for the two-component Camassa-Holm system

    Full text link
    The Camassa-Holm equation and its two-component Camassa-Holm system generalization both experience wave breaking in finite time. To analyze this, and to obtain solutions past wave breaking, it is common to reformulate the original equation given in Eulerian coordinates, into a system of ordinary differential equations in Lagrangian coordinates. It is of considerable interest to study the stability of solutions and how this is manifested in Eulerian and Lagrangian variables. We identify criteria of convergence, such that convergence in Eulerian coordinates is equivalent to convergence in Lagrangian coordinates. In addition, we show how one can approximate global conservative solutions of the scalar Camassa-Holm equation by smooth solutions of the two-component Camassa-Holm system that do not experience wave breaking

    A transcriptome-driven analysis of epithelial brushings and bronchial biopsies to define asthma phenotypes in U-BIOPRED

    Get PDF
    RATIONALE AND OBJECTIVES: Asthma is a heterogeneous disease driven by diverse immunologic and inflammatory mechanisms. We used transcriptomic profiling of airway tissues to help define asthma phenotypes. METHODS: The transcriptome from bronchial biopsies and epithelial brushings of 107 moderate-to-severe asthmatics were annotated by gene-set variation analysis (GSVA) using 42 gene-signatures relevant to asthma, inflammation and immune function. Topological data analysis (TDA) of clinical and histological data was used to derive clusters and the nearest shrunken centroid algorithm used for signature refinement. RESULTS: 9 GSVA signatures expressed in bronchial biopsies and airway epithelial brushings distinguished two distinct asthma subtypes associated with high expression of T-helper type 2 (Th-2) cytokines and lack of corticosteroid response (Group 1 and Group 3). Group 1 had the highest submucosal eosinophils, high exhaled nitric oxide (FeNO) levels, exacerbation rates and oral corticosteroid (OCS) use whilst Group 3 patients showed the highest levels of sputum eosinophils and had a high BMI. In contrast, Group 2 and Group 4 patients had an 86% and 64% probability of having non-eosinophilic inflammation. Using machine-learning tools, we describe an inference scheme using the currently-available inflammatory biomarkers sputum eosinophilia and exhaled nitric oxide levels along with OCS use that could predict the subtypes of gene expression within bronchial biopsies and epithelial cells with good sensitivity and specificity. CONCLUSION: This analysis demonstrates the usefulness of a transcriptomic-driven approach to phenotyping that segments patients who may benefit the most from specific agents that target Th2-mediated inflammation and/or corticosteroid insensitivity
    • 

    corecore