1,460 research outputs found

    Managing Intellectual Property to Foster Agricultural Development

    Get PDF
    Over the past decades, consideration of IPRs has become increasingly important in many areas of agricultural development, including foreign direct investment, technology transfer, trade, investment in innovation, access to genetic resources, and the protection of traditional knowledge. The widening role of IPRs in governing the ownership of—and access to—innovation, information, and knowledge makes them particularly critical in ensuring that developing countries benefit from the introduction of new technologies that could radically alter the welfare of the poor. Failing to improve IPR policies and practices to support the needs of developing countries will eliminate significant development opportunities. The discussion in this note moves away from policy prescriptions to focus on investments to improve how IPRs are used in practice in agricultural development. These investments must be seen as complementary to other investments in agricultural development. IPRs are woven into the context of innovation and R&D. They can enable entrepreneurship and allow the leveraging of private resources for resolving the problems of poverty. Conversely, IPRs issues can delay important scientific advancements, deter investment in products for the poor, and impose crippling transaction costs on organizations if the wrong tools are used or tools are badly applied. The central benefit of pursuing the investments outlined in this note is to build into the system a more robust capacity for strategic and flexible use of IPRs tailored to development goals

    A Heuristic Neural Network Structure Relying on Fuzzy Logic for Images Scoring

    Get PDF
    Traditional deep learning methods are sub-optimal in classifying ambiguity features, which often arise in noisy and hard to predict categories, especially, to distinguish semantic scoring. Semantic scoring, depending on semantic logic to implement evaluation, inevitably contains fuzzy description and misses some concepts, for example, the ambiguous relationship between normal and probably normal always presents unclear boundaries (normal − more likely normal - probably normal). Thus, human error is common when annotating images. Differing from existing methods that focus on modifying kernel structure of neural networks, this study proposes a dominant fuzzy fully connected layer (FFCL) for Breast Imaging Reporting and Data System (BI-RADS) scoring and validates the universality of this proposed structure. This proposed model aims to develop complementary properties of scoring for semantic paradigms, while constructing fuzzy rules based on analyzing human thought patterns, and to particularly reduce the influence of semantic conglutination. Specifically, this semantic-sensitive defuzzier layer projects features occupied by relative categories into semantic space, and a fuzzy decoder modifies probabilities of the last output layer referring to the global trend. Moreover, the ambiguous semantic space between two relative categories shrinks during the learning phases, as the positive and negative growth trends of one category appearing among its relatives were considered. We first used the Euclidean Distance (ED) to zoom in the distance between the real scores and the predicted scores, and then employed two sample t test method to evidence the advantage of the FFCL architecture. Extensive experimental results performed on the CBIS-DDSM dataset show that our FFCL structure can achieve superior performances for both triple and multiclass classification in BI-RADS scoring, outperforming the state-of-the-art methods

    Computing Fast and Scalable Table Cartograms for Large Tables

    Get PDF
    Given an m x n table T of positive weights and a rectangle R with an area equal to the sum of the weights, a table cartogram computes a partition of R into m x n convex quadrilateral faces such that each face has the same adjacencies as its corresponding cell in T, and has an area equal to the cell's weight. In this thesis, we explored different table cartogram algorithms for a large table with thousands of cells and investigated the potential applications of large table cartograms. We implemented Evans et al.'s table cartogram algorithm that guarantees zero area error and adapted a diffusion-based cartographic transformation approach, FastFlow, to produce large table cartograms. We introduced a constraint optimization-based table cartogram generation technique, TCarto, leveraging the concept of force-directed layout. We implemented TCarto with column-based and quadtree-based parallelization to compute table cartograms for table with thousands of cells. We presented several potential applications of large table cartograms to create the diagrammatic representations in various real-life scenarios, e.g., for analyzing spatial correlations between geospatial variables, understanding clusters and densities in scatterplots, and creating visual effects in images (i.e., expanding illumination, mosaic art effect). We presented an empirical comparison among these three table cartogram techniques with two different real-life datasets: a meteorological weather dataset and a US State-to-State migration flow dataset. FastFlow and TCarto both performed well on the weather data table. However, for US State-to-State migration flow data, where the table contained many local optima with high value differences among adjacent cells, FastFlow generated concave quadrilateral faces. We also investigated some potential relationships among different measurement metrics such as cartographic error (accuracy), the average aspect ratio (the readability of the visualization), computational speed, and the grid size of the table. Furthermore, we augmented our proposed TCarto with angle constraint to enhance the readability of the visualization, conceding some cartographic error, and also inspected the potential relationship of the restricted angles with the accuracy and the readability of the visualization. In the output of the angle constrained TCarto algorithm on US State-to-State migration dataset, it was difficult to identify the rows and columns for a cell upto 20 degree angle constraint, but appeared to be identifiable for more than 40 degree angle constraint

    Teachers’ education in GCE: emerging issues in a comparative perspective

    Get PDF
    Global Citizenship Education (GCE) has received increasing attention as a means of supporting children and young people in developing their knowledge and understanding of multiple global issues (Bourn, 2015; Fricke & Gathercole, 2015). Despite this increasing prominence, it is apparent that GCE remains a highly contested notion (Marshall, 2005; Hartung, 2017; Jooste & Heleta, 2017). The report draws on an analysis of nine Global Citizenship Education teacher education programmes across four EU countries: Austria, Czech Republic, Ireland and Italy. The methodology adopted is a multiple-site case study design (Yin, 2014), using ethnography as main method. The research employed a qualitative framework, seeking a description of practices and meaning making to behavioural patterns. Within each country, the research focused on an analysis of two typologies of settings: a) a training course for in-service primary school teachers organized by the project partner in the framework of the project, b) a training course organized by a different organization on themes related to Global Citizenship Education. The selection of the second setting was based on extreme case sampling. This setting was selected for being as different as possible from the initial one, in terms of approach, goals, teachers’ trainers and organization (but on the same or related theme). The theoretical assumption behind this choice is that being organized by a different institution the ideas behind course implementation are more likely to be different. Based on this the report argues that the teacher education programmes examined present an argument as to how teacher education programmes at the focus of this research present a transformational approach to GCE, where the concepts of critical thinking and self-reflection are perceived as the foundations towards action for a more just and sustainable world. This conception echoes aspects of Freirean pedagogy, itself an important GCE theoretical framework (Scheunpflug & Asbrand, 2006). However, less evident was the ‘critical’ approach to GCE, illustrated in the work of Andreotti (2006)

    Probing with Noise: Unpicking the Warp and Weft of Taxonomic and Thematic Meaning Representations in Static and Contextual Embeddings

    Get PDF
    The semantic relatedness of words has two key dimensions: it can be based on taxonomic information or thematic, co-occurrence-based information. These are captured by different language resources—taxonomies and natural corpora—from which we can build different computational meaning representations that are able to reflect these relationships. Vector representations are arguably the most popular meaning representations in NLP, encoding information in a shared multidimensional semantic space and allowing for distances between points to reflect relatedness between items that populate the space. Improving our understanding of how different types of linguistic information are encoded in vector space can provide valuable insights to the field of model interpretability and can further our understanding of different encoder architectures. Alongside vector dimensions, we argue that information can be encoded in more implicit ways and hypothesise that it is possible for the vector magnitude—the norm—to also carry linguistic information. We develop a method to test this hypothesis and provide a systematic exploration of the role of the vector norm in encoding the different axes of semantic relatedness across a variety of vector representations, including taxonomic, thematic, static and contextual embeddings. The method is an extension of the standard probing framework and allows for relative intrinsic interpretations of probing results. It relies on introducing targeted noise that ablates information encoded in embeddings and is grounded by solid baselines and confidence intervals. We call the method probing with noise and test the method at both the word and sentence level, on a host of established linguistic probing tasks, as well as two new semantic probing tasks: hypernymy and idiomatic usage detection. Our experiments show that the method is able to provide geometric insights into embeddings and can demonstrate whether the norm encodes the linguistic information being probed for. This confirms the existence of separate information containers in English word2vec, GloVe and BERT embeddings. The experiments and complementary analyses show that different encoders encode different kinds of linguistic information in the norm: taxonomic vectors store hypernym-hyponym information in the norm, while non-taxonomic vectors do not. Meanwhile, non-taxonomic GloVe embeddings encode syntactic and sentence length information in the vector norm, while the contextual BERT encodes contextual incongruity. Our method can thus reveal where in the embeddings certain information is contained. Furthermore, it can be supplemented by an array of post-hoc analyses that reveal how information is encoded as well, thus offering valuable structural and geometric insights into the different types of embeddings

    Absorptive Capacity in SMEs: A Comparative Study of the Financial and the Tourism Sectors in Malta

    Get PDF
    Many modern economies are largely characterised by knowledge intensive service industries, constantly battling the ferociously competitive business environment. As a result, the management of a firm’s knowledge has become crucial in determining the sustainable competitive success of an organization. This research compares the knowledge management practices of service sector firms, particularly those service sector firms that are knowledge intensive, such as the financial services sector, and those that are less knowledge intensive, such as firms in the tourism sector. The study was conducted using a mixed methodology comprising in-depth face-to-face interviews and a qualitative survey. Structural Equation Modelling has been used to interpret the data collected from the survey. This study proposes a framework designed specifically to explain the absorptive capacity in service sector SMEs. The framework being presented (figure 9.2, p.404) shows how in small service sector firms, power relationships act as driving factors the internal and external processes and routines of the firm, which, in turn, shape ACAP, This analysis exposes seventeen points of interest, which identifies the Knowledge Management (KM) behaviour of firms in the tourism and in the financial services sectors and reveals eleven convergent practices across both sectors. The study proceeds to identify six divergent KM practices across the industries and a further three points wherein the firms in the financial services sector gave evidence of differing practices amongst themselves. The overarching conclusion from this study, however, is that the behaviour of SMEs is greatly influenced by their size, which, in turn dictates the extent of the influence and control which the owner exercises on the operatio

    GEOBIA 2016 : Solutions and Synergies., 14-16 September 2016, University of Twente Faculty of Geo-Information and Earth Observation (ITC): open access e-book

    Get PDF

    The 2P-K Framework: A Personal Knowledge Measurement Framework for the Pharmaceutical Industry

    Get PDF
    Knowledge is a dynamic human process to justify our personal belief in pursuit of the truth. The intellectual output of any organisation is reliant upon the individual people within that organisation. Despite the eminent role of personal knowledge in organisations, personal knowledge management and measurement have received little attention, particularly in pharmaceutical manufacturing. The pharmaceutical industry is one of the pillars of the global economy and a knowledge-intensive sector where knowledge is described as the second product after medicines. The need of measurement to achieve effective management is not a new concept in management literature. This study offers an explanatory framework for personal knowledge, its underlying constructs and observed measures in the pharmaceutical manufacturing context. Following a sequential mixed method research (MMR) design, the researcher developed a measurement framework based on the thematic analysis of fifteen semi-structured interviews with industry experts and considering the extant academic and regulatory literature. A survey of 190 practitioners from the pharmaceutical manufacturing sector enabled quantitative testing and validation of the proposed models utilising confirmatory factor analysis. The pharmaceutical personal knowledge framework was the fruit of a comprehensive study to explain and measure the manifestations of personal knowledge in pharmaceutical organisations. The proposed framework identifies 41 personal knowledge measures reflecting six latent factors and the underlying personal knowledge. The hypothesised factors include: regulatory awareness, performance, wisdom, organisational understanding, mastership of product and process besides communication and networking skills. In order to enhance the applicability and flexibility of the measurement framework, an abbreviated 15-item form of the original framework was developed. The abbreviated pharmaceutical personal knowledge (2P-K) framework demonstrated superior model fit, better accuracy and reliability. The research results reveal that over 80% of the participant pharmaceutical organisations had a form of structured KM system. However, less than 30% integrated KM with corporate strategies suggesting that KM is still in the early stages of development in the pharmaceutical industry. Also, personal knowledge measurement is still a subjective practice and predominately an informal process. The 2P-K framework offers researchers and scholars a theoretically grounded original model for measuring personal knowledge. Also, it offers a basis for a personal knowledge measurement scale (2P-K-S) in the pharmaceutical manufacturing context. Finally, the study had some limitations. The framework survey relied on self-ratings. This might pose a risk of social desirability bias and Dunning–Kruger effect. Consequently, a 360- degree survey was suggested to achieve accurate assessments. Also, the model was developed and tested in an industry-specific context. A comparative study in similar manufacturing industries (e.g. chemical industries) is recommended to assess the validity of the current model or a modified version of it in other industries

    working paper series

    Get PDF
    Interest in business management thinking and innovation has continued to grow during recent decades. The Scottish Government identifies that a large proportion of new and start up businesses fail within the first 2 years. Consequently, there are many areas for the start-up entrepreneur to get information and help, nonetheless the trend remains. This study offers an alternative method for deciding on intrinsic success factors by outlining the relationship between business start-ups, creativity, and innovation. The focus was on creativity, as an entrepreneurial characteristic, links or effects the start-up capability of the entrepreneur. The study used a qualitative method to interpret this complexity and this became more apparent as the study progresses since innovation and innovation which supports a business start-up assume holistic, flux-like and complex concepts. Four main themes emerged from the thematic data analysis; Leadership; Ability to Change; Creativeness and Collaboration. Findings from the study indicate that business management thinking and innovation underpinned by perspective themes, help the entrepreneur see and appreciate the complex multi-faceted interactions of innovation, perhaps better than an average person. However, actual definition of the precise mechanisms needed to support business start-ups drawn from creativity were difficult to establish. In conclusion, the study has to say that while elements of creativity were present with each of the entrepreneur and were clearly significant to the success of the start up, it would seem very difficult to actually identify if there is such a thing as a guaranteed creativity template for succes
    • 

    corecore