128 research outputs found
Absorptive capacity and internationalization of New Zealand high-tech SMEs in the agro-technology sector
This study investigates the relationships between firm's technology, absorptive capacity and the internationalization process in the high-tech SMEs. The research identifies the most influential factors that affect the international activities and expansion decisions of New Zealand high-tech SMEs with core capabilities in agro-technology.
Mixed methods, qualitative and quantitative elements in the data collection and analysis, were employed in this research for a reason that a deeper understanding of the research subject and the analysis of complex issues such as the internationalization process and absorptive capacity required methodological variety. The use of qualitative and quantitative methods took place in parallel. Both methods were used to study the same subject but they had specific objective related purposes and they offered the possibility of developing rich empirical data as well as a more comprehensive understanding of the subject under the study.
The findings show that it is absorptive capacity that explains internationalization process, not internationalization process that explains absorptive capacity. The practice of internationalizing is as much a reflection of a firm's absorptive capacity as it is its determinant. The research identifies that high-tech SMEs possess technological and non-core absorptive capacity which in a different way influence firms' strategies. The research suggests that firm's technological capabilities and the advantage of specialized knowledge along with their limited non-core absorptive capacity act as constraints to the development of the future international strategy in high-tech SMEs.
The study expands the existing literature on internationalization by developing variables for evaluating absorptive capacity in firms. This helped develop an absorptive capacity model which can be used as a valuable tool for self-assessment by firms to facilitate gaining insight towards further growth and development. The research suggested that if firms were able to measure its absorptive capacity this may result in improved business activities and enhanced presence in the world market.
The results of this study should encourage firms to identify, capture and articulate knowledge achieved by their ventures. Managers must develop and nurture skills that ensure effective integration of learning as their firms expand, particularly internationally. These findings and absorptive capacity model offered as a tool should encourage managers to explore when, where, and how to best use firm's resources in the business operations. This is particularly important in regards to the research context (high-tech SMEs) where scientists are managers as well
Fuzzy special logic functions and applications
In this thesis, four special logic functions (threshold functions, monotone increasing functions, monotone decreasing functions, and unate functions) are extended to more general functions which allows the activities of these special functions to be a fuzzy rather than a 1-or-O process. These special logic functions are called as fuzzy special logic functions and are based on the concepts and techniques developed in fuzzy logic and fuzzy languages. The algorithms of determining C(n), Cmax(n) and generating the most dissimilar fuzzy special logic functions as well as important properties and results are investigated. Examples are given to illustrated these special logic functions. In addition, their applications -- function representation, data compression, error correction, and monotone flash analog to digital converter, their relationships, and fuzzy classification are also presented. It is obviously shown that fuzzy logic theory can be used successfully on these four special logic functions in order to normalize the grade of membership function μ in the interval [0 1]. As a result, the techniques described in this thesis may be of use in the study of other special logic functions and much fertile field work is great worth researching and developing
A Formalism for Visual Query Interface Design
The massive volumes and the huge variety of large knowledge bases make information
exploration and analysis difficult. An important activity is data filtering and
selection, in which both querying and visualization play important roles. Interfaces for data exploration environments normally include both, integrating them as tightly as possible.
But many features of information exploration environments, such as visual representation
of queries, visualization of query results, interactive data selection from visualizations, have only been studied separately. The intrinsic connections between
them have not been described formally. The lack of formal descriptions inhibits the
development of techniques that produce new representations for queries, and natural
integration of visual query specification with query result visualization.
This thesis describes a formalism that describes the basic components of information
exploration and and their relationships in information exploration environments. The key aspect of the formalism is that it unifies querying and visualization within a single framework, which provides a foundation for designing and analysing visual query
interfaces.
Various innovative designs of visual query representations can be derived from the
formalism. Simply comparing them with existing ones is not enough, it is more important to discover why one visual representation is better or worse than another. To do this it is necessary to understand users’ cognitive activities, and to know how these cognitive activities are enhanced or inhibited by different presentations of a query so that novel interfaces can be created and improved based on user testing.
This thesis presents a new experimental methodology for evaluating query representations, which uses stimulus onset asynchrony to separate different aspects of query comprehension. This methodology was used to evaluate a new visual query representation based on Karnaugh maps, and showing that there are two qualitatively different approaches to comprehension: deductive and inductive. The Karnaugh map representation scales extremely well with query complexity, and the experiment shows that its good scaling properties occur because it strongly facilitates inductive comprehension
Enhancing the Quality of Planning of Software Development Projects
As business competition gets tougher, there is much pressure on software development projects to become more productive and efficient. Previous research has shown that quality planning is a key factor in enhancing the success of software development projects. The research method selected for this study was design science research (DSR), and the design science research process (DSRP) model was adopted to conduct the study. This research describes the design and development of the quality of planning (QPLAN) tool and the quality of planning evaluation model (QPEM), which are two innovative artefacts that evaluate the quality of project planning and introduce best planning practices, such as providing references from historical data, suggesting how to manage in an appropriate way and including lessons learnt in the software development process. In particular, the QPEM is based on cognitive maps that represent the project manager’s know-how, project manager’s characteristics and technological expertise, as well as top management support, enterprise environmental factors and the quality of methods and tools in a form that corresponds closely with humans’ perceptions. Data were collected from 66 projects undertaken in 12 organisations from eight types of industries in six countries. The results show that the QPLAN tool has been significantly contributing to enhancing the success rate of projects
Using Boolean algebra to model the economic decision-making / Usando a álgebra booleana para modelar a tomada de decisão econômica
The study of decision theory by the economy has been characterized as a normative model rather than a descriptive model that is typical of other social sciences. This document presents a theoretical approach to descriptive model defining the elements required to take a particular course of action. The use of the tools belonging to the Boolean algebra is proposed, as are the truth tables and their associated functions to explain the decision process of one or several actions.Thanks to the reduced functions, it can observe which variables really affect certain decisions of different economic agents such as people, firms, government, among others. In addition, it is found that high levels of utility are given by have the best information and decisions that are satisfactory, instead of coming from processes of maximization that are not realistic
Representing archaeological uncertainty in cultural informatics
This thesis sets out to explore, describe, quantify, and visualise uncertainty in a
cultural informatics context, with a focus on archaeological reconstructions. For quite
some time, archaeologists and heritage experts have been criticising the often toorealistic
appearance of three-dimensional reconstructions. They have been highlighting
one of the unique features of archaeology: the information we have on our heritage
will always be incomplete. This incompleteness should be reflected in digitised
reconstructions of the past.
This criticism is the driving force behind this thesis. The research examines archaeological
theory and inferential process and provides insight into computer visualisation.
It describes how these two areas, of archaeology and computer graphics,
have formed a useful, but often tumultuous, relationship through the years.
By examining the uncertainty background of disciplines such as GIS, medicine,
and law, the thesis postulates that archaeological visualisation, in order to mature,
must move towards archaeological knowledge visualisation. Three sequential areas
are proposed through this thesis for the initial exploration of archaeological uncertainty:
identification, quantification and modelling. The main contributions of the
thesis lie in those three areas.
Firstly, through the innovative design, distribution, and analysis of a questionnaire,
the thesis identifies the importance of uncertainty in archaeological interpretation
and discovers potential preferences among different evidence types.
Secondly, the thesis uniquely analyses and evaluates, in relation to archaeological
uncertainty, three different belief quantification models. The varying ways that these
mathematical models work, are also evaluated through simulated experiments. Comparison
of results indicates significant convergence between the models.
Thirdly, a novel approach to archaeological uncertainty and evidence conflict visualisation
is presented, influenced by information visualisation schemes. Lastly, suggestions
for future semantic extensions to this research are presented through the
design and development of new plugins to a search engine
The hippocampal formation from a machine learning perspective
Nos dias de hoje, existem diversos tipos de sensores que conseguem captar uma grande quantidade de dados em curtos espaços de tempo. Em muitas situações, as informações obtidas pelos diferentes sensores traduzem fenómenos específicos, através de dados obtidos em diferentes formatos. Nesses casos, torna-se difícil saber quais as relações entre os dados e/ou identificar se os diferentes dados traduzem uma certa condição. Neste contexto, torna-se relevante desenvolver sistemas que tenham capacidade de analisar grandes quantidades de dados num menor tempo possível, produzindo informação válida a partir da informação recolhida.
O cérebro dos animais é um órgão biológico capaz de fazer algo semelhante com a informação obtida pelos sentidos, que traduzem fenómenos específicos. Dentro do cérebro, existe um elemento chamado Hipocampo, que se encontra situado na área do lóbulo temporal. A sua função principal consiste em analisar os elementos previamente codificados pelo Entorhinal Cortex, dando origem à formação de novas memórias. Sendo o Hipocampo um órgão que foi sofrendo evoluções ao longo do tempos, é importante perceber qual é o seu funcionamento e, se possível, tentar encontrar modelos computacionais que traduzam o seu mecanismo.
Desde a remoção do Hipocampo num paciente que sofria de convulsões, ficou claro que, sem esse elemento, não seria possível memorizar lugares ou eventos ocorridos num determinado espaço de tempo. Essa funcionalidade é obtida através de um conjunto específico de células chamadas de Grid Cells, que estão situadas na área do Entorhinal Cortex, mas também das Place Cells, Head Direction Cells e Boundary Vector Cells.
Neste âmbito, o principal objetivo desta Dissertação consiste em descrever os principais mecanismos biológicos localizados no Hipocampo e definir modelos computacionais que consigam simular as funções mais críticas de ambos os Hipocampos e da área do Entorhinal Cortex.Nowadays, sensor devices are able to generate huge amounts of data in short periods of time. In many situations, that data, collected by many different sensors, translates a specific phenomenon, but is presented in very different types and formats. In these cases, it is hard to determine how these distinct types of data are related to each other or translate a certain condition. In this context, it would be of great importance to develop a system capable of analysing such data in the smallest amount time to produce valid information. The brain is a biological organ capable of such decisions. Inside the brain, there is an element called Hippocampus, that is situated in the Temporal Lobe area. Its main function is to analyse the sensorial data encoded by the Entorhinal Cortex to create new memories. Since the Hippocampus has evolved for thousands of years to perform these tasks, it is of high importance to try to understand its functioning and to model it, i.e. to define a set of computer algorithms that approximates it.
Since the removal of the Hippocampus from a patient suffering from seizures, the scientific community believes that the Hippocampus is crucial for memory formation and for spatial navigation. Without it, it wouldn’t be possible to memorize places and events that happened in a specific time or place. Such functionality is achieved with the help of set of cells called Grid Cells, present in the Entorhinal Cortex area, but also with Place Cells, Head Direction Cells and Boundary Vector Cells. The combined information analysed by those cells allows the unique identification of places or events.
The main objective of the work developed in this Thesis consists in describing the biological mechanisms present in the Hippocampus area and to define potential computer models that allow the simulation of all or the most critical functions of both the Hippocampus and the Entorhinal Cortex areas
- …