707 research outputs found
Recommended from our members
Software-export strategies for developing countries: A Caribbean perspective
The globalization of the software industry is seen to be driven in part by skill shortages in industrialised economies, the movement of software development practices away from centralised to more distributed modes and the spread of information and communication technologies to less developed economies, where skilled labour is available at lower costs. As such, a software export industry is sometimes seen as a means by which some non-industrialised countries can create competitive advantage. While many studies have explored the software-exporting strategies used by the more successful of these countries, little research has been done in other locations that lack some of the basic resources deemed necessary for success in this area. This paper describes two Caribbean software-outsourcing ventures in order to explore possible software-export strategies available within such atypical contexts. The role of government and degree of integration of the software outsourcer into the local context are found to feature significantly
Recommended from our members
An approach to modeling database activity
Results in the field of data modeling currently suffer from many of the same ills which plagued data management systems in the late 1960's. Advanced semantic modeling systems such as the Semantic Data Model and the Relational Model/Tasmania are extremely complex to understand as well as somewhat ad hoc in design. Such systems capture only static snapshots of activity in the world being modeled. On the other hand, behavioral models which do attempt to model system dynamics typically provide less overall modeling power than comprehensive semantic models. Further, the specifications of behavior which can be expressed with such models are themselves static snapshots which are not integrated with other database objects.This work describes one approach for capturing dynamic relationships by distilling the concepts found in semantic and behavioral data models into a small number of flexible constructs. The resulting Prototype Activity Modeling System (PAMS) captures the containment, feedback, operational, and state dependency roles of entities in the world being modeled. Further, these definitions of database activity are captured as database objects (rather than as a schema) so as to allow dynamic manipulation of entity roles.The key concept of the approach is the bundle - a purposefully designed extension of time-proven relational database modeling concepts which includes support for presentation ordering and complex Cartesian aggregations. By applying the basic nested bundle principle, it is possible to obtain complex hierarchies of static structural information. The static templates so constructed, when used with a non-procedural query language and the value nomination principle which reduces relations to scalar values when necessary, provide a conventional database modeling system for applications. By extending these templates with the non-procedural thunk principle which embeds query specifications within object definitions, variations caused by dependencies within the application can cause the apparent contents of the database description to change. When further extended by the activity monitoring principle which records the interaction between the application and its environment, these dynamic templates can account for changes outside the scope of the application
DMFSGD: A Decentralized Matrix Factorization Algorithm for Network Distance Prediction
The knowledge of end-to-end network distances is essential to many Internet
applications. As active probing of all pairwise distances is infeasible in
large-scale networks, a natural idea is to measure a few pairs and to predict
the other ones without actually measuring them. This paper formulates the
distance prediction problem as matrix completion where unknown entries of an
incomplete matrix of pairwise distances are to be predicted. The problem is
solvable because strong correlations among network distances exist and cause
the constructed distance matrix to be low rank. The new formulation circumvents
the well-known drawbacks of existing approaches based on Euclidean embedding.
A new algorithm, so-called Decentralized Matrix Factorization by Stochastic
Gradient Descent (DMFSGD), is proposed to solve the network distance prediction
problem. By letting network nodes exchange messages with each other, the
algorithm is fully decentralized and only requires each node to collect and to
process local measurements, with neither explicit matrix constructions nor
special nodes such as landmarks and central servers. In addition, we compared
comprehensively matrix factorization and Euclidean embedding to demonstrate the
suitability of the former on network distance prediction. We further studied
the incorporation of a robust loss function and of non-negativity constraints.
Extensive experiments on various publicly-available datasets of network delays
show not only the scalability and the accuracy of our approach but also its
usability in real Internet applications.Comment: submitted to IEEE/ACM Transactions on Networking on Nov. 201
Case study in six sigma methadology : manufacturing quality improvement and guidence for managers
This article discusses the successful implementation of Six Sigma methodology in a high precision and critical process in the manufacture of automotive products. The Six Sigma define–measure–analyse–improve–control approach resulted in a reduction of tolerance-related problems and improved the first pass yield from 85% to 99.4%. Data were collected on all possible causes and regression analysis, hypothesis testing, Taguchi methods, classification and regression tree, etc. were used to analyse the data and draw conclusions. Implementation of Six Sigma methodology had a significant financial impact on the profitability of the company. An approximate saving of US$70,000 per annum was reported, which is in addition to the customer-facing benefits of improved quality on returns and sales. The project also had the benefit of allowing the company to learn useful messages that will guide future Six Sigma activities
The regulation of telecommunication in the United Kingdom of Great Britain & Northern Ireland
This paper reviews the application of national antitrust law and the implementation of the European Union's telecommunications directives to the markets in the United Kingdom, against the declared policy objective of raising national competitiveness. It illustrates the complexity of the systems that have been created over three decades, with complex and interlocking regulatory, self-regulatory, judicial and appellate bodies, interacting with the parliamentary systems to form a regulatory state. Where markets have failed, or thought likely to fail, the state at different levels (UK, national and municipal) has supported studies and subsidized the provision of broadband Internet access. The regulator, using its sectoral antitrust powers, agreed with British Telecom to functional separation, transferring the enduring bottleneck of local access to a separate subsidiary. While the UK describes itself as a regulatory leader this is difficult to evaluate, given the number and the frequencies of changes, nonetheless the claim seems very difficult to substantiate. --Governance,Competitiveness,Regulatory state,Great Britain,United Kingdom
A behavioural analysis of the adoption and use of interactive computer systems by senior managers
The purpose of this research has been to make a contribution to knowledge about those processes and phenomena which influence the use of computer-based decision systems by senior managers for their own decision activities. In the course of the thesis, research questions are addressed which relate to the nature of the role of the directly-accessed computer in the working life of the top manager, and especially to the factors which influence computer adoption and use. A review of relevant literature enabled gaps in existing knowledge about senior managerial computer use to be identified, and indicated the potential value of exploratory research. A programme of interviews was devised and executed which enabled the exploration of the research problem across a sample of senior managers from private and public organizations. It is felt that the methodology of performing intra- and inter-organizational comparisons among computer-exposed managers was fundamental to achieving new insights into managerial behaviours. Following qualitative and qualitative analysis of the research data, a dynamic behavioural model of the computer adoption process in large organizations is proposed together with a description of salient behavioural features at key points in the process. This theoretical model contributes to an understanding of the nature and circumstances of the senior managerial behaviours associated with direct computer use
Proxcache: A new cache deployment strategy in information-centric network for mitigating path and content redundancy
One of the promising paradigms for resource sharing with maintaining the basic Internet semantics is the Information-Centric Networking (ICN). ICN distinction with the current Internet is its ability to refer contents by names with partly dissociating the host-to-host practice of Internet Protocol addresses. Moreover, content caching in ICN is the major action of achieving content networking to reduce the amount of server access.
The current caching practice in ICN using the Leave Copy Everywhere (LCE) progenerate problems of over deposition of contents known as content redundancy,
path redundancy, lesser cache-hit rates in heterogeneous networks and lower content diversity. This study proposes a new cache deployment strategy referred to as ProXcache to acquire node relationships using hyperedge concept of hypergraph for cache positioning. The study formulates the relationships through the path and distance approximation to mitigate content and path redundancy. The study adopted the Design Research Methodology approach to achieve the slated research objectives. ProXcache was investigated using simulation on the Abilene, GEANT and the DTelekom network topologies for LCE and ProbCache caching strategies with the Zipf distribution to differ
content categorization. The results show the overall content and path redundancy are minimized with lesser caching operation of six depositions per request as compared to nine and nineteen for ProbCache and LCE respectively. ProXcache yields
better content diversity ratio of 80% against 20% and 49% for LCE and ProbCache respectively as the cache sizes varied. ProXcache also improves the cache-hit ratio through proxy positions. These thus, have significant influence in the development of the ICN for better management of contents towards subscribing to the Future Internet
- …