1,873 research outputs found

    The Worsening Shortage of College-Graduate Workers

    Get PDF
    The Bureau of Labor Statistics (BLS) projections of occupational employment growth have consistently underpredicted the growth of skilled occupations. BLS currently projects that professional, technical, and managerial jobs will account for 44.5% of employment growth between 1988 and 2000, while we project they will account for 70% of employment growth. Between March 1988 and March 1991 these occupations, in fact, accounted for 87% of employment growth. The BLS\u27s projections of the supply/demand balance for college graduates have also been off the mark-predicting a surplus for the 1980s when, in fact, a shortage developed, and relative wage ratios for college graduates rose to all-time highs. We project that the supply of college educated workers will grow more slowly during the 1990s and that there will be a continuing escalation of wage premiums for college graduates

    Issues Related to the Emergence of the Information Superhighway and California Societal Changes, IISTPS Report 96-4

    Get PDF
    The Norman Y. Mineta International Institute for Surface Transportation Policy Studies (IISTPS) at San José State University (SJSU) conducted this project to review the continuing development of the Internet and the Information Superhighway. Emphasis was placed on an examination of the impact on commuting and working patterns in California, and an analysis of how public transportation agencies, including Caltrans, might take advantage of the new communications technologies. The document reviews the technology underlying the current Internet “structure” and examines anticipated developments. It is important to note that much of the research for this limited-scope project was conducted during 1995, and the topic is so rapidly evolving that some information is almost automatically “dated.” The report also examines how transportation agencies are basically similar in structure and function to other business entities, and how they can continue to utilize the emerging technologies to improve internal and external communications. As part of a detailed discussion of specific transportation agency functions, it is noted that the concept of a “Roundtable Forum,” growing out of developments in Concurrent Engineering, can provide an opportunity for representatives from multiple jurisdictions to utilize the Internet for more coordinated decision-making. The report also included an extensive analysis of demographic trends in California in recent years, such as commute and recreational activities, and identifies how the emerging technologies may impact future changes

    Innovation, diffusion and catching up in the fifth long wave

    Get PDF
    Does the new technological paradigm based on information and communication technologies (ICTs) create new windows of opportunity or further obstacles for catching up countries? The paper discusses this question by taking neo-Schumpeterian long wave theory as the basic framework of analysis. According to this approach, the current rapid diffusion of the ICT-based paradigm marks the initial phase of a fifth long wave period. The first part of the paper focuses on the major changes that characterize the techno-economic system in the fifth long wave, and points out that the new paradigm is leading to several new opportunities for developing economies. If public policies will actively foster the development process by rapidly investing in the new technologies and in the related infrastructures and skills, these new opportunities will indeed be successfully exploited. The second part of the paper shifts the focus to the socio-institutional system, and argues that institutional changes driven by some major actors in the industrialized world are creating a new international regime where the scope and the resources available for State interventions are significantly reduced. The paper concludes by suggesting the existence of a temporary mismatch between the techno-economic and the socio-institutional system, which makes the catching up process more difficult for large parts of the developing world.Innovation; ICTs; catching up; long waves; global governance

    Research study on high-level skill needs in Ni Ict sector: final report

    Get PDF

    Data mining and predictive analytics application on cellular networks to monitor and optimize quality of service and customer experience

    Get PDF
    This research study focuses on the application models of Data Mining and Machine Learning covering cellular network traffic, in the objective to arm Mobile Network Operators with full view of performance branches (Services, Device, Subscribers). The purpose is to optimize and minimize the time to detect service and subscriber patterns behaviour. Different data mining techniques and predictive algorithms have been applied on real cellular network datasets to uncover different data usage patterns using specific Key Performance Indicators (KPIs) and Key Quality Indicators (KQI). The following tools will be used to develop the concept: RStudio for Machine Learning and process visualization, Apache Spark, SparkSQL for data and big data processing and clicData for service Visualization. Two use cases have been studied during this research. In the first study, the process of Data and predictive Analytics are fully applied in the field of Telecommunications to efficiently address users’ experience, in the goal of increasing customer loyalty and decreasing churn or customer attrition. Using real cellular network transactions, prediction analytics are used to predict customers who are likely to churn, which can result in revenue loss. Prediction algorithms and models including Classification Tree, Random Forest, Neural Networks and Gradient boosting have been used with an exploratory Data Analysis, determining relationship between predicting variables. The data is segmented in to two, a training set to train the model and a testing set to test the model. The evaluation of the best performing model is based on the prediction accuracy, sensitivity, specificity and the Confusion Matrix on the test set. The second use case analyses Service Quality Management using modern data mining techniques and the advantages of in-memory big data processing with Apache Spark and SparkSQL to save cost on tool investment; thus, a low-cost Service Quality Management model is proposed and analyzed. With increase in Smart phone adoption, access to mobile internet services, applications such as streaming, interactive chats require a certain service level to ensure customer satisfaction. As a result, an SQM framework is developed with Service Quality Index (SQI) and Key Performance Index (KPI). The research concludes with recommendations and future studies around modern technology applications in Telecommunications including Internet of Things (IoT), Cloud and recommender systems.Cellular networks have evolved and are still evolving, from traditional GSM (Global System for Mobile Communication) Circuit switched which only supported voice services and extremely low data rate, to LTE all Packet networks accommodating high speed data used for various service applications such as video streaming, video conferencing, heavy torrent download; and for say in a near future the roll-out of the Fifth generation (5G) cellular networks, intended to support complex technologies such as IoT (Internet of Things), High Definition video streaming and projected to cater massive amount of data. With high demand on network services and easy access to mobile phones, billions of transactions are performed by subscribers. The transactions appear in the form of SMSs, Handovers, voice calls, web browsing activities, video and audio streaming, heavy downloads and uploads. Nevertheless, the stormy growth in data traffic and the high requirements of new services introduce bigger challenges to Mobile Network Operators (NMOs) in analysing the big data traffic flowing in the network. Therefore, Quality of Service (QoS) and Quality of Experience (QoE) turn in to a challenge. Inefficiency in mining, analysing data and applying predictive intelligence on network traffic can produce high rate of unhappy customers or subscribers, loss on revenue and negative services’ perspective. Researchers and Service Providers are investing in Data mining, Machine Learning and AI (Artificial Intelligence) methods to manage services and experience. This research study focuses on the application models of Data Mining and Machine Learning covering network traffic, in the objective to arm Mobile Network Operators with full view of performance branches (Services, Device, Subscribers). The purpose is to optimize and minimize the time to detect service and subscriber patterns behaviour. Different data mining techniques and predictive algorithms will be applied on cellular network datasets to uncover different data usage patterns using specific Key Performance Indicators (KPIs) and Key Quality Indicators (KQI). The following tools will be used to develop the concept: R-Studio for Machine Learning, Apache Spark, SparkSQL for data processing and clicData for Visualization.Electrical and Mining EngineeringM. Tech (Electrical Engineering

    Evolution of artificial intelligence research in Technological Forecasting and Social Change: Research topics, trends, and future directions

    Get PDF
    Artificial intelligence (AI) is a set of rapidly expanding disruptive technologies that are radically transforming various aspects related to people, business, society, and the environment. With the proliferation of digital computing devices and the emergence of big data, AI is increasingly offering significant opportunities for society and business organizations. The growing interest of scholars and practitioners in AI has resulted in the diversity of research topics explored in bulks of scholarly literature published in leading research outlets. This study aims to map the intellectual structure and evolution of the conceptual structure of overall AI research published in Technological Forecasting and Social Change (TF&SC). This study uses machine learning-based structural topic modeling (STM) to extract, report, and visualize the latent topics from the AI research literature. Further, the disciplinary patterns in the intellectual structure of AI research are examined with the additional objective of assessing the disciplinary impact of AI. The results of the topic modeling reveal eight key topics, out of which the topics concerning healthcare, circular economy and sustainable supply chain, adoption of AI by consumers, and AI for decision-making are showing a rising trend over the years. AI research has a significant influence on disciplines such as business, management, and accounting, social science, engineering, computer science, and mathematics. The study provides an insightful agenda for the future based on evidence-based research directions that would benefit future AI scholars to identify contemporary research issues and develop impactful research to solve complex societal problems

    A comparison of processing techniques for producing prototype injection moulding inserts.

    Get PDF
    This project involves the investigation of processing techniques for producing low-cost moulding inserts used in the particulate injection moulding (PIM) process. Prototype moulds were made from both additive and subtractive processes as well as a combination of the two. The general motivation for this was to reduce the entry cost of users when considering PIM. PIM cavity inserts were first made by conventional machining from a polymer block using the pocket NC desktop mill. PIM cavity inserts were also made by fused filament deposition modelling using the Tiertime UP plus 3D printer. The injection moulding trials manifested in surface finish and part removal defects. The feedstock was a titanium metal blend which is brittle in comparison to commodity polymers. That in combination with the mesoscale features, small cross-sections and complex geometries were considered the main problems. For both processing methods, fixes were identified and made to test the theory. These consisted of a blended approach that saw a combination of both the additive and subtractive processes being used. The parts produced from the three processing methods are investigated and their respective merits and issues are discussed

    Reducing risk in pre-production investigations through undergraduate engineering projects.

    Get PDF
    This poster is the culmination of final year Bachelor of Engineering Technology (B.Eng.Tech) student projects in 2017 and 2018. The B.Eng.Tech is a level seven qualification that aligns with the Sydney accord for a three-year engineering degree and hence is internationally benchmarked. The enabling mechanism of these projects is the industry connectivity that creates real-world projects and highlights the benefits of the investigation of process at the technologist level. The methodologies we use are basic and transparent, with enough depth of technical knowledge to ensure the industry partners gain from the collaboration process. The process we use minimizes the disconnect between the student and the industry supervisor while maintaining the academic freedom of the student and the commercial sensitivities of the supervisor. The general motivation for this approach is the reduction of the entry cost of the industry to enable consideration of new technologies and thereby reducing risk to core business and shareholder profits. The poster presents several images and interpretive dialogue to explain the positive and negative aspects of the student process

    The Web Engineering Security (WES) methodology

    Get PDF
    The World Wide Web has had a significant impact on basic operational economical components in global information rich civilizations. This impact is forcing organizations to provide justification for security from a business case perspective and to focus on security from a web application development environment perspective. This increased focus on security was the basis of a business case discussion and led to the acquisition of empirical evidence gathered from a high level Web survey and more detailed industry surveys to analyse security in the Web application development environment. Along with this information, a collection of evidence from relevant literature was also gathered. Individual aspects of the data gathered in the previously mentioned activities contributed to the proposal of the Essential Elements (EE) and the Security Criteria for Web Application Development (SCWAD). The Essential Elements present the idea that there are essential, basic organizational elements that need to be identified, defined and addressed before examining security aspects of a Web Engineering Development process. The Security Criteria for Web Application Development identifies criteria that need to be addressed by a secure Web Engineering process. Both the EE and SCWAD are presented in detail along with relevant justification of these two elements to Web Engineering. SCWAD is utilized as a framework to evaluate the security of a representative selection of recognized software engineering processes used in Web Engineering application development. The software engineering processes appraised by SCWAD include: the Waterfall Model, the Unified Software Development Process (USD), Dynamic Systems Development Method (DSDM) and eXtreme Programming (XP). SCWAD is also used to assess existing security methodologies which are comprised of the Orion Strategy; Survivable / Viable IS approaches; Comprehensive Lightweight Application Security Process (CLASP) and Microsoft’s Trust Worthy Computing Security Development Lifecycle. The synthesis of information provided by both the EE and SCWAD were used to develop the Web Engineering Security (WES) methodology. WES is a proactive, flexible, process neutral security methodology with customizable components that is based on empirical evidence and used to explicitly integrate security throughout an organization’s chosen application development process. In order to evaluate the practical application of the EE, SCWAD and the WES methodology, two case studies were conducted during the course of this research. The first case study describes the application of both the EE and SCWAD to the Hunterian Museum and Art Gallery’s Online Photo Library (HOPL) Internet application project. The second case study presents the commercial implementation of the WES methodology within a Global Fortune 500 financial service sector organization. The assessment of the WES methodology within the organization consisted of an initial survey establishing current security practices, a follow-up survey after changes were implemented and an overall analysis of the security conditions assigned to projects throughout the life of the case study
    corecore