80,433 research outputs found
Weak signal identification with semantic web mining
We investigate an automated identification of weak signals according to Ansoff to improve strategic planning and technological forecasting. Literature shows that weak signals can be found in the organization's environment and that they appear in different contexts. We use internet information to represent organization's environment and we select these websites that are related to a given hypothesis. In contrast to related research, a methodology is provided that uses latent semantic indexing (LSI) for the identification of weak signals. This improves existing knowledge based approaches because LSI considers the aspects of meaning and thus, it is able to identify similar textual patterns in different contexts. A new weak signal maximization approach is introduced that replaces the commonly used prediction modeling approach in LSI. It enables to calculate the largest number of relevant weak signals represented by singular value decomposition (SVD) dimensions. A case study identifies and analyses weak signals to predict trends in the field of on-site medical oxygen production. This supports the planning of research and development (R&D) for a medical oxygen supplier. As a result, it is shown that the proposed methodology enables organizations to identify weak signals from the internet for a given hypothesis. This helps strategic planners to react ahead of time
Recommended from our members
Auditing the accessibility of Massive Open Online Courses (MOOCs)
The outcome from the research being reported in this paper is the design of an accessibility audit to evaluate Massive Open Online Courses (MOOCs) for accessibility and to arrive at solutions and adaptations that can meet user needs. This accessibility audit includes expert-based heuristic evaluations and user-based evaluations of the MOOC platforms and individual courses
Automated Discovery of Internet Censorship by Web Crawling
Censorship of the Internet is widespread around the world. As access to the
web becomes increasingly ubiquitous, filtering of this resource becomes more
pervasive. Transparency about specific content that citizens are denied access
to is atypical. To counter this, numerous techniques for maintaining URL filter
lists have been proposed by various individuals and organisations that aim to
empirical data on censorship for benefit of the public and wider censorship
research community.
We present a new approach for discovering filtered domains in different
countries. This method is fully automated and requires no human interaction.
The system uses web crawling techniques to traverse between filtered sites and
implements a robust method for determining if a domain is filtered. We
demonstrate the effectiveness of the approach by running experiments to search
for filtered content in four different censorship regimes. Our results show
that we perform better than the current state of the art and have built domain
filter lists an order of magnitude larger than the most widely available public
lists as of Jan 2018. Further, we build a dataset mapping the interlinking
nature of blocked content between domains and exhibit the tightly networked
nature of censored web resources
Marking complex assignments using peer assessment with an electronic voting system and an automated feedback tool
The work described in this paper relates to the development and use of a range of initiatives in order to mark complex masters' level assignments related to the development of computer web applications. In the past such assignments have proven difficult to mark since they assess a range of skills including programming, human computer interaction and design. Based on the experience of several years marking such assignments, the module delivery team decided to adopt an approach whereby the students marked each other's practical work using an electronic voting system (EVS). The results of this are presented in the paper along with statistical comparison with the tutors' marking, providing evidence for the efficacy of the approach. The second part of the assignment related to theory and documentation. This was marked by the tutors using an automated feedback tool. It was found that the time to mark the work was reduced by more than 30% in all cases compared to previous years. More importantly it was possible to provide good quality individual feedback to learners rapidly. Feedback was delivered to all within three weeks of the test submission datePeer reviewe
How the internet changed career: framing the relationship between career development and online technologies
This article examines the inter-relationship between the internet and career development. It asks three inter-linked questions: How does the internet reshape the context within which individuals pursue their career? What skills and knowledge do people need in order to pursue their careers effectively using the internet? How can careers workers use the internet as a medium for the delivery of career support? The article develops conceptual architecture for answering these questions and in particular highlights the importance of the concept of digital career literacy
Recommended from our members
A survey on online monitoring approaches of computer-based systems
This report surveys forms of online data collection that are in current use (as well as being the subject of research to adapt them to changing technology and demands), and can be used as inputs to assessment of dependability and resilience, although they are not primarily meant for this use
Construction safety and digital design: a review
As digital technologies become widely used in designing buildings and infrastructure, questions arise about
their impacts on construction safety. This review explores relationships between construction safety and
digital design practices with the aim of fostering and directing further research. It surveys state-of-the-art
research on databases, virtual reality, geographic information systems, 4D CAD, building information
modeling and sensing technologies, finding various digital tools for addressing safety issues in the
construction phase, but few tools to support design for construction safety. It also considers a literature on
safety critical, digital and design practices that raises a general concern about ‘mindlessness’ in the use of
technologies, and has implications for the emerging research agenda around construction safety and digital
design. Bringing these strands of literature together suggests new kinds of interventions, such as the
development of tools and processes for using digital models to promote mindfulness through multi-party
collaboration on safet
Critical Success Factors for Positive User Experience in Hotel Websites: Applying Herzberg's Two Factor Theory for User Experience Modeling
This research presents the development of a critical success factor matrix
for increasing positive user experience of hotel websites based upon user
ratings. Firstly, a number of critical success factors for web usability have
been identified through the initial literature review. Secondly, hotel websites
were surveyed in terms of critical success factors identified through the
literature review. Thirdly, Herzberg's motivation theory has been applied to
the user rating and the critical success factors were categorized into two
areas. Finally, the critical success factor matrix has been developed using the
two main sets of data.Comment: Journal articl
- …