2,753 research outputs found

    The Analysis of design and manufacturing tasks using haptic and immersive VR - Some case studies

    Get PDF
    The use of virtual reality in interactive design and manufacture has been researched extensively but the practical application of this technology in industry is still very much in its infancy. This is surprising as one would have expected that, after some 30 years of research commercial applications of interactive design or manufacturing planning and analysis would be widespread throughout the product design domain. One of the major but less well known advantages of VR technology is that logging the user gives a great deal of rich data which can be used to automatically generate designs or manufacturing instructions, analyse design and manufacturing tasks, map engineering processes and, tentatively, acquire expert knowledge. The authors feel that the benefits of VR in these areas have not been fully disseminated to the wider industrial community and - with the advent of cheaper PC-based VR solutions - perhaps a wider appreciation of the capabilities of this type of technology may encourage companies to adopt VR solutions for some of their product design processes. With this in mind, this paper will describe in detail applications of haptics in assembly demonstrating how user task logging can lead to the analysis of design and manufacturing tasks at a level of detail not previously possible as well as giving usable engineering outputs. The haptic 3D VR study involves the use of a Phantom and 3D system to analyse and compare this technology against real-world user performance. This work demonstrates that the detailed logging of tasks in a virtual environment gives considerable potential for understanding how virtual tasks can be mapped onto their real world equivalent as well as showing how haptic process plans can be generated in a similar manner to the conduit design and assembly planning HMD VR tool reported in PART A. The paper concludes with a view as to how the authors feel that the use of VR systems in product design and manufacturing should evolve in order to enable the industrial adoption of this technology in the future

    Application of Web 2.0 technologies in e-government: A United Kingdom case study

    Get PDF
    Electronic government (e-Government) has endured significant transformation over the last decade and currently, it is making further leaps by incorporating modern technologies such as second generation web (Web 2.0) technologies. However, since the development and use of this kind of technology is still at its early stages in the public sector, research about the use of Web 2.0 in this domain is still highly tentative and lacks theoretical underpinning. This paper reports the preliminary findings of an in-depth case study in the United Kingdom (UK) public sector, which explore the application of Web 2.0 technologies in the local government authority (LGA). The findings elicited from the case study offer an insight into information systems (IS) evaluation criterions and impact factors of Web 2.0 from both a practical setting and an internal organisational perspective. This paper concludes that a combined analysis of the evaluation and impact factors rather than a singular approach would better assist the decision making process that leads to effective application of Web 2.0 technologies. It also highlights the significant impact and perceived effect of adoption of such technologies

    ASSESSMENT AND COMPARISON OF INJECTION TECHNIQUES USING MANNEQUIN AS A LEARNING TOOL AND OSPE AS AN EVALUATION METHOD

    Get PDF
    Objective: The aim of the current study was to compare the effectiveness of only demonstration and demonstration coupled with the powerpoint method (intervention) in acquiring the knowledge of injection technique using objective structured practical examination (OSPE) as an evaluation tool. Methods: The present study was conducted among IInd professional medical undergraduates (N=80). Identification of medical devices, parts of a syringe and intravenous (IV) infusion set, intramuscular (IM) injection and intravenous infusion techniques were taught using demonstration and intervention method. Participants were then evaluated for their knowledge by OSPE method using validated checklists. Participants were also asked to give feedback for the teaching and evaluation method. Data were analyzed using SPSS 20.0.0, IBM Corporation. Results: After the intervention method 100% participant could identify needle, cannula, and IV infusion set. Noticeable difference was found in identifying parts of a syringe and IV infusion set after intervention method. OSPE evaluation post-intervention showed that more number of participants could perform the steps of injection correctly and in sequence. OSPE scores post-intervention differed significantly (<0.001) with demonstration method. Conclusion: Demonstration coupled with the powerpoint teaching method was found better than the demonstration method alone. This method should be used to impart practical knowledge of injection technique

    A comparison of CMS Tier0-dataflow scenarios using the Yasper simulation tool

    Get PDF
    The CMS experiment at CERN will produce large amounts of data in short time periods. Because the data buffers at the experiment are not large enough, this data needs to be transferred to other storages. The CMS Tier0 will be an enormous job processing and storage facility at the CERN site. One part of this Tier0, called the Tier0 input buffer, has the task to readout the experiment data buffers and to supply these data to other tasks that need to be carried out with it (such as storing). It has to make sure that no data is lost. This thesis compares different scenarios to work with a set of disk servers in order to accomplish the Tier0 input buffer tasks. To increase the performance per disk server, write and read actions on the same disk server are separated. To find the optimal moments a disk server should change from accepting and writing items to supplying items to other tasks, the combination of various parameters, such as the usage of a particular queuing discipline (like FIFO, LIFO, LPTF and SPTF) and the state of the disk server has been studied. To make the actual comparisons a simulation of dataflow models of the different scenarios has been used. These simulations have been performed with the Yasper simulation tool. This tool uses Petri Net models as its input. To be more certain that the models represent the real situation, some model parts have been remodelled in a tool called GPSS. This tool is not using Petri Nets as its input model; instead it uses queuing models described in a special GPSS language. The results of the simulations show that the best queuing discipline to be used with the Tier0 input buffer is the LPTF discipline. In particular in combination with a change moment as soon as a disk server has been readout completely

    Cross-Media Alliances to Stop Disinformation: A Real Solution?

    Get PDF
    Social networks have surpassed their intermediary role and become gatekeepers of online content and traffic. This transformation has favored the spread of information disorders. The situation is especially alarming in Spain, where 57% of Spaniards have at some moment believed false news. Since 2016, First Draft has promoted several collaborative verification projects that brought together newsrooms to fact-check false, misleading and confusing claims circulating online during presidential elections in several countries. The main objective of this article is to study the collaboration forged between newsrooms in Spain in order to debunk disinformation contents in 2019 under the name of Comprobado (Verified) and the impact of this initiative. Applying a methodological approach based on non-participant observation, interviews, content analysis of reports, scientific articles, books and media archives, we examine the internal uses of this platform, how journalists verified public discourse, the strategies and internal agreements implemented, and the degree of participation of the 16 media involved. Results show that only half of the initiatives begun were transformed into published reports, and the media impact achieved was limited. Finally, we note that the principal reasons for the frustration of the project were its improvised implementation, due to the date of the election being brought forward, and the scant culture of collaboration in the sector. In Spain at least, cross-media alliances are still an exception

    B2C eCommerce Strategy and Market Structure: The Survey Based Approach

    Get PDF
    This paper follows two objectives: (i) It demonstrates the merits of the survey based approach to B2C eCommerce characteristics and company strategy, and (ii) it presents empirical evidence of the crucial importance of size and marketing investment in B2C eCommerce markets. It presents econometric estimates of the effects of company characteristics and company strategies on the performance of Viennese B2C eCommerce companies in 2001. We provide econometric analysis of three dependent variables in turn: (i) number of B2C eCommerce customers in 2000, (ii) number of B2C eCommerce employees in January 2001 and (iii) revenue growth rate in 2001. The models do explain the data quite well: Size as well as endogenous sunk costs emerge as the main success factors. Furthermore, the results of nonparametric tests are presented. They mostly confirm the econometric evidence. We also show that the quantitative results are consistent with the qualitative results of the surveys. Finally, we argue that the survey based approach to B2C eCommerce is a method that provides reliable and consistent data, and that it complements the approach based on prices and consumer behavior commonly applied.B2C eCommerce, empirical evidence, success factors, endogenous sunk costs, market structure

    Efficient practices in railway ballast maintenance and quality assessment using GPR

    Get PDF
    The need for effective and efficient railway maintenance is always more demanded all over the world as the main consequence of aging and degradation of infrastructures. Primarily, the filling of air voids within a railway ballast track-bed by fine-grained materials, coming up from the subballast layers by vibrations and capillarity effects, can heavily affect both the bearing and the draining capacity of the infrastructure with major impacts on safety. This occurrence is typically referred to as “fouling”. When ballast is fouled, especially by clay, its internal friction angle is undermined, with serious lowering of the strength properties and increase of deformation rates of the whole rail track-bed. Thereby, a detailed and up-to-date knowledge of the quality of the railway substructure is mandatory for scheduling proper maintenance, with the final goal of optimizing the productivity while keeping the safety at the highest standard. This paper aims at reviewing a set of maintenance methodologies, spanning from the traditional and most employed ones, up to the most innovative approaches available in the market, with a special focus on the Ground Penetrating Radar (GPR) non-destructive testing (NDT) technique. The breakthrough brought by the application of new processing approaches is also analyzed and a methodological framework is given on some of the most recent and effective maintenance practices

    B2C eCommerce Strategy and Market Structure: The Survey Based Approach

    Get PDF
    This paper follows two objectives: (i) It demonstrates the merits of the survey based approach to B2C eCommerce characteristics and company strategy, and (ii) it presents empirical evidence of the crucial importance of size and marketing investment in B2C eCommerce markets. It presents econometric estimates of the effects of company characteristics and company strategies on the performance of Viennese B2C eCommerce companies in 2001. We provide econometric analysis of three dependent variables in turn: (i) number of B2C eCommerce customers in 2000, (ii) number of B2C eCommerce employees in January 2001 and (iii) revenue growth rate in 2001. The models do explain the data quite well: Size as well as endogenous sunk costs emerge as the main success factors. Furthermore, the results of nonparametric tests are presented. They mostly confirm the econometric evidence. We also show that the quantitative results are consistent with the qualitative results of the surveys. Finally, we argue that the survey based approach to B2C eCommerce is a method that provides reliable and consistent data, and that it complements the approach based on prices and consumer behavior commonly applied.B2C eCommerce, empirical evidence, success factors, endogenous sunk costs
    • 

    corecore