14,362 research outputs found

    A study of QoS support for real time multimedia communication over IEEE802.11 WLAN : a thesis presented in partial fulfillment of the requirements for the degree of Master of Engineering in Computer Systems Engineering, Massey University, Albany, New Zealand

    Get PDF
    Quality of Service (QoS) is becoming a key problem for Real Time (RT) traffic transmitted over Wireless Local Area Network (WLAN). In this project the recent proposals for enhanced QoS performance for RT multimedia is evaluated and analyzed. Two simulation models for EDCF and HCF protocols are explored using OPNET and NS-2 simulation packages respectively. From the results of the simulation, we have studied the limitations of the 802.1 le standard for RT multimedia communication and analysed the reasons of the limitations happened and proposed the solutions for improvement. Since RT multimedia communication encompasses time-sensitive traffic, the measure of quality of service generally is minimal delay (latency) and delay variation (jitter). 802.11 WLAN standard focuses on the PHY layer and the MAC layer. The transmitted data rate on PHY layer are increased on standards 802.1 lb, a, g, j, n by different code mapping technologies while 802.1 le is developed specially for the QoS performance of RT-traffics at the MAC layer. Enhancing the MAC layer protocols are the significant topic for guaranteeing the QoS performance of RT-traffics. The original MAC protocols of 802.11 are DCF (Distributed Coordination Function) and PCF (Point Coordinator Function). They cannot achieve the required QoS performance for the RT-traffic transmission. IEEE802.lle draft has developed EDCF and HCF instead. Simulation results of EDCF and HCF models that we explored by OPNET and NS-2, show that minimal latency and jitter can be achieved. However, the limitations of EDCF and HCF are identified from the simulation results. EDCF is not stable under the high network loading. The channel utilization is low by both protocols. Furthermore, the fairness index is very poor by the HCF. It means the low priority traffic should starve in the WLAN network. All these limitations are due to the priority mechanism of the protocols. We propose a future work to develop dynamic self-adaptive 802.11c protocol as practical research directions. Because of the uncertainly in the EDCF in the heavy loading, we can add some parameters to the traffic loading and channel condition efficiently. We provide indications for adding some parameters to increase the EDCF performance and channel utilization. Because all the limitations are due to the priority mechanism, the other direction is doing away with the priority rule for reasonable bandwidth allocation. We have established that the channel utilization can be increased and collision time can be reduced for RT-traffics over the EDCF protocol. These parameters can include loading rate, collision rate and total throughput saturation. Further simulation should look for optimum values for the parameters. Because of the huge polling-induced overheads, HCF has the unsatisfied tradeoff. This leads to poor fairness and poor throughput. By developing enhanced HCF it may be possible to enhance the RI polling interval and TXOP allocation mechanism to get better fairness index and channel utilization. From the simulation, we noticed that the traffics deployment could affect the total QoS performance, an indication to explore whether the classification of traffics deployments to different categories is a good idea. With different load-based traffic categories, QoS may be enhanced by appropriate bandwidth allocation Strategy

    Forecasting transport mode use with support vector machines based approach

    Get PDF
    The paper explores potential to forecast what transport mode one will use for his/her next trip. The support vector machines based approach learns from individual's behavior (validated GPS tracks) to support smart city transport planning services. The overall success rate, in forecasting the transport mode, is 82 %, with lower confusion for private car, bike and walking

    Participation in Public Administration Revisited: Delimiting, Categorizing and Evaluating Administrative Participation

    Get PDF
    Participation has been a relevant issue in public administration research and theory for several decades, especially in old democracies. However, recent processes of globalization, Europeanization and digitalisation, coupled with diminishing citizens’ trust in public institutions, have again made the concept of public participation topical. The aim of this paper is to provide a theoretical reflection on the concept and substance of participation in public administration and on research efforts. In order to do so, administrative participation is first defined and distinguished from other types of participation in modern democracies (political and civil participation). Participation in public administration encompasses the processes through which the public is directly involved in regulative and implementation functions of administrative organizations, as well as in the oversight of their functioning. The three main categories of participation in public administration are elaborated – regulative, implementing and oversight participation – together with some apparent forms (instruments) within each category. The main principles upon which administrative participation is based are also explained – transparency, openness, responsiveness and trust. The final part of the paper contains an overview of the existing research and evaluation of participation in public administration. The twofold value of participation – intrinsic and instrumental – is explained, its potential benefits and shortcomings are listed, and a distinction between the process and outcome dimension of participation is elaborated. Although the literature has become rather extensive and refined, one can conclude that unambiguous findings on the practical effects of participation are still deficient, especially with regard to its dependence on different contextual – especially organizational – variables. Therefore, some conceptual and methodological observations for further research are formulated

    Applying Real Options Thinking to Information Security in Networked Organizations

    Get PDF
    An information security strategy of an organization participating in a networked business sets out the plans for designing a variety of actions that ensure confidentiality, availability, and integrity of company’s key information assets. The actions are concerned with authentication and nonrepudiation of authorized users of these assets. We assume that the primary objective of security efforts in a company is improving and sustaining resiliency, which means security contributes to the ability of an organization to withstand discontinuities and disruptive events, to get back to its normal operating state, and to adapt to ever changing risk environments. When companies collaborating in a value web view security as a business issue, risk assessment and cost-benefit analysis techniques are necessary and explicit part of their process of resource allocation and budgeting, no matter if security spendings are treated as capital investment or operating expenditures. This paper contributes to the application of quantitative approaches to assessing risks, costs, and benefits associated with the various components making up the security strategy of a company participating in value networks. We take a risk-based approach to determining what types of security a strategy should include and how much of each type is enough. We adopt a real-options-based perspective of security and make a proposal to value the extent to which alternative components in a security strategy contribute to organizational resiliency and protect key information assets from being impeded, disrupted, or destroyed

    On the analysis of EEG power, frequency and asymmetry in Parkinson's disease during emotion processing

    Get PDF
    Objective: While Parkinson’s disease (PD) has traditionally been described as a movement disorder, there is growing evidence of disruption in emotion information processing associated with the disease. The aim of this study was to investigate whether there are specific electroencephalographic (EEG) characteristics that discriminate PD patients and normal controls during emotion information processing. Method: EEG recordings from 14 scalp sites were collected from 20 PD patients and 30 age-matched normal controls. Multimodal (audio-visual) stimuli were presented to evoke specific targeted emotional states such as happiness, sadness, fear, anger, surprise and disgust. Absolute and relative power, frequency and asymmetry measures derived from spectrally analyzed EEGs were subjected to repeated ANOVA measures for group comparisons as well as to discriminate function analysis to examine their utility as classification indices. In addition, subjective ratings were obtained for the used emotional stimuli. Results: Behaviorally, PD patients showed no impairments in emotion recognition as measured by subjective ratings. Compared with normal controls, PD patients evidenced smaller overall relative delta, theta, alpha and beta power, and at bilateral anterior regions smaller absolute theta, alpha, and beta power and higher mean total spectrum frequency across different emotional states. Inter-hemispheric theta, alpha, and beta power asymmetry index differences were noted, with controls exhibiting greater right than left hemisphere activation. Whereas intra-hemispheric alpha power asymmetry reduction was exhibited in patients bilaterally at all regions. Discriminant analysis correctly classified 95.0% of the patients and controls during emotional stimuli. Conclusion: These distributed spectral powers in different frequency bands might provide meaningful information about emotional processing in PD patients

    Simulation Genres and Student Uptake: The Patient Health Record in Clinical Nursing Simulations

    Get PDF
    Drawing on fieldwork, this article examines nursing students’ design and use of a patient health record during clinical simulations, where small teams of students provide nursing care for a robotic patient. The student-designed patient health record provides a compelling example of how simulation genres can both authentically coordinate action within a classroom simulation and support professional genre uptake. First, the range of rhetorical choices available to students in designing their simulation health records are discussed. Then, the article draws on an extended example of how student uptake of the patient health record within a clinical simulation emphasized its intertextual relationship to other genres, its role mediating social interactions with the patient and other providers, and its coordination of embodied actions. Connections to students’ experiences with professional genres are addressed throughout. The article concludes by considering initial implications of this research for disciplinary and professional writing courses

    Many-Task Computing and Blue Waters

    Full text link
    This report discusses many-task computing (MTC) generically and in the context of the proposed Blue Waters systems, which is planned to be the largest NSF-funded supercomputer when it begins production use in 2012. The aim of this report is to inform the BW project about MTC, including understanding aspects of MTC applications that can be used to characterize the domain and understanding the implications of these aspects to middleware and policies. Many MTC applications do not neatly fit the stereotypes of high-performance computing (HPC) or high-throughput computing (HTC) applications. Like HTC applications, by definition MTC applications are structured as graphs of discrete tasks, with explicit input and output dependencies forming the graph edges. However, MTC applications have significant features that distinguish them from typical HTC applications. In particular, different engineering constraints for hardware and software must be met in order to support these applications. HTC applications have traditionally run on platforms such as grids and clusters, through either workflow systems or parallel programming systems. MTC applications, in contrast, will often demand a short time to solution, may be communication intensive or data intensive, and may comprise very short tasks. Therefore, hardware and software for MTC must be engineered to support the additional communication and I/O and must minimize task dispatch overheads. The hardware of large-scale HPC systems, with its high degree of parallelism and support for intensive communication, is well suited for MTC applications. However, HPC systems often lack a dynamic resource-provisioning feature, are not ideal for task communication via the file system, and have an I/O system that is not optimized for MTC-style applications. Hence, additional software support is likely to be required to gain full benefit from the HPC hardware

    University-wide Entrepreneurship Education: Alternative Models and Current Trends

    Full text link
    WP 2002-02 March 2002The paper examines the trend towards university-wide programs in entrepreneurship education. We present a conceptual framework for dividing university-wide programs into two categories: “magnet programs,” which draw students into entrepreneurship courses offered in the business school, and “radiant programs,” which feature entrepreneurship courses outside the business school, focused on the specific context of the non-business students. Examining 38 ranked entrepreneurship programs, we found that about 75% now have university-wide programs, most of which follow a magnet model. In interviews with stakeholders at sample institutions (some ranked, others not), we found that magnet and radiant programs differ in terms of program definition, motivation for the university-wide focus, and costs and benefits. Our major findings are 1) The trend toward University-wide entrepreneurship education is strong and gaining momentum 2) Our conceptual framework clarifies the different pathways for creating a university-wide approach, 3) While the radiant model is extremely appealing to students, parents, and alumni, the magnet model is easier to administer and represents the pathway of least resistance, and 4) While the magnet model is simpler to implement, it may lead to conflicts in the longer term because the benefits (in terms of flow of students and donors) may not be shared equally across the university

    Teaching children with and without disabilities school readiness skills

    Get PDF
    We know that computer assisted educational curricula are much more attention captivating and interesting to children compared with a classic paper and pencil approach to teaching. Educational computer games can easily engage students, captivate and maintain their attention allowing them both learning with teachers and practicing on their own time without the teacher’s direct attention. Overall, computer based instruction increases the motivation and results in faster acquisition of skills. Also, teaching children with developmental disabilities requires special set of tools and methods, due to decreased level of attention towards stimuli presented and lessened capability to learn in the ways typical children do. Therefore, computer based instruction seems to be a good match for these diverse learners because it offers multiple exemplars, interesting and interactive practice with constant feedback, multiplied learning opportunities without direct teacher engagement, and customization to each child’s needs. In this paper we present the expanded LeFCA framework that was proven successful for teaching children with autism basic skills and concepts, and we now tested it across various levels of learners with and without disabilities across 3 different languages: Bosnian-Croatian-Serbian (BHS), Italian and English (US). Within the pilot project, we produced four games for teaching matching, pointing out (based on visual and auditory stimuli) and labeling skills, which are considered to be primary skills needed for learning. We then expanded the frame with adding four more games that teach sorting, categorizing, sequencing and pattern making. The results of our user study, done with 20 participants in three different languages, showed that the created software in native languages was completely clear and user friendly for kids with and without special needs, and that is systematically and developmentally appropriately sequenced for learning. Additionally, we found that children were able to generalize learned skills, through a transfer to a new mediums or environments and their teacher reported that children were very motivated and enjoyed playing the games
    • 

    corecore