170 research outputs found
E-university delivery model: handling the evaluation process
Purpose
The setting up of e-university has been slow-going. Much of e-university slow progress has been attributed to poor business models, branding, disruptive technologies, lack of organisational structure that accommodates such challenges, and failure to integrate a blended approach. One of the stumbling blocks, among many, is the handling of evaluation process. E-university models do not provide much automation compared to the original brick-and-mortar classroom model of delivery. The underlining technologies may not have been supportive; however, the conditions are changing, and more evaluation tools are becoming available for academics. The paper aims to discuss these issues.
Design/methodology/approach
This paper identifies the extent of current online evaluation processes. In this process, the team reviews the case study of a UK E-University using Adobe Connect learning model that mirrors much of the physical processes as well as online exams and evaluation tools. Using the Riva model, the paper compares the physical with the online evaluation processes for e-universities to identify differences in these processes to evaluate the benefits of e-learning. As a result, the models can help us to identify the processes where improvements can take place for automating the process and evaluate the impact of this change.
Findings
The paper concludes that this process can be significantly shortened and provide a fairer outcome but there remain some challenges for e-university processes to overcome.
Originality/value
This paper examines the vital quality assurance processes in academia as more universities move towards process automation, blended or e-university business models. Using the case study of Arden University online distance learning, the paper demonstrates, through modelling and analysis that the process of online automation of the evaluation process is achieved with significant efficiency
E-university Lecture Delivery Model: From Classroom to Virtual
The failure of the UK government in setting up the first e-university in the early 2000 is attributed to several reasons including poor business models, branding, disruptive technologies, lack of organizational structure that accommodates such challenges, and failure to integrate a blended approach. Key to this failure is the lecture/lesson delivery model whereby e-university lesson models did not adapt much of the original classroom model of teaching with that of the virtual environment. A key obstacle is believed to be the lack of technologies of the time to support such processes. The conditions have since changed and are set to continue to change. This paper looks at academic research, technological innovations, employs process analysis, and reflective analysis to provide a lecture/lesson delivery model for the next generations of e-universities. The aim is to find to what extend current online lecture/lesson deliveries have evolved. In this process, the team reviews the case study of a UK e-university using Adobe Connect learning model that mirrors much of the physical processes of lecture/lesson delivery. Using Riva model, the paper compares the physical with the virtual model of lesson/lecture delivery processes. The paper concludes that this key process has shown promising results but there remain some challenges for e-university processes to overcome
A Probabilistic Data Fusion Modeling Approach for Extracting True Values from Uncertain and Conflicting Attributes
Real-world data obtained from integrating heterogeneous data sources are often multi-valued, uncertain, imprecise, error-prone, outdated, and have different degrees of accuracy and correctness. It is critical to resolve data uncertainty and conflicts to present quality data that reflect actual world values. This task is called data fusion. In this paper, we deal with the problem of data fusion based on probabilistic entity linkage and uncertainty management in conflict data. Data fusion has been widely explored in the research community. However, concerns such as explicit uncertainty management and on-demand data fusion, which can cope with dynamic data sources, have not been studied well. This paper proposes a new probabilistic data fusion modeling approach that attempts to find true data values under conditions of uncertain or conflicted multi-valued attributes. These attributes are generated from the probabilistic linkage and merging alternatives of multi-corresponding entities. Consequently, the paper identifies and formulates several data fusion cases and sample spaces that require further conditional computation using our computational fusion method. The identification is established to fit with a real-world data fusion problem. In the real world, there is always the possibility of heterogeneous data sources, the integration of probabilistic entities, single or multiple truth values for certain attributes, and different combinations of attribute values as alternatives for each generated entity. We validate our probabilistic data fusion approach through mathematical representation based on three data sources with different reliability scores. The validity of the approach was assessed via implementation into our probabilistic integration system to show how it can manage and resolve different cases of data conflicts and inconsistencies. The outcome showed improved accuracy in identifying true values due to the association of constructive evidence
A best-effort integration framework for imperfect information spaces
Entity resolution (ER) with imperfection management has been accepted as a major aspect while integrating heterogeneous information sources that exhibit entities with varied identifiers, abbreviated names, and multi-valued attributes. Many of novel integration applications such as personal information management and web-scale information management require the ability to represent and manipulate imperfect data. This requirement signifies the issues of starting with imperfect data to the production of probabilistic database. However, classical data integration (CDI) framework fails to cope with such requirement of explicit imperfect information management. This paper introduces an alternative integration framework based on the best-effort perspective to support instance integration automation. The new framework explicitly incorporates probabilistic management to the ER tasks. The probabilistic management includes a new probabilistic global entity, a new pair-wise-source-to-target ER process, and probabilistic decision model logic as alternatives. Together, the paper presents how these processes operate to support the current heterogeneous sources integration challenges
Right-click Authenticate adoption: The impact of authenticating social media postings on information quality
Getting the daily news from social media has nowadays become a common practice among people. Unreliable sources of information expose people to a dose of hoaxes, rumours, conspiracy theories and misleading news. Mixing both reliable and unreliable information on social media has made the truth to be hardly determined. Academic research indicates an increasing reliance of online users on social media as a main source of news. Researchers found that young users, in particular, are to believe what they read on social media without adequate verification. In previous work, we proposed the concept of `Right-click Authenticate' where we suggested designing an accessible tool to authenticate and verify information online before sharing it. In this paper, we present a review of the problem of sharing misinformation online and extend our work by analysing how `Right-click Authenticate' reduces the challenges of while improving key metrics within the Information Quality fields
Mechanotransduction is required for establishing and maintaining mature inner hair cells and regulating efferent innervation
In the adult auditory organ, mechanoelectrical transducer (MET) channels are essential for transducing acoustic stimuli into electrical signals. In the absence of incoming sound, a fraction of the MET channels on top of the sensory hair cells are open, resulting in a sustained depolarizing current. By genetically manipulating the in vivo expression of molecular components of the MET apparatus, we show that during pre-hearing stages the MET current is essential for establishing the electrophysiological properties of mature inner hair cells (IHCs). If the MET current is abolished in adult IHCs, they revert into cells showing electrical and morphological features characteristic of pre-hearing IHCs, including the re-establishment of cholinergic efferent innervation. The MET current is thus critical for the maintenance of the functional properties of adult IHCs, implying a degree of plasticity in the mature auditory system in response to the absence of normal transduction of acoustic signals
Corporate Social Responsibility and Islamic Financial Institutions (IFIs): Management Perceptions from IFIs in Bahrain
Islamic finance is gaining greater attention in the finance industry, and this paper analyses how Islamic financial institutions (IFIs) are responding to the welfare needs of society. Using interview data with managers and content analysis of the disclosures, this study attempts to understand management perceptions of corporate social
responsibility (CSR) in IFIs. A thorough understanding of CSR by managers, as evident in the interviews, has not been translated fully into practice. The partial use of IFIsâ potential role in social welfare would add further challenges in the era of financialisation
Otoferlin acts as a Ca2+ sensor for vesicle fusion and vesicle pool replenishment at auditory hair cell ribbon synapses
Hearing relies on rapid, temporally precise, and sustained neurotransmitter release at the ribbon synapses of sensory cells, the inner hair cells (IHCs). This process requires otoferlin, a six C2-domain, Ca2+-binding transmembrane protein of synaptic vesicles. To decipher the role of otoferlin in the synaptic vesicle cycle, we produced knock-in mice (Otof Ala515,Ala517/Ala515,Ala517) with lower Ca2+-binding affinity of the C2C domain. The IHC ribbon synapse structure, synaptic Ca2+ currents, and otoferlin distribution were unaffected in these mutant mice, but auditory brainstem response wave-I amplitude was reduced. Lower Ca2+ sensitivity and delay of the fast and sustained components of synaptic exocytosis were revealed by membrane capacitance measurement upon modulations of intracellular Ca2+ concentration, by varying Ca2+ influx through voltage-gated Ca2+-channels or Ca2+ uncaging. Otoferlin thus functions as a Ca2+ sensor, setting the rates of primed vesicle fusion with the presynaptic plasma membrane and synaptic vesicle pool replenishment in the IHC active zone
Recommended from our members
Multi-model evaluation of short-lived pollutant distributions over East Asia during summer 2008
The ability of seven state of the art chemistry-aerosol models to reproduce distributions of tropospheric ozone and its precursors, as well as aerosols over eastern Asia in summer 2008 is evaluated. The study focuses on the performance of models used to assess impacts of pollutants on climate and air quality as part of the EU ECLIPSE project. Models, run using the same ECLIPSE emissions, are compared over different spatial scales to in-situ surface, vertical profile and satellite data. Several rather clear biases are found between model results and observations including overestimation of ozone at rural locations
downwind of the main emission regions in China as well as downwind over the Pacific. Several models produce too much
ozone over polluted regions which is then transported downwind. Analysis points to different factors related to the ability of models to simulate VOC limited regimes over polluted regions and NOx limited regimes downwind. This may also be linked to biases compared to satellite NO2 indicating overestimation of NO2 over and to the north of the northern China Plain emission region. On the other hand, model NO2 is too low to the south and east of this region and over Korean/Japan. Overestimation of ozone is linked to systematic underestimation of CO particularly at rural sites and downwind of the main Chinese emission
regions. This is likely to be due to enhanced destruction of CO by OH. Overestimation of Asian ozone and its transport downwind implies that radiative forcing from this source may be overestimated. Model-observation discrepancies over Beijing do not appear to be due to emission controls linked to the Olympic Games in summer 2008. With regard to aerosols, most models reproduce the satellite-derived AOD patterns over eastern China. Our study nevertheless reveals an overestimation of ECLIPSE model-mean surface BC and sulphate aerosols in urban China in summer 2008. The effect of the short-term emission mitigation in Beijing is too weak to explain the differences between the models. Our results rather point to an overestimation of SO2 emissions, in particular, close to the surface in Chinese urban areas. However, we also identify a clear underestimation of aerosol concentrations over northern India, suggesting that the rapid recent growth of emissions in India, as well as their spatial extension, is underestimated in emission inventories. Model deficiencies in the representation of pollution accumulation due to the Indian monsoon may also be playing a role. Comparison with vertical aerosol lidar measurements highlights a general underestimation of scattering aerosols in the boundary layer associated with overestimation in the free troposphere pointing to modeled aerosol lifetimes that are too long. This is likely linked to a too strong vertical transport and/or insufficient deposition efficiency during transport or export from the boundary layer, rather than chemical processing (in the case of sulphate aerosols). Underestimation of sulphate in the boundary layer implies potentially large errors in simulated aerosol-cloud interactions, via impacts on boundary-layer clouds. This evaluation has important implications for accurate assessment of air pollutants on regional air quality and global climate based on global model calculations. Ideally, models should be run at higher resolution over source regions to better simulate
urban-rural pollutant gradients/chemical regimes, and also to better resolve pollutant processing and loss by wet deposition as well as vertical transport. Discrepancies in vertical distributions requires further quantification and improvement since this is a key factor in the determination of radiative forcing from short-lived pollutants
- âŠ