170 research outputs found

    E-university delivery model: handling the evaluation process

    Get PDF
    Purpose The setting up of e-university has been slow-going. Much of e-university slow progress has been attributed to poor business models, branding, disruptive technologies, lack of organisational structure that accommodates such challenges, and failure to integrate a blended approach. One of the stumbling blocks, among many, is the handling of evaluation process. E-university models do not provide much automation compared to the original brick-and-mortar classroom model of delivery. The underlining technologies may not have been supportive; however, the conditions are changing, and more evaluation tools are becoming available for academics. The paper aims to discuss these issues. Design/methodology/approach This paper identifies the extent of current online evaluation processes. In this process, the team reviews the case study of a UK E-University using Adobe Connect learning model that mirrors much of the physical processes as well as online exams and evaluation tools. Using the Riva model, the paper compares the physical with the online evaluation processes for e-universities to identify differences in these processes to evaluate the benefits of e-learning. As a result, the models can help us to identify the processes where improvements can take place for automating the process and evaluate the impact of this change. Findings The paper concludes that this process can be significantly shortened and provide a fairer outcome but there remain some challenges for e-university processes to overcome. Originality/value This paper examines the vital quality assurance processes in academia as more universities move towards process automation, blended or e-university business models. Using the case study of Arden University online distance learning, the paper demonstrates, through modelling and analysis that the process of online automation of the evaluation process is achieved with significant efficiency

    E-university Lecture Delivery Model: From Classroom to Virtual

    Get PDF
    The failure of the UK government in setting up the first e-university in the early 2000 is attributed to several reasons including poor business models, branding, disruptive technologies, lack of organizational structure that accommodates such challenges, and failure to integrate a blended approach. Key to this failure is the lecture/lesson delivery model whereby e-university lesson models did not adapt much of the original classroom model of teaching with that of the virtual environment. A key obstacle is believed to be the lack of technologies of the time to support such processes. The conditions have since changed and are set to continue to change. This paper looks at academic research, technological innovations, employs process analysis, and reflective analysis to provide a lecture/lesson delivery model for the next generations of e-universities. The aim is to find to what extend current online lecture/lesson deliveries have evolved. In this process, the team reviews the case study of a UK e-university using Adobe Connect learning model that mirrors much of the physical processes of lecture/lesson delivery. Using Riva model, the paper compares the physical with the virtual model of lesson/lecture delivery processes. The paper concludes that this key process has shown promising results but there remain some challenges for e-university processes to overcome

    A Probabilistic Data Fusion Modeling Approach for Extracting True Values from Uncertain and Conflicting Attributes

    Get PDF
    Real-world data obtained from integrating heterogeneous data sources are often multi-valued, uncertain, imprecise, error-prone, outdated, and have different degrees of accuracy and correctness. It is critical to resolve data uncertainty and conflicts to present quality data that reflect actual world values. This task is called data fusion. In this paper, we deal with the problem of data fusion based on probabilistic entity linkage and uncertainty management in conflict data. Data fusion has been widely explored in the research community. However, concerns such as explicit uncertainty management and on-demand data fusion, which can cope with dynamic data sources, have not been studied well. This paper proposes a new probabilistic data fusion modeling approach that attempts to find true data values under conditions of uncertain or conflicted multi-valued attributes. These attributes are generated from the probabilistic linkage and merging alternatives of multi-corresponding entities. Consequently, the paper identifies and formulates several data fusion cases and sample spaces that require further conditional computation using our computational fusion method. The identification is established to fit with a real-world data fusion problem. In the real world, there is always the possibility of heterogeneous data sources, the integration of probabilistic entities, single or multiple truth values for certain attributes, and different combinations of attribute values as alternatives for each generated entity. We validate our probabilistic data fusion approach through mathematical representation based on three data sources with different reliability scores. The validity of the approach was assessed via implementation into our probabilistic integration system to show how it can manage and resolve different cases of data conflicts and inconsistencies. The outcome showed improved accuracy in identifying true values due to the association of constructive evidence

    A best-effort integration framework for imperfect information spaces

    Get PDF
    Entity resolution (ER) with imperfection management has been accepted as a major aspect while integrating heterogeneous information sources that exhibit entities with varied identifiers, abbreviated names, and multi-valued attributes. Many of novel integration applications such as personal information management and web-scale information management require the ability to represent and manipulate imperfect data. This requirement signifies the issues of starting with imperfect data to the production of probabilistic database. However, classical data integration (CDI) framework fails to cope with such requirement of explicit imperfect information management. This paper introduces an alternative integration framework based on the best-effort perspective to support instance integration automation. The new framework explicitly incorporates probabilistic management to the ER tasks. The probabilistic management includes a new probabilistic global entity, a new pair-wise-source-to-target ER process, and probabilistic decision model logic as alternatives. Together, the paper presents how these processes operate to support the current heterogeneous sources integration challenges

    Right-click Authenticate adoption: The impact of authenticating social media postings on information quality

    Get PDF
    Getting the daily news from social media has nowadays become a common practice among people. Unreliable sources of information expose people to a dose of hoaxes, rumours, conspiracy theories and misleading news. Mixing both reliable and unreliable information on social media has made the truth to be hardly determined. Academic research indicates an increasing reliance of online users on social media as a main source of news. Researchers found that young users, in particular, are to believe what they read on social media without adequate verification. In previous work, we proposed the concept of `Right-click Authenticate' where we suggested designing an accessible tool to authenticate and verify information online before sharing it. In this paper, we present a review of the problem of sharing misinformation online and extend our work by analysing how `Right-click Authenticate' reduces the challenges of while improving key metrics within the Information Quality fields

    Mechanotransduction is required for establishing and maintaining mature inner hair cells and regulating efferent innervation

    Get PDF
    In the adult auditory organ, mechanoelectrical transducer (MET) channels are essential for transducing acoustic stimuli into electrical signals. In the absence of incoming sound, a fraction of the MET channels on top of the sensory hair cells are open, resulting in a sustained depolarizing current. By genetically manipulating the in vivo expression of molecular components of the MET apparatus, we show that during pre-hearing stages the MET current is essential for establishing the electrophysiological properties of mature inner hair cells (IHCs). If the MET current is abolished in adult IHCs, they revert into cells showing electrical and morphological features characteristic of pre-hearing IHCs, including the re-establishment of cholinergic efferent innervation. The MET current is thus critical for the maintenance of the functional properties of adult IHCs, implying a degree of plasticity in the mature auditory system in response to the absence of normal transduction of acoustic signals

    Corporate Social Responsibility and Islamic Financial Institutions (IFIs): Management Perceptions from IFIs in Bahrain

    Get PDF
    Islamic finance is gaining greater attention in the finance industry, and this paper analyses how Islamic financial institutions (IFIs) are responding to the welfare needs of society. Using interview data with managers and content analysis of the disclosures, this study attempts to understand management perceptions of corporate social responsibility (CSR) in IFIs. A thorough understanding of CSR by managers, as evident in the interviews, has not been translated fully into practice. The partial use of IFIs’ potential role in social welfare would add further challenges in the era of financialisation

    Otoferlin acts as a Ca2+ sensor for vesicle fusion and vesicle pool replenishment at auditory hair cell ribbon synapses

    Get PDF
    Hearing relies on rapid, temporally precise, and sustained neurotransmitter release at the ribbon synapses of sensory cells, the inner hair cells (IHCs). This process requires otoferlin, a six C2-domain, Ca2+-binding transmembrane protein of synaptic vesicles. To decipher the role of otoferlin in the synaptic vesicle cycle, we produced knock-in mice (Otof Ala515,Ala517/Ala515,Ala517) with lower Ca2+-binding affinity of the C2C domain. The IHC ribbon synapse structure, synaptic Ca2+ currents, and otoferlin distribution were unaffected in these mutant mice, but auditory brainstem response wave-I amplitude was reduced. Lower Ca2+ sensitivity and delay of the fast and sustained components of synaptic exocytosis were revealed by membrane capacitance measurement upon modulations of intracellular Ca2+ concentration, by varying Ca2+ influx through voltage-gated Ca2+-channels or Ca2+ uncaging. Otoferlin thus functions as a Ca2+ sensor, setting the rates of primed vesicle fusion with the presynaptic plasma membrane and synaptic vesicle pool replenishment in the IHC active zone
    • 

    corecore