27 research outputs found

    Object modelling and tracking in videos via multidimensional features

    No full text
    We propose to model a tracked object in a video sequence by locating a list of object features that are ranked according to their ability to differentiate against the image background. The Bayesian inference is utilised to derive the probabilistic location of the object in the current frame, with the prior being approximated from the previous frame and the posterior achieved via the current pixel distribution of the object. Consideration has also been made to a number of relevant aspects of object tracking including multidimensional features and the mixture of colours, textures, and object motion. The experiment of the proposed method on the video sequences has been conducted and has shown its effectiveness in capturing the target in a moving background and with nonrigid object motion

    Error minimized designation of inhomogeneous assessors on repetitive tasks in large numbers

    No full text
    Assessment consistency is not easy to maintain across many assessors for a unit of a large student population, particularly when a great many of those assessors are not regular staff. This work proposes an assessor reallocation approach, and a variant, to assign assessors to marking new assessment items for the different students based on the assessors’ earlier marking statistics in comparison with that of the other fellow assessors. This is to minimize the potential accumulation of marking discrepancies without having to resort to additional staff training which can often be impossible within an allowed time or budget frame

    Object modelling in videos via multidimensional features of colours and textures

    Get PDF
    We propose to model a tracked object in a video sequence by locating a list of object features that are ranked according to their ability to differentiate against the image background. The Bayesian inference is utilised to derive the probabilistic location of the object in the current frame, with the prior being approximated from the previous frame and the posterior achieved via the current pixel distribution of the object. The experiment of the proposed method on the video sequences has also been conducted and has shown its effectiveness in capturing the target in a moving background and with non-rigid object motion

    Object modelling in videos via multidimensional features of colours and textures

    No full text
    We propose to model a tracked object in a video sequence by locating a list of object features that are ranked according to their ability to differentiate against the image background. The Bayesian inference is utilised to derive the probabilistic location of the object in the current frame, with the prior being approximated from the previous frame and the posterior achieved via the current pixel distribution of the object. The experiment of the proposed method on the video sequences has also been conducted and has shown its effectiveness in capturing the target in a moving background and with non-rigid object motion

    A miniature e-learning portal of full teaching and learning capacity

    No full text
    We developed a component-based miniature e-learning portal aiming at delivering a small set of subjects in an institute of learning, or supporting a small to medium sized coaching centre. The main thrust is to make the portal simple with minimal requirement on the hardware and software, yet still very competitive in its functionalities. Under the auspice of not having to serve many hundreds of subjects simultaneously, this portal supports not only all the major e-learning activities such as student registration, assignment submission, online quizzes, forums, feedback editing, scheduled surveys etc, but also new challenges such as a fairer marker reallocation, and programming drills with an automatic marking. The fact that it is component-based, concise, and essentially free means that instructors can easily add their own additional dedicated teaching modules themselves. This portal can stand on its own, or be embedded into other major teaching systems, or do the both concurrently, and it has been deployed by one of the authors for the delivery of large cohorts of hundreds of students in a university over the last 10 years while it was and still is being further developed, including the modules for the knowledge-based training that is yet to be fully implemented

    Bias reduced designation of inhomogeneous assessors on repetitive tasks in large numbers

    No full text
    Assessment consistency is not easy to maintain across many assessors for a unit of a large student population, particularly when a great many of those assessors are not regular staff. This work proposes an assessor reallocation approach, with some variants, to assign assessors to marking new assessment items for the different students based on the assessors' earlier marking statistics in comparison with that of the other assessors. This is to minimize the potential accumulation of marking discrepancies without having to resort to additional staff training which can often be impossible within the allowed time or budget frame. More specifically, we will first estimate the individual assessors' marking inclination or tendency, termed “bias” for simplicity, against the average for each particular assessment item, then profile each assessor by balancing such biases over a number of assessment items already marked, and subsequently predict for each student the potential bias he is likely to receive when marked by the different assessors. The proposed algorithm will finally select the assessor for the next assessment item so that it will lead to pro-rata the smallest difference with respect to the average of the accumulated total marking biases. This approach is objective and independent of the subjects being delivered, and can be readily applied, particularly in the context of e-learning or e-education, to any assessment tasks that involve multiple assessors in parallel over a number of assessment items

    Webpage resource protection via obfuscation and auto expiry

    No full text
    Content delivery via web pages or sites are becoming increasingly popular due to the effectiveness and versatility of the readily available delivery mechanism, especially in the eeducation and training. While the copyright laws are there to protect the ownership and commercial rights of the intelligent properties, the openness of the web architecture often makes it impossible to prevent the content source being misappropriated or incorporated illegitimately elsewhere after some modifications on the downloaded source. We propose here an obfuscation mechanism for the HTML5 to convert a site of raw content into a site of obfuscated pages and images. With the advent of canvas on HTML5 and the AJAX to stop certain unauthorized access, the whole site of documents can be rendered meaningless or useless on both the server and the client side if just a small key part is modified or hidden. Several masquerading algorithms have been proposed for this purpose. The obfuscation will become permanent if a webpage is merely downloaded or even DOM-saved without having all necessary intermediate data or keys tracked by a specialist, before the auto-expiry of such a process, at a cost tantamount or exceeding the reconstruction of the original documents from scratches hence defeating the purpose of piracy. We applied the scheme to the delivery of a university subject by automating the whole process

    A new user cooperative protocol based on code-shifted differential chaos shift keying modulation

    No full text
    In this paper, a new cooperative protocol based on code-shifted differential chaos shift keying (CS-DCSK) modulation, which is an improved differential chaos shifted keying modulation scheme no requirement delay circuits at autocorrelation receiver (AcR), is proposed. The relay user transmits own data while piggybacks other user's data which are received and decoded at pre-phase utilizing the multiple bit-streams transmission property of generalized CS-DCSK (GCS-DCSK). Moreover, the cooperative user may forward the estimated data only when its data is correctly decoded using cyclic redundancy check (CRC) code, namely selective decode-and-forward (DAF). Simulation results show that the proposed protocol has better performance than that of conventional four phase cooperative protocol in bit error rate (BER) and system throughput

    Effective and efficient strategies and their technological implementations to reduce plagiarism and collusion in nonproctored online exams

    No full text
    Advanced digital technologies and social media have greatly improved both the learning experience and the assessment convenience, while inadvertently facilitated potential plagiarism and collaborative cheating at the same time. In this article, we will focus on the strategies and their technological implementations to run exams, or in-class tests similar to the nature of an exam. Our aim is to defeat potential student plagiarism or collusion as much as possible, while not consuming more than a tolerable amount of time and efforts on the teaching team as a whole. For the most vulnerable online, open-book, and nonproctored exams, we thus propose to limit the reading and answering of each main or section question to a strictly allocated amount of time questionwise, and we forbid returning to any previous questions that are already completed, or expired, to greatly reduce the time window for potential plagiarism and collusion. An equally important design goal here is that the relevant implementation, maintenance, deployment, exam management, answer retrieval, and marking should in principle be made sufficiently efficient to be handled completely just by the relevant subject instructor(s), at least after the initial development

    Power quality assessment of different load categories

    No full text
    In this paper, a fuzzy logic based method is proposed to define and evaluate power quality indices for different categories of electricity consumers. The severity level of each power quality disturbance is first estimated according to the frequency of its identification or some other metrics including percentages of voltage imbalance, total harmonic distortion rates, and durations and varying ranges of certain types of disturbances. With the constitutions of a regional power load being analyzed, a power quality index is then computed for each category of electricity end-users based on the severities of the identified disturbances and their impacts to each particular load type. It is believed that such an index will help utilities assess power quality comprehensively so as to improve their services to meet variable requirements from different electricity customers. Some data based on site measurements are used to demonstrate the proposed method in calculating the defined power quality index
    corecore