4,237 research outputs found

    Purging of untrustworthy recommendations from a grid

    Full text link
    In grid computing, trust has massive significance. There is lot of research to propose various models in providing trusted resource sharing mechanisms. The trust is a belief or perception that various researchers have tried to correlate with some computational model. Trust on any entity can be direct or indirect. Direct trust is the impact of either first impression over the entity or acquired during some direct interaction. Indirect trust is the trust may be due to either reputation gained or recommendations received from various recommenders of a particular domain in a grid or any other domain outside that grid or outside that grid itself. Unfortunately, malicious indirect trust leads to the misuse of valuable resources of the grid. This paper proposes the mechanism of identifying and purging the untrustworthy recommendations in the grid environment. Through the obtained results, we show the way of purging of untrustworthy entities.Comment: 8 pages, 4 figures, 1 table published by IJNGN journal; International Journal of Next-Generation Networks (IJNGN) Vol.3, No.4, December 201

    Gravity and Large Extra Dimensions

    Get PDF
    The idea that quantum gravity can be realized at the TeV scale is extremely attractive to theorists and experimentalists alike. This proposal leads to extra spacial dimensions large compared to the electroweak scale. Here we give a very systematic view of the foundations of the theories with large extra dimensions and their physical consequences.Comment: 26 pages, 3 diagram

    OLGA : An Ontology and LSTM-based approach for generating Arithmetic Word Problems (AWPs) of transfer type

    Full text link
    Machine generation of Arithmetic Word Problems (AWPs) is challenging as they express quantities and mathematical relationships and need to be consistent. ML-solvers require a large annotated training set of consistent problems with language variations. Exploiting domain-knowledge is needed for consistency checking whereas LSTM-based approaches are good for producing text with language variations. Combining these we propose a system, OLGA, to generate consistent word problems of TC (Transfer-Case) type, involving object transfers among agents. Though we provide a dataset of consistent 2-agent TC-problems for training, only about 36% of the outputs of an LSTM-based generator are found consistent. We use an extension of TC-Ontology, proposed by us previously, to determine the consistency of problems. Among the remaining 64%, about 40% have minor errors which we repair using the same ontology. To check consistency and for the repair process, we construct an instance-specific representation (ABox) of an auto-generated problem. We use a sentence classifier and BERT models for this task. The training set for these LMs is problem-texts where sentence-parts are annotated with ontology class-names. As three-agent problems are longer, the percentage of consistent problems generated by an LSTM-based approach drops further. Hence, we propose an ontology-based method that extends consistent 2-agent problems into consistent 3-agent problems. Overall, our approach generates a large number of consistent TC-type AWPs involving 2 or 3 agents. As ABox has all the information of a problem, any annotations can also be generated. Adopting the proposed approach to generate other types of AWPs is interesting future work

    Academic Audit and Quality Assurance in Higher Education

    Get PDF
    The role of higher education institutions is reflected in its learning outcomes. The learning outcomes contribute to develop quality professionals by enhancing competency in subject knowledge and intellectual capability, grooming professionalism and employability skills. Still further it contributes to emotional and social maturity, sound character, sharp business acumen, strong scientific temper and strategic thinking among the learners. This could be materialized only through imparting comprehensive, continually enhanced and global quality professional education supported by a sound quality management system. Quality policy contributes to institutionalizing the quality assurance processes. Commitment to providing quality teaching and learning through well designed and systematic curriculum delivery using multitude of learning experiences is at the core of this policy. A variety of quality assurance processes are institutionalized focusing around teacher quality, curriculum delivery and pedagogy, research and training, skill development of students, orientation programmes for overall personality development and broad range of activities which equip the students to face challenges and take up risks with courage. Academic Audit gives feed-back on its efficiency. The observations from the audit are utilised for institutional improvement

    Author productivity and the application of Lotka’s Law in LIS publications

    Get PDF
    The paper examines authorship pattern of 556 papers published in Journal of Documentation during 2003 to 2015. In addition to the papers, a sample of 1550 references from a population of 15,529 unique references given at the end of the papers was selected using simple random sample method. It was found that almost half of the publications were written by single authors. Lotka’s Law was tested on the resulting 2106 publications using Kolmogorov-Smrinov goodness-of-fit. The K-S test and the author productivity graph revealed that Lotka’s law was applicable to the set LIS publications.

    Supernovae as Probes of Extra Dimensions

    Get PDF
    Since the dawn of the new millennium, there has been a revived interest in the concept of extra dimensions.In this scenario all the standard model matter and gauge fields are confined to the 4 dimensions and only gravity can escape to higher dimensions of the universe.This idea can be tested using table-top experiments, collider experiments, astrophysical or cosmological observations. The main astrophysical constraints come from the cooling rate of supernovae, neutron stars, red giants and the sun. In this article, we consider the energy loss mechanism of SN1987A and study the constraints it places on the number and size of extra dimensions and the higher dimensional Planck scale.Comment: 5 pages, no figures, new references are adde

    Pulmonary Tumor Detection by virtue of GLCM

    Get PDF
    132–134As per the technical evolution and latest trend, Image processing techniques has become a boon in medical domain especially for tumor detection. Presence of tumor in Lungs which leads to lung cancer is a prominent and trivial disease at 18%. This is important to be detected at early stage thereby decreasing the mortality rate. The survival rate among people increased by early diagnosis of lung tumor. Detection of tumor cell will improve the survival rate from 14 to 49%. The aim of this research work is to design a lung tumor detection system based on analysis of microscopic image of biopsy using digital image processing. This can be done using Gray Level Co-Occurrence Matrix (GLCM) method and classified using back propagation neural network. This method is used for extracting texture features based on parameters such as contrast, correlation, energy, and homogeneity from the lung nodule. The microscopic lung biopsy images are classified into either cancer or non-cancer class using the artificial neural network algorithm. The proposed system has proven results in lung tumor detection and diagnosis
    corecore