36 research outputs found

    A study of the generalizability of self-supervised representations

    Get PDF
    Recent advancements in self-supervised learning (SSL) made it possible to learn generalizable visual representations from unlabeled data. The performance of Deep Learning models fine-tuned on pretrained SSL representations is on par with models fine-tuned on the state-of-the-art supervised learning (SL) representations. Irrespective of the progress made in SSL, its generalizability has not been studied extensively. In this article, we perform a deeper analysis of the generalizability of pretrained SSL and SL representations by conducting a domain-based study for transfer learning classification tasks. The representations are learned from the ImageNet source data, which are then fine-tuned using two types of target datasets: similar to the source dataset, and significantly different from the source dataset. We study generalizability of the SSL and SL-based models via their prediction accuracy as well as prediction confidence. In addition to this, we analyze the attribution of the final convolutional layer of these models to understand how they reason about the semantic identity of the data. We show that the SSL representations are more generalizable as compared to the SL representations. We explain the generalizability of the SSL representations by investigating its invariance property, which is shown to be better than that observed in the SL representations

    Impact of intellectual capital on profitability : conventional versus Islamic banks

    Get PDF
    Intellectual capital has been found to have a significant association with profitability in the financial sector of various parts of the world. As a result, this study aims to empirically investigate the relationship between intellectual capital and financial performance of twentyseven private commercial banks for the year 2013 in Bangladesh. Annual reports for the relevant year of the selected banks have been used to gather secondary information for the empirical models based on Pulic’s VAIC model. Stepwise regression was performed for the full sample, conventional and Islamic banks separately. The analysis indicates that both VIAC and its components have a significant association with profitability. Results for conventional and Islamic banks established different components of VIAC as a significant predictor of bank’s profitability. A future study including all financial institutions could provide a better estimate of the impact of intellectual capital on profitability for the finance sector.peer-reviewe

    Fintech and Islamic Finance: Literature Review and Research Agenda

    Get PDF
    Fintech revolution started with the introduction of credit cards in 1960 and have been revolutionized with blockchain technologies. Integration of Fintech based solution with Islamic finance has gained interest among academics. However, the lack of literature evidence on this issue has motivated us to conduct a systematic literature review on Islamic Fintech. We have identified fourteen documents relevant to the context of the study and conducted the content and thematic analysis. An extensive review of past literature allows us to identify Shari’ah compliance as one of the major challenges for the growth of Islamic fintech. In addition, we conclude that Islamic fintech might pose challenges for Islamic Financial Institutions (IFIs) in terms of operational efficiency, customer retention, transparency and accountability. We contribute by providing insights on the challenges faced by the Islamic finance industry toward integrating Fintech based solutions with reference to past studies and indicate areas for future studies that could reduce the gaps in Islamic Fintech literature

    TOPOLOGY-AWARE APPROACH FOR THE EMERGENCE OF SOCIAL NORMS IN MULTIAGENT SYSTEMS

    Get PDF
    Social norms facilitate agent coordination and conflict resolution without explicit communication. Norms generally involve restrictions on a set of actions or behaviors of agents to a particular strategy and can significantly reduce the cost of coordination. There has been recent progress in multiagent systems (MAS) research to develop a deep understanding of the social norm formation process. This includes developing mechanisms to create social norms in an effective and efficient manner. The hypoth- esis of this dissertation is that equipping agents in networked MAS with “network thinking” capabilities and using this contextual knowledge to form social norms in an effective and efficient manner improves the performance of the MAS. This disser- tation investigates the social norm emergence problem in conventional norms (where there is no conflict between individual and collective interests) and essential norms (where agents need to explicitly cooperate to achieve socially-efficient behavior) from a game-theoretic perspective. First, a comprehensive investigation of the social norm formation problem is performed in various types of networked MAS with an emphasis on the effect of the topological structures on the process. Based on the insights gained from these network-theoretic investigations, novel topology-aware decentralized mech- anisms are developed that facilitate the emergence of social norms suitable for various environments. It addresses the convention emergence problem in both small and large conventional norm spaces and equip agents to predict the topological structure to use the suitable convention mechanisms. It addresses the cooperation emergence prob- lem in the essential norm space by harnessing agent commitments and altruism where appropriate. Extensive simulation based experimentation has been conducted on dif- ferent network topologies by varying the topological features and agent interaction models. Comparisons with state-of-the-art norm formation techniques show that pro- posed mechanisms facilitate significant improvement in performance in a variety of networks

    Requirements for large-scale adoption of rapid manufacturing technologies

    Get PDF
    Despite the use of Additive Manufacturing (AM) technologies in a lot of applications including the production of some high-value products for end use, it is still very much an untapped potential. There is an increase in usage of AM technology for the manufacture of end-use products (Rapid Manufacturing (RM)) in recent years, but mass use of the technology in terms of speed, cost and quality, which is acceptable by the general consumer, is still not widely in existence today. The concept of RM as a viable production process is still not understood by many businesses/consumers, with thinking still dominated by the AM technologies for Rapid Prototyping (RP) applications. A key difference between RM and RP is in the supply chain. The RM supply chain is much more complicated than the RP supply chain. This research conducted a Delphi Study to identify the requirements or prerequisites necessary for the use of RM technologies as a viable means to manufacture end used products (RM application of AM) in mass scale. The paper identifies 36 requirements or pre-requisites and classified them into various classes of importance in order to highlight their significance. In addition to supply chain issues, the requirements unearthed are factors or features about RM technology (equipment), materials and processes that need modification, upgrading or creation

    A Semantic-Aware Data Management System for Seismic Engineering Research Projects and Experiments

    Get PDF
    The invention of the Semantic Web and related technologies is fostering a computing paradigm that entails a shift from databases to Knowledge Bases (KBs). There the core is the ontology that plays a main role in enabling reasoning power that can make implicit facts explicit; in order to produce better results for users. In addition, KB-based systems provide mechanisms to manage information and semantics thereof, that can make systems semantically interoperable and as such can exchange and share data between them. In order to overcome the interoperability issues and to exploit the benefits offered by state of the art technologies, we moved to KB-based system. This paper presents the development of an earthquake engineering ontology with a focus on research project management and experiments. The developed ontology was validated by domain experts, published in RDF and integrated into WordNet. Data originating from scientific experiments such as cyclic and pseudo dynamic tests were also published in RDF. We exploited the power of Semantic Web technologies, namely Jena, Virtuoso and VirtGraph tools in order to publish, storage and manage RDF data, respectively. Finally, a system was developed with the full integration of ontology, experimental data and tools, to evaluate the effectiveness of the KB-based approach; it yielded favorable outcomes

    Low-Carbon Energy Technologies: Potentials of Solar and Nuclear Energy Sources for Sustainable Economic Development in Bangladesh

    Get PDF
    Electricity shortage has become a major challenge to continued economic growth in Bangladesh. The country is growing in terms of GDP growth at a rate of 7% a year. Bangladesh is expected to move towards 23rd position globally by 2050 from its position 31 in 2014, in terms of GDP at purchasing power parity (PPP). The demand for electricity is forecasted to be 61,164 MW within the same period. Currently, electricity generation in Bangladesh is highly dependent on fossil fuels, nearly 59% is produced from natural gas followed by furnace oil, diesel and coal, while only 3% from renewables. Electricity generation is the largest single source of GHG (greenhouse gas) emissions in Bangladesh, and thus finding alternative energy source has become imperative for the country. Solar and nuclear energy sources have the potentials to be utilized for low-carbon energy sector and thus for a sustainable economic development in Bangladesh. Barriers to solar and nuclear energy will be reduced significantly in coming years with technological advancement. However, energy policies need to be revised to facilitate low-carbon energy technologies. Besides, more international collaboration is highly required not only to import new technologies but also to enhance the capacity of research and development (R&D) as well as overall adoption of the technologies

    Facilitating Decision Choices with Cascading Consequences in Interdependent Program Networks

    Get PDF
    Naval Postgraduate School Acquisition Research Progra

    Separation and Execution of graphical engine on a cross platform IDE to enhance performance

    No full text
    “Biosim” is a simulation software which works to simulate the harvesting system.This system is able to design a model for any logistic problem with the combination of several objects so that the artificial system can show the performance of an individual model. The system will also describe the efficiency, possibility to be chosen for real life application of that particular model. So, when any one wish to setup a logistic model like- harvesting system, in real life he/she may be noticed about the suitable prostitution for his plants and factories as well as he/she may get information about the least number of objects, total time to complete the task, total investment required for his model, total amount of noise produced for his establishment in advance. It will produce an advance over view for his model. But “Biosim” is quite slow .As it is an object based system, it takes long time to make its decision. Here the main task is to modify the system so that it can work faster than the previous. So, the main objective of this thesis is to reduce the load of “Biosim” by making some modification of the original system as well as to increase its efficiency. So that the whole system will be faster than the previous one and performs more efficiently when it will be applied in real life. Theconcept is to separate the execution part of ”Biosim” form its graphical engine and run this separated portion in a third generation language platform. C++ is chosenhere as this external platform. After completing the proposed system, results with different models have been observed. The results show that, for any type of plants of fields, for any number of trucks, the proposed system is faster than the original system. The proposed system takes at least 15% less time “Biosim”. The efficiency increase with the complexity of than the original the model. More complex the model, more efficient the proposed system is than original “Biosim”.Depending on the complexity of a model, the proposed system can be 56.53 % faster than the original “Biosim”

    Separation and Execution of graphical engine on a cross platform IDE to enhance performance

    No full text
    “Biosim” is a simulation software which works to simulate the harvesting system.This system is able to design a model for any logistic problem with the combination of several objects so that the artificial system can show the performance of an individual model. The system will also describe the efficiency, possibility to be chosen for real life application of that particular model. So, when any one wish to setup a logistic model like- harvesting system, in real life he/she may be noticed about the suitable prostitution for his plants and factories as well as he/she may get information about the least number of objects, total time to complete the task, total investment required for his model, total amount of noise produced for his establishment in advance. It will produce an advance over view for his model. But “Biosim” is quite slow .As it is an object based system, it takes long time to make its decision. Here the main task is to modify the system so that it can work faster than the previous. So, the main objective of this thesis is to reduce the load of “Biosim” by making some modification of the original system as well as to increase its efficiency. So that the whole system will be faster than the previous one and performs more efficiently when it will be applied in real life. Theconcept is to separate the execution part of ”Biosim” form its graphical engine and run this separated portion in a third generation language platform. C++ is chosenhere as this external platform. After completing the proposed system, results with different models have been observed. The results show that, for any type of plants of fields, for any number of trucks, the proposed system is faster than the original system. The proposed system takes at least 15% less time “Biosim”. The efficiency increase with the complexity of than the original the model. More complex the model, more efficient the proposed system is than original “Biosim”.Depending on the complexity of a model, the proposed system can be 56.53 % faster than the original “Biosim”
    corecore