39,092 research outputs found

    Social media in collaborative learning in higher education : a qualitative case study of teachers’ and students’ views

    Get PDF
    In this study, it was investigated how social media are used in collaborative learning in higher education and also how it can be better used in teaching and learning according to the students and teachers. The research questions of this study were: 1) How social media are used in collaborative learning by the teachers and students in higher education for educational purposes? 2) How could social media be used in collaborative learning process in higher education, according to the students and teachers? Qualitative interviews were conducted to collect the data from ten students and five teachers from the different faculties of University of Lapland and Lapland University of Applied Sciences. In conclusion it was found that, social media were not much used in collaboration with teachers by the students of both institutions. In case of teachers, it was found that all of them were using social media in their collaborative ways of teaching design and they have found social media as useful tool to deliver their teaching. Most of the students and all the teachers found social media to be useful in their teaching and learning. But there were also some challenges faced and areas of improvements identified by them. Thus the higher educational institutions should understand the importance of using social media in teaching and learning and take initiatives to overcome the current challenges identified by the students and teachers

    Information Physics: The New Frontier

    Full text link
    At this point in time, two major areas of physics, statistical mechanics and quantum mechanics, rest on the foundations of probability and entropy. The last century saw several significant fundamental advances in our understanding of the process of inference, which make it clear that these are inferential theories. That is, rather than being a description of the behavior of the universe, these theories describe how observers can make optimal predictions about the universe. In such a picture, information plays a critical role. What is more is that little clues, such as the fact that black holes have entropy, continue to suggest that information is fundamental to physics in general. In the last decade, our fundamental understanding of probability theory has led to a Bayesian revolution. In addition, we have come to recognize that the foundations go far deeper and that Cox's approach of generalizing a Boolean algebra to a probability calculus is the first specific example of the more fundamental idea of assigning valuations to partially-ordered sets. By considering this as a natural way to introduce quantification to the more fundamental notion of ordering, one obtains an entirely new way of deriving physical laws. I will introduce this new way of thinking by demonstrating how one can quantify partially-ordered sets and, in the process, derive physical laws. The implication is that physical law does not reflect the order in the universe, instead it is derived from the order imposed by our description of the universe. Information physics, which is based on understanding the ways in which we both quantify and process information about the world around us, is a fundamentally new approach to science.Comment: 17 pages, 6 figures. Knuth K.H. 2010. Information physics: The new frontier. J.-F. Bercher, P. Bessi\`ere, and A. Mohammad-Djafari (eds.) Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2010), Chamonix, France, July 201

    Microelectronics Process Engineering at San Jose State University: A Manufacturing-Oriented Interdisciplinary Degree Program

    Get PDF
    San Jose State University\u27s new interdisciplinary curriculum in Microelectronics Process Engineering is described. This baccalaureate program emphasizes hands-on thin-film fabrication experience, manufacturing methods such as statistical process control, and fundamentals of materials science and semiconductor device physics. Each course of the core laboratory sequence integrates fabrication knowledge with process engineering and manufacturing methods. The curriculum development process relies on clearly defined and detailed program and course learning objectives. We also briefly discuss our strategy of making process engineering experiences accessible for all engineering students through both Lab Module and Statistics Module series

    A method of classification for multisource data in remote sensing based on interval-valued probabilities

    Get PDF
    An axiomatic approach to intervalued (IV) probabilities is presented, where the IV probability is defined by a pair of set-theoretic functions which satisfy some pre-specified axioms. On the basis of this approach representation of statistical evidence and combination of multiple bodies of evidence are emphasized. Although IV probabilities provide an innovative means for the representation and combination of evidential information, they make the decision process rather complicated. It entails more intelligent strategies for making decisions. The development of decision rules over IV probabilities is discussed from the viewpoint of statistical pattern recognition. The proposed method, so called evidential reasoning method, is applied to the ground-cover classification of a multisource data set consisting of Multispectral Scanner (MSS) data, Synthetic Aperture Radar (SAR) data, and digital terrain data such as elevation, slope, and aspect. By treating the data sources separately, the method is able to capture both parametric and nonparametric information and to combine them. Then the method is applied to two separate cases of classifying multiband data obtained by a single sensor. In each case a set of multiple sources is obtained by dividing the dimensionally huge data into smaller and more manageable pieces based on the global statistical correlation information. By a divide-and-combine process, the method is able to utilize more features than the conventional maximum likelihood method

    A distance measure of interval-valued belief structures

    Get PDF
    Interval-valued belief structures are generalized from belief function theory, in terms of basic belief assignments from crisp to interval numbers. The distance measure has long been an essential tool in belief function theory, such as conflict evidence combinations, clustering analysis, belief function and approximation. Researchers have paid much attention and proposed many kinds of distance measures. However, few works have addressed distance measures of interval-valued belief structures up. In this paper, we propose a method to measure the distance of interval belief functions. The method is based on an interval-valued one-dimensional Hausdorff distance and Jaccard similarity coefficient. We show and prove its properties of non-negativity, non-degeneracy, symmetry and triangle inequality. Numerical examples illustrate the validity of the proposed distance

    The Organizational Design of Intelligence Failures

    Get PDF
    While the detection, and prevention, of the September 11, 2001 plot would have been ideal, I argue that the more major intelligence failures occurred after the attacks of September 11. The erroneous intelligence concerning the WMD presence in Iraq permitted the Bush Administration to order the invasion of Iraq. Systematic underestimates of the budgetary costs and personnel requirements of the war meant that Congress did not give the matter the debate that it warranted. Finally, incorrect (or incomplete) intelligence concerning the extent of the informal opposition to the U.S. led forces resulted in inadequate numbers of allied forces being deployed and a protracted period of conflict and disruption in Iraq. These facts are all well known to anyone who reads newspapers. I make three arguments in this paper. First, the collection of the intelligence data and its evaluation does not occur in a vacuum. There must always be an organizing theory that motivates the collection and evaluation of the data and that this theory is formulated at the highest levels of the decision making process. Second, it is not possible to construct a truly neutral or objective (analytical) hierarchy. Third, it is impossible to separate the analytical evaluation of the data from the decision that will be based on such evaluation. As an inevitable consequence of these arguments, intelligence analysis and the resulting conclusions are driven by top-down considerations rather than bottom-up as has been argued by some reviewers of recent intelligence failures. Key Words: stable coalitions, self-enforcing agreements, compliance, enforcement, public goods

    Centralization, Decentralization, and Conflict in the Middle East and North Africa

    Get PDF
    This paper examines broadly the intergovernmental structure in the Middle East and North Africa region, which has one of the most centralized government structures in the world. The authors address the reasons behind this centralized structure by looking first at the history behind the tax systems of the region. They review the Ottoman taxation system, which has been predominantly influential as a model, and discuss its impact on current government structure. They also discuss the current intergovernmental structure by examining the type and degree of decentralization in five countries representative of the region: Egypt, Iran, West Bank/Gaza, Tunisia, and Yemen. Cross-country regression analysis using panel data for a broader set of countries leads to better understanding of the factors behind heavy centralization in the region. The findings show that external conflicts constitute a major roadblock to decentralization in the region.Fiscal decentralization; intergovernmental relations; Middle East and North Africa
    • …
    corecore