13 research outputs found

    Need for the Intercultural Awareness in Erasmus Mobility -Administrative point of view

    Get PDF
    Abstract. In recent years, mobility has become one of the most important goals inside the European Union (EU). Different projects and programs support the mobility of students, teachers and other employed persons. From program to program and project to project, goals are different. Through employment, companies have the need for different experts or the need to work with international teams. They can also offer jobs for placements and for young experts for their first employment (like the Leonardo da Vinci program), while for students and teachers, the main goals of mobility are learning and teaching in different environments (like the Erasmus program) as well as learning languages and benefiting from cross-cultural experiences. Both students and teachers come from different cultural environments and the host organisation has to take care of intercultural awareness in all levels of activities, from administration up to teaching and passing exams

    Inference attacks and control on database structures

    Get PDF
    Today’s databases store information with sensitivity levels that range from public to highly sensitive, hence ensuring confidentiality can be highly important, but also requires costly control. This paper focuses on the inference problem on different database structures. It presents possible treats on privacy with relation to the inference, and control methods for mitigating these treats. The paper shows that using only access control, without any inference control is inadequate, since these models are unable to protect against indirect data access. Furthermore, it covers new inference problems which rise from the dimensions of new technologies like XML, semantics, etc

    Simulation of line scale contamination in calibration uncertainty model

    No full text
    Precise calibration of quartz line scales is very important for assuring traceability of microscopic measurements. Very significant influence in calibration uncertainty budget is represented by uncertainty of line centre detection. Line centre is usually detected through video signal processing using different types of algorithms. This paper is presenting calibration procedure that was developed in the Laboratory for Production Measurement at the Faculty of Mechanical Engineering in Maribor. It is focused in uncertainty analysis and especially in the influence of line scale contamination on determination of line centre position. Different types of line scale contamination like dirt spots, scratches, line edge incorrectness, and line intensity were simulated in order to test the ability of the line centre detection algorithm to eliminate such influences from the measurement results

    Some ideas about intelligent medical system design

    No full text
    Mechanical systems increase ouur physical abilities (cranes to lift vast amounts, telescopes to see farther, etc.), but intelligent systems are power tools for heavy lifting in the information world - they complement, extend, and amplify our ability to think and solve problems. In the present paper we will introduce some thoughts and ideas about intelligent medical systems design and the use of MetaMet. A specific design approach constructed with MetaMet will be presented and discussed

    Searching for messages conforming to arbitrary sets of conditions in SHA-256

    No full text
    Recent progress in hash functions analysis has led to collisions on reduced versions of SHA-256. As in other hash functions, differential collision search methods of SHA-256 can be described by means of conditions on and between state and message bits. We describe a tool for efficient automatic searching of message pairs conforming to useful sets of conditions, i. e. stemming from (interleaved) local collisions. We not only considerably improve upon previous work [7], but also show the extendability of our approach to larger sets of conditions. Furthermore, we present the performance results of an actual implementation and pose an open problem in this context

    Improvement of the Peyravian-Jeffries`s user authentication protocol and password change protocol

    No full text
    Remote authentication of users supported by passwords is a broadly adopted method of authentication within insecure network environments. Such protocols typically rely on pre-established secure cryptographic keys or public key infrastructure. Recently, Peyravian and Jeffries [M. Peyravian, C. Jeffries, Secure remote user access over insecure networks, Computer Communications 29 (5-6) (2006) 660-667] proposed a protocol for secure remote user access over insecure networks. Shortly after the protocol was published Shim [K.A. Shim, Security flaws of remote user access over insecure networks, Computer Communications 30 (1) (2006) 117-121] and Munilla et al. [J. Munilla, A. Peinado, Off-line password-guessing attack to Peyravian-Jeffries`s remote user authentication protocol, Computer Communications 30 (1) (2006) 52-54] independently presented an off-line guessing attack on the protocol. Based on their findings we present an improved secure password-based protocol for remote user authentication, password change, and session key establishment over insecure networks, which is immune against the attack

    An algorithm for protecting knowledge discovery data

    No full text
    In the paper, we present an algorithm that can be applied to protect data before a data mining process takes place. The data mining, a part of the knowledge discovery process, is mainly about building models from data. We address the following question: can we protect the data and still allow the data modelling process to take place? We consider the case where the distributions of original data values are preserved while the values themselves change, so that the resulting model is equivalent to the one built with original data. The presented formal approach is especially useful when the knowledge discovery process is outsourced. The application of the algorithm is demonstrated through an example

    Data Policy for Increasing the Data Quality in Intelligent Systems

    No full text
    Use of data in various areas and their electronic availability has put importance of data quality to higher level. In general data quality has syntactic and semantic component. The syntactic component is relatively easy to achieve if supported by tools, while semantic component requires more research. In many cases such data come from different sources, which are distributed across enterprise and are at different quality levels. Special attention needs to be paid to data upon which critical decisions are met in/for intelligent systems. In the present paper we will focus on the semantic component of data quality in the selected domain and data policy for increasing the quality of data used and/or acquired in intelligent systems

    An evaluation of process complexity

    No full text
    Postopke v vseh aktivnostih lahko opredelimo z zaporedjem aktivnosti, ki se izvajajo, delovnimi proizvodi, ki so potrebni za izvajanje aktivnosti, oziroma nastanejo kot rezultat aktivnosti ter z viri, ki so potrebni za izvedbo posameznih aktivnosti. Čim bolj podrobno poznamo postopek, tem več podatkov je na voljo pri ocenjevanju posameznih projektov. V prispevku je opisan model (Software Process Complexity Model), ki omogoča izdelavo podrobnega opisa procesa na temelju notacije Petrijevih mrež. Model postavlja tudi mehanizem za ocenjevanje zahtevnosti tako posameznih gradnikov kakor tudi postopka kot celote. Na podlagi tako izvedenih ocenitev lahko določimo relativni delež truda, ki bo v izbranem projektu potreben za izvedbo posamezne aktivnosti. Model SoPCoM je bil razvit v okviru raziskovalnega dela na Univerzi v Mariboru.All types of processes can be described as a sequence of activities using a set of reqiured input and output products and resources of different types. Knowing the process in detail ensures that evaluations of the required effort for projects - conducted on the basis of a described process model - are more accurate. The Software Process Complexity Model (SoPCoM) described in this paper provides a mechanism for the description of a process in Petri-nets notation. Attributes defined in the SopCoM enable an evaluation of the complexity of each process element as well as the complexity of the process as a whole. The relative complex process model. The SoPCoM was developed at the University of Maribor
    corecore