TU Delft Repository

    Quantifying Vulnerabilities in an Airport Checkpoint: A study on the role of employee behavior in the emergence of vulnerabilities

    No full text
    Ever since the attacks on the World Trade Center, airport security has been a topic of interest. The United States was caught by surprise and the attacks triggered a renewed interest in aviation security. It was well recognized that airport security was one step behind on intelligent attackers and this created the need to developed better risk assessment methods.Unfortunately, none of the risk assessment methods developed has the possibility to quantify the vulnerabilities in an airport checkpoint. One of the main challenges in developing a method which can do this, is to find a technique that can account for the complexity of the airport environment from which the vulnerabilities emerge. Furthermore empirical research has shown that security operators do not necessarily follow protocol, but behave as autonomous agents which regularly bend or break the rules.A method which can potentially identify vulnerabilities in such a complex environment is agent based modeling. This modeling technique has proven to be very powerful in modeling complex systems that emerge from the behaviour of autonomous agents. Some work has been done in this area, but until now this technique has not been used to quantify vulnerabilities in an airport checkpoint. Therefore, the aim of this project is to develop an agent based model of an airport checkpoint, quantify the vulnerabilities emerging from this checkpoint and analyze the effect of employee behaviour on these vulnerabilities. To do this, the behaviour of the security operators is modeled using models that are rooted in behavioural psychology and have strong empirical backing. The employees decision making is modeled using Decision Field Theory and the employee performance is modeled using the Functional State Model. With the model developed, a set of experiments is performed to calibrate the model and analyze the vulnerabilities. These experiments result in the quantification of vulnerabilities for a predefined set of threat scenarios.The analysis of these threat scenarios shows that employee behaviour mainly impact threats scenarios in which a weapon is hidden in the carry-on baggage. The reason that security operators have a large influence on the outcome of this screening process is that it is the process in which employees have to perform multiple activities and make multiple decisions. It is found that the vulnerabilities in this screening process are mainly dependent on the speed/accuracy trade-off as made by the employees. The perceived risk of the detected prohibited items plays a limited role, since most prohibited items are only seen as a small risk. The performance of the employees played a less important role on the outcome of the screening process and differences in performance are mainly caused by the personality type of the agents. The personality type that put more effort into a task, outperformed the other agents. The skill level of the operators however, did not significantly effect the outcome the simulation. This suggests that the effort an operator puts in is more important than the skill level of the operator.Finally it is found that it is beneficial to minimize the number of steps in the screening process. This benefits overall performance, since adding steps to the screening process means adding opportunities to make mistakes

    Mobile Payment: The Hiding Impact of Learning Costs on User Intentions

    No full text
    This study analyzes how learning costs for technologies that lack de facto standards, such as mobile payment, affect user intentions. In addition, we evaluate how the negative effect of learning costs is mediated by perceived functional value and facilitating conditions. Data used in this research was obtained from a study among 463 consumers. We find support that negative effects from learning costs are fully mediated by perceived functional value and facilitating conditions. Hence, one important reason of slow user acceptance is that the high diversity mobile payment services, platforms and technologies increases the learning costs of users. The results pose important implications for managers willing to increase the acceptance of mobile payment.Information and Communication Technolog

    An exploratory study into the influence of last-mile home delivery innovations on consumer delivery service choices in the parcel and meal delivery markets

    No full text
    Innovating the last mile of parcel and meal delivery Logistic service providers are looking for ways to innovate their delivery process in order to optimize their operations, lower operational cost, and to be able to meet growing consumer demands in terms of convenience, predictability, accuracy and flexibility in delivery time and place. Ordering products, produce and meals online, to be delivered at your door, is becoming increasingly common. This leads to an increase in demand for home deliveries. From the operational cost perspective of logistic service providers (LSP), the so called last mile of the delivery is very expensive. The last mile is the very last part of the journey. For parcel delivery, this is typically the trip from the LSP’s distribution centre to the consumer home. LSP’s are contracted to deliver parcels to the consumer. Therefore, retailers are in fact the customer from a LSP’s perspective. Improving cost and quality of delivery services is important for LSP’s for two reasons. Firstly, to lower operational cost in order to improve their market position. Secondly, to leverage a better consumer experience to the retailers. There are innovations – often involving aspects of automation – which may optimize this last mile, therewith potentially reducing the operational cost of delivery and enhancing flexibility and quality for the consumers, in theory making them interesting investments. This research focusses on the consumer side of this topic, and looks into their preferences by means of discrete choice modelling in order to determine the consumer value of these innovations. Preferences can be specific to a type of commodity. For example, different trade-offs may be made when ordering a book rather than ordering a pizza. To capture these differences to a certain extent, this research takes into consideration parcel delivery and meal delivery.Transport, Infrastructure and Logistic

    Damage progression on fiber reinforced polymer (FRP) adhesively bonded single lap joints (ABSLJ) under quasi-static tension: Implementation of a 3D continuum damage model in UMAT to predict: global behavior, damage initiation and propagation until final failure, for different adherend layup configurations

    No full text
    Adhesively bonded joints have proven to outperform their mechanically fastened joint counterparts, as they present a more structurally efficient method of load transfer, lower stress concentrations and better fatigue performance at reduced weight. In the specific case of the Adhesively Bonded Single Lap Joint (ABSLJ), bending-induced stresses that result from the load path eccentricity add up to the adherend inplane stresses. Moreover, significant peak peel and shear stresses develop at the lap ends of the adhesive and associated adherend interlaminar tensile stresses have a detrimental effect on the joint’s strength. Such joints made of Fiber Reinforced Polymer (FRP) adherends bonded with an epoxy adhesive layer sustain a substantial amount of damage, from failure onset to ultimate failure. With the purpose of design structurally efficient and damage tolerant composite joints, it is essential to understand the stress distribution and to accurately predict the damage initiation and propagation events in such joints made of composite materials.A well-established set of Damage Progression Models (DPMs) in the framework on the Continuum Damage Models (CDMs) were developed as a tool to predict the global response,damage initiation load and ultimate load of the specimens. Hashin 3D, Puck and LaRC05 werethe implemented failure criteria to detect the initiation of damage in the adherends. After thispoint, the elastic properties of the detected damage elements were reduced according to sudden and gradual material degradation models. As for the adhesive, the von Mises criterion was used to detect the damage onset and a linear softening law modeled the material degradation. For the validation of the DPMs, the numerical results were compared against the data of an already published experimental study. Four different adherend layup sequences: [45/90/ − 45/0]2푠, [90/−45/0/45]2푠, [0/45/90/−45]2푠 and [45/0/−45/0]2푠 were studied based on data extractedfrom the mechanical testing, Digital Image Correlation (DIC) and Acoustic Emission (AE).Good correlations between numerical predictions and averaged experimental linear stiffnesses were found, particularly for the two configurations with the outmost ply at 45∘, for which the difference was lower than 5%. The initial non-linear stage of the global response seems to be governed by the longitudinal bending stiffness, while the subsequent linear behavior is controlled by the longitudinal membrane stiffness of the adherends. Regarding damage initiation, numerical predictions showed to be 11.5%, 7.5%, 29.9% and 6.1%, respectively, more conservative for the four analysed configurations, when compared to the AE results, whose established criterion should be further developed. With respect to the ultimate load, the relative differences between predictions and tests showed significant variability among the tested configurations; specifically the deviations were of: 33.2%, 37.4%, -0.4% and -13.71%. Despite the encouraging results, an inherent shortcoming of CDMs is the representation of damage in a smeared manner due to the homogenization of the anisotropic material in the modeling process. A blended framework using CDMs to model intralaminar failure and discrete crack models to model interlaminar failure and matrix cracking might lead to more realistic damage patterns

    Comparison of new memory surface hardening models for prediction of high cyclic loading (Comparaison de nouveaux modèles de surface de mémoire à durcissement pour la prévision de fortes charges cycliques)

    No full text
    This paper presents an objective comparison between two recent constitutive models employing the concept of the hardening memory surface to predict the high cyclic loading behaviour of granular soils. The hardening memory surface is applied to the well-known Severn-Trent sand and the SANINSAND04 constitutive models. While the addition of the new model surface (the memory surface) leads to enhanced model capabilities, slight differences in the implementation can lead to different model performances and simulations. This paper describes the differences between the two implementations and highlights the most relevant modelling ingredi-ents to predict particular features of the cyclic soil behaviour. This paper will help the reader in selecting the most suitable model and related ingredients for a particular geotechnical application.Accepted Author ManuscriptGeo-engineerin

    Exploring Multicore Architectures For Streaming Applications

    No full text
    The Smith Waterman algorithm is used to perform local alignment on biological sequences by calculating a similarity matrix. This process is computation-intensive. Only the elements along the minor diagonal of the matrix can be calculated in parallel, due to the nature of dependencies present in the algorithm. In the past, CPUs, GPUs and FPGAs have been used to implement the Smith Waterman algorithm. While GPUs offer better performance as compared to FPGAs and are easier to program, they have higher power consumption. The FPGA implementations typically employ systolic arrays, which consist of processing elements connected in a regular manner through which data is streamed. Custom designed processing elements for an FPGA implementation entails a lot of effort. In this thesis, we investigate alternative architectures to provide performance with a lower power profile and ease of programmability. We design a systolic array architecture with general purpose processors and map the Smith Waterman algorithm on it. The design of the systolic array consists of scratchpad memories to store intermediate data. Since employing multiple processors is a common method to extract more performance nowadays, we compare our architecture with a multicore architecture. Simulation results show that the systolic array architecture promises more speedup than the multicore architecture, achieving a performance of up to 1.5MCUPS for 16 processing elements, which is 4x times faster than a 16-processor multicore architecture. Moreover the performance of the systolic array architecture scales well with increasing number of processors as compared to the multicore architecture. Mapping the SW algorithm to the systolic array architecture is possible using only 100 lines of code programmed within 2 person-weeks in C which is a standard, familiar language. Our experiences with mapping the algorithm onto the systolic array architecture show that it could result into a CUDA-like programming paradigm.Computer Engineerin

    ENETOSH Standard of Competence for Instructors and Trainers in Safety and Health

    No full text
    The ENETOSH Standard of Competence for Instructors and Trainers in Safety and Health was developed as part of a project funded by the European Commission (LEONARDO DA VINCI, 146 253, 10/2005 – 09/2007). The aim of the project was to set up a “European Network Education and Training in Occupational Safety and Health” (ENETOSH). The development of the standard of competence was one of the work packages on the project, in which 13 partners from 10 countries participated. The project was coordinated by the Institute for Work and Health, part of the German Social Accident Insurance (DGUV). ENETOSH offers a Europe-wide and international platform, via which knowledge and experience in the area of education and training in safety and health can be shared in a systematic manner. It includes, for example, a database with almost 600 examples of good practice from 38 countries plus international forums for discussion between participants. The network is aimed at all staff engaged in education and training on all levels of the educational system. In 2011, the ENETOSH network consisted of 57 members and partners from 23 European countries plus South Korea and USA. The IAG remains responsible for its coordination.Values and TechnologyTechnology, Policy and Managemen

    aEEG analog front end IC for a neonatal brain development monitoring

    No full text
    Every year number of prematurely born infants grows. Most underdeveloped organ after birth is brain. Therefore its monitoring is very important, especially as it can provide indications about health state in a future, both short and long term. Non invasive method of brain monitoring is EEG recording. However tedious process of system set up and short time of recording discourage doctors to use it in daily care. Good alternative is aEEG measurement. Number of electrodes is reduced to 5 including ground node and signal interpretation is much easier. Main advantage of the system is high correlation between aEEG readouts and raw EEG signal. Although aEEG is already well known and accepted in neonatology, it is still not used to monitor every patient. Problem is high price of a device starting from 30000 euro. In a result, hospital is not able to provide proper monitoring for each and every patient. For this reason, main task of this thesis is to propose cheaper version of a system. The system can be divided into two prats. Analog front end and digital part providing signal processing. There are plenty of cheap development boards that can be used as digital part, therefore focus of this project was set on analog front end. In order to propose cheap design, minimal requirements have to be specified. Two tests were performed. First one was to identify interferences disturbing aEEG recording in the worst case scenario, when patient is inside an incubator and all monitoring devices are turned on. For this purpose, a phantom mimicking neonate’s body conductivity and size was constructed. Results showed that differential method of obtaining the signal is necessary for biosignals which amplitude is in V. Only registered interference was 50Hz spike coming from the mains. Noise floor peak to peak amplitude was measured on 1V level, while magnitude of 50Hz spike was on the level of 9V for devices turned off and 25V for devices turned on. Big influence on the recording has phantom location as well. If phantom was in direct neighbourhood of devices connected to power line, 50Hz spike raised to 90V level and several harmonics of the main tone appeared. By putting phantom 2 meters away, main tone’s magnitude dropped to 2V level and all of the harmonics were gone. Second performed test was resolution test. Real raw EEG signal was applied on the Simulink model analog to digital converter in order to identify what is minimal number of bits required for proper signal acquisition. Tests showed that in order to keep the number of bits low, amplification of the signal is required. For signal directly applied on the ADC 18bit resolution was required for proper signal acquisition. Amplification by factor of 1000 allowed to reduced this value to 7bits. Proposed system consists of amplifying stage realising 60dB gain with high pass cut off filtration and ADC. Amplifying stage is realised by amplifier providing 35dB gain with filtration below 2Hz and second amplifier realising 25dB gain. ADC is implemented by continuous time second order Sigma Delta Modulator. Proposed system was designed in CMOS 0.18 and h18a6am technology. Tests of full system showed SNR no lower than 51dB, power consumption of 217.5V. Input stage has CMRR of 113dB and input impedance above 2.25GΩ for the bandwidth 2-15Hz. System reliability was checked with corner analysis and wide range of temperatures. Results showed small variations of SNR.Electrical Engineering, Mathematics and Computer ScienceMicroelectronic

    The other city - desigining a serious game for crisis training in close protection

    No full text
    Effective training methods are key to successful crisis management in close protection. This paper discusses the outcomes of a project on the development of a serious game, a virtual training environment for close protection. The aims of the game are to improve situational awareness and communications skills at the individual and team level. Two game designs, developed with two different game engines, are presented and discussed in relation to the project’s objectives. Comparison of the two designs shows that several trade-offs are encountered when developing a training game with the available technology. Technological features of the game engines, and differences in time invested in the development of different aspects of the games, make that the two designs meet different project objectives. Simultaneously reaching all project objectives in a single design seems impossible with the two game engines. This paper discusses the different trade-offs that were encountered in the project and presents the major challenges that lie ahead.Multi Actor SystemsTechnology, Policy and Managemen
    TU Delft Repositoryis based in NL
    Repository Dashboard
    Do you manage TU Delft Repository? Access insider analytics, issue reports and manage access to outputs from your repository in the CORE Repository Dashboard!