498 research outputs found

    Changing Effects of Monetary Policy in the U.S. –Evidence from a Time-Varying Coefficient VAR

    Get PDF
    We estimate a time-varying coefficient VAR model for the U.S. economy to analyse (i) if the effect of monetary policy on output has been changing systematically over time, and (ii) if monetary policy has asymmetric effects over the business cycle. We find that the impact of monetary policy shocks has been gradually declining over the sample period (1962-2002), as some theories of the monetary transmission mechanism imply. In addition, our results indicate that the effects of monetary policy are greater in a recession than in a boom.

    Theoretical and practical capabilities of probabilistic data fusion with Bayesian networks

    Get PDF
    The increasing demand for high quality data in context of intelligent transportation systems more and more facilitates the use of data fusion methods in order to derive as much information as possible from existing sensors and sensor technologies. This paper discusses the application of simple Bayesian networks for this task with regard to fusing traffic state measurements, in particular. Theoretical aspects like model calibration and statistical quality of the results are discussed in a mathematically exact way. Moreover, a real-world example based on floating car data from two independent vehicle fleets is described in order to evaluate the Bayesian approach also from a practical perspective. Chances and restrictions of the presented model – also with regard to possible modifications – are critically discussed

    Serious Games for Software Refactoring

    Get PDF
    Software design issues can severely impede software development and maintenance. Thus, it is important for the success of software projects that developers are aware of bad smells in code artifacts and improve their skills to reduce these issues via refactoring. However, software refactoring is a complex activity and involves multiple tasks and aspects. Therefore, imparting competences for identifying bad smells and refactoring code efficiently is challenging for software engineering education and training. The approaches proposed for teaching software refactoring in recent years mostly concentrate on small and artificial tasks and fall short in terms of higher level competences, such as analysis and evaluation. In this paper, we investigate the possibilities and challenges of designing serious games for software refactoring on real-world code artifacts. In particular, we propose a game design, where students can compete either against a predefined benchmark (technical debt) or against each other. In addition, we describe a lightweight architecture as the technical foundation for the game design that integrates pre-existing analysis tools such as test frameworks and software-quality analyzers. Finally, we provide an exemplary game scenario to illustrate the application of serious games in a learning setting

    Serious Refactoring Games

    Get PDF
    Software design issues can severely impede software development and maintenance. Thus, it is important for the success of software projects that developers are aware of bad smells in code artifacts and improve their skills to reduce these issues via refactoring. However, software refactoring is a complex activity and involves multiple tasks and aspects. Therefore, imparting competences for identifying bad smells and refactoring code efficiently is challenging for software engineering education and training. The approaches proposed for teaching software refactoring in recent years mostly concentrate on small and artificial tasks and fall short in terms of higher level competences, such as analysis and evaluation. In this paper, we investigate the possibilities and challenges of designing serious games for software refactoring on real-world code artifacts. In particular, we propose a game design, where students can compete either against a predefined benchmark (technical debt) or against each other. In addition, we describe a lightweight architecture as the technical foundation for the game design that integrates pre-existing analysis tools such as test frameworks and software-quality analyzers. Finally, we provide an exemplary game scenario to illustrate the application of serious games in a learning setting

    Integration der Betriebssicherheitsverordnung (BetrSichV) bei der Herstellung sowie vorbeugenden Instandhaltung elektrischer technischer Arbeitsmittel und Medizinprodukte

    Get PDF
    Integration der Betriebssicherheitsverordnung (BetrSichV) bei der Herstellung sowie vorbeugenden Instandhaltung elektrischer technischer Arbeitsmittel und MedizinprodukteIntegration of the Ordinance on Industrial Safety and Health „Betriebssicherheitsverordnung (BetrSichV)“ into Production and Preventive Maintenance of Electrical Working Appliances and Medical ProductsSummarisationStarting end of 2004 with a special view on the Ordinance on Industrial Safety and Health (Betriebssicherheitsverordnung BetrSichV) and the Labour Protection Law (ArbSchG) selected companies have been closely monitored and interviewed. The collected results were very significant, yet, lacking scientific support. It could be shown that the merger of two theories had to be accomplished, a) risk influence of internal risks and b) interaction between maintaenance and risk assessment. The theory of risk assessment of internal risks is based on synchronous judgement of workplace, material and work tools. Legal requirements had to be casted into applicable and clearly to be followed rules for the user, with the goal of achieving a combined risk assessment. This process was developed and – after all - tested. A theory of bidirectional interaction between maintenance and risk assessment was drafted. It could, by experiment, be confirmed together with the research on potential control parameters and components. Such inputs have been found and investigated. Plenty of literature is available that deals with simple relations between maintenance and risk assessment. Yet, there was not more result rather than a common thinking that preventive maintenance would be a positive criteria for calculations of potential risk. In existing literature no further considerations were taken on how risk assessment could be used as a control instrument to improve internal processes, mainly focussing preventive maintenance.. Now, both theories have been successfully used in companies and instututions which were coached through the test period.. Proven results were significant cost savings and – I am proud to tell – not a single mentionable casualty at work during the entire 3 year test period. Based on the concept of combined risk assessment a specification sheet was developed which, again, is the fundament for a sophisticated software that is presently under developement..Seit Ende 2004 wurden im Hinblick auf die Betriebssicherheitsverordnung (BetrSichV) und das Arbeitsschutzgesetz (ArbSchG) ausgewĂ€hlte Unternehmen intensiv betrachtet und befragt. Die gewonnenen Erkenntnisse allein waren zwar aussagekrĂ€ftig, allein es fehlte eine wissenschaftlich fundierte Beschreibung. Wie sich zeigte, war die ZusammenfĂŒhrung zweier Theorien erforderlich: a) die von der GefĂ€hrdungsbeeinflussung bei inneren Risiken und b) die von der wechselseitigen Beeinflussung von Instandhaltung und GefĂ€hrdungsbeurteilungen. Grundlage der Theorie der GefĂ€hrdungsbeeinflussung bei inneren Risiken ist die gleichzeitige Betrachtung von Arbeitsplatz, Arbeitsstoff und Arbeitsmittel. Hier galt es, die gesetzlichen Forderungen in eine anwendbare und nachvollziehbare Vorgehensweise fĂŒr die Anwender zu bringen, kurz gesagt: zu einer kombinierten GefĂ€hrdungsbeurteilung zu kommen. Eine entsprechende Vorgehensweise wurde daraufhin entwickelt und getestet. Die Theorie der wechselseitigen Beeinflussung von Instandhaltung und GefĂ€hrdungsbeurteilungen wurde ebenso formuliert und der Nachweis fĂŒr deren Wirksamkeit nach allgemein verwendbaren Stellmechanismen bzw. -komponenten gesucht, die dann gefunden und auf ihre Auswirkungen hin untersucht wurden. Zwar wurde bis dato ĂŒber die einfachen ZusammenhĂ€nge zwischen Instandhaltung und GefĂ€hrdungsbeurteilung viel Literatur verfasst. Allerdings ging lediglich die Erkenntnis in die GefĂ€hrdungsbeurteilung ein, dass die vorbeugende Instandhaltung ein positiver Beurteilungspunkt sei. Dagegen wurde nicht untersucht, wie die GefĂ€hrdungsbeurteilung als Steuerungsinstrument zur Verbesserung der betrieblichen Prozesse, speziell in der vorbeugenden Instandhaltung, eingesetzt werden kann. Beide Theorien wurden in den betreuten Unternehmen und Institutionen erfolgreich angewandt. Es entstanden nachweisliche Kosteneinsparungen und – ich darf betonen - es traten keine maßgeblichen ArbeitsunfĂ€lle in dem seither abgelaufenen Zeitraum von 3 Jahren auf. Auf Grundlage des Konzepts der kombinierten GefĂ€hrdungsbeurteilung wurde ein Pflichtenheft erstellt, auf dessen Grundlage derzeit eine Software entwickelt wird

    Privacy in Voice-over-IP mitigating the risks at SIP intermediaries

    Get PDF
    Telephony plays a fundamental role in our society. It enables remote parties to interact and express themselves over great distances. The telephone as a means of communicating has become part of every day life. Organisations and industry are now looking at Voice over IP (VoIP) technologies. They want to take advantage of new and previously unavailable voice services. Various interested parties are seeking to leverage the emerging VoIP technology for more flexible and efficient communication between staff, clients and partners. VoIP is a recent innovation enabled by Next Generation Network (NGN). It provides and enables means of communication over a digital network, specifically the Internet. VoIP is gaining wide spread adoption and will ultimately replace traditional telephony. The result of this trend is a ubiquitous, global and digital communication infrastructure. VoIP, however, still faces many challenges. It is not yet as reliable and dependable as the current Public Switched Telephone Network (PSTN). The employed communication protocols are immature with many security flaws and weaknesses. Session Initiation Protocol (SIP), a popular VoIP protocol does not sufficiently protect a users privacy. A user’s information is neither encrypted nor secured when calling a remote party. There is a lack of control over the information included in the SIP messages. Our specific concern is that private and sensitive information is exchanged over the public internet. This dissertation concerns itself with the communication path chosen by SIP when establishing a session with a remote party. In SIP, VoIP calls are established over unknown and untrusted intermediaries to reach the desired party. We analyse the SIP headers to determine the information leakage at each chosen intermediary. Our concerns for possible breach of privacy when using SIP were confirmed by the findings. A user’s privacy can be compromised through the extraction of explicit private details reflected in SIP headers. It is further possible to profile the user and determine communication habits from implicit time, location and device information. Our research proposes enhancements to SIP. Each intermediary must digitally sign over the SIP headers ensuring the communication path was not be altered. These signatures are added sequentially creating a chain of certified intermediaries. Our enhancements to SIP do not seek to encrypt the headers, but to use these intermediary signatures to reduce the risk of information leakage. We created a model of our proposed enhancements for attaching signatures at each intermediary. The model also provides a means of identifying unknown or malicious intermediaries prior to establishing a SIP session. Finally, the model was specified in Z notation. The Z specification language was well suited to accurately and precisely represent our model. This formal notation was adopted to specify the types, states and model behaviour. The specification was validated using the Z type-checker ZTC. CopyrightDissertation (MSc)--University of Pretoria, 2010.Computer Scienceunrestricte

    Queue length estimation at traffic signals using floating car data

    Get PDF
    Verkehr und MobilitĂ€t sind ein wichtiger Teil unseres tĂ€glichen Lebens. Zugleich leiden Menschen und Umwelt an deren negativen Folgen. Speziell im Straßenverkehr sind die Probleme hĂ€ufig kaum zu ĂŒbersehen. Entsprechend kommt dem Verkehrsmanagement als integraler Bestandteil der Verkehrsplanung und -steuerung eine wachsende Bedeutung zu. Ziel ist es, durch intelligente AnsĂ€tze die negativen Auswirkungen moderner MobilitĂ€t möglichst zu vermeiden oder wenigstens zu reduzieren. Ein effizientes Verkehrsmanagement benötigt dazu eine verlĂ€ssliche Verkehrsdatenerfassung, wobei das innerstĂ€dtische Verkehrsmonitoring wegen finanzieller und technologischer BeschrĂ€nkungen derzeit hĂ€ufig grundlegende Defizite aufweist. Engmaschige Verkehrsnetze mit einer komplexen Verkehrsdynamik erschweren hier eine flĂ€chendeckende Verkehrslageerfassung mit herkömmlichen Mitteln. Die vorliegende Arbeit entwickelt daher basierend auf sogenannten Floating-Car-Daten (FCD) eine alternative Methode, die bei Bedarf in flexibler Weise zusĂ€tzlich Informationen aus beliebigen weiteren Quellen qualitĂ€tssteigernd fusionieren kann. Es ergibt sich ein kosteneffizientes, erweiterbares Verfahren zur SchĂ€tzung von RĂŒckstaulĂ€ngen an lichtsignalgeregelten Netzknoten. Dessen Kernidee ist es, beobachtete GPS-Positionen einzelner Fahrzeuge mit geeigneten, modellbasierten Profilen der lokalen Verkehrsdichte zu vergleichen. Dies erlaubt die SchĂ€tzung einer virtuellen Verkehrsnachfrage, die mit Hilfe eines Verkehrsflussmodells konsistent in die gesuchte VerkehrskenngrĂ¶ĂŸe umgerechnet werden kann. Die Theorie des neuen Verfahrens wird detailliert beschrieben. DarĂŒber hinaus werden im Zuge umfangreicher Analysen mit simulierten und realen Verkehrsdaten systematisch die Potentiale des neuen Ansatzes bewertet. Insgesamt zeigt sich, dass sowohl in der Simulation als auch unter realen Bedingungen schon mit sehr niedrigen FCD-Abdeckungsgraden vielversprechende Ergebnisse erzielt werden können.Traffic and mobility are important ingredients of our daily lives. However, people and environment are suffering from their negative effects at the same time. As for road traffic in urban areas, the problems are quite obvious. That is why national and local authorities are (re-)discovering traffic management as an integrated component of traffic planning and control. Clearly, efficient traffic management needs reliable traffic information. Despite that, urban traffic monitoring is still underdeveloped due to financial and technological constraints. Because of complex traffic dynamics and close-meshed road networks in urban areas, it is hardly possible to obtain good area-wide traffic information based on conventional detection only. Because of that, an alternative method based on so-called floating car data has been developed in the presented work which is able to incorporate also additional data sources in a flexible way as to enhance the quality of the results. In its current version, it provides a cost-efficient way to estimate queue lengths at signalized intersections. The basic idea is to compare observed GPS positions with some model-based profiles of local traffic density. In this way, it is possible to gauge some kind of virtual traffic demand which is consistently transformed into the desired traffic state information afterwards using a simple traffic flow model. The theory of the new approach is described in detail. Moreover, extensive analyses based on simulated and real data systematically evaluate the achievable accuracy of estimation. All in all, it is shown that promising results can be obtained even at very low penetration rates in simulation as well as in real-life conditions

    A probabilistic framework for traffic data quality

    Get PDF
    Regarding the assessment of traffic data in ITS, there is an increasing need for answers to the following questions: (i) What exactly is "traffic data quality"?, and, related to that, (ii) There are too many ways to define and to do things, and results of different researchers are inconsistent or not comparable. How can we overcome this situation? With that background, an important aim of the ongoing DLR-project I.MoVe is to develop a consistent understanding of traffic data quality, together with a unified framework for its assessment. To this end, a probabilistic framework for traffic data quality is provided in this paper. Real-world examples from I.MoVe demonstrate its application for the assessment of data sources like induction loops, stationary bluetooth sensors and floating car data (FCD). A first important point is to distinguish strictly between quality indices, quality requirements, and quality itself. While the present framework develops quality indices based on established quality criteria like accuracy, completeness, validity, and coverage, the usual understanding of quality is extended to a probabilistic view. This also addresses the problem of information retrieval in the presence of vagueness and uncertainty. The provided examples are making full use of the proposed framework, and also constitute interesting results for the practitioner by themselves. Examples include the assessment of induction loop count data, and assessing the temporal coverage of a stretch of road with stationary bluetooth data, or of the whole city of Berlin, Germany with FCD
    • 

    corecore