911 research outputs found

    Artificial intelligence (AI) methods in optical networks: A comprehensive survey

    Get PDF
    Producción CientíficaArtificial intelligence (AI) is an extensive scientific discipline which enables computer systems to solve problems by emulating complex biological processes such as learning, reasoning and self-correction. This paper presents a comprehensive review of the application of AI techniques for improving performance of optical communication systems and networks. The use of AI-based techniques is first studied in applications related to optical transmission, ranging from the characterization and operation of network components to performance monitoring, mitigation of nonlinearities, and quality of transmission estimation. Then, applications related to optical network control and management are also reviewed, including topics like optical network planning and operation in both transport and access networks. Finally, the paper also presents a summary of opportunities and challenges in optical networking where AI is expected to play a key role in the near future.Ministerio de Economía, Industria y Competitividad (Project EC2014-53071-C3-2-P, TEC2015-71932-REDT

    Attack Aware RWA for Sliding Window Scheduled Traffic Model

    Get PDF
    In Transparent optical networks (TONs), the data signals remain in the optical domain for the entire transmission path. The capability of handling high data rates and features like transparency makes TONs susceptible to several physical layer attacks. Hence, designing TONs with a capability of handling such high power jamming attacks is an important network security problem. In this work, we propose an integer linear program (ILP) formulation to control the propagation of these physical layer attacks in TONs, for the demands which need periodic bandwidth usage at certain predefined timings. There are two different approaches for handling these scheduled traffic demands, fixed window and sliding window. Our research deals with the sliding window scheduled traffic model, which is more flexible when compared with fixed window, as the start and end timings of the demand are unknown and they slide within a larger window setting. Hence, we present an ILP to handle the routing and wavelength assignment (RWA) problem for sliding window scheduled traffic model, with an objective to minimize the attack radius for all the commodities

    Gated Communities in Gurgaon: Caste and Class on the Urban Frontier

    Get PDF
    Senior Project submitted to The Division of Social Studies of Bard College

    Low cost attitude control system scanwheel development

    Get PDF
    In order to satisfy a growing demand for low cost attitude control systems for small spacecraft, development of low cost scanning horizon sensor coupled to a low cost/low power consumption Reaction Wheel Assembly was initiated. This report addresses the details of the versatile design resulting from this effort. Tradeoff analyses for each of the major components are included, as well as test data from an engineering prototype of the hardware

    Voter Worldview and Presidential Candidate Choice.

    Get PDF
    Research has shown a relationship between having a strict father upbringing, defined by rules reinforcement and self-discipline beliefs, and the presence of high levels of social dominance orientation (SDO) and right-wing authoritarianism (RWA). The relationship between these variables and issue choice has been established, but no study has explored the connection between parental upbringing and moral foundations. Furthermore, the connection to political candidate choice has not been shown. This study investigated the relationship between people\u27s parental upbringing beliefs, their adult morality, and their rating of ideal presidential candidate characteristics. Based on the moral foundation theory, a mixed methods study was conducted to examine the relationship among upbringing, moral foundations, RWA, SDO, socioeconomic status (SES), and candidate selection by surveying 221 adult participants recruited online and in the community. Linear regression analysis was conducted to examine how levels of SDO, RWA, and the strict father variables predict the 5 five moral foundations. Qualitative analysis, through the use of open-ended questions, explored presidential candidate choice by rating people\u27s preference of the 5 moral foundations, the strict father nurturing parent worldviews, SDO, RWA, and subjective SES, as expressed in their ideal president. Results indicated that upbringing is related to RWA for conservatives and inversely related to SDO for liberals. Also, participants exhibited a rules reinforcement versus self-discipline left-right political dichotomy. Participants favored a tough-minded president on foreign affairs. This study\u27s results will enable voters to understand how their political attitudes may be formed and how they could be scrutinized and manipulated by those with an interest in doing so

    Anti-Muslim Backlash and Changing Political Ideologies. The Consequences of Perceived Threat from Islamist Terrorism

    Get PDF
    On September 11th, 2001 the world changed when the USA were attacked by Islamist terrorists who killed more than 3000 people and destroyed several symbols of US-American power. These events demonstrated how substantive the threat of Islamist terrorism really was, consequently causing fear and terror among millions of people in the Western World. A decade later, also due to repeated terror attacks and attempts around the world, this perception of threat is still present. My dissertation therefore deals with the consequences of this perceived terrorist threat; two studies focus on consequences for Muslims in the Western World while another study analyzes the effect on political ideologies of individuals

    A Fog Computing Approach for Cognitive, Reliable and Trusted Distributed Systems

    Get PDF
    In the Internet of Things era, a big volume of data is generated/gathered every second from billions of connected devices. The current network paradigm, which relies on centralised data centres (a.k.a. Cloud computing), becomes an impractical solution for IoT data storing and processing due to the long distance between the data source (e.g., sensors) and designated data centres. It worth noting that the long distance in this context refers to the physical path and time interval of when data is generated and when it get processed. To explain more, by the time the data reaches a far data centre, the importance of the data can be depreciated. Therefore, the network topologies have evolved to permit data processing and storage at the edge of the network, introducing what so-called fog Computing. The later will obviously lead to improvements in quality of service via processing and responding quickly and efficiently to varieties of data processing requests. Although fog computing is recognized as a promising computing paradigm, it suffers from challenging issues that involve: i) concrete adoption and management of fogs for decentralized data processing. ii) resources allocation in both cloud and fog layers. iii) having a sustainable performance since fog have a limited capacity in comparison with cloud. iv) having a secure and trusted networking environment for fogs to share resources and exchange data securely and efficiently. Hence, the thesis focus is on having a stable performance for fog nodes by enhancing resources management and allocation, along with safety procedures, to aid the IoT-services delivery and cloud computing in the ever growing industry of smart things. The main aspects related to the performance stability of fog computing involves the development of cognitive fog nodes that aim at provide fast and reliable services, efficient resources managements, and trusted networking, and hence ensure the best Quality of Experience, Quality of Service and Quality of Protection to end-users. Therefore the contribution of this thesis in brief is a novel Fog Resource manAgeMEnt Scheme (FRAMES) which has been proposed to crystallise fog distribution and resource management with an appropriate service's loads distribution and allocation based on the Fog-2-Fog coordination. Also, a novel COMputIng Trust manageMENT (COMITMENT) which is a software-based approach that is responsible for providing a secure and trusted environment for fog nodes to share their resources and exchange data packets. Both FRAMES and COMITMENT are encapsulated in the proposed Cognitive Fog (CF) computing which aims at making fog able to not only act on the data but also interpret the gathered data in a way that mimics the process of cognition in the human mind. Hence, FRAMES provide CF with elastic resource managements for load balancing and resolving congestion, while the COMITMENT employ trust and recommendations models to avoid malicious fog nodes in the Fog-2-Fog coordination environment. The proposed algorithms for FRAMES and COMITMENT have outperformed the competitive benchmark algorithms, namely Random Walks Offloading (RWO) and Nearest Fog Offloading (NFO) in the experiments to verify the validity and performance. The experiments were conducted on the performance (in terms of latency), load balancing among fog nodes and fogs trustworthiness along with detecting malicious events and attacks in the Fog-2-Fog environment. The performance of the proposed FRAMES's offloading algorithms has the lowest run-time (i.e., latency) against the benchmark algorithms (RWO and NFO) for processing equal-number of packets. Also, COMITMENT's algorithms were able to detect the collaboration requests whether they are secure, malicious or anonymous. The proposed work shows potential in achieving a sustainable fog networking paradigm and highlights significant benefits of fog computing in the computing ecosystem

    An assessment of UK banking liquidity regulation and supervision

    Get PDF
    This thesis assesses UK banking liquidity regulation and supervision and the Basel liquidity requirements, and models banks' liquidity risk. The study reveals that the FSA's risk-assessment framework before 2008 was too general without specifically considering banks' liquidity risk (as well as its failures on Northern Rock). The study also lists the limitations of the FSA's banking liquidity regimes before 2008. The thesis reviews whether the FSA's new liquidity regimes after 2008 would have coped with UK banks' liquidity risks if they have been applied properly. The fundamental changes in the FSA's liquidity supervision reflect three considerations. First, it introduces a systemic control requirement by measuring individual fifirm's liquidity risk with a market-wide stress or combination of idiosyncratic and market-wide stresses. Second, it emphasizes the monitoring of business model risks and the capability of senior managers. Third, it allows both internal and external managers to access more information by increasing the liquidity reporting frequencies. The thesis also comments on the Basel Liquidity Principles of 2008 and the two Liquidity Standards. The Principles of 2008 represents a substantial revision of the Principles of 2000 and reflect the lessons of the fifinancial market turmoil since 2007. The study argues that the implementation of the sound principles by banks and supervisors should be fexible, but also need to be consistent to make sure they understand banks' liquidity positions quite well. The study also explains the composition of the Basel liquidity ratios as well as the side effect of Basel liquidity standards; for example, it will reshape interbank deposit markets and bond markets as a result of the increase in demand for `liquid assets' and `stable funding'. This thesis uses quantitative balance sheet liquidity analysis, based upon modified versions of the BCBS (2010b) and Moody's (2001) models, to estimate eight UK banks' short and long-term liquidity positions from 2005 to 2010 respectively. The study shows that only Barclays Bank remained liquid on a short-term basis throughout the sample period (2005-2010); while the HSBC Bank also proved liquid on a short-term basis, although not in 2008 and 2010. On a long-term basis, RBS has remained liquid since 2008 after receiving government support; while Santander UK also proved liquid, except in 2009. The other banks,especially Natwest, are shown to have faced challenging conditions, on both a short-term and long-term basis, over the sample period. This thesis also uses the Exposure-Based Cash-Flow-at-Risk (CFaR) model to forecast UK banks' liquidity risk. Based on annual data over the period 1997 to 2010, the study predicts that by the end of 2011, the (102) UK banks' average CFaR at the 95% confidence level will be -£5.76 billion, Barclays Bank's (Barclays') CFaR will be -£0.34 billion, the Royal Bank of Scotland's (RBS's) CFaR will be -£40.29 billion, HSBC Bank's (HSBC's) CFaR will be £0.67 billion, Lloyds TSB Bank's (Lloyds TSB's) CFaR will be -£4.90 billion, National Westminister Bank's (Natwest's) CFaR will be -£10.38 billion, and Nationwide Building Society's (Nationwide's) CFaR will be -£0.72 billion. Moreover, it is clear that Lloyds TSB and Natwest are associated with the largest risk, according to the biggest percentage difference between downside cash flow and expected cash flow (3600% and 816% respectively). Since I summarize a bank's liquidity risk exposure in a single number (CFaR), which is the maximum shortfall given the targeted probability level, it can be directly compared to the bank's risk tolerance and used to guide corporate risk management decisions. Finally, this thesis estimates the long-term United Kingdom economic impact of the Basel III capital and liquidity requirements. Using quarterly data over the period 1997:q1 to 2010:q2, the study employs a non-linear-in-factor probit model to show increases in bank capital and liquidity would reduce the probability of a bank crisis significantly. The study estimates the long-run cost of the Basel III requirements with a Vector Error Correction Model (VECM), which shows holding higher capital and liquidity would reduce output by a small amount but increase bank profitability in the long run. The maximum temporary net benefit and permanent net benefit is shown to be 1.284% and 35.484% of pre-crisis GDP respectively when the tangible common equity ratio stays at 10%. Assuming all UK banks also meet the Basel III long-term liquidity requirements, the temporary net benefit and permanent net benefit will be 0.347% and 14.318% of pre-crisis GDP respectively. Therefore, the results suggest that, in terms of the impact on output, there is considerable room to further tighten capital and liquidity requirements, while still providing positive effects for the United Kingdom economy
    corecore