1,092 research outputs found
Recommended from our members
An Event System Architecture for Scaling Scale-Resistant Services
Large organizations are deploying ever-increasing numbers of networked compute devices, from utilities installing smart controllers on electricity distribution cables, to the military giving PDAs to soldiers, to corporations putting PCs on the desks of employees. These computers are often far more capable than is needed to accomplish their primary task, whether it be guarding a circuit breaker, displaying a map, or running a word processor. These devices would be far more useful if they had some awareness of the world around them: a controller that resists tripping a switch, knowing that it would set off a cascade failure, a PDA that warns its owner of imminent danger, a PC that exchanges reports of suspicious network activity to its peers to identify stealthy computer crackers. In order to provide these higher-level services, the devices need a model of their environment. The controller needs a model of the distribution grid, the PDA needs a model of the battlespace, and the PC needs a model of the network and of normal network and user behavior. Unfortunately, not only might models such as these require substantial computational resources, but generating and updating them is even more demanding. Modelbuilding algorithms tend to be bad in three ways: requiring large amounts of CPU and memory to run, needing large amounts of data from the outside to stay up to date, and running so slowly that can't keep up with any fast changes in the environment that might occur. We can solve these problems by reducing the scope of the model to the immediate locale of the device, since reducing the size of the model makes the problem of model generation much more tractable. But such models are also much less useful, having no knowledge of the wider system. This thesis proposes a better solution to this problem called Level of Detail, after the computer graphics technique of the same name. Instead of simplifying the representation of distant objects, however, we simplify less-important data. Compute devices in the system receive streams of data that is a mixture of detailed data from devices that directly affect them and data summaries (aggregated data) from less directly influential devices. The degree to which the data is aggregated (i.e., how much it is reduced) is determined by calculating an influence metric between the target device and the remote device. The smart controller thus receives a continuous stream of raw data from the adjacent transformer, but only an occasional small status report summarizing all the equipment in a neighborhood in another part of the city. This thesis describes the data distribution system, the aggregation functions, and the influence metrics that can be used to implement such a system. I also describe my current towards establishing a test environment and validating the concepts, and describe the next steps in the research plan
Response to the DECC Consultation of the siting process for a Geological Disposal Facility, 2013
Several members of SEG (Matt Gross, Phil Johnstone, Florian Kern, Gordon MacKerron, and Andy Stirling) have participated in a written response to the Department of Energy and Climate Change’s (DECC) consultation of the siting process for a Geological Disposal Facility (GDF) for nuclear waste. This consultation follows the rejection by Cumbria County Council earlier this year to hosting a Geological Disposal Facility. The government have therefore gone back to the national level to find a suitable location, and the issue remains a multifaceted and controversial one. Matt Gross and Phil Johnstone also represented SEG at the one day consultation on the same issue run by DECC at Centre Hall, Westminster, involving several round-table discussions with civil service, nuclear regulators, and local politicians on the various issues surrounding the siting of a GDF
Recommended from our members
Susceptibility Ranking of Electrical Feeders: A Case Study
Ranking problems arise in a wide range of real world applications where an ordering on a set of examples is preferred to a classification model. These applications include collaborative filtering, information retrieval and ranking components of a system by susceptibility to failure. In this paper, we present an ongoing project to rank the feeder cables of a major metropolitan area's electrical grid according to their susceptibility to outages. We describe our framework and the application of machine learning ranking methods, using scores from Support Vector Machines (SVM), RankBoost and Martingale Boosting. Finally, we present our experimental results and the lessons learned from this challenging real-world application
The Political Economy of Public Debt Management. Institutional Setting and Political Influence
This dissertation investigates the overarching research question of the relationship between politics and PDM. Given that public budgets ultimately collateralize the associated risks involved in actively managing public debt and using innovative financial instruments, the relationship between politics and modern PDM is of major relevance for legitimacy in democratic capitalism. As debt instrument selection is a crucial and strategic aspect of PDM to optimize debt portfolios, chapter 2 examines the question which economic and political factors have an impact on the use and extent of short-term debt instruments on the municipal level. Finding that the municipalities’ budgetary situation represents the key determining factor of their share of short-term debt, the results demonstrate the need for different approaches to grasp the phenomenon of PDM and ultimately raises the questions of whether or not politicians have influence on PDM at all. Consequently, chapter 3 investigates the institutional setting of DMOs on the national level by analyzing the relationship between DMOs and their respective parent ministry. The analysis focusses on the variation of autonomy across countries resulting out of this delegation process. The finding that DMOs have substantial autonomy in decision-making competencies, while especially DMOs separated from the core administration are subject of relatively low reporting obligations, naturally raises doubts concerning political control. Chapter 4 consequently addresses the question whether PDM is subject of parliamentary control. The results show that parliaments have relatively low control of PDM, which indicates a trade-off between expertise and control. Moreover, this chapter underlines the necessity to differentiate between budget and debt policy and subsequently between debt level and debt structure
Evaluating a possible new paradigm for recruitment dynamics: predicting poor recruitment for striped bass (Morone saxatilis) from an environmental variable
Understanding what causes large year classes and predicting them has been called the holy grail of fisheries science, one of the last great unanswered questions. Recruitment prediction, or forecasting, is an important component for setting fishery catch limits. We propose a new approach, called the “poor-recruitment paradigm”, for predicting recruitment using environmental variables. This approach hypothesizes that it is easier to predict poor recruitment rather than good recruitment because an environmental variable affects recruitment only when its value is extreme (lethal); otherwise, the variable may be benign and not influence recruitment. Thus, good recruitment necessitates all environmental conditions not be harmful and for some to be especially favorable; poor recruitment, however, requires only one environmental variable to be extreme. This idea was evaluated using recruitment and river discharge data for striped bass (Morone saxatilis) from seven major spawning tributaries of Chesapeake Bay. Low spring river discharge reliably resulted in poor recruitment of striped bass. Specifically, in all rivers, median recruitment and standard deviation of recruitment were lower when spring river discharge was low compared to when it was average or high; additionally, the proportion of years with poor recruitment was higher in years of low discharge than in years of average to high discharge. The consistent predictability of poor recruitment has the potential to improve stock projections, and therefore, has the potential to improve catch advice
Visualizing the 'invisible'
The ability of scientists to image and manipulate matter at the (sub)atomic scale is a result of stunning advances in microscopy. Foremost amongst these was the invention of the scanning probe microscope, which, despite its classification as a microscope, does not rely on optics to generate images. Instead, images are produced via the interaction of an atomically sharp probe with a surface. Here the author considers to what extent those images represent an accurate picture of ‘reality’ at a size regime where quantum physics holds sway, and where the image data can be acquired and manipulated in a variety of ways
Silicate-analogous borosulfates featuring promising luminescence and frequency-doubling (SHG) properties based on a rich crystal chemistry
Our contribution adresses important features of the emerging compound class of silicate-analogous borosulfates, i.e. their rich crystal chemistry, their exciting optical properties and of course their syntheses – the chemistry behind. Silicate-analogous materials comprise tetrahedral anionic basic building units lacking an inversion centre enhancing the chance of non-centrosymmetric surroundings of metal ions promoting excellent optical properties.
Since the very first characterization of crystalline borosulfates in 2012 over sixty members have been found. Therein, the reaction of boric and sulfuric acid yields supertetrahedral BX4 (X=SO4) moieties giving rise to a rich crystal chemistry from non-condensed [B(SO4)4]5– anions via band (see Fig.) and layer structures to anionic frameworks [B(SO4)2]– – which can be understood by principles well known from silicates (see Fig.). The selective synthesis of borosulfates can be challenging but we meanwhile found some basic principles helping to selectively synthesize new compounds as phase-pure samples. Great impact is ascribed to the nature of the boron source, the metal (salt) employed and the amount of oleum added.
On one hand, borosulfates feature a low coordination strength which is beneficial for the luminescence and UV-Vis properties of compounds containing lanthanide and transition metal ions, such as Ce3+ (see Fig.), Eu3+, Tb3+ or Co2+ and Ni2+. On the other hand, borosulfates frequently adopt non-centrosysmmetric structures yielding optical properties like SHG (second harmonic generation) which – in combination with large band-gaps – makes them highly promising materials for frequency doubling in the high energy regime. Also ionic conductivity was observed recently
Recommended from our members
Kinesthetics eXtreme: An External Infrastructure for Monitoring Distributed Legacy Systems
Autonomic computing - self-configuring, self-healing, self-optimizing applications, systems and networks - is widely believed to be a promising solution to ever-increasing system complexity and the spiraling costs of human system management as systems scale to global proportions. Most results to date, however, suggest ways to architect new software constructed from the ground up as autonomic systems, whereas in the real world organizations continue to use stovepipe legacy systems and/or build 'systems of systems' that draw from a gamut of new and legacy components involving disparate technologies from numerous vendors. Our goal is to retrofit autonomic computing onto such systems, externally, without any need to understand or modify the code, and in many cases even when it is impossible to recompile. We present a meta-architecture implemented as active middleware infrastructure to explicitly add autonomic services via an attached feedback loop that provides continual monitoring and, as needed, reconfiguration and/or repair. Our lightweight design and separation of concerns enables easy adoption of individual components, as well as the full infrastructure, for use with a large variety of legacy, new systems, and systems of systems. We summarize several experiments spanning multiple domains
Recommended from our members
Retrofitting Autonomic Capabilities onto Legacy Systems
Autonomic computing - self-configuring, self-healing, self-optimizing applications, systems and networks - is a promising solution to ever-increasing system complexity and the spiraling costs of human management as systems scale to global proportions. Most results to date, however, suggest ways to architect new software constructed from the ground up as autonomic systems, whereas in the real world organizations continue to use stovepipe legacy systems and/or build 'systems of systems' that draw from a gamut of disparate technologies from numerous vendors. Our goal is to retrofit autonomic computing onto such systems, externally, without any need to understand, modify or even recompile the target system's code. We present an autonomic infrastructure that operates similarly to active middleware, to explicitly add autonomic services to pre-existing systems via continual monitoring and a feedback loop that performs, as needed, reconfiguration and/or repair. Our lightweight design and separation of concerns enables easy adoption of individual components, independent of the rest of the full infrastructure, for use with a large variety of target systems. This work has been validated by several case studies spanning multiple application domains
- …