513,710 research outputs found

    Assessing temporal scales and patterns in time series: Comparing methods based on redundancy analysis

    Get PDF
    Time-series modelling techniques are powerful tools for studying temporal scaling structures and dynamics present in ecological and other complex systems and are gaining popularity for assessing resilience quantitatively. Among other methods, canonical ordinations based on redundancy analysis are increasingly used for determining temporal scaling patterns that are inherent in ecological data. However, modelling outcomes and thus inference about ecological dynamics and resilience may vary depending on the approaches used. In this study, we compare the statistical performance, logical consistency and information content of two approaches: (i) asymmetric eigenvector maps (AEM) that account for linear trends and (ii) symmetric distance-based Moran's eigenvector maps (MEM), which requires detrending of raw data to remove linear trends prior to analysis. Our comparison is done using long-term water quality data (25 years) from three Swedish lakes. This data set therefore provides the opportunity for assessing how the modelling approach used affects performance and inference in time series modelling. We found that AEM models had consistently more explanatory power than MEM, and in two out of three lakes AEM extracted one more temporal scale than MEM. The scale-specific patterns detected by AEM and MEM were uncorrelated. Also individual water quality variables explaining these patterns differed between methods, suggesting that inferences about systems dynamics are dependent on modelling approach. These findings suggest that AEM might be more suitable for assessing dynamics in time series analysis compared to MEM when temporal trends are relevant. The AEM approach is logically consistent with temporal autocorrelation where earlier conditions can influence later conditions but not vice versa. The symmetric MEM approach, which ignores the asymmetric nature of time, might be suitable for addressing specific questions about the importance of correlations in fluctuation patterns where there are no confounding elements of linear trends or a need to assess causality

    Proceedings of the Conference on Human and Economic Resources

    Get PDF
    Recent development of information technologies and telecommunications have given rise to an extraordinary increase in the data transactions in the financial markets. In large and transparent markets, with lower transactions and information costs, financial participants react more rapidly to changes in the profitability of their assets, and in their perception of the risks of the different financial instruments. In this respect, if the rapidity of reaction of financial players is the main feature of globalized markets, then only advanced information technologies, which uses data resources efficiently are capable of reflecting these complex nature of financial markets. The aim of this paper is to show how the new information technologies affect modelling of financial markets and decisions by using limited data resources within an intelligent system. By using intelligent information systems, mainly neural networks, this paper tries to show how the the limited economic data can be used for efficient economic decisions in the global financial markets. Advances in microprocessors and software technologies make it possible to enable the development of increasingly powerful systems at reasonable costs. The new technologies have created artificial systems, which imitate people’s brain for efficient analysis of economic data. According to Hertz, Krogh and Palmer (1991), artificial neural networks which have a similar structure of the brain consist of nodes passing activation signals to each other. Within the nodes, if incoming activation signals from the others are combined some of the nodes will produce an activation signal modified by a connection weight between it and the node to which it is linked. By using financial data from international foreign exchange markets, namely daily time series of EUR/USD parity, and by employing certain neural network algorithms, it has showed that new information technologies have advantages on efficient usage of limited economic data in modeling. By investigating the “artificial” works on modeling of international financial markets, this paper is tried to show how limited information in the markets can be used for efficient economic decisions. By investigating certain neural networks algorithms, the paper displays how artificial neural networks have been used for efficient economic modeling and decisions in global F/X markets. New information technologies have many advantages over statistics methods in terms of efficient data modeling. They are capable of analyzing complex patterns quickly and with a high degree of accuracy. Since, “artificial” information systems do not make any assumptions about the nature of the distribution of the data, they are not biased in their analysis. By using different neural network algorithms, the economic data can be modeled in an efficient way. Especially if the markets are non-linear and complex, the intelligent systems are more powerful on explaining the market behavior in the chaotic environments. With more advanced information technologies, in the future, it will be possible to model all the complexity of the economic life. New researches in the future need a more strong interaction between economics and computer science.neural networks,knowledge, information technology, communication technology

    Parametric BIM façade module development for diagrid twisted structures

    Get PDF
    Building Information Modelling (BIM) is an epochal phenomenon in AEC. Most of the developed countries has already adopted BIM by regulations. Biggest projects around the world executing by BIM which provides more effective project management process. Whole construction progress including; feasibility, design, construction and commissioning became digital visualization. Any required analysis can be done via this model template. In fact, requesting the complex and sophisticated structures’ construction is the main catalyst of developing BIM. Mankind start to design and construct cutting-edges and pushing limit structures. Twisted towers are one of the significant instance of like these structures. This study embrace the twisted towers’ façade design which has one of the most complex patterns. So it is thought that if such these towers’ façade system can be organized and customized limitlessly, most of the façades systems will be solved for the optimum result. This study aims to reveal a method which can be used for to develop easily alternative façade systems for construction projects even they have very complex design. One of the most complex structures the twisted diagrid systems which are constructed currently around the world was selected. BIM and computational programming have been used to improve the module.ArcadisAutodeskTopconVHV VersicherungenZPP German Engineerin

    Anaerobic digestion: a prime solution for water, energy and food nexus challenges

    Get PDF
    We solve the problem of identifying one or more optimal patterns of anaerobic digestion (AD) installation across the UK, by considering existing installations, the current feedstock potential and the project growth of the potential via population, demography and urbanization. We test several scenarios for the level of adoption of the AD operations in the community under varying amounts of feedstock supply, which may arise from change in food waste or energy crops generation via other policies and incentives. For the most resilient scales of solutions, we demonstrate for the UK the net energy production (bio-gas and electricity) from AD (and so the avoided emissions from grid energy), the mass of bio-waste processed (and avoided land-fill), and the quantum of digestate produced (as a proxy for avoided irrigation and fertilizer production). In order to simulate the AD innovation within WEF nexus we use agent based modelling (ABM) owing to its bottom-up approach and capability of modelling complex systems with relatively low level data and information

    Modelling departure time and mode choice

    Get PDF
    As a result of increasing road congestion and road pricing, modelling the temporal response of travellers to transport policy interventions has rapidly emerged as a major issue in many practical transport planning studies. A substantial body of research is therefore being carried out to understand the complexities involved in modelling time of day choice. These models are contributing substantially to our understanding of how travellers make time-of-day decisions (Hess et al, 2004; de Jong et al, 2003). These models, however, tend to be far too complex and far too data intensive to be of use for application in large-scale modelling forecasting systems, where socio-economic detail is limited and detailed scheduling information is rarely available. Moreover, model systems making use of the some of the latest analytical structures, such as Mixed Logit, are generally inapplicable in practical planning, since they rely on computer-intensive simulation in application just as well as in estimation. The aim of this paper, therefore, is to describe the development of time-period choice models which are suitable for application in large-scale modelling forecasting systems. Large-scale practical planning models often rely on systems of nested logit models, which can incorporate many of the most important interactions that are present in the complex models but which have low enough run-times to allow them to be used for practical planning. In these systems, temporal choice is represented as the choice between a finite set of discrete alternatives, represented by mutually exclusive time-periods that are obtained by aggregation of the actual observed continuous time values. The issues that face modellers are then: -how should the time periods be defined, and in particular how long should they be? -how should the choices of time periods be related to each other, e.g. is the elasticity for shorter shifts greater than for longer shifts? -how should time period choice be placed in the model system relative to other choices, such as that of the mode of travel? These questions cannot be answered on a purely theoretical basis but require the analysis of empirical data. However, there is not a great deal of data available on the relevant choices. The time period models described in the paper are developed from three related stated preference (SP) studies undertaken over the past decade in the United Kingdom and the Netherlands. Because of the complications involved with using advanced models in large-scale modelling forecasting systems, the model structures are limited to nested logit models. Two different tree structures are explored in the analysis, nesting mode above time period choice or time period choice above mode. The analysis examines how these structures differ by data set, purpose of travel and time period specification. Three time period specifications were tested, dividing the 24-hour day into: -twenty-four 1-hour periods; -five coarse time-periods; -sixteen 15-minute morning-peak periods, and two coarse pre-peak and post-peak periods. In each case, the time periods are used to define both the outbound and the return trip timings. The analysis shows that, with a few exceptions, the nested models outperform the basic Multinomial Logit structures, which operate under the assumption of equal substitution patterns across alternatives. With a single exception, the nested models in turn show higher substitution between alternative time periods than between alternative modes, showing that, for all the time period lengths studied, travellers are more sensitive to transport levels of service in their choice of departure time than in choice of mode. The advantages of the nesting structures are especially pronounced in the 1-hour and 15-minute models, while, in the coarse time-period models, the MNL model often remains the preferred structure; this is a clear effect of the broader time-periods, and the consequently lower substitution between time-periods.

    Design patterns for low-carbon buildings: a proposal

    Get PDF
    Design patterns as introduced by Christopher Alexander and colleagues are proposed in this paper as a means of guiding building designers through the often complex processes of low-carbon building design. The patterns are intended to be integrated into the Building Information Modelling (BIM) environments that are increasingly used in architectural and building engineering design practice, where patterns provide relevant information at appropriate times, carrying out environmental analyses as required, both as selected by the building designer and automatically. The paper provides examples of patterns from some of the various domains and disciplines that encompass low-carbon design of the built environment, as a means of exploring whether patterns could facilitate communication between those domains and disciplines. The focus is on low-carbon building design and building simulation, but patterns used in computer science and interface and interaction design are also discussed as these fit well with the object-oriented environment of contemporary software design and BIM systems

    Workflow resource pattern modelling and visualization

    Get PDF
    Workflow patterns have been recognized as the theoretical basis to modeling recurring problems in workflow systems. A form of workflow patterns, known as the resource patterns, characterise the behaviour of resources in workflow systems. Despite the fact that many resource patterns have been discovered, people still preclude them from many workflow system implementations. One of reasons could be obscurityin the behaviour of and interaction between resources and a workflow management system. Thus, we provide a modelling and visualization approach for the resource patterns, enabling a resource behaviour modeller to intuitively see the specific resource patterns involved in the lifecycle of a workitem. We believe this research can be extended to benefit not only workflow modelling, but also other applications, such as model validation, human resource behaviour modelling, and workflow model visualization

    Common metrics for cellular automata models of complex systems

    Get PDF
    The creation and use of models is critical not only to the scientific process, but also to life in general. Selected features of a system are abstracted into a model that can then be used to gain knowledge of the workings of the observed system and even anticipate its future behaviour. A key feature of the modelling process is the identification of commonality. This allows previous experience of one model to be used in a new or unfamiliar situation. This recognition of commonality between models allows standards to be formed, especially in areas such as measurement. How everyday physical objects are measured is built on an ingrained acceptance of their underlying commonality. Complex systems, often with their layers of interwoven interactions, are harder to model and, therefore, to measure and predict. Indeed, the inability to compute and model a complex system, except at a localised and temporal level, can be seen as one of its defining attributes. The establishing of commonality between complex systems provides the opportunity to find common metrics. This work looks at two dimensional cellular automata, which are widely used as a simple modelling tool for a variety of systems. This has led to a very diverse range of systems using a common modelling environment based on a lattice of cells. This provides a possible common link between systems using cellular automata that could be exploited to find a common metric that provided information on a diverse range of systems. An enhancement of a categorisation of cellular automata model types used for biological studies is proposed and expanded to include other disciplines. The thesis outlines a new metric, the C-Value, created by the author. This metric, based on the connectedness of the active elements on the cellular automata grid, is then tested with three models built to represent three of the four categories of cellular automata model types. The results show that the new C-Value provides a good indicator of the gathering of active cells on a grid into a single, compact cluster and of indicating, when correlated with the mean density of active cells on the lattice, that their distribution is random. This provides a range to define the disordered and ordered state of a grid. The use of the C-Value in a localised context shows potential for identifying patterns of clusters on the grid

    Intention-Driven Screenography

    Get PDF
    The visual design development of Web Information Systems is a complex task. At present, the process is mainly based on experiences and seems to be an immovable part of art. Typically, occurs a late consideration of graphical issues that results in inflexibility and cause problems for extension and change management. Database and software systems are mainly based on development phases such as requirement acquisition and elicitation and conceptual modelling. Moreover, users, their preferences and portfolio are taken into consideration. We show in this preprint that these approaches can be generalised to website presentation. We use methods developed for programming in the large, e.g. patterns. We can map patterns to conceptualisations of web page layout, i.e. grids. Patterns shall help us to reuse concepts. This paper introduces the concept of pattern and clarifies their structure and task for the whole development. Because the WIS development process is based on six dimensions, we initially introduce development dimensions and show the seamless integration of the pattern-based approach. We call the art of website layout screenography. Screenography extends web application engineering by scenographic and dramaturgic aspects and intends to support the interaction between system and user. Screenography aims at an individualised, decorated playout in consideration of intention, user profiles and portfolios, provider aims, context, equipment, functionality and the storyline progress. The users orientation of WIS requires the deep integration of user concerns, tasks and expectations into screenography. Therefore, this paper develops concepts of intention- driven screenography
    • …
    corecore