132 research outputs found

    Grooming Connectivity Intents in IP-Optical Networks Using Directed Acyclic Graphs

    Full text link
    During the last few years, there have been concentrated efforts toward intent-driven networking. While relying upon Software-Defined Networking (SDN), Intent-Based Networking (IBN) pushes the frontiers of efficient networking by decoupling the intentions of a network operator (i.e., what is desired to be done) from the implementation (i.e., how is it achieved). The advantages of such a paradigm have long been argued and include, but are not limited to, the reduction of human errors, reduced expertise requirements among operator personnel, and faster business plan adaptation. In previous work, we have shown how incorporating IBN in multi-domain networks can have a significantly positive impact as it can enable decentralized operation, accountability, and confidentiality. The pillar of our previous contribution is the compilation of intents using system-generated intent trees. In this work, we extend the architecture to enable grooming among the user intents. Therefore, separate intents can now end up using the same network resources. While this makes the intent system reasonably more complex, it indisputably improves resource allocation. To represent the intent relationships of the newly enhanced architecture, we use Directed Acyclic Graphs (DAGs). Furthermore, we appropriately adapt an advanced established technique from the literature to solve the Routing, Modulation, and Spectrum Assignment (RMSA) problem for the intent compilation. We demonstrate a realistic scenario in which we evaluate our architecture and the intent compilation strategy. Our current approach successfully consolidates the advantages of having an intent-driven architecture and, at the same time, flexibly choosing among advanced resource allocation techniques

    Diffusion with Resetting Inside a Circle

    Full text link
    We study the Brownian motion of a particle in a bounded circular 2-dimensional domain, in search for a stationary target on the boundary of the domain. The process switches between two modes: one where it performs a two-dimensional diffusion inside the circle and one where it travels along the one-dimensional boundary. During the diffusion, the Brownian particle resets to its initial position with a constant rate rr. The Fokker-Planck formalism allows us to calculate the mean time to absorption (MTA) as well as the optimal resetting rate for which the MTA is minimized. From the derived analytical results the parameter regions where resetting reduces the search time can be specified. We also provide a numerical method for the verification of our results.Comment: 11 pages, 10 figure

    Derived Demand for Fresh Cheese Products Imported into Japan

    Get PDF
    The objective of this article is to estimate the derived demand for imported fresh cheese products into Japan when fresh cheese import data are disaggregated by source country of production. We provide empirical measures of the sensitivity of demand to changes in total imports, own-price, and cross-prices among exporting countries for fresh cheese. Japan's derived demand for U.S. fresh cheese products is perfectly inelastic. Thus, the import demand competition among importing countries should be based upon differences in product characteristics.Demand and Price Analysis, International Relations/Trade,

    Adaptive L0 Regularization for Sparse Support Vector Regression

    Get PDF
    In this work, we proposed a sparse version of the Support Vector Regression (SVR) algorithm that uses regularization to achieve sparsity in function estimation. To achieve this, we used an adaptive L0 penalty that has a ridge structure and, therefore, does not introduce additional computational complexity to the algorithm. In addition to this, we used an alternative approach based on a similar proposal in the Support Vector Machine (SVM) literature. Through numerical studies, we demonstrated the effectiveness of our proposals. We believe that this is the first time someone discussed a sparse version of Support Vector Regression (in terms of variable selection and not in terms of support vector selection)

    Understanding and facilitating the development of intellect

    Get PDF
    Information flows continuously in the environment. As we attempt to do something, our senses receive large volumes of information. In any conversation, messages are exchanged rapidly. To understand meaning, we have to focus, record, choose and process relevant information at every moment, before it is displaced by other information. Often, information is incomplete or masked by other information or the problems to be solved are new to us. Thus, we must compare different aspects of information or other messages, and use deduction to fill in the gaps in the information, connect it with what we already know or invent solutions to new problems. Children at school learn new concepts every day. Reading, arithmetic or science are very demanding for them. To learn, children must hold information in their heads, use previously acquired concepts to interpret new information and then change their understanding as required. These tasks are possible because we can focus on information and process it before it disappears, alternate between stimuli or concepts according goals, and make decisions based on an understanding and evaluation of information through reasoning. At the same time, we adjust our strategies according to what we already know or depending on our strengths and weaknesses. To understand human intelligence, psychological and cognitive sciences try to specify what cognitive processes are involved in dealing with the above-mentioned tasks, how these processes change during learning, why individuals have different capacities, and how biology and culture may influence them. Any systematic attempt to improve intelligence through education would have to build on the knowledge assembled by research since the end of the nineteenth century. In this booklet we outline how the sciences of the mind view intelligence and suggest a programme for instruction that may uild upon its various processes

    Testing three rainfall interception models and different parameterization methods with data from an open Mediterranean pine forest

    Get PDF
    Various models have been developed to simulate rainfall interception by vegetation but their formulations and applications rely on a number of assumptions and parameter estimation procedures. The aim of this study is to examine the effect of different model assumptions and parameter derivation approaches on the performance of the Rutter, Gash and Liu interception models. The Rutter model, in contrast to the other two daily models, was applied both on an hourly and on a daily basis. Hourly data from a meteorological station, one automatic and 28 manual throughfall gauges from a semi-arid Pinus brutia forest (Cyprus) for the period between 01/Jul/2016 and 31/May/2020 were used for the analysis. We conducted a sensitivity analysis for the assessment of the model parameters and variables: canopy storage capacity (S), canopy cover fraction (c), the ratio of mean wet evaporation rate to mean wet rainfall rate (Ēc/R̄) and potential evaporation (Eo). Three parameter derivation approaches were tested: the widely used regression method and an automatic model parameterization procedure for optimization of S and c and for optimization of S (with c observed). The parameterized models were run with daily meteorological data and compared with long-term weekly throughfall data (2008–2019). The Gash and Liu models showed low sensitivity to Ēc/R̄. Test runs with different combinations of S, c and Ēc/R̄ revealed strong equifinality. The models showed high performance for both calibration and validation periods with Kling–Gupta Efficiency (KGE) above 0.90. Gash and Liu models with the automatic model parameterization procedures resulted in higher KGEs than with the regression method. The interception losses computed from the long-term application of the three models ranged between 18 and 20%. The models were all capable of capturing the inherently variable interception process. However, a representative time series of throughfall measurements is needed to parameterize the models
    corecore