724,421 research outputs found

    Sample average approximation with heavier tails II: localization in stochastic convex optimization and persistence results for the Lasso

    Full text link
    We present exponential finite-sample nonasymptotic deviation inequalities for the SAA estimator's near-optimal solution set over the class of stochastic optimization problems with heavy-tailed random \emph{convex} functions in the objective and constraints. Such setting is better suited for problems where a sub-Gaussian data generating distribution is less expected, e.g., in stochastic portfolio optimization. One of our contributions is to exploit \emph{convexity} of the perturbed objective and the perturbed constraints as a property which entails \emph{localized} deviation inequalities for joint feasibility and optimality guarantees. This means that our bounds are significantly tighter in terms of diameter and metric entropy since they depend only on the near-optimal solution set but not on the whole feasible set. As a result, we obtain a much sharper sample complexity estimate when compared to a general nonconvex problem. In our analysis, we derive some localized deterministic perturbation error bounds for convex optimization problems which are of independent interest. To obtain our results, we only assume a metric regular convex feasible set, possibly not satisfying the Slater condition and not having a metric regular solution set. In this general setting, joint near feasibility and near optimality are guaranteed. If in addition the set satisfies the Slater condition, we obtain finite-sample simultaneous \emph{exact} feasibility and near optimality guarantees (for a sufficiently small tolerance). Another contribution of our work is to present, as a proof of concept of our localized techniques, a persistent result for a variant of the LASSO estimator under very weak assumptions on the data generating distribution.Comment: 34 pages. Some correction

    The HOM problem is EXPTIME-complete

    Get PDF
    We define a new class of tree automata with constraints and prove decidability of the emptiness problem for this class in exponential time. As a consequence, we obtain several EXPTIME-completeness results for problems on images of regular tree languages under tree homomorphisms, like set inclusion, regularity (HOM problem), and finiteness of set difference. Our result also has implications in term rewriting, since the set of reducible terms of a term rewrite system can be described as the image of a tree homomorphism. In particular, we prove that inclusion of sets of normal forms of term rewrite systems can be decided in exponential time. Analogous consequences arise in the context of XML typechecking, since types are defined by tree automata and some type transformations are homomorphic.Peer ReviewedPostprint (published version

    Binding Energy and the Fundamental Plane of Globular Clusters

    Full text link
    A physical description of the fundamental plane of Galactic globular clusters is developed which explains all empirical trends and correlations in a large number of cluster observables and provides a small but complete set of truly independent constraints on theories of cluster formation and evolution in the Milky Way. Within the theoretical framework of single-mass, isotropic King models, it is shown that (1) 39 regular (non--core-collapsed) globulars with measured core velocity dispersions share a common V-band mass-to-light ratio of 1.45 +/- 0.10, and (2) a complete sample of 109 regular globulars reveals a very strong correlation between cluster binding energy and total luminosity, regulated by Galactocentric position: E_b \propto (L^{2.05} r_{\rm gc}^{-0.4}). The observational scatter about either of these two constraints can be attributed fully to random measurement errors, making them the defining equations of a fundamental plane for globular clusters. A third, weaker correlation, between total luminosity and the King-model concentration parameter, c, is then related to the (non-random) distribution of globulars on the plane. The equations of the FP are used to derive expressions for any cluster observable in terms of only L, r_{\rm gc}, and c. Results are obtained for generic King models and applied specifically to the globular cluster system of the Milky Way.Comment: 60 pages with 19 figures, submitted to Ap

    Parallel Load Balancing on Constrained Client-Server Topologies

    Get PDF
    We study parallel \emph{Load Balancing} protocols for a client-server distributed model defined as follows. There is a set \sC of nn clients and a set \sS of nn servers where each client has (at most) a constant number d1d \geq 1 of requests that must be assigned to some server. The client set and the server one are connected to each other via a fixed bipartite graph: the requests of client vv can only be sent to the servers in its neighborhood N(v)N(v). The goal is to assign every client request so as to minimize the maximum load of the servers. In this setting, efficient parallel protocols are available only for dense topolgies. In particular, a simple symmetric, non-adaptive protocol achieving constant maximum load has been recently introduced by Becchetti et al \cite{BCNPT18} for regular dense bipartite graphs. The parallel completion time is \bigO(\log n) and the overall work is \bigO(n), w.h.p. Motivated by proximity constraints arising in some client-server systems, we devise a simple variant of Becchetti et al's protocol \cite{BCNPT18} and we analyse it over almost-regular bipartite graphs where nodes may have neighborhoods of small size. In detail, we prove that, w.h.p., this new version has a cost equivalent to that of Becchetti et al's protocol (in terms of maximum load, completion time, and work complexity, respectively) on every almost-regular bipartite graph with degree Ω(log2n)\Omega(\log^2n). Our analysis significantly departs from that in \cite{BCNPT18} for the original protocol and requires to cope with non-trivial stochastic-dependence issues on the random choices of the algorithmic process which are due to the worst-case, sparse topology of the underlying graph

    Lessons From the Debt-Deflation Theory of Sudden Stops

    Get PDF
    This paper reports results for a class of dynamic, stochastic general equilibrium models with credit constraints that can account for some of the empirical regularities of the Sudden Stop phenomenon of recent emerging markets crises. In these models, credit constraints set in motion Irving Fisher's debt-deflation mechanism and they bind as an endogenous equilibrium outcome when agents are highly indebted. The quantitative predictions of these models yield three key lessons: (1) Sudden Stops can occur as an endogenous response to typical realizations of adverse shocks to fundamentals, in environments in which agents plan their actions taking credit constraints and expectations of Sudden Stops into account. (2) Credit constraints cause output declines during Sudden Stops when collateral constraints limit debt to a fraction of the market value of capital, when there are limits on access to working capital, or when debt-deflation lowers the value of the marginal product of factors of production. (3) The debt-deflation mechanism has significant quantitative effects in terms of the amplification, asymmetry and persistence of the responses of macroeconomic aggregates to standard shocks, and in the occurrence of Sudden Stops as infrequent events nested within regular business cycles. Precautionary saving rules out the largest Sudden Stops from the stochastic stationary state, but Sudden Stops remain a positive-probability event in the long run.

    CLaRK System - an XML-based system for Corpora Development

    Get PDF
    The CLaRK System incorporates several technologies: - XML technology - Unicode - Cascaded Regular Grammars; - Constraints over XML Documents On the basis of these technologies the following tools are implemented: XML Editor, Unicode Tokeniser, Sorting tool, Removing and Extracting tool, Concordancer, XSLT tool, Cascaded Regular Grammar tool, etc. 1 Unicode tokenization In order to provide possibility for imposing constraints over the textual node and to segment them in meaningful way, the CLaRK System supports a user-defined hierarchy of tokenisers. At the very basic level the user can define a tokeniser in terms of a set of token types. In this basic tokeniser each token type is defined by a set of UNICODE symbols. Above this basic level tokenisers, the user can define other tokenisers, for which the token types are defined as regular expressions over the tokens of some other tokeniser, the so called parent tokeniser. 2 Regular Grammars The regular grammars are the basic mechanism for linguistic processing of the content of an XML document within the system. The regular grammar processor applies a set of rules over the content of some elements in the document and incorporates the categories of the rules back in the document as XML mark-up. The content is processed before the application of the grammar rules in the following way: textual nodes are tokenized with respect to some appropriate tokeniser, the element nodes are textualized on the basis of XPath expressions that determine the important information about the element. The recognized word is substituted by a new XML mark-up, which can or can not contain the word. 3 Constraints The constraints that we implemented in the CLaRK System are generally based on the XPath language. We use XPath expressions to determine some data within one or several XML documents and thus we evaluate some predicates over the data. There are two modes of using a constraint. In the first mode the constraint is used for validity check, similar to the validity check, which is based on DTD or XML schema. In the second mode, the constraint is used to support the change of the document in order it to satisfy the constraint. There are three types of constraints, implemented in the system: regular expression constraints, number restriction constraints, value restriction constraints. 4 Macro Language In the CLaRK System the tools support a mechanism for describing their settings. On the basis of these descriptions (called queries) a tool can be applied only by pointing to a certain description record. Each query contains the states of all settings and options which the corresponding tool has. Once having this kind of queries there is a special tool for combining and applying them in groups (macros). During application the queries are executed successively and the result from an application is an input for the next one. For a better control on the process of applying several queries in one we introduce several conditional operators. These operators can determine the next query for application depending on certain conditions. When a condition for such an operator is satisfied, the execution continues from a location defined in the operator. The mechanism for addressing queries is based on user defined labels. When a condition is not satisfied the operator is ignored and the process continues from the position following the operator. In this way constructions like IF-THEN-ELSE and WHILE-DO easily can be expressed. The system supports five types of control operators: IF (XPath): the condition is an XPath expression which is evaluated on the current working document. If the result is a non-empty node-set, non-empty string, positive number or true boolean value the condition is satisfied; IF NOT (XPath): the same kind of condition as the previous one but the approving result is negated; IF CHANGED: the condition is satisfied if the preceding operation has changed the current working document or has produced a non-empty result document (depending on the operation); IF NOT CHANGED: the condition is satisfied if either the previous operation did not change the working document or did not produce a non-empty result. GOTO: unconditional changing the execution position. Each macro defined in the system can have its own query and can be incorporated in another macro. In this way some limited form of subroutine can be implemented. The new version of CLaRK will support server applications, calls to/from external programs

    Parallel Load Balancing on constrained client-server topologies

    Get PDF
    We study parallel Load Balancing protocols for the client-server distributed model defined as follows. There is a set of n clients and a set of n servers where each client has (at most) a constant number of requests that must be assigned to some server. The client set and the server one are connected to each other via a fixed bipartite graph: the requests of client v can only be sent to the servers in its neighborhood. The goal is to assign every client request so as to minimize the maximum load of the servers. In this setting, efficient parallel protocols are available only for dense topologies. In particular, a simple protocol, named raes, has been recently introduced by Becchetti et al. [1] for regular dense bipartite graphs. They show that this symmetric, non-adaptive protocol achieves constant maximum load with parallel completion time and overall work, w.h.p. Motivated by proximity constraints arising in some client-server systems, we analyze raes over almost-regular bipartite graphs where nodes may have neighborhoods of small size. In detail, we prove that, w.h.p., the raes protocol keeps the same performances as above (in terms of maximum load, completion time, and work complexity, respectively) on any almost-regular bipartite graph with degree. Our analysis significantly departs from that in [1] since it requires to cope with non-trivial stochastic-dependence issues on the random choices of the algorithmic process which are due to the worst-case, sparse topology of the underlying graph

    Outdoor learning spaces: the case of forest school

    Get PDF
    © 2017 The Author. Area published by John Wiley & Sons Ltd on behalf of Royal Geographical Society (with the Institute of British Geographers). This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.This paper contributes to the growing body of research concerning use of outdoor spaces by educators, and the increased use of informal and outdoor learning spaces when teaching primary school children. The research takes the example of forest school, a form of regular and repeated outdoor learning increasingly common in primary schools. This research focuses on how the learning space at forest school shapes the experience of children and forest school leaders as they engage in learning outside the classroom. The learning space is considered as a physical space, and also in a more metaphorical way as a space where different behaviours are permitted, and a space set apart from the national curriculum. Through semi-structured interviews with members of the community of practice of forest school leaders, the paper seeks to determine the significance of being outdoors on the forest school experience. How does this learning space differ from the classroom environment? What aspects of the forest school learning space support pupils’ experiences? How does the outdoor learning space affect teaching, and the dynamics of learning while at forest school? The research shows that the outdoor space provides new opportunities for children and teachers to interact and learn, and revealed how forest school leaders and children co-create a learning environment in which the boundaries between classroom and outdoor learning, teacher and pupil, are renegotiated to stimulate teaching and learning. Forest school practitioners see forest school as a separate learning space that is removed from the physical constraints of the classroom and pedagogical constraints of the national curriculum to provide a more flexible and responsive learning environment.Peer reviewe
    corecore