815,830 research outputs found

    Enhanced modeling features within TREETOPS

    Get PDF
    The original motivation for TREETOPS was to build a generic multi-body simulation and remove the burden of writing multi-body equations from the engineers. The motivation of the enhancement was twofold: (1) to extend the menu of built-in features (sensors, actuators, constraints, etc.) that did not require user code; and (2) to extend the control system design capabilities by linking with other government funded software (NASTRAN and MATLAB). These enhancements also serve to bridge the gap between structures and control groups. It is common on large space programs for the structures groups to build hi-fidelity models of the structure using NASTRAN and for the controls group to build lower order models because they lack the tools to incorporate the former into their analysis. Now the controls engineers can accept the hi-fidelity NASTRAN models into TREETOPS, add sensors and actuators, perform model reduction and couple the result directly into MATLAB to perform their design. The controller can then be imported directly into TREETOPS for non-linear, time-history simulation

    Programs as Data Structures in λSF-Calculus

    Full text link
    © 2016 The Author(s) Lambda-SF-calculus can represent programs as closed normal forms. In turn, all closed normal forms are data structures, in the sense that their internal structure is accessible through queries defined in the calculus, even to the point of constructing the Goedel number of a program. Thus, program analysis and optimisation can be performed entirely within the calculus, without requiring any meta-level process of quotation to produce a data structure. Lambda-SF-calculus is a confluent, applicative rewriting system derived from lambda-calculus, and the combinatory SF-calculus. Its superior expressive power relative to lambda-calculus is demonstrated by the ability to decide if two programs are syntactically equal, or to determine if a program uses its input. Indeed, there is no homomorphism of applicative rewriting systems from lambda-SF-calculus to lambda-calculus. Program analysis and optimisation can be illustrated by considering the conversion of a programs to combinators. Traditionally, a program p is interpreted using fixpoint constructions that do not have normal forms, but combinatory techniques can be used to block reduction until the program arguments are given. That is, p is interpreted by a closed normal form M. Then factorisation (by F) adapts the traditional account of lambda-abstraction in combinatory logic to convert M to a combinator N that is equivalent to M in the following two senses. First, N is extensionally equivalent to M where extensional equivalence is defined in terms of eta-reduction. Second, the conversion is an intensional equivalence in that it does not lose any information, and so can be reversed by another definable conversion. Further, the standard optimisations of the conversion process are all definable within lambda-SF-calculus, even those involving free variable analysis. Proofs of all theorems in the paper have been verified using the Coq theorem prover

    Distributional outcomes of a decentralized welfare program

    Get PDF
    It is common for central governments, to delegate authority over the targeting of welfare programs to local community organizations - which may be better informed about who is poor, though possibly less accountable for getting the money to the local poor - while the center retains control over how much goes to each local region. The authors outline a theoretical model of the interconnected behavior of the various actors in such a setting. The model's information structure provides scope for econometric identification. Applying data for a specific program in Bangladesh, they find that overall targeting was mildly pro-poor, mostly because of successful targeting within villages. But this varied across villages. Although some village characteristics promoted better targeting, these were generally not the same characteristics that attracted resources from the center. The authors observe that the center's desire for broad geographic coverage, appears to have severely constrained the scope for pro-poor village targeting. However, poor villages tended not to be better at reaching their poor. They find some evidence that local institutions matter. The presence of cooperatives for farmers and the landless, appears to be associated with more pro-poor program targeting. The presence of recreational clubs has the opposite effect. Sometimes the benefits of decentralized social programs are captured by local elites, depending on the type of spending being decentralized. When public spending us on private (excludable) good, and there is no self-targeting mechanism to ensure that only the poor participate, there is ample scope for local mis-targeting.Services&Transfers to Poor,Poverty Impact Evaluation,Poverty Reduction Strategies,Environmental Economics&Policies,Poverty Monitoring&Analysis,Safety Nets and Transfers,Rural Poverty Reduction,Services&Transfers to Poor,Governance Indicators,Poverty Impact Evaluation

    Linear dimensionality reduction: Survey, insights, and generalizations

    Get PDF
    Linear dimensionality reduction methods are a cornerstone of analyzing high dimensional data, due to their simple geometric interpretations and typically attractive computational properties. These methods capture many data features of interest, such as covariance, dynamical structure, correlation between data sets, input-output relationships, and margin between data classes. Methods have been developed with a variety of names and motivations in many fields, and perhaps as a result the connections between all these methods have not been highlighted. Here we survey methods from this disparate literature as optimization programs over matrix manifolds. We discuss principal component analysis, factor analysis, linear multidimensional scaling, Fisher's linear discriminant analysis, canonical correlations analysis, maximum autocorrelation factors, slow feature analysis, sufficient dimensionality reduction, undercomplete independent component analysis, linear regression, distance metric learning, and more. This optimization framework gives insight to some rarely discussed shortcomings of well-known methods, such as the suboptimality of certain eigenvector solutions. Modern techniques for optimization over matrix manifolds enable a generic linear dimensionality reduction solver, which accepts as input data and an objective to be optimized, and returns, as output, an optimal low-dimensional projection of the data. This simple optimization framework further allows straightforward generalizations and novel variants of classical methods, which we demonstrate here by creating an orthogonal-projection canonical correlations analysis. More broadly, this survey and generic solver suggest that linear dimensionality reduction can move toward becoming a blackbox, objective-agnostic numerical technology.JPC and ZG received funding from the UK Engineering and Physical Sciences Research Council (EPSRC EP/H019472/1). JPC received funding from a Sloan Research Fellowship, the Simons Foundation (SCGB#325171 and SCGB#325233), the Grossman Center at Columbia University, and the Gatsby Charitable Trust.This is the author accepted manuscript. The final version is available from MIT Press via http://jmlr.org/papers/v16/cunningham15a.htm

    The Development of KKNI-Based Curriculum at the Arabic Language Education Programs in Indonesian Higher Education

    Get PDF
    This study aimed to analyze the pattern of Indonesian National Qualifications Framework-based curriculum development carried out by 5 best Arabic Language Department (PBA) of undergraduate programs –accredited A- in Indonesia. The method used was qualitative research covering document analysis, interviews, and observations that were applied, and followed by data reduction, data display, and conclusion. The results showed that generally, the PBA departments had developed their curriculum by referring to the steps and rules in the KKNI standard. However, the determination of the courses was not carried out in-depth by developing learning outcomes and lesson materials independently, but only by referring to the CPL from the National Standards for Higher Education (SNPT) of the Minister of Education and Culture Regulation (MOEC) No. 49 of 2014, as well as by adjusting the CPL and the lesson materials to existing courses. This was because the PBA study program had difficulties in developing CPL independently. The curriculum structure developed by PBA was based on the serial model curriculum structure where the courses were arranged from the easiest in the first semester to the most difficult one last semester. &nbsp

    Different Transcriptional Control of Metabolism and Extracellular Matrix in Visceral and Subcutaneous Fat of Obese and Rimonabant Treated Mice

    Get PDF
    BACKGROUND: The visceral (VAT) and subcutaneous (SCAT) adipose tissues play different roles in physiology and obesity. The molecular mechanisms underlying their expansion in obesity and following body weight reduction are poorly defined. METHODOLOGY: C57Bl/6 mice fed a high fat diet (HFD) for 6 months developed low, medium, or high body weight as compared to normal chow fed mice. Mice from each groups were then treated with the cannabinoid receptor 1 antagonist rimonabant or vehicle for 24 days to normalize their body weight. Transcriptomic data for visceral and subcutaneous adipose tissues from each group of mice were obtained and analyzed to identify: i) genes regulated by HFD irrespective of body weight, ii) genes whose expression correlated with body weight, iii) the biological processes activated in each tissue using gene set enrichment analysis (GSEA), iv) the transcriptional programs affected by rimonabant. PRINCIPAL FINDINGS: In VAT, "metabolic" genes encoding enzymes for lipid and steroid biosynthesis and glucose catabolism were down-regulated irrespective of body weight whereas "structure" genes controlling cell architecture and tissue remodeling had expression levels correlated with body weight. In SCAT, the identified "metabolic" and "structure" genes were mostly different from those identified in VAT and were regulated irrespective of body weight. GSEA indicated active adipogenesis in both tissues but a more prominent involvement of tissue stroma in VAT than in SCAT. Rimonabant treatment normalized most gene expression but further reduced oxidative phosphorylation gene expression in SCAT but not in VAT. CONCLUSION: VAT and SCAT show strikingly different gene expression programs in response to high fat diet and rimonabant treatment. Our results may lead to identification of therapeutic targets acting on specific fat depots to control obesity

    IST Austria Thesis

    Get PDF
    This dissertation focuses on algorithmic aspects of program verification, and presents modeling and complexity advances on several problems related to the static analysis of programs, the stateless model checking of concurrent programs, and the competitive analysis of real-time scheduling algorithms. Our contributions can be broadly grouped into five categories. Our first contribution is a set of new algorithms and data structures for the quantitative and data-flow analysis of programs, based on the graph-theoretic notion of treewidth. It has been observed that the control-flow graphs of typical programs have special structure, and are characterized as graphs of small treewidth. We utilize this structural property to provide faster algorithms for the quantitative and data-flow analysis of recursive and concurrent programs. In most cases we make an algebraic treatment of the considered problem, where several interesting analyses, such as the reachability, shortest path, and certain kind of data-flow analysis problems follow as special cases. We exploit the constant-treewidth property to obtain algorithmic improvements for on-demand versions of the problems, and provide data structures with various tradeoffs between the resources spent in the preprocessing and querying phase. We also improve on the algorithmic complexity of quantitative problems outside the algebraic path framework, namely of the minimum mean-payoff, minimum ratio, and minimum initial credit for energy problems. Our second contribution is a set of algorithms for Dyck reachability with applications to data-dependence analysis and alias analysis. In particular, we develop an optimal algorithm for Dyck reachability on bidirected graphs, which are ubiquitous in context-insensitive, field-sensitive points-to analysis. Additionally, we develop an efficient algorithm for context-sensitive data-dependence analysis via Dyck reachability, where the task is to obtain analysis summaries of library code in the presence of callbacks. Our algorithm preprocesses libraries in almost linear time, after which the contribution of the library in the complexity of the client analysis is (i)~linear in the number of call sites and (ii)~only logarithmic in the size of the whole library, as opposed to linear in the size of the whole library. Finally, we prove that Dyck reachability is Boolean Matrix Multiplication-hard in general, and the hardness also holds for graphs of constant treewidth. This hardness result strongly indicates that there exist no combinatorial algorithms for Dyck reachability with truly subcubic complexity. Our third contribution is the formalization and algorithmic treatment of the Quantitative Interprocedural Analysis framework. In this framework, the transitions of a recursive program are annotated as good, bad or neutral, and receive a weight which measures the magnitude of their respective effect. The Quantitative Interprocedural Analysis problem asks to determine whether there exists an infinite run of the program where the long-run ratio of the bad weights over the good weights is above a given threshold. We illustrate how several quantitative problems related to static analysis of recursive programs can be instantiated in this framework, and present some case studies to this direction. Our fourth contribution is a new dynamic partial-order reduction for the stateless model checking of concurrent programs. Traditional approaches rely on the standard Mazurkiewicz equivalence between traces, by means of partitioning the trace space into equivalence classes, and attempting to explore a few representatives from each class. We present a new dynamic partial-order reduction method called the Data-centric Partial Order Reduction (DC-DPOR). Our algorithm is based on a new equivalence between traces, called the observation equivalence. DC-DPOR explores a coarser partitioning of the trace space than any exploration method based on the standard Mazurkiewicz equivalence. Depending on the program, the new partitioning can be even exponentially coarser. Additionally, DC-DPOR spends only polynomial time in each explored class. Our fifth contribution is the use of automata and game-theoretic verification techniques in the competitive analysis and synthesis of real-time scheduling algorithms for firm-deadline tasks. On the analysis side, we leverage automata on infinite words to compute the competitive ratio of real-time schedulers subject to various environmental constraints. On the synthesis side, we introduce a new instance of two-player mean-payoff partial-information games, and show how the synthesis of an optimal real-time scheduler can be reduced to computing winning strategies in this new type of games

    Control Flow Analysis for SF Combinator Calculus

    Full text link
    Programs that transform other programs often require access to the internal structure of the program to be transformed. This is at odds with the usual extensional view of functional programming, as embodied by the lambda calculus and SK combinator calculus. The recently-developed SF combinator calculus offers an alternative, intensional model of computation that may serve as a foundation for developing principled languages in which to express intensional computation, including program transformation. Until now there have been no static analyses for reasoning about or verifying programs written in SF-calculus. We take the first step towards remedying this by developing a formulation of the popular control flow analysis 0CFA for SK-calculus and extending it to support SF-calculus. We prove its correctness and demonstrate that the analysis is invariant under the usual translation from SK-calculus into SF-calculus.Comment: In Proceedings VPT 2015, arXiv:1512.0221

    Analysis of Present Conditions and Prospects of the Labour Market Development in Russia

    Get PDF
    The target programs directed on strengthening employees' positions with a high educational level on the labour market are extremely necessary in the Russian conditions. In order to decide on specific problems special departments should be created in the Ministry of Work and Social Development of the Russian Federation.Programy te powinny wzmocnić pozycję wykwalifikowanej siły roboczej na rynku pracy. W celu ich wdrożenia należy utworzyć specjalne departamenty w Ministerstwie Pracy i Rozwoju Socjalnego w Rosji

    INSTITUTIONAL ANALYSIS ON POVERTY REDUCTION PROGRAM IN THE SOCIETY: A CASE STUDY OF NATIONAL PROGRAM FOR COMMUNITY EMPOWERMENT OF INDEPENDENT URBAN (PNPM-MP) IN SEMARANG, INDONESIA

    Get PDF
    Institutional PNPM-MP in Semarang put poverty alleviation as the main priority of the empowerment-based development policy. The strategy developed is to synergize government agencies with community institutions built by PNPM-MP program at the village and base level, that is, Community Institutional Agency (BKM) and the Community Self-Reliance Group (KSM). The problem studied in this research is: How does PNPM-MP institutional in society involve in poverty reduction, with the aim of describing and analyzing institutional programs in the community. The research approach used phenomenological qualitative, by conducting interviews, observation, focus group discussion to obtain data from informants (BKM / KSM). Informants include two BKM and ten KSM which is purposively selected (deliberately) from two villages in two districts. The analysis was performed interactively, that is, analysis techniques which are integral cycle among data collection, data reduction, data performance and conclusion withdrawal. Research conclusion: PNPM-MP institutional at the village and base level (BKM / KSM) has not been able to be a driving force in poverty reduction and is still seen by the community as a program requirement, not institutionalized on both horizontal and vertical level. Recommendation for the research result is that it needed awareness that poverty reduction requires a synergy between government agencies and community agencies embodied in the development planning of one village one planning
    corecore