26,640 research outputs found

    Postsecular instruments of acculturation : Czesław Miłosz's works from the second American Stay

    Get PDF
    The article raises the question about the ways in which religious tradition can become an ally in the process of acculturation while serving the modern subject both as a springboard for innovative, creative work and as a tool of self-improvement. Czesław Miłosz's selected works from his second stay in the United States (1961-1980) are analysed from the postsecular perspective which recognises religion as a full-fledged actor in the process of modern transformations that may broaden the field of artistic choice but remains vulnerable to artistic resemantizations or even profanations (Agamben). Such an analysis allows us to interpret the poem From the Rising of the Sun as a form of reconciliation of Miłosz's American and Lithuanian experience (as well as of maturity and childhood, centre and periphery, modern and pre-modern cultural formation) through textual practices inspired by his private Liturgy of the Hours. In this light, the translations of the Books of the Bible on which Miłosz worked, his novel The Mountains of Parnassus, as well as his essays from Visions from San Francisco Bay emerge as instruments of shaping the communal identity with the use of pre-existing rituals, which are, nonetheless, also negotiated in the act of writing

    Bidirectional Model Transformations in QVT: Semantic Issues and Open Questions

    Get PDF
    (QVT) standard as applied to the specification of bidirectional transformations between models. We discuss what is meant by bidirectional transformations, and the model-driven development scenarios in which they are needed. We analyse the fundamental requirements on tools which support such transformations, and discuss some semantic issues which arise. We argue that a considerable amount of basic research is needed before suitable tools will be fully realisable, and suggest directions for this future research

    Process, System, Causality, and Quantum Mechanics, A Psychoanalysis of Animal Faith

    Full text link
    We shall argue in this paper that a central piece of modern physics does not really belong to physics at all but to elementary probability theory. Given a joint probability distribution J on a set of random variables containing x and y, define a link between x and y to be the condition x=y on J. Define the {\it state} D of a link x=y as the joint probability distribution matrix on x and y without the link. The two core laws of quantum mechanics are the Born probability rule, and the unitary dynamical law whose best known form is the Schrodinger's equation. Von Neumann formulated these two laws in the language of Hilbert space as prob(P) = trace(PD) and D'T = TD respectively, where P is a projection, D and D' are (von Neumann) density matrices, and T is a unitary transformation. We'll see that if we regard link states as density matrices, the algebraic forms of these two core laws occur as completely general theorems about links. When we extend probability theory by allowing cases to count negatively, we find that the Hilbert space framework of quantum mechanics proper emerges from the assumption that all D's are symmetrical in rows and columns. On the other hand, Markovian systems emerge when we assume that one of every linked variable pair has a uniform probability distribution. By representing quantum and Markovian structure in this way, we see clearly both how they differ, and also how they can coexist in natural harmony with each other, as they must in quantum measurement, which we'll examine in some detail. Looking beyond quantum mechanics, we see how both structures have their special places in a much larger continuum of formal systems that we have yet to look for in nature.Comment: LaTex, 86 page

    A heuristic-based approach to code-smell detection

    Get PDF
    Encapsulation and data hiding are central tenets of the object oriented paradigm. Deciding what data and behaviour to form into a class and where to draw the line between its public and private details can make the difference between a class that is an understandable, flexible and reusable abstraction and one which is not. This decision is a difficult one and may easily result in poor encapsulation which can then have serious implications for a number of system qualities. It is often hard to identify such encapsulation problems within large software systems until they cause a maintenance problem (which is usually too late) and attempting to perform such analysis manually can also be tedious and error prone. Two of the common encapsulation problems that can arise as a consequence of this decomposition process are data classes and god classes. Typically, these two problems occur together ā€“ data classes are lacking in functionality that has typically been sucked into an over-complicated and domineering god class. This paper describes the architecture of a tool which automatically detects data and god classes that has been developed as a plug-in for the Eclipse IDE. The technique has been evaluated in a controlled study on two large open source systems which compare the tool results to similar work by Marinescu, who employs a metrics-based approach to detecting such features. The study provides some valuable insights into the strengths and weaknesses of the two approache

    The Need for a Different Approach to Financial Reporting and Standard-setting

    Get PDF
    International Financial Reporting Standards are questioned. Possibly, there is a need for a different kind of standards and a different procedure for developing them. No doubt, there is a need for a more profound theoretical approach to these issues. Theory-building in accounting should include approaches whereby problem descriptions have a broad coverage and cross the boarders of traditional specialisations. In this paper, a theoretical approach is outlined. According to this approach, insights into control problems for every organisation and system can be gained by analysing relationships between global value chains and a hierarchy of one or several organisations. Time is crucial. Instrumentality is regarded as an inevitable and necessary guide line for any control system that relates resources to functions and visions. Instrumentality concerns the effects of tools on certain functions. In the paper financial reporting and standard-setting are placed in a wide context in which longitudinal relationships are essential for individuals, organisations and control systems. Basic financial accounting concepts and their relationships with business events are discussed. The importance of uncertainty for financial reporting is emphasized, and so is the fact, that control from top-levels is exercised at a distance. A tendency to instrumentalism is also recognized: measures and procedures, for example standard setting procedures, tend to be important in themselves, irrespective of ultimate economic functions in a wider perspective. The analysis in the paper is one application of a general approach to financial control for all types of organisations. The general approach is based on a number of previous research-oriented books published over several decades and the authorĀ“s specific own experiences from internal and external processes with organisations in focus. Consistency and integrative power of the ideas have been tested in relation to certain books in various fields outside the core of the subject: applied systems theory, theatre, sociology, economic history, institutional theory and economics.financial reporting; International Financial Reporting Standards; standard-setting; accounting standard setting bodies; supervisory boards; corporate governance; transparency; market value accounting; mark-to-market; fair values; historical values; accounting theory.

    Computing overlappings by unification in the deterministic lambda calculus LR with letrec, case, constructors, seq and variable chains

    Get PDF
    Correctness of program transformations in extended lambda calculi with a contextual semantics is usually based on reasoning about the operational semantics which is a rewrite semantics. A successful approach to proving correctness is the combination of a context lemma with the computation of overlaps between program transformations and the reduction rules.The method is similar to the computation of critical pairs for the completion of term rewriting systems. We describe an effective unification algorithm to determine all overlaps of transformations with reduction rules for the lambda calculus LR which comprises a recursive let-expressions, constructor applications, case expressions and a seq construct for strict evaluation. The unification algorithm employs many-sorted terms, the equational theory of left-commutativity modeling multi-sets, context variables of different kinds and a mechanism for compactly representing binding chains in recursive let-expressions. As a result the algorithm computes a finite set of overlappings for the reduction rules of the calculus LR that serve as a starting point to the automatization of the analysis of program transformations
    • ā€¦
    corecore