5,100 research outputs found

    Unified constitutive material models for nonlinear finite-element structural analysis

    Get PDF
    Unified constitutive material models were developed for structural analyses of aircraft gas turbine engine components with particular application to isotropic materials used for high-pressure stage turbine blades and vanes. Forms or combinations of models independently proposed by Bodner and Walker were considered. These theories combine time-dependent and time-independent aspects of inelasticity into a continuous spectrum of behavior. This is in sharp contrast to previous classical approaches that partition inelastic strain into uncoupled plastic and creep components. Predicted stress-strain responses from these models were evaluated against monotonic and cyclic test results for uniaxial specimens of two cast nickel-base alloys, B1900+Hf and Rene' 80. Previously obtained tension-torsion test results for Hastelloy X alloy were used to evaluate multiaxial stress-strain cycle predictions. The unified models, as well as appropriate algorithms for integrating the constitutive equations, were implemented in finite-element computer codes

    The properties of attractors of canalyzing random Boolean networks

    Full text link
    We study critical random Boolean networks with two inputs per node that contain only canalyzing functions. We present a phenomenological theory that explains how a frozen core of nodes that are frozen on all attractors arises. This theory leads to an intuitive understanding of the system's dynamics as it demonstrates the analogy between standard random Boolean networks and networks with canalyzing functions only. It reproduces correctly the scaling of the number of nonfrozen nodes with system size. We then investigate numerically attractor lengths and numbers, and explain the findings in terms of the properties of relevant components. In particular we show that canalyzing networks can contain very long attractors, albeit they occur less often than in standard networks.Comment: 9 pages, 8 figure

    Theory of transient spectroscopy of multiple quantum well structures

    Full text link
    A theory of the transient spectroscopy of quantum well (QW) structures under a large applied bias is presented. An analytical model of the initial part of the transient current is proposed. The time constant of the transient current depends not only on the emission rate from the QWs, as is usually assumed, but also on the subsequent carrier transport across QWs. Numerical simulation was used to confirm the validity of the proposed model, and to study the transient current on a larger time scale. It is shown that the transient current is influenced by the nonuniform distribution of the electric field and related effects, which results in a step-like behavior of the current. A procedure of extraction of the QW emission time from the transient spectroscopy experiments is suggested.Comment: 5 pages, 4 figures, to be published in J. Appl. Phy

    Post-modernism's use and abuse of Nietzsche

    Get PDF
    I focus on Nietzsche's architectural metaphor of self-construction in arguing for the claim that postmodern readings of Nietzsche misunderstand his various attacks on dogmatic philosophy as paving the way for acceptance of a self characterized by fundamental disunity. Nietzsche's attack on essentialist dogmatic metaphysics is a call to engage in a purposive self-creation under a unifying will, a will that possesses the strength to reinterpret history as a pathway to "the problem that we are". Nietzsche agrees with the postmodernists that unity is not a pre-given, however he would disavow their rejection of unity as a goal. Where the postmodernists celebrate "the death of the subject" Nietzsche rejects this valorization of disunity as a form of Nihilism and prescribes the creation of a genuine unified subjectivity to those few capable of such a goal. Postmodernists are nearer Nietzsche's idea of the Last Man than his idea of the Overman.Articl

    Algebraic reduction of the Ising model

    Full text link
    We consider the Ising model on a cylindrical lattice of L columns, with fixed-spin boundary conditions on the top and bottom rows. The spontaneous magnetization can be written in terms of partition functions on this lattice. We show how we can use the Clifford algebra of Kaufman to write these partition functions in terms of L by L determinants, and then further reduce them to m by m determinants, where m is approximately L/2. In this form the results can be compared with those of the Ising case of the superintegrable chiral Potts model. They point to a way of calculating the spontaneous magnetization of that more general model algebraically.Comment: 25 pages, one figure, last reference completed. Various typos fixed. Changes on 12 July 2008: Fig 1, 0 to +1; before (2.1), if to is; after (4.6), from to form; before (4.46), first three to middle two; before (4.46), last to others; Conclusions, 2nd para, insert how ; renewcommand \i to be \rm

    A scalable parallel finite element framework for growing geometries. Application to metal additive manufacturing

    Get PDF
    This work introduces an innovative parallel, fully-distributed finite element framework for growing geometries and its application to metal additive manufacturing. It is well-known that virtual part design and qualification in additive manufacturing requires highly-accurate multiscale and multiphysics analyses. Only high performance computing tools are able to handle such complexity in time frames compatible with time-to-market. However, efficiency, without loss of accuracy, has rarely held the centre stage in the numerical community. Here, in contrast, the framework is designed to adequately exploit the resources of high-end distributed-memory machines. It is grounded on three building blocks: (1) Hierarchical adaptive mesh refinement with octree-based meshes; (2) a parallel strategy to model the growth of the geometry; (3) state-of-the-art parallel iterative linear solvers. Computational experiments consider the heat transfer analysis at the part scale of the printing process by powder-bed technologies. After verification against a 3D benchmark, a strong-scaling analysis assesses performance and identifies major sources of parallel overhead. A third numerical example examines the efficiency and robustness of (2) in a curved 3D shape. Unprecedented parallelism and scalability were achieved in this work. Hence, this framework contributes to take on higher complexity and/or accuracy, not only of part-scale simulations of metal or polymer additive manufacturing, but also in welding, sedimentation, atherosclerosis, or any other physical problem where the physical domain of interest grows in time

    Measuring Global Similarity between Texts

    Get PDF
    We propose a new similarity measure between texts which, contrary to the current state-of-the-art approaches, takes a global view of the texts to be compared. We have implemented a tool to compute our textual distance and conducted experiments on several corpuses of texts. The experiments show that our methods can reliably identify different global types of texts.Comment: Submitted to SLSP 201

    Data mining: a tool for detecting cyclical disturbances in supply networks.

    Get PDF
    Disturbances in supply chains may be either exogenous or endogenous. The ability automatically to detect, diagnose, and distinguish between the causes of disturbances is of prime importance to decision makers in order to avoid uncertainty. The spectral principal component analysis (SPCA) technique has been utilized to distinguish between real and rogue disturbances in a steel supply network. The data set used was collected from four different business units in the network and consists of 43 variables; each is described by 72 data points. The present paper will utilize the same data set to test an alternative approach to SPCA in detecting the disturbances. The new approach employs statistical data pre-processing, clustering, and classification learning techniques to analyse the supply network data. In particular, the incremental k-means clustering and the RULES-6 classification rule-learning algorithms, developed by the present authors’ team, have been applied to identify important patterns in the data set. Results show that the proposed approach has the capability automatically to detect and characterize network-wide cyclical disturbances and generate hypotheses about their root cause

    Impact of Unexpected Events, Shocking News and Rumours on Foreign Exchange Market Dynamics

    Get PDF
    We analyze the dynamical response of the world's financial community to various types of unexpected events, including the 9/11 terrorist attacks as they unfolded on a minute-by-minute basis. We find that there are various 'species' of news, characterized by how quickly the news get absorbed, how much meaning and importance is assigned to it by the community, and what subsequent actions are then taken. For example, the response to the unfolding events of 9/11 shows a gradual collective understanding of what was happening, rather than an immediate realization. For news items which are not simple economic statements, and hence whose implications are not immediately obvious, we uncover periods of collective discovery during which collective opinions seem to oscillate in a remarkably synchronized way. In the case of a rumour, our findings also provide a concrete example of contagion in inter-connected communities. Practical applications of this work include the possibility of producing selective newsfeeds for specific communities, based on their likely impact

    Random-cluster multi-histogram sampling for the q-state Potts model

    Get PDF
    Using the random-cluster representation of the qq-state Potts models we consider the pooling of data from cluster-update Monte Carlo simulations for different thermal couplings KK and number of states per spin qq. Proper combination of histograms allows for the evaluation of thermal averages in a broad range of KK and qq values, including non-integer values of qq. Due to restrictions in the sampling process proper normalization of the combined histogram data is non-trivial. We discuss the different possibilities and analyze their respective ranges of applicability.Comment: 12 pages, 9 figures, RevTeX
    • …
    corecore