27,401 research outputs found

    Lattice ϕ4\phi^4 theory of finite-size effects above the upper critical dimension

    Full text link
    We present a perturbative calculation of finite-size effects near TcT_c of the ϕ4\phi^4 lattice model in a dd-dimensional cubic geometry of size LL with periodic boundary conditions for d>4d > 4. The structural differences between the ϕ4\phi^4 lattice theory and the ϕ4\phi^4 field theory found previously in the spherical limit are shown to exist also for a finite number of components of the order parameter. The two-variable finite-size scaling functions of the field theory are nonuniversal whereas those of the lattice theory are independent of the nonuniversal model parameters.One-loop results for finite-size scaling functions are derived. Their structure disagrees with the single-variable scaling form of the lowest-mode approximation for any finite ξ/L\xi/L where ξ\xi is the bulk correlation length. At TcT_c, the large-LL behavior becomes lowest-mode like for the lattice model but not for the field-theoretic model. Characteristic temperatures close to TcT_c of the lattice model, such as Tmax(L)T_{max}(L) of the maximum of the susceptibility χ\chi, are found to scale asymptotically as TcTmax(L)Ld/2T_c - T_{max}(L) \sim L^{-d/2}, in agreement with previous Monte Carlo (MC) data for the five-dimensional Ising model. We also predict χmaxLd/2\chi_{max} \sim L^{d/2} asymptotically. On a quantitative level, the asymptotic amplitudes of this large -LL behavior close to TcT_c have not been observed in previous MC simulations at d=5d = 5 because of nonnegligible finite-size terms L(4d)/2\sim L^{(4-d)/2} caused by the inhomogeneous modes. These terms identify the possible origin of a significant discrepancy between the lowest-mode approximation and previous MC data. MC data of larger systems would be desirable for testing the magnitude of the L(4d)/2L^{(4-d)/2} and L4dL^{4-d} terms predicted by our theory.Comment: Accepted in Int. J. Mod. Phys.

    History and Implications of the Missouri Test-Oath Case

    Full text link
    Cummings v. Missouri (1867) is often overlooked in modern legal history, and very little scholarly literature exists chronicling the case’s implications for contemporary constitutional jurisprudence. When awareness does exist, there is a tendency to classify Cummings as simply a Civil War-era religious liberty case—a mischaracterization which reflects a fundamental misunderstanding of the ruling’s background and modern relevance. In reality, born out of post-war paranoia over loyalty and past Confederate allegiances, the Cummings case is most notable as landmark judicial precedent in defining the U.S. Constitution’s proscriptions of bills of attainder and ex post facto laws, and possesses very little significance today for religious liberty jurisprudence. Beginning with an analysis of the contemporary historical and political circumstances at hand, this article seeks to reframe the scholarly conversation surrounding Cummings to reflect the true place of importance it holds in the anthology of American legal history

    Finding Forensic Evidence In the Operating System\u27s Graphical User Interface

    Get PDF
    A branch of cyber security known as memory forensics focuses on extracting meaningful evidence from system memory. This analysis is often referred to as volatile memory analysis, and is generally performed on memory captures acquired from target systems. Inside of a memory capture is the complete state of a system under investigation, including the contents of currently running as well as previously executed applications. Analysis of this data can reveal a significant amount of activity that occurred on a system since the last reboot. For this research, the Windows operating system is targeted. In particular, the graphical user interface component that includes the taskbar, start menu and notification system will be examined for possible forensic artifacts. The techniques presented in this research are valuable to a forensic investigator trying to find evidence. They are also useful for penetration testers trying to determine if a tool has left any evidence behind for investigators to find. The research described in this thesis led to development of a scanning technique that served as the basis for a Volatility plugin that automates finding GUI related artifacts. To support this research, a lab consisting of three virtual machines (VM) was created using VMware. Two Windows 10 virtual machines were created for generating artifacts and one Linux was created for scanning the Windows machines. These machines were connected to a live router briefly for gathering network information. This these explores the strengths and limitations of this searching discovered during research. Lastly, future applications of this research are covered

    Establishment of effective metamodels for seakeeping performance in multidisciplinary ship design optimation

    No full text
    Ship design is a complex multidisciplinary optimization process to determine configuration variables that satisfy a set of mission requirements. Unfortunately, high fidelity commercial software for the ship performance estimation such as Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) are computationally expensive and time consuming to execute and deter the ship designer’s ability to explore larger range of optimization solutions. In this paper, the Latin Hypercube Design was used to select the sample data for covering the design space. A comprehensive seakeeping evaluation index, The percentage of downtime, a comprehensive seakeeping evaluation index, was also used to evaluate the seakeeping performance within the short-term and long-term wave distribution in the Multidisciplinary Design Optimization (MDO) process. The five motions of ship seakeeping performance contained roll, pitch, yaw, sway and heave. Particularly, a new effective approximation modelling technique—Single-Parameter Lagrangian support vector regression ?SPL-SVR? was investigated to construct ship seakeeping metamodels to facilitate the application of MDO. By considering the effects of two ship speeds, the established metamedels of ship seakeeping performance for the short-term percentage downtime are satisfactory for seakeeping predictions during the conceptual design stage; thus, the new approximation algorithm provides an optimal and cost-effective solution for constructing the metamodels using the MDO process

    ArraySearch: A Web-Based Genomic Search Engine

    Get PDF
    Recent advances in microarray technologies have resulted in a flood of genomics data. This large body of accumulated data could be used as a knowledge base to help researchers interpret new experimental data. ArraySearch finds statistical correlations between newly observed gene expression profiles and the huge source of well-characterized expression signatures deposited in the public domain. A search query of a list of genes will return experiments on which the genes are significantly up- or downregulated collectively. Searches can also be conducted using gene expression signatures from new experiments. This resource will empower biological researchers with a statistical method to explore expression data from their own research by comparing it with expression signatures from a large public archive
    corecore