15 research outputs found

    The Information Catastrophe

    Get PDF
    Currently we produce 10 to power 21 digital bits of information annually on Earth. Assuming 20 percent annual growth rate, we estimate that 350 years from now, the number of bits produced will exceed the number of all atoms on Earth, or 10 to power 50. After 250 years, the power required to sustain this digital production will exceed 18.5 TW, or the total planetary power consumption today, and 500 years from now the digital content will account for more than half of the Earths mass, according to the mass energy information equivalence principle. Besides the existing global challenges such as climate, environment, population, food, health, energy and security, our estimates here point to another singularity event for our planet, called the Information Catastrophe.Comment: 4 page

    The generalized Lindemann melting coefficient

    Get PDF
    Lindemann developed the melting temperature theory over 100 years ago, known as the Lindemann criterion. Its main assumption is that melting occurs when the root-mean-square vibration amplitude of ions and atoms in crystals exceeds a critical fraction, h of the inter-atomic spacing in crystals. The Lindemann coefficient h is undefined and scientific papers report different h values for different elements. Here we present previously unobserved data trends pointing to the fact that the Lindemann coefficient could be linked to the periodic groups of the periodic table, having an exact value for each element belonging to a given periodic group. We report 12 distinctive Lindemann coefficient values corresponding to 12 groups of the periodic table containing solid elements with identifiable melting temperature. Using these vales, the recalculation of the melting temperatures indicates a good match to the experimental values for 39 elements, corresponding to 12 out of 15 periodic groups. This newly observed result opens up the possibility of further refining the Lindemann melting criterion by stimulating analytical studies of the Lindemann coefficient in the light of this newly discovered result

    Four-State Anti-Ferroelectric Random Access Memory

    Get PDF
    Ferroelectric random access memory (FRAM) is a two-state non-volatile memory, in which information is digitally encoded using switchable remanent polarization states within a ferroelectric thin film capacitor. Here, we propose a novel non-volatile memory based on anti-ferroelectric polycrystalline ceramics, termed anti-FRAM (AFRAM). The AFRAM memory cell architecture is similar to FRAM, but it is an operation protocol. Our initial experimental demonstration of the memory effect in anti-ferroelectric ceramic shows, remarkably, that the AFRAM technology encodes data in both ferroelectric sublattices of the anti-ferroelectric medium. This results in a four-state nonvolatile memory capable of storing two digital bits simultaneously, unlike the FRAM technology that has two-memory states and it is capable to store one digital bit per cell

    Solid-State Heating Using the Multicaloric Effect in Multiferroics

    No full text
    The multicaloric effect is defined as the adiabatic reversible temperature change in multiferroic materials induced by the application of an external electric or magnetic field, and it was first theoretically proposed in 2012. The multicaloric effects in multiferroics, as well as other similar caloric effects in single ferroics, such as magnetocaloric, elastocaloric, barocaloric, and electrocaloric, have been the focus of much research due to their potential commercialization in solid-state refrigeration. In this short communication article, we examine the thermodynamics of the multicaloric effect for solid-state heating applications. A possible thermodynamic multicaloric heating cycle is proposed and then implemented to estimate the solid-state heating effect for a known electrocaloric system. This work offers a path to implementing caloric and multicaloric effects to efficient heating systems, and we offer a theoretical estimate of the upper limit of the temperature change achievable in a multicaloric cooling or heating effect

    Second law of information dynamics

    Get PDF
    One of the most powerful laws in physics is the second law of thermodynamics, which states that the entropy of any system remains constant or increases over time. In fact, the second law is applicable to the evolution of the entire universe and Clausius stated, “The entropy of the universe tends to a maximum.” Here, we examine the time evolution of information systems, defined as physical systems containing information states within Shannon’s information theory framework. Our observations allow the introduction of the second law of information dynamics (infodynamics). Using two different information systems, digital data storage and a biological RNA genome, we demonstrate that the second law of infodynamics requires the information entropy to remain constant or to decrease over time. This is exactly the opposite to the evolution of the physical entropy, as dictated by the second law of thermodynamics. The surprising result obtained here has massive implications for future developments in genomic research, evolutionary biology, computing, big data, physics, and cosmology

    Four-State Anti-Ferroelectric Random Access Memory

    No full text
    corecore