499 research outputs found

    Essays on Regional Power System Investment: Value of Planning Model Enhancements, Transmission Generation Storage Co-optimization, and Border Carbon Adjustment

    Get PDF
    This thesis is composed of three essays on power system planning models, which are models that identify what assets of transmission, generation, storage, and demand-management would be beneficial to invest (or retire) over a multidecadal time horizon for large geographic regions. In the first essay, I propose a framework to systematically evaluate the economic benefits of enhancements to planning models, facilitating meaningful comparisons among model enhancements. I test the framework in a transmission expansion planning (TEP) context for the western U.S. and compare four enhancements: (1) consideration of multiple scenarios of long-run policy, economy, and technology scenarios, (2) refined representations of short-run operational variability due to demand and variable energy resources, (3) refined power flow modeling, and (4) inclusion of generation unit commitment costs and constraints. Results show that the consideration of long-run uncertainties provides the most benefits, while benefits from the other three enhancements are relatively small. The interaction between storage and transmission can be both complementary and substitutive. In the second essay, to quantify the benefits of considering this interaction in TEP, I enhance the TEP model with storage expansion capability and test it in a planning context for the western U.S. Results show that the benefits of anticipating storage expansion in TEP increase when the assumed cost of building storage decreases but are sensitive to assumed carbon prices. Compared to the total value that storage can bring to the power system, the value of anticipating storage expansion in TEP can be significant, showing a strong impact from TEP decisions upon the profitability of storage investors. In the third essay, I use the TEP model to test the effectiveness of different border carbon adjustment policies in the western U.S. power system, in which California is a unilaterally regulates carbon emissions. The results show that charging electricity imports based on the facility-specific emission rate of the import contract can lead to substantial emissions leakage and even increases in total system emissions. Meanwhile, assuming the same emission rate across all electricity imports can partially mitigate leakage and result in small system-wide emissions reductions. Finally, basing the import emission rate on the marginal emission rate external to the carbon pricing regime can encourage a system-wide emission reduction, achieving the best economic efficiency

    Attention-free Spikformer: Mixing Spike Sequences with Simple Linear Transforms

    Full text link
    By integrating the self-attention capability and the biological properties of Spiking Neural Networks (SNNs), Spikformer applies the flourishing Transformer architecture to SNNs design. It introduces a Spiking Self-Attention (SSA) module to mix sparse visual features using spike-form Query, Key, and Value, resulting in the State-Of-The-Art (SOTA) performance on numerous datasets compared to previous SNN-like frameworks. In this paper, we demonstrate that the Spikformer architecture can be accelerated by replacing the SSA with an unparameterized Linear Transform (LT) such as Fourier and Wavelet transforms. These transforms are utilized to mix spike sequences, reducing the quadratic time complexity to log-linear time complexity. They alternate between the frequency and time domains to extract sparse visual features, showcasing powerful performance and efficiency. We conduct extensive experiments on image classification using both neuromorphic and static datasets. The results indicate that compared to the SOTA Spikformer with SSA, Spikformer with LT achieves higher Top-1 accuracy on neuromorphic datasets (i.e., CIFAR10-DVS and DVS128 Gesture) and comparable Top-1 accuracy on static datasets (i.e., CIFAR-10 and CIFAR-100). Furthermore, Spikformer with LT achieves approximately 29-51% improvement in training speed, 61-70% improvement in inference speed, and reduces memory usage by 4-26% due to not requiring learnable parameters.Comment: Under Revie

    Ferromagnetic, structurally disordered ZnO implanted with Co ions

    Full text link
    We present superparamagnetic clusters of structurally highly disordered Co-Zn-O created by high fluence Co ion implantation into ZnO (0001) single crystals at low temperatures. This secondary phase cannot be detected by common x-ray diffraction but is observed by high-resolution transmission electron microscopy. In contrast to many other secondary phases in a ZnO matrix it induces low-field anomalous Hall effect and thus is a candidate for magneto-electronics applications.Comment: 5 pages, 3 figure

    Recent Advances and New Frontiers in Spiking Neural Networks

    Full text link
    In recent years, spiking neural networks (SNNs) have received extensive attention in brain-inspired intelligence due to their rich spatially-temporal dynamics, various encoding methods, and event-driven characteristics that naturally fit the neuromorphic hardware. With the development of SNNs, brain-inspired intelligence, an emerging research field inspired by brain science achievements and aiming at artificial general intelligence, is becoming hot. This paper reviews recent advances and discusses new frontiers in SNNs from five major research topics, including essential elements (i.e., spiking neuron models, encoding methods, and topology structures), neuromorphic datasets, optimization algorithms, software, and hardware frameworks. We hope our survey can help researchers understand SNNs better and inspire new works to advance this field.Comment: Accepted at IJCAI202

    Tuning Synaptic Connections instead of Weights by Genetic Algorithm in Spiking Policy Network

    Full text link
    Learning from the interaction is the primary way biological agents know about the environment and themselves. Modern deep reinforcement learning (DRL) explores a computational approach to learning from interaction and has significantly progressed in solving various tasks. However, the powerful DRL is still far from biological agents in energy efficiency. Although the underlying mechanisms are not fully understood, we believe that the integration of spiking communication between neurons and biologically-plausible synaptic plasticity plays a prominent role. Following this biological intuition, we optimize a spiking policy network (SPN) by a genetic algorithm as an energy-efficient alternative to DRL. Our SPN mimics the sensorimotor neuron pathway of insects and communicates through event-based spikes. Inspired by biological research that the brain forms memories by forming new synaptic connections and rewires these connections based on new experiences, we tune the synaptic connections instead of weights in SPN to solve given tasks. Experimental results on several robotic control tasks show that our method can achieve the performance level of mainstream DRL methods and exhibit significantly higher energy efficiency
    • …
    corecore