2,136 research outputs found
Recommended from our members
Multiplexed model predictive control of interconnected systems
A Multiplexed Model Predictive Control (MMPC) scheme with Quadratic Dissipativity Constraint (QDC) for interconnected systems is presented in this paper. A centralized MMPC is designed for the global system, wherein the controls of subsystems are updated sequentially to reduce the computational time. In MMPC, the global state vector of the interconnected system is required by the optimization. The QDC is converted into an enforced stability constraint for the MMPC as an alternative to the terminal constraint and terminal cost in this approach. The nominal recursive feasibility for the global system and the iterative feasibility for the local subsystems are obtained via set operations on the invariant sets. The admissible sets for the control inputs are obtained and employed in this approach for the QDC-based stability constraint. The set operations are speed up by multiple magnitudes thanks to the implementation of multiplexed inputs in MMPC. Numerical simulations with Automatic Generation Control (AGC) in power systems having tie-lines demonstrate the theoretical development.The authors acknowledge the support by the Singapore National Research Foundation (NRF) under its Campus for Research Excellence And Technological Enterprise (CREATE) programme and the Cambridge Centre for Advanced Research in Energy Efficiency in Singapore (Cambridge CARES), C4T project.This is the author accepted manuscript. The final version is available from IEEE via http://dx.doi.org/10.1109/CDC.2015.740256
Principles of Neuromorphic Photonics
In an age overrun with information, the ability to process reams of data has
become crucial. The demand for data will continue to grow as smart gadgets
multiply and become increasingly integrated into our daily lives.
Next-generation industries in artificial intelligence services and
high-performance computing are so far supported by microelectronic platforms.
These data-intensive enterprises rely on continual improvements in hardware.
Their prospects are running up against a stark reality: conventional
one-size-fits-all solutions offered by digital electronics can no longer
satisfy this need, as Moore's law (exponential hardware scaling),
interconnection density, and the von Neumann architecture reach their limits.
With its superior speed and reconfigurability, analog photonics can provide
some relief to these problems; however, complex applications of analog
photonics have remained largely unexplored due to the absence of a robust
photonic integration industry. Recently, the landscape for
commercially-manufacturable photonic chips has been changing rapidly and now
promises to achieve economies of scale previously enjoyed solely by
microelectronics.
The scientific community has set out to build bridges between the domains of
photonic device physics and neural networks, giving rise to the field of
\emph{neuromorphic photonics}. This article reviews the recent progress in
integrated neuromorphic photonics. We provide an overview of neuromorphic
computing, discuss the associated technology (microelectronic and photonic)
platforms and compare their metric performance. We discuss photonic neural
network approaches and challenges for integrated neuromorphic photonic
processors while providing an in-depth description of photonic neurons and a
candidate interconnection architecture. We conclude with a future outlook of
neuro-inspired photonic processing.Comment: 28 pages, 19 figure
Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts
The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power
Quantitative multiplexing with nano-self-assemblies in SERS.
Multiplexed or simultaneous detection of multiple analytes is a valuable tool in many analytical applications. However, complications caused by the presence of interfering compounds in a sample form a major drawback in existing molecular sensor technologies, particularly in multi-analyte systems. Although separating analytes through extraction or chromatography can partially address the problem of interferents, there remains a need for developing direct observational tools capable of multiplexing that can be applied in situ. Surface-enhanced Raman Spectroscopy (SERS) is an optical molecular finger-printing technique that has the ability to resolve analytes from within mixtures. SERS has attracted much attention for its potential in multiplexed sensing but it has been limited in its quantitative abilities. Here, we report a facile supramolecular SERS-based method for quantitative multiplex analysis of small organic molecules in aqueous environments such as human urine.The authors thank Ms. Anna Andreou for the 1H-NMR measurements and acknowledge funding from Walters-Kundert Trust, EPSRC (EP/K028510/1, EP/G060649/1, EP/H007024/1, ERC LINASS 320503), an ERC starting investigator grant (ASPiRe 240629), EU CUBiHOLE grant. S.K. thanks Krebs Memorial Scholarship (The Biochemical Society) and Cambridge Commonwealth Trust for funding.This is the final version of the article. It first appeared from NPG via http://dx.doi.org/10.1038/srep0678
Psychopower and Ordinary Madness: Reticulated Dividuals in Cognitive Capitalism
Despite the seemingly neutral vantage of using nature for widely-distributed computational purposes, neither post-biological nor post-humanist teleology simply concludes with the real "end of nature" as entailed in the loss of the specific ontological status embedded in the identifier "natural." As evinced by the ecological crises of the Anthropocene—of which the 2019 Brazil Amazon rainforest fires are only the most recent—our epoch has transfixed the “natural order" and imposed entropic artificial integration, producing living species that become “anoetic,” made to serve as automated exosomatic residues, or digital flecks. I further develop Gilles Deleuze’s description of control societies to upturn Foucauldian biopower, replacing its spacio-temporal bounds with the exographic excesses in psycho-power; culling and further detailing Bernard Stiegler’s framework of transindividuation and hyper-control, I examine how becoming-subject is predictively facilitated within cognitive capitalism and what Alexander Galloway terms “deep digitality.” Despite the loss of material vestiges qua virtualization—which I seek to trace in an historical review of industrialization to postindustrialization—the drive-based and reticulated "internet of things" facilitates a closed loop from within the brain to the outside environment, such that the aperture of thought is mediated and compressed. The human brain, understood through its material constitution, is susceptible to total datafication’s laminated process of “becoming-mnemotechnical,” and, as neuroplasticity is now a valid description for deep-learning and neural nets, we are privy to the rebirth of the once-discounted metaphor of the “cybernetic brain.” Probing algorithmic governmentality while posing noetic dreaming as both technical and pharmacological, I seek to analyze how spirit is blithely confounded with machine-thinking’s gelatinous cognition, as prosthetic organ-adaptation becomes probabilistically molded, networked, and agentially inflected (rather than simply externalized)
Recommended from our members
Multiomics modeling of the immunome, transcriptome, microbiome, proteome and metabolome adaptations during human pregnancy.
MotivationMultiple biological clocks govern a healthy pregnancy. These biological mechanisms produce immunologic, metabolomic, proteomic, genomic and microbiomic adaptations during the course of pregnancy. Modeling the chronology of these adaptations during full-term pregnancy provides the frameworks for future studies examining deviations implicated in pregnancy-related pathologies including preterm birth and preeclampsia.ResultsWe performed a multiomics analysis of 51 samples from 17 pregnant women, delivering at term. The datasets included measurements from the immunome, transcriptome, microbiome, proteome and metabolome of samples obtained simultaneously from the same patients. Multivariate predictive modeling using the Elastic Net (EN) algorithm was used to measure the ability of each dataset to predict gestational age. Using stacked generalization, these datasets were combined into a single model. This model not only significantly increased predictive power by combining all datasets, but also revealed novel interactions between different biological modalities. Future work includes expansion of the cohort to preterm-enriched populations and in vivo analysis of immune-modulating interventions based on the mechanisms identified.Availability and implementationDatasets and scripts for reproduction of results are available through: https://nalab.stanford.edu/multiomics-pregnancy/.Supplementary informationSupplementary data are available at Bioinformatics online
- …