1,811 research outputs found
Frequency and damping evolution during experimental seismic response of civil engineering structures
The results of the seismic tests on several reinforced-concrete shear walls and a four-storey frame are analysed in this paper. Each specimen was submitted to the action of a horizontal accelerogram, with successive growing amplitudes, using the pseudodynamic method. An analysis of the results allows knowing the evolution of the eigen frequency and damping ratio during the earthquakes thanks to an identification method working in the time domain. The method is formulated as a spatial model in which the stiffness and damping matrices are directly identified from the experimental displacements, velocities and restoring forces. The obtained matrices are then combined with the theoretical mass in order to obtain the eigen frequencies, damping ratios and modes. Those parameters have a great relevance for the design of this type of structures
Earthquake Protection of Existing Structures with Limited Seismic Joint: Base Isolation with Supplemental Damping versus Rotational Inertia
Existing civil engineering structures having strategic importance, such as hospitals, fire stations, and power plants, often do not comply with seismic standards in force today, as they were designed and built based on past structural guidelines. On the other hand, due to their special importance, structural integrity of such buildings is of vital importance during and after earthquakes, which puts demands on strategies for their seismic protection. In this regard, seismic base isolation has been widely employed; however, the existing limited seismic joint between adjacent buildings may hamper this application because of the large displacements concentrated at the isolation floor. In this paper, we compare two possible remedies: the former is to provide supplemental damping in conventional base isolation systems and the latter consists in a combination of base isolation with supplemental rotational inertia. For the second strategy, a mechanical device, called inerter, is arranged in series with spring and dashpot elements to form the so-called tuned-mass-damper-inerter (TMDI) directly connected to an isolation floor. Several advantages of this second system as compared to the first one are outlined, especially with regard to the limitation of floor accelerations and interstory drifts, which may be an issue for nonstructural elements and equipment, in addition to disturbing occupants. Once the optimal design of the TMDI is established, possible implementation of this system into existing structures is discussed
Keeping Authorities "Honest or Bust" with Decentralized Witness Cosigning
The secret keys of critical network authorities - such as time, name,
certificate, and software update services - represent high-value targets for
hackers, criminals, and spy agencies wishing to use these keys secretly to
compromise other hosts. To protect authorities and their clients proactively
from undetected exploits and misuse, we introduce CoSi, a scalable witness
cosigning protocol ensuring that every authoritative statement is validated and
publicly logged by a diverse group of witnesses before any client will accept
it. A statement S collectively signed by W witnesses assures clients that S has
been seen, and not immediately found erroneous, by those W observers. Even if S
is compromised in a fashion not readily detectable by the witnesses, CoSi still
guarantees S's exposure to public scrutiny, forcing secrecy-minded attackers to
risk that the compromise will soon be detected by one of the W witnesses.
Because clients can verify collective signatures efficiently without
communication, CoSi protects clients' privacy, and offers the first
transparency mechanism effective against persistent man-in-the-middle attackers
who control a victim's Internet access, the authority's secret key, and several
witnesses' secret keys. CoSi builds on existing cryptographic multisignature
methods, scaling them to support thousands of witnesses via signature
aggregation over efficient communication trees. A working prototype
demonstrates CoSi in the context of timestamping and logging authorities,
enabling groups of over 8,000 distributed witnesses to cosign authoritative
statements in under two seconds.Comment: 20 pages, 7 figure
Enabling Smart Retrofitting and Performance Anomaly Detection for a Sensorized Vessel: A Maritime Industry Experience
The integration of sensorized vessels, enabling real-time data collection and
machine learning-driven data analysis marks a pivotal advancement in the
maritime industry. This transformative technology not only can enhance safety,
efficiency, and sustainability but also usher in a new era of cost-effective
and smart maritime transportation in our increasingly interconnected world.
This study presents a deep learning-driven anomaly detection system augmented
with interpretable machine learning models for identifying performance
anomalies in an industrial sensorized vessel, called TUCANA. We Leverage a
human-in-the-loop unsupervised process that involves utilizing standard and
Long Short-Term Memory (LSTM) autoencoders augmented with interpretable
surrogate models, i.e., random forest and decision tree, to add transparency
and interpretability to the results provided by the deep learning models. The
interpretable models also enable automated rule generation for translating the
inference into human-readable rules. Additionally, the process also includes
providing a projection of the results using t-distributed stochastic neighbor
embedding (t-SNE), which helps with a better understanding of the structure and
relationships within the data and assessment of the identified anomalies. We
empirically evaluate the system using real data acquired from the vessel TUCANA
and the results involve achieving over 80% precision and 90% recall with the
LSTM model used in the process. The interpretable models also provide logical
rules aligned with expert thinking, and the t-SNE-based projection enhances
interpretability. Our system demonstrates that the proposed approach can be
used effectively in real-world scenarios, offering transparency and precision
in performance anomaly detection
Encouraging and Ensuring Successful Technology Transition in Civil Aviation
Technology transitions are essential to transforming air traffic management to meet future capacity needs. Encouraging and obtaining equipage adoption is one crucial aspect of technology transitions. We propose an approach for developing appropriate strategies to persuade aviation stakeholders to transition to new technologies. Our approach uses cost, benefit, and value distribution across stakeholders and over time to determine which strategies are most appropriate to persuading aircraft operators to adopt new equipage. Equipage that may show an overall positive value can nevertheless fail to provide value to individual stakeholders. Such imbalances in value distribution between stakeholders or over time may lead to stakeholder intransigence and can stymie efforts to transform air traffic management systems. Leverage strategies that correct these imbalances and accelerate the realization of value for all stakeholders can enhance cooperation and increase the likelihood of a successful transition to the new technology. We demonstrate the application of the approach using the case of automatic dependent surveillance-broadcast (ADS-B). The approach is also applicable to a wide range of industries beyond aviation, such as the energy sector and telecommunications
Vintage-Differentiated Environmental Regulation
Vintage-differentiated regulation (VDR) is a common feature of many environmental and other regulatory policies in the United States. Under VDR, standards for regulated units are fixed in terms of the units’ respective dates of entry, or “vintage,” with later entrants facing more stringent regulation. In the most common application, often referred to as “grandfathering,” units produced prior to a specific date are exempted from new regulation or face less stringent requirements. The vintage-differentiated approach has long appealed to many participants in the policy community, for reasons associated with efficiency, equity, and simple politics. First, it is frequently more cost-effective—in the short-term—to introduce new pollutionabatement technologies at the time that new plants are constructed than to retrofit older facilities with such technologies. Second, it seems more fair to avoid changing the rules of the game in mid-stream, and hence to apply new standards only to new plants. Third, political pressures tend to favor easily-identified existing facilities rather than undefined potential facilities. On the other hand, VDRs can be expected—on the basis of standard investment theory—to retard turnover in the capital stock (of durable plants and equipment), and thereby to reduce the cost-effectiveness of regulation in the long-term, compared with equivalent undifferentiated regulations.1 A further irony is that, when this slower turnover results in delayed adoption of new, cleaner technology, VDR can result in higher levels of pollutant emissions than would occur in the absence of regulation. In this Article, I survey previous applications and synthesize current thinking regarding VDRs in the environmental realm, and develop lessons for public policy and for future research. In Part 2, I describe the ubiquitous nature of VDRs in U.S. regulatory policy, and examine the reasons why VDRs are so common. In Part 3, I establish a theoretical framework for analysis of the cost-effectiveness of alternative types of environmental policy instruments to provide a context for the analysis of VDRs. In Part 4, I focus on the effects of VDRs, and describe a general theory of the impacts of these instruments in terms of their effects on technology adoption, capital turnover, pollution abatement costs, and environmental performance. In Parts 5 and 6, I examine empirical analyses of the impacts of VDRs in two significant sectors: Part 5 focuses on the effects of VDRs in the U.S. auto industry, and Part 6 on the effects of new source review, which is a form of VDR, in power generation and other sectors. In Part 7, I examine implications for policy and research, and recommend avenues for improvements in both.
The energy center initiative at politecnico di torino: practical experiences on energy efficiency measures in the municipality of torino
Urban districts should evolve towards a more sustainable infrastructure and greener energy carriers. The utmost challenge is the smart integration and control, within the existing infrastructure, of new information and energy technologies (such as sensors, appliances, electric and thermal power and storage devices) that are able to provide multi-services based on multi-actors and multi and interchangeable energy carriers. In recent years, the Municipality of Torino represents an experimental scenario, in which practical experiences in the below-areas have taken place through a number of projects: 1. energy efficiency in building; 2. smart energy grids management and smart metering; 3. biowaste-to-energy: mixed urban/industrial waste management with enhanced energy recovery from biogas. This work provides an overview and update on the most interesting initiatives of smart energy management in the urban context of Torino, with an analysis and quantification of the advantages gained in terms of energy and environmental efficiency
Stacco: Differentially Analyzing Side-Channel Traces for Detecting SSL/TLS Vulnerabilities in Secure Enclaves
Intel Software Guard Extension (SGX) offers software applications enclave to
protect their confidentiality and integrity from malicious operating systems.
The SSL/TLS protocol, which is the de facto standard for protecting
transport-layer network communications, has been broadly deployed for a secure
communication channel. However, in this paper, we show that the marriage
between SGX and SSL may not be smooth sailing.
Particularly, we consider a category of side-channel attacks against SSL/TLS
implementations in secure enclaves, which we call the control-flow inference
attacks. In these attacks, the malicious operating system kernel may perform a
powerful man-in-the-kernel attack to collect execution traces of the enclave
programs at page, cacheline, or branch level, while positioning itself in the
middle of the two communicating parties. At the center of our work is a
differential analysis framework, dubbed Stacco, to dynamically analyze the
SSL/TLS implementations and detect vulnerabilities that can be exploited as
decryption oracles. Surprisingly, we found exploitable vulnerabilities in the
latest versions of all the SSL/TLS libraries we have examined.
To validate the detected vulnerabilities, we developed a man-in-the-kernel
adversary to demonstrate Bleichenbacher attacks against the latest OpenSSL
library running in the SGX enclave (with the help of Graphene) and completely
broke the PreMasterSecret encrypted by a 4096-bit RSA public key with only
57286 queries. We also conducted CBC padding oracle attacks against the latest
GnuTLS running in Graphene-SGX and an open-source SGX-implementation of mbedTLS
(i.e., mbedTLS-SGX) that runs directly inside the enclave, and showed that it
only needs 48388 and 25717 queries, respectively, to break one block of AES
ciphertext. Empirical evaluation suggests these man-in-the-kernel attacks can
be completed within 1 or 2 hours.Comment: CCS 17, October 30-November 3, 2017, Dallas, TX, US
- …