4,686 research outputs found
Weak Secrecy in the Multi-Way Untrusted Relay Channel with Compute-and-Forward
We investigate the problem of secure communications in a Gaussian multi-way
relay channel applying the compute-and-forward scheme using nested lattice
codes. All nodes employ half-duplex operation and can exchange confidential
messages only via an untrusted relay. The relay is assumed to be honest but
curious, i.e., an eavesdropper that conforms to the system rules and applies
the intended relaying scheme. We start with the general case of the
single-input multiple-output (SIMO) L-user multi-way relay channel and provide
an achievable secrecy rate region under a weak secrecy criterion. We show that
the securely achievable sum rate is equivalent to the difference between the
computation rate and the multiple access channel (MAC) capacity. Particularly,
we show that all nodes must encode their messages such that the common
computation rate tuple falls outside the MAC capacity region of the relay. We
provide results for the single-input single-output (SISO) and the
multiple-input single-input (MISO) L-user multi-way relay channel as well as
the two-way relay channel. We discuss these results and show the dependency
between channel realization and achievable secrecy rate. We further compare our
result to available results in the literature for different schemes and show
that the proposed scheme operates close to the compute-and-forward rate without
secrecy.Comment: submitted to JSAC Special Issue on Fundamental Approaches to Network
Coding in Wireless Communication System
3D-BEVIS: Bird's-Eye-View Instance Segmentation
Recent deep learning models achieve impressive results on 3D scene analysis
tasks by operating directly on unstructured point clouds. A lot of progress was
made in the field of object classification and semantic segmentation. However,
the task of instance segmentation is less explored. In this work, we present
3D-BEVIS, a deep learning framework for 3D semantic instance segmentation on
point clouds. Following the idea of previous proposal-free instance
segmentation approaches, our model learns a feature embedding and groups the
obtained feature space into semantic instances. Current point-based methods
scale linearly with the number of points by processing local sub-parts of a
scene individually. However, to perform instance segmentation by clustering,
globally consistent features are required. Therefore, we propose to combine
local point geometry with global context information from an intermediate
bird's-eye view representation.Comment: camera-ready version for GCPR '1
INVESTIGATION OF STRATEGIES TO PROMOTE EFFECTIVE TEACHER PROFESSIONAL DEVELOPMENT EXPERIENCES IN EARTH SCIENCE
This dissertation serves as a call to geoscientists to share responsibility with K-12 educators for increasing Earth science literacy. When partnerships are created among K-12 educators and geoscientists, the synergy created can promote Earth science literacy in students, teachers, and the broader community. The research described here resulted in development of tools that can support effective professional development for teachers. One tool is used during the planning stages to structure a professional development program, another set of tools supports measurement of the effectiveness of a development program, and the third tool supports sustainability of professional development programs. The Michigan Teacher Excellence Program (MiTEP), a Math/Science Partnership project funded by the National Science Foundation, served as the test bed for developing and testing these tools.
The first tool, the planning tool, is the Earth Science Literacy Principles (ESLP). The ESLP served as a planning tool for the two-week summer field courses as part of the MiTEP program. The ESLP, published in 2009, clearly describe what an Earth science literate person should know. The ESLP consists of nine big ideas and their supporting fundamental concepts. Using the ESLP for planning a professional development program assisted both instructors and teacher-participants focus on important concepts throughout the professional development activity.
The measurement tools were developed to measure change in teachers’ Earth science content-area knowledge and perceptions related to teaching and learning that result from participating in a professional development program. The first measurement tool, the Earth System Concept Inventory (ESCI), directly measures content-area knowledge through a succession of multiple-choice questions that are aligned with the content of the professional development experience. The second measurement, an exit survey, collects qualitative data from teachers regarding their impression of the professional development. Both the ESCI and the exit survey were tested for validity and reliability.
Lesson study is discussed here as a strategy for sustaining professional development in a school or a district after the end of a professional development activity. Lesson study, as described here, was offered as a formal course. Teachers engaged in lesson study worked collaboratively to design and test lessons that improve the teachers’ classroom practices. Data regarding the impact of the lesson study activity were acquired through surveys, written documents, and group interviews. The data are interpreted to indicate that the lesson study process improved teacher quality and classroom practices. In the case described here, the lesson study process was adopted by the teachers’ district and currently serves as part of the district’s work in Professional Learning Communities, resulting in ongoing professional development throughout the district
Online Fault Classification in HPC Systems through Machine Learning
As High-Performance Computing (HPC) systems strive towards the exascale goal,
studies suggest that they will experience excessive failure rates. For this
reason, detecting and classifying faults in HPC systems as they occur and
initiating corrective actions before they can transform into failures will be
essential for continued operation. In this paper, we propose a fault
classification method for HPC systems based on machine learning that has been
designed specifically to operate with live streamed data. We cast the problem
and its solution within realistic operating constraints of online use. Our
results show that almost perfect classification accuracy can be reached for
different fault types with low computational overhead and minimal delay. We
have based our study on a local dataset, which we make publicly available, that
was acquired by injecting faults to an in-house experimental HPC system.Comment: Accepted for publication at the Euro-Par 2019 conferenc
Charge and energy dependence of the residence time of cosmic ray nuclei below 15 GeV/nucleon
The relative abundance of nuclear species measured in cosmic rays at Earth has often been interpreted with the simple leaky box model. For this model to be consistent an essential requirement is that the escape length does not depend on the nuclear species. The discrepancy between escape length values derived from iron secondaries and from the B/C ratio was identified by Garcia-Munoz and his co-workers using a large amount of experimental data. Ormes and Protheroe found a similar trend in the HEAO data although they questioned its significance against uncertainties. They also showed that the change in the B/C ratio values implies a decrease of the residence time of cosmic rays at low energies in conflict with the diffusive convective picture. These conclusions crucially depend on the partial cross section values and their uncertainties. Recently new accurate cross sections of key importance for propagation calculations have been measured. Their statistical uncertainties are often better than 4% and their values significantly different from those previously accepted. Here, these new cross sections are used to compare the observed B/C+O and (Sc to Cr)/Fe ratio to those predicted with the simple leaky box model
Source spectral index of heavy cosmic ray nuclei
From the energy spectra of the heavy nuclei observed by the French-Danish experiment on HEAO-3, the source spectra of the mostly primary nuclei (C, O, Ne, Mg, Si, Ca and Fe) in the framework of an energy dependent leaky box model (Engelmann, et al., 1985) were derived. The energy dependence of the escape length was derived from the observed B/C and sub-iron/iron ratios and the presently available cross sections for C and Fe on H nuclei (Koch-Miramond, et al., 1983). A good fit to the source energy spectra of all these nuclei was obtained by a power law in momentum with an exponent gamma = -2.4+0.05 for the energy range 1 to 25GeV/n (Engelmann, et al., 1985). Comparison with data obtained at higher energy suggested a progressive flattening of these spectra. More accurate spectral indices are sought by using better values of the escape length based on the latest cross section measurements (Webber 1984, Soutoul, et al., this conference). The aim is also to extend the analysis to lower energies down to 0.4GeV/n (kinetic energy observed near Earth), using data obtained by other groups. The only nuclei for which a good data base is possessed in a broad range of energies are O and Fe, so the present study is restricted to these two elements
Possible "dawn" and "dusk" roles of light pulses shifting the phase of a circadian rhythm
A new automatic photoelectric method used in recording the eclosion rate of flies is described. The phase responses of the circadian rhythm of eclosion in Drosophila pseudoobscura to light pulses, of 1000 lx intensity and durations varying between 30 min and 12 h, were studied. The rhythm responds selectively either to the "on" or to the "off" transition of light pulses offered during the subjective night. The light pulses shift phase with the off transition during the first half of the night (dusk effect) and shift phase with the "on" transition during the second half of the night (dawn effect). The present findings are briefly discussed in the context of the work of other authors in this field
Shrink or Substitute: Handling Process Failures in HPC Systems using In-situ Recovery
Efficient utilization of today's high-performance computing (HPC) systems
with complex hardware and software components requires that the HPC
applications are designed to tolerate process failures at runtime. With low
mean time to failure (MTTF) of current and future HPC systems, long running
simulations on these systems require capabilities for gracefully handling
process failures by the applications themselves. In this paper, we explore the
use of fault tolerance extensions to Message Passing Interface (MPI) called
user-level failure mitigation (ULFM) for handling process failures without the
need to discard the progress made by the application. We explore two
alternative recovery strategies, which use ULFM along with application-driven
in-memory checkpointing. In the first case, the application is recovered with
only the surviving processes, and in the second case, spares are used to
replace the failed processes, such that the original configuration of the
application is restored. Our experimental results demonstrate that graceful
degradation is a viable alternative for recovery in environments where spares
may not be available.Comment: 26th Euromicro International Conference on Parallel, Distributed and
network-based Processing (PDP 2018
Continuous monitoring of the boundary-layer top with lidar
International audienceContinuous lidar observations of the top height of the boundary layer (BL top) have been performed at Leipzig (51.3° N, 12.4° E), Germany, since August 2005. The results of measurements taken with a compact, automated Raman lidar over a one-year period (February 2006 to January 2007) are presented. Four different methods for the determination of the BL top are discussed. The most promising technique, the wavelet covariance algorithm, is improved by implementing some modifications so that an automated, robust retrieval of BL depths from lidar data is possible. Three case studies of simultaneous observations with the Raman lidar, a vertical-wind Doppler lidar, and accompanying radiosonde profiling of temperature and humidity are discussed to demonstrate the potential and the limits of the four lidar techniques at different aerosol and meteorological conditions. The lidar-derived BL top heights are compared with respective values derived from predictions of the regional weather forecast model COSMO of the German Meteorological Service. The comparison shows a general underestimation of the BL top by about 20% by the model. The statistical analysis of the one-year data set reveals that the seasonal mean of the daytime maximum BL top is 1400 m in spring, 1800 m in summer, 1200 m in autumn, and 800 m in winter at the continental, central European site. BL top typically increases by 100?300 m per hour in the morning of convective days
- …