159 research outputs found
A Case Study on Investigation of Component Age Dependent Reliability Models - EC JRC Network on Use of Probabilistic Safety Assessments (PSA) for Evaluation of Aging Effects to the Safety of Energy Facilities - Task 4
The report presents the results of a case study on "Investigation of component age dependent reliability models" implemented by INPE and JRC IE in the frame of EC JRC Ageing PSA Network Task 4 activities. Several cases of Generalized Linear Model were proposed and investigated for the cases of continues and discrete data. The Fisher Chi-2 minimization approach was applied for goodness of fit test and parameters elaboration. Finally, uncertainty analysis was done for estimated parameters and model extrapolations. The results were analyzed and compared with other approaches.JRC.F.4-Nuclear design safet
ErbB3 signaling prevents cardiac fibrosis after isoproterenol-induced myocardial injury
We generate a new tamoxifen-inducible mice model of EC-specific ErbB3 overexpression to investigate the role of ErbB3 in myocardial ischemia and heart failure. In the present study, we compared cardiac function and fibrosis development in EC cell-specific ErbB3 overexpressing mice versus control after the Isoproterenol (ISO)-induced model of cardiac injury, which culminates in cardiac fibrosis development.https://knowledgeconnection.mainehealth.org/lambrew-retreat-2023/1011/thumbnail.jp
Weighted Maximum Independent Set of Geometric Objects in Turnstile Streams
We study the Maximum Independent Set problem for geometric objects given in
the data stream model. A set of geometric objects is said to be independent if
the objects are pairwise disjoint. We consider geometric objects in one and two
dimensions, i.e., intervals and disks. Let be the cardinality of the
largest independent set. Our goal is to estimate in a small amount of
space, given that the input is received as a one-pass stream. We also consider
a generalization of this problem by assigning weights to each object and
estimating , the largest value of a weighted independent set. We
initialize the study of this problem in the turnstile streaming model
(insertions and deletions) and provide the first algorithms for estimating
and .
For unit-length intervals, we obtain a -approximation to
and in poly space. We also show a
matching lower bound. Combined with the -approximation for insertion-only
streams by Cabello and Perez-Lanterno [CP15], our result implies a separation
between the insertion-only and turnstile model. For unit-radius disks, we
obtain a -approximation to and
in poly space, which is closely related to
the hexagonal circle packing constant.
We provide algorithms for estimating for arbitrary-length intervals
under a bounded intersection assumption and study the parameterized space
complexity of estimating and , where the parameter is the ratio
of maximum to minimum interval length.Comment: The lower bound for arbitrary length intervals in the previous
version contains a bug, we are updating the submission to reflect thi
Processing algorithm of weekly records of the Roztochia landscape-geophysical station thermograph М-16АН as a response source of air temperature data
Formulation of the problem. During the database processing of the Roztochia landscape-geophysical station (RLGS), located in the village of Bryukhovychi, Lviv, an air temperature data gap for 1990–1991 was found. The task of the research was to find those sources about air temperature at RLGS, which would allow us to fill in the gaps during the night hours when, unfortunately, observers did not make measurements.
Problems of further research. In comparison with the method adopted in Ukraine for processing weekly thermograph tapes, in this study, it is proposed to correct the air temperature value during their processing, compensating for accelerated or slowed rotation of the weekly thermograph drum. It is suggested to use only those dry bulb measurements carried out on days with cloudy or rainy weather.
The purpose. The main goal was to find an algorithm for processing weekly thermograph tapes under the conditions of a partial absence of temperature measurements using a dry thermometer by an observer to fill in the gaps regarding night air temperature data.
Research methods. The air temperature values falling during the measurement period were read from the thermograph tapes, and an electronic table was formed. The temperature values (difference estimation) were compared with the corresponding ones recorded in the "Books" of KM-1. At the same time, it was necessary to make two new corrections. The first correction will be made along the ordinate axis, changing the value of the temperature recorded by the thermograph compared to the values of the temperature measured by the dry thermometer. The second correction was made along the abscissa axis, compensating for the drum's slowed down or accelerated rotation.
Presentation of the main research material. A brief description of the proposed algorithm for thermograph tape processing is as follows. In the spreadsheet, in separate columns, we record the temperature values during the observation periods: a) by dry thermometer and b) by thermograph at the points corresponding to the observation periods. Subtracting columns (a) and (b) values, we determine those dry bulb temperature values suitable for calculating corrections. We reject too significant differences that occur during rapid temperature changes. Next, we look for points on the thermograph tape that serve as time benchmarks (the starting and ending points of the temperature curve and the places where the observer draws vertical lines). These temperature values will form column (c). It will additionally include the temperature values obtained by reading the temperature from the tape for rainy and/or overcast days for points whose localization is corrected for time. The difference between column (a) and column (c) will give the temperature correction for several observation periods on each weekly strip taken separately. The last step is the linear interpolation of temperature corrections between neighbouring points of intermediate observation periods.
Practical value. The proposed algorithm may help eliminate gaps in temperature data at other observation points, where the thermograph served as a backup device for recording air temperature.
Research results. The measurements that fall on rainy and overcast weather are best suited for calculating thermograph corrections when air temperature changes slow down. It is necessary to identify benchmark points of time fixation, to which the observer must add the moments of putting on and removing the tape from the drum
Quantum-Inspired Algorithms from Randomized Numerical Linear Algebra
We create classical (non-quantum) dynamic data structures supporting queries
for recommender systems and least-squares regression that are comparable to
their quantum analogues. De-quantizing such algorithms has received a flurry of
attention in recent years; we obtain sharper bounds for these problems. More
significantly, we achieve these improvements by arguing that the previous
quantum-inspired algorithms for these problems are doing leverage or
ridge-leverage score sampling in disguise; these are powerful and standard
techniques in randomized numerical linear algebra. With this recognition, we
are able to employ the large body of work in numerical linear algebra to obtain
algorithms for these problems that are simpler or faster (or both) than
existing approaches.Comment: Adding new numerical experiment
ARDA: Automatic Relational Data Augmentation for Machine Learning
Automatic machine learning (\AML) is a family of techniques to automate the
process of training predictive models, aiming to both improve performance and
make machine learning more accessible. While many recent works have focused on
aspects of the machine learning pipeline like model selection, hyperparameter
tuning, and feature selection, relatively few works have focused on automatic
data augmentation. Automatic data augmentation involves finding new features
relevant to the user's predictive task with minimal ``human-in-the-loop''
involvement.
We present \system, an end-to-end system that takes as input a dataset and a
data repository, and outputs an augmented data set such that training a
predictive model on this augmented dataset results in improved performance. Our
system has two distinct components: (1) a framework to search and join data
with the input data, based on various attributes of the input, and (2) an
efficient feature selection algorithm that prunes out noisy or irrelevant
features from the resulting join. We perform an extensive empirical evaluation
of different system components and benchmark our feature selection algorithm on
real-world datasets
Management of Energy Enterprises: Energy-efficiency Approach in Solar Collectors Industry: The Case of Russia
In the last few years, the development and management of some types of renewable energy industries in Russia is proceeding at a rapid pace, mainly due to government support programs. At the same time, many renewable energy technologies designed for use at the enterprises, primarily in the residential and commercial sectors, have not yet been widely disseminated. In particular, this applies to solar collectors. In this article, we study the factors, preventing the wider distribution of solar collectors in the residential sector from the viewpoint of the theory of energy-efficiency. The focus of research is the informational barriers. The obtained results allow us to draw several practical conclusions about the directions for improving regional energy-efficiency programs currently being implemented in most of Russian regions.
Keywords: solar thermal energy, solar collectors, barriers of energy efficiency, survey
JEL Classifications: O33, Q42, Q47, Q4
- …
