132 research outputs found
External grind-hardening forces modelling and experimentation
Grind hardening process utilizes the heat generated in the grinding area for the surface heat treatment of the workpiece. The workpiece surface is heated above the austenitizing temperature by using large values of depth of cut and low workpiece feed speeds. However, such process parameter combinations result in high process forces that inhibit the broad application of grind hardening to smaller grinding machines. In the present paper, modelling and predicting of the process forces as a function of the process parameters are presented. The theoretical predictions present good agreement with experimental results. The results of the study can be used for the prediction of the grind hardening process forces and, therefore, optimize the process parameters so as to be used with every size grinding machine
Greek meat supply response and price volatility in a rational expectations framework: A multivariate GARCH approach
This paper examines supply response models in a rational expectations framework for each one of the four major Greek meat markets, i.e. beef, broiler, lamb and pork. A multivariate GARCH model with Cholesky decomposition is used to incorporate price volatility into the rational expectations supply response model for each meat category and as a result the conditional covariance matrix remains positive definite without imposing any restrictions on the parameters. The empirical results confirm the existence of rational behaviour by meat producers in the four examined markets and indicate that price volatility is a major risk factor in Greek meat production while feed prices and veterinarian medicine prices are both important cost factors. Furthermore, the last Common Agricultural Policy reform is found to have a negative impact on the beef and lamb production in Greece.meat supply, price volatility, rational expectations, MGARCH., Agricultural and Food Policy,
Splitting groups with cubic Cayley graphs of connectivity two
A group splits over a subgroup if is either a free product with
amalgamation or an HNN-extension . We invoke Bass-Serre theory and classify all infinite
groups which admit cubic Cayley graphs of connectivity two in terms of
splittings over a subgroup
Testable Learning with Distribution Shift
We revisit the fundamental problem of learning with distribution shift, in
which a learner is given labeled samples from training distribution ,
unlabeled samples from test distribution and is asked to output a
classifier with low test error. The standard approach in this setting is to
bound the loss of a classifier in terms of some notion of distance between
and . These distances, however, seem difficult to compute and do not lead
to efficient algorithms.
We depart from this paradigm and define a new model called testable learning
with distribution shift, where we can obtain provably efficient algorithms for
certifying the performance of a classifier on a test distribution. In this
model, a learner outputs a classifier with low test error whenever samples from
and pass an associated test; moreover, the test must accept if the
marginal of equals the marginal of . We give several positive results
for learning well-studied concept classes such as halfspaces, intersections of
halfspaces, and decision trees when the marginal of is Gaussian or uniform
on . Prior to our work, no efficient algorithms for these basic
cases were known without strong assumptions on .
For halfspaces in the realizable case (where there exists a halfspace
consistent with both and ), we combine a moment-matching approach with
ideas from active learning to simulate an efficient oracle for estimating
disagreement regions. To extend to the non-realizable setting, we apply recent
work from testable (agnostic) learning. More generally, we prove that any
function class with low-degree -sandwiching polynomial approximators can
be learned in our model. We apply constructions from the pseudorandomness
literature to obtain the required approximators
Reduction of Vascular Inflammation, LDL-C, or Both for the Protection from Cardiovascular Events?
Background:
Low density lipoprotein cholesterol (LDL-C) and low grade arterial inflammation are key pathogenic factors for atherosclerosis and its manifestation, cardiovascular disease (CVD). Objective:
In this narrative review we assessed if decreasing LDL-C levels or inflammation or both is more effective in reducing CVD events. Results:
In the Scandinavian Simvastatin Survival Study (4S), all statin trials of the 90s’ and the Further Cardiovascular Outcomes Research with PCSK9 Inhibition in Subjects with Elevated Risk (FOURIER) the benefit came from the LDL-C reduction. In the GREak and Atorvastatin Coronary heart disease Evaluation (GREACE), the Treating to New Targets (TNT), and the Justification for the Use of Statins in Prevention: an Intervention Trial Evaluating Rosuvastatin (JUPITER) trials both mechanisms in combination produced significant benefits. In the Atorvastatin for Reduction of MYocardial Damage during Angioplasty (ARMYDA) trials and the Canakinumab Antiinflammatory Thrombosis Outcome Study (CANTOS) with a human antibody targeting IL-1β with no lipid lowering effect, the reduction in arterial inflammation played the only beneficial role because there was no change in lipids levels. Conclusion:
Both LDL-C and inflammation reduction are beneficial to the reduction of CVD risk. However, canakinumab is a very expensive drug that only induced a 15% reduction in CVD events, thus drastically reducing the possibility for it to be used in clinical practice. Besides, canakinumab is associated with increased infections, some fatal. A potent statin with anti-inflammatory effects is probably the best choice for the majority of those needing hypolipidaemic drug therapy
Sustainability assessment for manufacturing operations
Sustainability is becoming more and more important as a decision attribute in the manufacturing environment. However, quantitative metrics for all the aspects of the triple bottom line are difficult to assess. Within the present paper, the sustainability metrics are considered in tandem with other traditional manufacturing metrics such as time, flexibility, and quality and a novel framework is presented that integrates information and requirements from Computer-Aided Technologies (CAx) systems. A novel tool is outlined for considering a number of key performance indicators related to the triple bottom line when deciding the most appropriate process route. The implemented system allows the assessment of alternative process plans considering the market demands and available resources
- …