132 research outputs found

    External grind-hardening forces modelling and experimentation

    Get PDF
    Grind hardening process utilizes the heat generated in the grinding area for the surface heat treatment of the workpiece. The workpiece surface is heated above the austenitizing temperature by using large values of depth of cut and low workpiece feed speeds. However, such process parameter combinations result in high process forces that inhibit the broad application of grind hardening to smaller grinding machines. In the present paper, modelling and predicting of the process forces as a function of the process parameters are presented. The theoretical predictions present good agreement with experimental results. The results of the study can be used for the prediction of the grind hardening process forces and, therefore, optimize the process parameters so as to be used with every size grinding machine

    Greek meat supply response and price volatility in a rational expectations framework: A multivariate GARCH approach

    Get PDF
    This paper examines supply response models in a rational expectations framework for each one of the four major Greek meat markets, i.e. beef, broiler, lamb and pork. A multivariate GARCH model with Cholesky decomposition is used to incorporate price volatility into the rational expectations supply response model for each meat category and as a result the conditional covariance matrix remains positive definite without imposing any restrictions on the parameters. The empirical results confirm the existence of rational behaviour by meat producers in the four examined markets and indicate that price volatility is a major risk factor in Greek meat production while feed prices and veterinarian medicine prices are both important cost factors. Furthermore, the last Common Agricultural Policy reform is found to have a negative impact on the beef and lamb production in Greece.meat supply, price volatility, rational expectations, MGARCH., Agricultural and Food Policy,

    Splitting groups with cubic Cayley graphs of connectivity two

    Get PDF
    A group GG splits over a subgroup CC if GG is either a free product with amalgamation ACBA \underset{C}{\ast} B or an HNN-extension G=AC(t)G=A \underset{C}{\ast} (t). We invoke Bass-Serre theory and classify all infinite groups which admit cubic Cayley graphs of connectivity two in terms of splittings over a subgroup

    Testable Learning with Distribution Shift

    Full text link
    We revisit the fundamental problem of learning with distribution shift, in which a learner is given labeled samples from training distribution DD, unlabeled samples from test distribution DD' and is asked to output a classifier with low test error. The standard approach in this setting is to bound the loss of a classifier in terms of some notion of distance between DD and DD'. These distances, however, seem difficult to compute and do not lead to efficient algorithms. We depart from this paradigm and define a new model called testable learning with distribution shift, where we can obtain provably efficient algorithms for certifying the performance of a classifier on a test distribution. In this model, a learner outputs a classifier with low test error whenever samples from DD and DD' pass an associated test; moreover, the test must accept if the marginal of DD equals the marginal of DD'. We give several positive results for learning well-studied concept classes such as halfspaces, intersections of halfspaces, and decision trees when the marginal of DD is Gaussian or uniform on {±1}d\{\pm 1\}^d. Prior to our work, no efficient algorithms for these basic cases were known without strong assumptions on DD'. For halfspaces in the realizable case (where there exists a halfspace consistent with both DD and DD'), we combine a moment-matching approach with ideas from active learning to simulate an efficient oracle for estimating disagreement regions. To extend to the non-realizable setting, we apply recent work from testable (agnostic) learning. More generally, we prove that any function class with low-degree L2L_2-sandwiching polynomial approximators can be learned in our model. We apply constructions from the pseudorandomness literature to obtain the required approximators

    Reduction of Vascular Inflammation, LDL-C, or Both for the Protection from Cardiovascular Events?

    Get PDF
    Background: Low density lipoprotein cholesterol (LDL-C) and low grade arterial inflammation are key pathogenic factors for atherosclerosis and its manifestation, cardiovascular disease (CVD). Objective: In this narrative review we assessed if decreasing LDL-C levels or inflammation or both is more effective in reducing CVD events. Results: In the Scandinavian Simvastatin Survival Study (4S), all statin trials of the 90s’ and the Further Cardiovascular Outcomes Research with PCSK9 Inhibition in Subjects with Elevated Risk (FOURIER) the benefit came from the LDL-C reduction. In the GREak and Atorvastatin Coronary heart disease Evaluation (GREACE), the Treating to New Targets (TNT), and the Justification for the Use of Statins in Prevention: an Intervention Trial Evaluating Rosuvastatin (JUPITER) trials both mechanisms in combination produced significant benefits. In the Atorvastatin for Reduction of MYocardial Damage during Angioplasty (ARMYDA) trials and the Canakinumab Antiinflammatory Thrombosis Outcome Study (CANTOS) with a human antibody targeting IL-1β with no lipid lowering effect, the reduction in arterial inflammation played the only beneficial role because there was no change in lipids levels. Conclusion: Both LDL-C and inflammation reduction are beneficial to the reduction of CVD risk. However, canakinumab is a very expensive drug that only induced a 15% reduction in CVD events, thus drastically reducing the possibility for it to be used in clinical practice. Besides, canakinumab is associated with increased infections, some fatal. A potent statin with anti-inflammatory effects is probably the best choice for the majority of those needing hypolipidaemic drug therapy

    Sustainability assessment for manufacturing operations

    Get PDF
    Sustainability is becoming more and more important as a decision attribute in the manufacturing environment. However, quantitative metrics for all the aspects of the triple bottom line are difficult to assess. Within the present paper, the sustainability metrics are considered in tandem with other traditional manufacturing metrics such as time, flexibility, and quality and a novel framework is presented that integrates information and requirements from Computer-Aided Technologies (CAx) systems. A novel tool is outlined for considering a number of key performance indicators related to the triple bottom line when deciding the most appropriate process route. The implemented system allows the assessment of alternative process plans considering the market demands and available resources
    corecore