Higher Institute on Territorial Systems for Innovation

PORTO Publications Open Repository TOrino
Not a member yet
    77769 research outputs found

    Prediction of fretting wear in spline couplings

    No full text
    The original contribution of this work is modeling of fretting wear in aero-engine spline couplings widely used in aero-industry to transfer power and torque. Their safe operation is very critical with respect to flight safety. They consist of two components namely hub and shaft. As they are of light weight, usually it is difficult to realize a perfect alignment. To allow for misalignment, their teeth are designed to be of crowned shape. The crowing allows a degree of misalignment without concentration of stresses which is otherwise inevitable if a misalignment is introduced in case of straight teeth. However, crowing results in another problem of fretting wear and fretting fatigue owing to kinematic constraints imposed as a result of misalignment. The focus of this work is development of mathematical models for prediction of fretting wear and not fretting fatigue. The spline couplings under consideration are industrial scale and made up of nitrogen hardened 42CrMo4. The aero industry requires a reliable method to model and predict fretting wear to be able to optimize the design of spline coupling and reduce the maintenance costs. Wear tests on crowned spline couplings on a dedicated test bench have been conducted and analyzed. Empirical, artificial neural network based and analytical models have been de- veloped to analyse, predict and formulate fretting wear in spline couplings. The empirical and artificial neural netwrok based models are specific to the given case of spline couplings and tribological conditions. However, the analytical model developed has been found to be quite general. Incremental fretting wear formulation both in terms of wear volume and wear depth has been realized. Some novel findings regarding effect of roughness parameters in conjunction with applied torque and misalignment angles with respect to fretting wear are also reported. It has been observed that the evolution of wear depth accelerates with increased applied torque or misalignment angle. Changes in roughness parameters are also found to be increasing with torque and misalignment angle in most of the cases. Preliminary tests for frequency effects on fretting wear have also been conducted

    Nanoparticle-assisted ultrasound: a special focus on sonodynamic therapy against cancer

    No full text
    At present, ultrasound radiation is broadly employed in medicine for both diagnostic and therapeutic purposes at various frequencies and intensities. In this review article, we focus on therapeutically-active nanoparticles (NPs) when stimulated by ultrasound. We first introduce the different ultrasound-based therapies with special attention to the techniques involved in oncological field, then we summarize the different NPs used, ranging from soft materials, like liposomes or micro/nano-bubbles, to metal and metal oxide NPs. We therefore focus on the sonodynamic therapy and on the possible working mechanisms under debate of NPs-assisted sonodynamic treatments. We support the idea that various, complex and synergistics physical-chemical processes take place during acoustic cavitation and NP activation. Different mechanisms are therefore responsible for the final cancer cell death and strongly depends on not only the type and structure of NPs or nanocarriers, but also on the way they interact with the ultrasonic pressure waves. We conclude with a brief overview of the clinical applications of the various ultrasound therapies and the related use of NPs-assisted ultrasound in clinics, showing that this very innovative and promising approach is however still at its infancy in the clinical cancer treatment

    Finite element models with node-dependent kinematics for the analysis of composite beam structures

    No full text
    This paper presents refined one-dimensional models with node-dependent kinematics. The three-dimensional displacement field is discretized into two domains, namely cross-section domain and axis domain. The mechanical behaviors of the beam can be firstly captured by the cross-section functions then interpolated by the nodal shape functions of the beam element. Such a feature makes it possible to adopt different types of cross-section functions on each element node, obtaining node-dependent kinematic finite element models. Such models can integrate Taylor-based and Lagrange-type nodal kinematics on element level, bridging a less-refined model to a more refined model without using special coupling methods. FE governing equations of node-dependent models are derived by applying the Carrera Unified Formulation. Some numerical cases on metallic and composite beam-like structures are studied to demonstrate the effectiveness of node-dependent models in bridging a locally refined model to a global model when local effects should be accounted for

    Development of manganese oxide films for the electro-oxidation of phenol at high temperature and pressure

    No full text
    This thesis is divided into three main parts. In the first part, the concept of a MJ-PEM reactor will be introduced, and its design and calculations will be explained. A MJ-PEM reactor is the result of the coupling of a Multi-Junction Solar Cell (MJSC) and a Polymer Electrolyte Membrane (PEM) electrolyzer, able to work at high temperatures and pressures (up to 150°C and 30 bar). Two scenarios for the application of this system were investigated: in the first one, the anodic chamber is used for the oxidation of recalcitrant organics contained in wastewater, while the cathodic compartment is used for the evolution of H2, for storage or direct use on site; in the second one, the H2 produced at the cathode is sent to an anaerobic digestion process, to boost the biomethanation step, whereas at the anode O2 is evolved and it is exploited for the digestate stabilization and disinfection. Both the scenarios proved to be feasible and effective, due to a high degree of integration between stoichiometric and thermal requirements of different systems, allowing to carry out both waste or wastewater treatment on one side, and hydrogen or natural gas production on the other side. The second part of this work concerns the synthesis and the characterization of electrodes based on manganese oxides, for the electro-oxidation of recalcitrant organics. Phenol was chosen as target molecule, due to its high refractoriness and stability, and its wide presence in industrial plants. Manganese oxides are extensively used in electrochemistry, and they were chosen because of their low cost, high abundance, and low toxicity. Different types of manganese oxides (MnOx) were synthesized by electrodeposition on two substrates, namely metallic titanium and titania nanotubes (TiO2-NTs). X-Ray Diffraction (XRD) and X-Ray Photoelectron Spectroscopy (XPS) were used to analyze the oxidation states of manganese, whereas Field Emission Scanning Electronic Microscopy (FESEM) was employed to investigate the morphology of the samples and the penetration of manganese oxides inside the NTs. The electrochemical properties of the electrodes have been investigated by cyclic voltammetry (CV) and linear sweep voltammetry (LSV), showing that both calcination and electrodeposition over TiO2-NTs gave more stable electrodes, exhibiting a marked increase in the current density. The activity of the proposed nanostructured samples towards phenol degradation has been investigated. Tetravalent manganese (α-MnO2) resulted to be the most active phase, with a phenol conversion of 42.7%. Trivalent manganese (α-Mn2O3), instead, reported the highest stability, with an average working potential of 2.9 V vs. RHE, and the highest tendency for oxygen evolution reaction, reaching 0.4 mA/cm2 at 2.5 V vs. RHE. TiO2-NTs interlayer contributed in all cases to the decrease in the final potential reached after the reaction time of about 1 - 1.5 V, due to the improved contact with the catalyst film and the prevention of passivation of the titanium substrate. In the third part, the most performing electrodes were selected from the ones synthesized in the second part. They were tested in High Temperature, High Pressure (HTHP) reactor, designed in Politecnico di Torino for kinetic studies on electro-degradation of refractory organics in wastewaters, under Catalytic Wet Air Oxidation (CWAO) conditions, i.e. 150°C and 30 bar. The most stable (α-Mn2O3) and the most active (α-MnO2) manganese oxides were compared, both at ambient and CWAO operative conditions, with some of the most effective electrodes used in this field: Sb-doped SnO2 and RuO2. Results showed that manganese oxides, especially α-Mn2O3, is more than tripled at 150°C and 30 bar, reaching values of phenol oxidation close to the ones of Sb-SnO2 and RuO2. This phenomenon can be attributed to the higher tendency of manganese in its Mn3+ form to oxidize water to O2, that is wasted at ambient conditions, while is better employed at high temperatures (high kinetics, low overpotentials) and high pressures (improved O2 solubility)

    Analysis, characterization and classification of Internet traffic

    No full text
    The Internet is a global interconnection of networks representing nowadays one of the most important telecommunication technologies. Born as an U.S. military project, it has evolved in a worldwide communication system used by people every day. This success is based on its ``freedom'' since no single organization or administration entity governs or maintains it. This freedom also motivates the huge heterogeneity of Internet services available today ranging from working activities (e.g., VoIP, e-mail, etc.) to entertainment (e.g., video games, streaming, peer-to-peer, etc.) and commerce (e.g., Amazon, eBay, etc.) just to name a few. The Internet is a fertile and in constant evolution system. Every year new services and software platforms are launched affecting not only the users' activities (e.g. social networks) but also the internal architecture of the networks (e.g., Content Delivery Network vs peer-to-peer) or the devices used to access to the services (e.g., PC vs smartphones and Internet tablets). The richness of the Internet scenario is paid at the cost of its internal complexity. Eric Schmidt, the CEO of Google, said: \emph{``The Internet is the first thing that humanity has built that humanity doesn't understand, the largest experiment in anarchy that we have ever had.''}\footnote{\url{http://www.brainyquote.com/quotes/authors/e/eric_schmidt.html}}. At the origins, the Internet has been designed to operate on few standardized services. None could have i) foreseen the success of this media and ii) designed the network to cope with the plethora of nowadays services. If on the one hand this diversity provides the Internet with a certain level of resiliency and has driven innovation, on the other hand understanding its internal mechanisms is a daunting task, made worse by the fast and constant deployment of new services and applications. However, behind what it could seem a chaotic scenario, the Internet is composed by well defined markets in which big players participate having precise interests: \begin{description} \item \textbf{Users}, representing the majority of the people which assess to the network. They are interested in \emph{Quality of Experience} - QoE, i.e., having good performance when accessing to the network, avoiding for example long delay related to the initial buffering when streaming a video. They are also interested in the \emph{Network Neutrality}, preserving their freedom to use the Internet independently from which service they are accessing; \item \textbf{Internet Service Providers - ISP}, corresponding to organizations which provide Internet access to the customers. They are interested in incrementing the revenues through i) \emph{network engineering} as to optimize the offered services and ii) studying the users' activity as to find new \emph{billing policies}; \item \textbf{Content providers}, corresponding to organizations which sell a specific Internet service, e.g., video streaming, file hosting, etc. As for ISPs, they are interested in finding new way to make revenues. At the same time, they have to cope also with illegal activities as \emph{content piracy}, a common flaw since the early days of peer-to-peer systems; \item \textbf{Government regulation agencies}, corresponding to organizations which regulate some aspects of the Internet activities. For example, they study \emph{Service Level Agreements} - SLA between users and ISPs, comparing the quality of the Internet access offered to the users with respect to the specifications written in the contract signed. \end{description} Other activities as \emph{security} are important for more than one player. Consider for example \emph{malware} and \emph{Denial of Service} - DoS attacks. These can violate the users' privacy, damaging the network and violate some laws. Overall then, there are several motivations to be interested in studying the Internet. Since the early days, the scientific community has made giant steps toward understanding the Internet. We can generalize that two requirements have to be satisfied. First of all, we need \emph{tools and methodologies} as to inspect and characterize the traffic at different granularities, i.e., per-packet, per-flow, per-port, per-user, etc. In particular, \emph{traffic classification} is one of most important activities performed by network operators. It allows to identify which application has generated a given communication and to study not only the whole network traffic aggregate but also how different applications participate in the composition of the total traffic. Leveraging on these tools and methodologies, we can further drill into performing \emph{users and network characterization}. For example, monitoring the traffic over long-term periods, we can study the applications' popularity trends and identify the rise of new technologies. We can perform \emph{anomaly detection}, i.e., study unexpected network condition that might be related to either security issues of malfunctioning hardware. We can optimize routing policies, study inter-ISP traffic, investigate the energy consumption of the network elements or work on caching schemes related social network content, just to name a few of the huge amount of research studies recently conducted in the literature. In this thesis, we present our contributions in studying the Internet discussing the tools and methodologies developed to characterize the network traffic. The thesis is divided in two parts. In the first part we focus on traffic classification methodologies starting from the problem definition and the available solutions in the literature as reported in Chapter~\ref{chapter:traff_class}. In the remaining of the first part we focus on KISS, a novel traffic classification technique we propose based on \emph{Stochastic Packet Inspection} (SPI) analysis. In particular, in Chapter~\ref{chapter:kiss} we describe the framework used by the classifier which is then validated in Chapter~\ref{sec:kiss_udp} and~\ref{sec:kiss_tcp} for UDP and TCP traffic respectively. Chapter~\ref{chapter:compare} is about the comparison of KISS with other state of the art traffic classifier while in Chapter~\ref{sec:clustering} we extend the KISS framework with some clustering techniques. Overall, KISS allows to reach a high level of accuracy in traffic classification which is comparable or even better with respect to other traffic classifiers. It presents a flexible structure which is able to identify a rich set of applications with a limited amount of resource requirements. In the second part of the thesis we study YouTube, the famous video streaming system. Leveraging on Tstat, a passive traffic analyzer, we developed a methodology to identify the YouTube video downloads and we conduct an in depth analysis of many aspects of YouTube. In Chapter~\ref{sec:yt-overview} we start presenting an overview of the system and its components showing the internal mechanisms adopted. Chapter~\ref{sec:yt-methodology} reports an analysis of the available methodologies in the literature to study YouTube and presents our methodology based on monitoring the real users' activities considering different location, access technologies and devices. In the remaining chapters we present the results of our analysis grouped in four different areas of interest: video content properties (Chapter~\ref{sec:yt-content}), internal load balancing and caching policies (Chapter~\ref{sec:yt-cdn}), users' habits and behaviours (Chapter~\ref{sec:user}), and download performance (Chapter~\ref{sec:yt-performance}). Results show that YouTube is a complex system where several components interact with precise policies used to control the communications. Besides its great success, the system is far from being perfect and there is space for further optimization. For example, mobile devices suffer more impairments during the download with respect to PCs. Users stick to the default video resolution and are not interested in changing the quality during the playback. Instead, it is common the abruptly abort of the download. This behaviour is particularly critical because, coupled with aggressive buffering policies used to ensure continuity in the playback, it leads to waste a non negligible amount of traffic, i.e., the users download a portion of the video which it is never playe

    Design and implementation of a belief-propagation scheduler for multicast traffic in input-queued switches

    No full text
    Scheduling multicast traffic in input-queued switches to maximize throughput requires solving a hard combinatorial optimization problem in a very short time. This task advocates the design of algorithms that are simple to implement and efficient in terms of performance. We propose a new scheduling algorithm, based on message passing and inspired by the belief propagation paradigm, meant to approximate the provably-optimal scheduling policy for multicast traffic. We design and implement both a software and a hardware version of the algorithm, the latter running on a NetFPGA. We compare the performance and the power consumption of the two versions when integrated in a software router. Our main findings are that our algorithm outperforms other centralized greedy scheduling policies, achieving a better tradeoff between complexity and performance, and it is amenable to practical high-performance implementations

    Visibility graph analysis of wall turbulence time-series

    No full text

    7,061

    full texts

    77,774

    metadata records
    Updated in last 30 days.
    PORTO Publications Open Repository TOrino is based in Italy
    Access Repository Dashboard
    Do you manage Open Research Online? Become a CORE Member to access insider analytics, issue reports and manage access to outputs from your repository in the CORE Repository Dashboard! 👇