4,248 research outputs found

    Correlation of transonic-cone preston-tube data and skin friction

    Get PDF
    Preston-tube measurements obtained on the Arnold Engineering Development Center (AEDC) Transition Cone have been correlated with theoretical skin friction coefficients in transitional and turbulent flow. This has been done for the NASA Ames 11-Ft Transonic Wind Tunnel (11 TWT) and flight tests. The developed semi-empirical correlations of Preston-tube data have been used to derive a calibration procedure for the 11 TWT flow quality. This procedure has been applied to the corrected laminar data, and an effective freestream unit Reynolds number is defined by requiring a matching of the average Preston-tube pressure in flight and in the tunnel. This study finds that the operating Reynolds number is below the effective value required for a match in laminar Preston-tube data. The distribution of this effective Reynolds number with Mach number correlates well with the freestream noise level in this tunnel. Analyses of transitional and turbulent data, however, did not result in effective Reynolds numbers that can be correlated with background noise. This is a result of the fact that vorticity fluctuations present in transitional and turbulent boundary layers dominate Preston-tube pressure fluctuations and, therefore, mask the tunnel noise eff ects. So, in order to calibrate the effects of noise on transonic wind tunnel tests only laminar data should be used, preferably at flow conditions similar to those in flight tests. To calibrate the effects of transonic wind-tunnel noise on drag measurements, however, the Preston-tube data must be supplemented with direct measurements of skin friction

    Rhythmic inhibition allows neural networks to search for maximally consistent states

    Full text link
    Gamma-band rhythmic inhibition is a ubiquitous phenomenon in neural circuits yet its computational role still remains elusive. We show that a model of Gamma-band rhythmic inhibition allows networks of coupled cortical circuit motifs to search for network configurations that best reconcile external inputs with an internal consistency model encoded in the network connectivity. We show that Hebbian plasticity allows the networks to learn the consistency model by example. The search dynamics driven by rhythmic inhibition enable the described networks to solve difficult constraint satisfaction problems without making assumptions about the form of stochastic fluctuations in the network. We show that the search dynamics are well approximated by a stochastic sampling process. We use the described networks to reproduce perceptual multi-stability phenomena with switching times that are a good match to experimental data and show that they provide a general neural framework which can be used to model other 'perceptual inference' phenomena

    Correlation of Preston-tube data with laminar skin friction (Log No. J12984)

    Get PDF
    Preston tube data within laminar boundary layers obtained on a sharp ten-degree cone in the NASA Ames eleven-foot transonic wind tunnel are correlated with the corresponding values of theoretical skin friction. Data were obtained over a Mach number range of 0.30 to 0.95 and unit Reynolds numbers of 9.84, 13.1, and 16.4 million per meter. The rms scatter of skin friction coefficient about the correlation is of the order of one percent, which is comparable to the reported accuracy for calibrations of Preston tubes in incompressible pipe flows. In contrast to previous works on Preston tube/skin friction correlations, which are based on the physical height of the probe's face, this satisfactory correlation for compressible boundary layer flows is achieved by accounting for the effects of a variable "effective" height of the probe. The coefficients, which appear in the correlation, are dependent on the particular tunnel environment. The general procedure can be used to define correlations for other wind tunnels

    Green Synthesis of Graphite Oxide as Metal free Catalyst for Petrochemicals Production

    Get PDF
    Graphite oxide was synthesized by the modified Hummer’s method. The degree of the oxidation of the graphite was systematically controlled via the oxidation time, KMnO4 : Graphite wt. ratio and the addition of the phosphoric acid to the oxidation media. The Physicochemical properties of the synthesized graphite oxide are investigated by using different techniques ; XRD,  FTIR, zeta potential, and TEM. It was found that the structure of the expanded graphite can be easily and remarkably disordered by oxidation. Three phases of interlayer distances were identified at 3.4 , 4 and 6A⁰ , specified for the pristine graphite, intermediate and the fully expanded graphite oxide respectively  These phases were cooresponding to the compositions : epoxide, carboxyl and hydroxyl groups respectively . as confirmed by FTIR.The catalytic activity of the prepared graphite oxide samples was tested for petrochemical production from the ethanol conversion reaction at different reaction temperature 100-250oC and resulted pressure ranging  40-92 atm . The converted products were mainly composed of  acetone, ethylene, acetaldehyde, diethylether, and heavyhydrocarbons( >c). Acetone was found to be the main product at all reaction temperatures with selectivity ‘‘53-94%’’. XRD and TEM analyses of the prepared samples confirmed  the transformation of  the prepared graphite oxide into graphene like material at reaction temperature 250oC

    Deferring the learning for better generalization in radial basis neural networks

    Get PDF
    Proceeding of: International Conference Artificial Neural Networks — ICANN 2001. Vienna, Austria, August 21–25, 2001The level of generalization of neural networks is heavily dependent on the quality of the training data. That is, some of the training patterns can be redundant or irrelevant. It has been shown that with careful dynamic selection of training patterns, better generalization performance may be obtained. Nevertheless, generalization is carried out independently of the novel patterns to be approximated. In this paper, we present a learning method that automatically selects the most appropriate training patterns to the new sample to be predicted. The proposed method has been applied to Radial Basis Neural Networks, whose generalization capability is usually very poor. The learning strategy slows down the response of the network in the generalisation phase. However, this does not introduces a significance limitation in the application of the method because of the fast training of Radial Basis Neural Networks

    INTEGRATING INFORMATION AND COMMUNICATION TECHNOLOGY AS A SOLUTION TO SUSTAINABLE ROAD TRANSPORTATION IN SOUTH AFRICA

    Get PDF
    Conference ProceedingsInitiatives have long been taken to attain sustainable road transportation system across the world, including South Africa. Despite the various initiatives, sustainable road transportation in South African cities remains as a challenge. Therefore, this study, through a qualitative study, examined how sustainable road transportation can be achieved in South African cities. It is found that strengthening of public transportation system and effective integration of Information and Communication Technology (ICT) in socio-economic activities, and travel needs in particular, would be able to contribute significantly to sustainable road transportation. ICT Integration and its effective use will reduce need for travel, reduce traffic volume, and enable appropriate route planning, which consequently will reduce traffic congestion, traffic collisions, travel distance and travel time. It will also limit environmental pollution caused by carbon emissions from vehicles, thus contributing to sustainable road transportation

    Barriers identified that limited participation of Central University of Technology Academic Staff in National Research Foundation funding programmes

    Get PDF
    Published ArticleThese results suggest that those academics who have participated in NRF funding programmesmanage their time better. •Results suggest a link between the non participation rate of NRF funding, academic responsibilities and a lack of time management skills. •Dissonance is typically resolved by changing attitude, it is therefore recommended that time management training and platforms be created where staff can express free choice

    LUNG CANCER DETECTION IN LOW-RESOLUTION IMAGES

    Get PDF
    One of the most important prognostic factors for all lung cancer patients is the accurate detection of metastases. Pathologists, as we all know, examine the body and its tissues. On the existing clinical method, they have a tedious and manual task. Recent analysis has been inspired by these aspects. Deep Learning (DL) algorithms have been used to identify lung cancer. The developed cutting-edge technologies beat pathologists in terms of cancer identification and localization inside pathology images. These technologies, though, are not medically feasible because they need a massive amount of time or computing capabilities to perceive high-resolution images. Image processing techniques are primarily employed for lung cancer prediction and early identification and therapy to avoid lung cancer. This research aimed to assess lung cancer diagnosis by employing DL algorithms and low-resolution images. The goal would be to see if Machine Learning (ML) models might be created that generate higher confidence conclusions while consuming fractional resources by comparing low and high-resolution images. A DL pipeline has been built to a small enough size from compressing high-resolution images to be fed into an or before CNN (Convolutional Neural Network) for binary classification i.e. cancer or normal. Numerous enhancements have been done to increase overall performance, providing data augmentations, including augmenting training data and implementing tissue detection. Finally, the created low-resolution models are practically incapable of handling extremely low-resolution inputs i.e. 299 x 299 to 2048 x 2048 pixels. Considering the lack of classification ability, a substantial reduction in models’ predictable times is only a marginal benefit. Due to an obvious drawback with the methodology, this is disheartening but predicted finding: very low resolutions, essentially expanding out on a slide, preserve only data about macro-cellular structures, which is usually insufficient to diagnose cancer by itself
    corecore