1,338 research outputs found

    Real-time Defogging of Single Image of IoTs-based Surveillance Video Based on MAP

    Get PDF
    Due to the atmospheric scattering phenomenon in fog weather, the current monitoring video image defogging method cannot estimate the fog density of the image. This paper proposes a real-time defogging algorithm for single images of IoTs surveillance video based on maximum a posteriori (MAP). Under the condition of single image sequence, the posterior probability of the high-resolution single image is set to the maximum, which improves the MAP design super-resolution image reconstruction. This paper introduces fuzzy classification to calculate atmospheric light intensity, and obtains a single image of IoTs surveillance video by the atmospheric dissipation function. The improved algorithm has the largest signal-to-noise ratio after defogging, and the maximum value is as high as 40.99 dB. The average time for defogging of 7 experimental surveillance video images is only 2.22 s, and the real-time performance is better. It can be concluded that the proposed algorithm has excellent defogging performance and strong applicability

    Active Building Facades to Mitigate Urban Street Pollution

    Get PDF
    This multidisciplinary research utilizes current thinking in planning, engineering, science and architecture, and proposes an interdisciplinary solution for addressing urban air pollution related to increasing urbanization. The premise that buildings are interconnected with urban infrastructure, with buildings serving as a resource and not just as a load, and the use of an active building facade to remediate environmental air pollutants beyond the building’s perimeter, represents a fundamental paradigm shift as to the nature of buildings in the urban environment. Form Based Codes (FBCs) are urban design guidelines which also provide requirements for street dimensions between building facades and height limitations of buildings based upon the number of stories. If these FBCs do not control for height to width ratios, they can result in a morphology called an urban street canyon. The vertical dimension of a street canyon corresponds to the height of a building (H) which is typically regulated by the number of stories (floors). The horizontal dimension of a street canyon, the width of the street (W) and associated frontage, corresponds to the right of way (ROW) which is the space between building lot lines. The most important geometric detail about a street canyon is the ratio of the canyon height (H) to canyon width (W), H:W, which is defined as the aspect ratio because when the value of the aspect ratio is >= 1:1, air pollution can accumulate at the street level. The problem becomes one where FBCs are setting urban design guidelines for streets, ostensibly for walkability, but are unintentionally creating street canyons which are accumulating unhealthy air pollutants in the very locations where they hope to encourage people to walk. Within the envelope of an urban building, air quality is an issue addressed almost completely as an internal requirement. Building ventilation systems rely on internal air quality monitoring and are designed to optimize energy efficiency for the building and its occupants. There are no studies that suggest that the building HVAC system should be used to ameliorate air pollution found outside the building, except for use within the building perimeter. This research investigated the capacity of a double-skin-facades (DSF), an active façade system typically used only for building HVAC, to evacuate air at the street level within the frontage zone of influence, as well as whether the DSF could actually remove criteria pollutants from the streetscape where human interaction is being promoted. Aside from matters of cost, DSFs have had little impact in the United States because they do not effectively filter air pollutants, which is especially troubling if they are to be used for fresh air intake. Plant integration into a DSF has been proposed for thermal mitigation; however, the suggestion that the plants could also create a functional component to filter the air has not. The NEDLAW vegetated biofilter reduces concentrations of toluene, ethylbenzene, and o-xylene as well as other VOCs and PMs. A DSF integrated vegetated biofilter has numerous benefits for streetscapes and opportunities for expanded use of an energy efficient system that serves not only the building occupants but the urban environment. This research developed and evaluated an active DSF building system for the evacuation and amelioration of street level air pollutants. Several modeling methods, including computational fluid dynamic (CFD) simulation and experimental validation through the use of a boundary layer wind tunnel were employed. The results based upon CFD modeling showed definitive removal of street level air pollution with mixing with upper boundary air. The numerical modeling process identified gaps in the CFD analyses particularly with regarding to multi-scalar meshing of the DSF within the street canyon. Experimental verification and validation of the active DSF using an urban boundary layer wind tunnel also showed definitive ventilation of street level air pollution and mixing with upper boundary air. Furthermore, the data showed that a vegetated biofilter would be able to operate within the operational parameters of the DSF. This research identified a means to extend the building systems to function as urban infrastructure for purposes of air pollution removal. The development of a method where investment in a building system is an investment in the city’s infrastructure is a paradigm shift that has led to the identification of multiple avenues of future interdisciplinary research as well as informing future urban design guidelines

    A study of the reduction of simulation modeling development time

    Get PDF
    This paper presents new ideas dealing with simulation model development that are derived from the principles of lean manufacturing, for example, the concept of treating the time that is spent in production activities as lead-time, and the feasibility of reducing the lead-time through different mechanisms, which are presented in a framework [44]. One of the main obstacles in developing simulation model is time. Simulation lead-time is composed of nine steps of simulation model development. These steps are: problem definition, establishing boundaries, establishing variables, data collection, model development, verification & validation, documentation, experimentation, and implementation. This paper presents the idea of treating the time spent in simulation modeling development as a lead-time. At the same time, it presents a new framework to reduce lead-time, which has never been addressed before. A new framework to reduce the simulation modeling development long lead-time similar to the Toyota production framework for reducing the production lead-time will be presented in this paperThe framework developed as a result of an actual simulation case study, which took place at a local company, and which took a very long lead-time. The framework was composed of different steps, techniques, and mechanisms that should reduce simulation modeling development lead-time every time a simulation project is conducted. One of the goals of this framework is to reduce one of the main obstacles of simulation model, which is the long lead-time. One of the new mechanisms that is presented in this framework is a geographical distributed communication tool, which is called NetMeeting. This tool is an application of the concept of distributed and Web-Based simulations

    Unmanned Aerial Systems Traffic Management (UTM): Safely Enabling UAS Operations in Low-Altitude Airspace

    Get PDF
    Currently, there is no established infrastructure to enable and safely manage the widespread use of low-altitude airspace and UAS flight operations. Given this, and understanding that the FAA faces a mandate to modernize the present air traffic management system through computer automation and significantly reduce the number of air traffic controllers by FY 2020, the FAA maintains that a comprehensive, yet fully automated UAS traffic management (UTM) system for low-altitude airspace is needed. The concept of UTM is to begin by leveraging concepts from the system of roads, lanes, stop signs, rules and lights that govern vehicles on the ground today. Building on its legacy of work in air traffic management (ATM), NASA is working with industry to develop prototype technologies for a UAS Traffic Management (UTM) system that would evolve airspace integration procedures for enabling safe, efficient low-altitude flight operations that autonomously manage UAS operating in an approved low-altitude airspace environment. UTM is a cloud-based system that will autonomously manage all traffic at low altitudes to include UASs being operated beyond visual line of sight of an operator. UTM would thus enable safe and efficient flight operations by providing fully integrated traffic management services such as airspace design, corridors, dynamic geofencing, severe weather and wind avoidance, congestion management, terrain avoidance, route planning re-routing, separation management, sequencing spacing, and contingency management. UTM removes the need for human operators to continuously monitor aircraft operating in approved areas. NASA envisions concepts for two types of UTM systems. The first would be a small portable system, which could be moved between geographical areas in support of operations such as precision agriculture and public safety. The second would be a Persistent system, which would support low-altitude operations in an approved area by providing continuous automated coverage. Both would require persistent communication, navigation, and surveillance (CNS) coverage to track, ensure, and monitor conformance. UTM is creating an airspace management tool that allows the ATM system to accommodate the number of UAS that will operate in the low altitude airspace. The analogy is just because we have a car, whether its autonomous or someone is driving, does not diminish the need for a road or road signs or rules of the road

    Small engine emissions testing laboratory development and emissions sampling system verification

    Get PDF
    With the recent scrutiny of engine emissions and a focus towards higher fuel efficiencies, there has been an increase in demand for small, efficient engines and small engine emissions testing. Small engines have proven to provide high efficiency performance for systems including refrigeration units, generators, compressors and numerous other off-road applications. In the past, the existing emissions testing facilities at West Virginia University\u27s (WVU) Center for Alternative Fuels, Engines and Emissions (CAFEE) have been focused towards the testing of heavy duty diesel engines.;In order to expand the emissions testing capabilities at CAFEE, a new small engine emissions testing laboratory was needed. Over a two year period a new small engine emissions laboratory (SEEL) was designed and built at CAFEE\u27s Westover facility. The new SEEL used a 40 hp alternating current (AC) dynamometer with an in-line slip ring torque sensor. It included full dynamometer and engine cooling capabilities. Custom built software provided the control algorithms to allow for engine mapping, steady state, and transient emissions tests. Safety systems including shaft guards and an automatic kill switch provided a safe working environment and would isolate damage in case of a mechanical failure.;The SEEL was designed to be used with existing raw and dilute emissions sampling systems. The raw emissions sampling system was recently developed at WVU and needed to be verified against a trusted dilute emissions sampling system in order to prepare it for testing with the SEEL. A set of tests were performed which included simultaneous sampling of one engine by both sampling systems. The results from these tests showed that raw sampling system CO, CO2, and NOx passed their verification criteria of 2%, 2%, and 5% difference respectfully. The HC measurement systems did not pass the 10% verification criteria. The verification of HC was a complex issue that was beyond the scope of this study

    Source-oriented model for air pollutant effects on visibility

    Get PDF
    A source-oriented model for air pollutant effects on visibility has been developed that can compute light scattering, light extinction, and estimated visual range directly from data on gas phase and primary particle phase air pollutant emissions from sources. The importance of such a model is that it can be used to compute the effect of emission control proposals on visibility-related parameters in advance of the adoption of such control programs. The model has been assembled by embedding several aerosol process modules within the photochemical trajectory model previously developed for aerosol nitrate concentration predictions by Russell et al. [1983] and Russell and Cass [1986]. These modules describe the size distribution and chemical composition of primary particle emissions, the speciation of organic vapor emissions, atmospheric chemical reactions, transport of condensible material between the gas and the particle phases, fog chemistry, dry deposition, and atmospheric light scattering and light absorption. Model predictions have been compared to observed values using 48-hour trajectories arriving at Claremont, California, at each hour of August 28, 1987, during the Southern California Air Quality Study. The predicted fine particle concentration averages 62 μg m^(−3) compared to an observed value of 61 μg m^(−3), while predicted PM_(10) concentrations average 102 μg m^(−3) compared to an observed average of 97 μg m^(−3). The size distribution and chemical composition predictions for elemental carbon, sulfate, and sodium ion agree with observations to within plus or minus a few micrograms per cubic meter, while ammonium and nitrate concentrations are underpredicted by the base case model by 3 to 7 μg m^(−3) on average. Light-scattering coefficient values are calculated from the predicted aerosol size distribution and refractive index, and the model predictions agree with measured values on average to within 19%. The advantages and limitations of the modeling procedure are discussed
    corecore