252 research outputs found
Efficiency of the foreign exchange markets in South Asian Countries
This paper examines the weak form efficiency of the foreign exchange markets in seven SAARC countries using monthly return series for each of these markets over a period of 21 years (1985-2005). We applied a battery of unit root tests and variance ratio tests (individual and multiple) to see whether the return series (and also, the raw data) follow random walk process. Our results suggest that the increments of the return series are not serially correlated. Therefore, we conclude that foreign exchange markets in SAARC countries are weak form efficient.
A study of sound attenuation using sonic energy focussed at an artificial soil fracture
Several laws have been passed over the past two decades to control contamination and to remediate already existing contaminated sites. There are a large number of these sites and several ex-situ and in-situ techniques have been developed to decontaminate these sites. These techniques can be quite expensive, especially the ex-situ clean-up operations, and there has always been a need for cheap, quick remediation methods.
This study investigates the attenuation of sound used in the in situ remediation technique coupling Sonic Energy with Pneumatic Fracturing and Vapor Extraction. Preliminary attenuation studies were performed in the laboratory with a microphone made at Lucent Technologies. The laboratory facilities at Lucent Technologies and at the Otto York Center at New Jersey Institute of Technology were used to measure the attenuation of sound through air with five whistles. The best whistle gave a sound intensity at the source of 150 - 160 dB and this whistle was used in the field study. The field study was performed at a site in Hillsborough Township, New Jersey contaminated with trichloroethylene and dichioroethylene where Kaleem (1999) had observed a considerable increase in the removal rate of the contaminants at the site using sound energy, thus lowering the effective remediation time of the site. In this study, experimental runs were performed in which sonic energy of known intensity was applied at the inlet of artificial soil fractures present at the site and its intensity was measured at the outlet of the fractures thereby giving the attenuation of the sound in the soil/rock.
The results obtained indicate that the sonic energy is absorbed very quickly in the ground and hence the sound attenuates very quickly at this site. This rapid attenuation is probably due to the increased attenuation that takes place in rock/soil at this site depending on the nature of the fractures in the rock and soil. A probable theory to explain the increased removal rate of the contaminants even though the sonic energy is absorbed rapidly is that most of the sonic energy is absorbed in a local region near the source, lowering the concentration of the contaminants in that region. This concentration would be lower than that if only air without sound were being used for Vapor Extraction. The rapid depletion of contaminants using sonic energy would result in a higher contaminant concentration in the effluent stream for constant air flowrate. This depletion of contaminants sets up a greater concentration gradient between the remediated region and the contaminated region and hence greater mass diffusion between the two regions. Thus, there is a lowering in the overall concentration of the contaminants in the field and a decrease in the remediation time of the site.
It is recommended that a laboratory model of the fracture and its environment be simulated and attenuation studies be performed to examine the factors that affect the propagation of sound. It is, furthermore, recommended that controlled attenuation studies be made in a bed of soil with the microphone placed in boreholes at closer distances to the sound source. The larger fixed borehole distances at the Hillsborough site would not allow a quantitative indication of how rapidly the sound intensity decreased. A whistle or a siren with a higher intensity and a higher frequency should be designed and used to examine the attenuation that takes place at this site with larger borehole diameters
CSCI 6990
There are a number of different aspects of software engineering and software development. Software engineering is not entirely about source code. Like any industrial product, a software product is also developed undergoing a disciplined process and engineering techniques. This course emphasizes on software process, in particular the agile development process, which has drawn tremendous interests and adoption in the industry over the last decade
Numerical and experimental studies on coastal marsh erosion under hurricane induced wave and current
Considering the past history and future risks of hurricanes in the USA, well understood storm protection plans are needed to shelter the important areas of the population and economy, especially within southeastern Louisiana. It is extensively assumed that marshes offer protection from hurricane though the degree of this protection is not well measured or understood due to the complex physics involved in this overall system. Moreover, marshes experience significant erosion while serving as a barrier for important areas. Consequently, a particular method to quantify the effects on marshes during a coastal hurricane period is necessary to mitigate major marsh loss.
A study comprised of experimental work and numerical simulation was undertaken to evaluate the effect of marsh vegetation on resisting hurricane induced erosion and erosion of the marsh itself. Local vegetation Spartina alterniflora was selected as principal marsh vegetation for this study. Contribution from Spartina alterniflora had been analyzed from two different directions such as contribution of roots and contribution of shoots.
The overall research was divided into three different phases. The first phase was the laboratory experiments of collected soil samples with and without roots of Spartina from the study area (Cycle-1 of CS-28 project). Direct shear tests were perfouned on the samples to study the effect of roots on soil shear strength. Tensile strength of the roots was also studied. In the second phase, Delft3D wave flow coupled model was applied on the Louisiana coastal marsh near Calcasieu Lake to assess the contribution of marsh vegetation in reducing hurricane induced wave and current actions. The objective of this phase was to develop an integrated wind, current, wave modeling system for the Louisiana coast under hurricane conditions. Hurricane Ike in 2008 was chosen as an example to study the marsh\u27s contribution during hurricane. The wave flow coupled model was generated covering a significant part of Calcasieu Lake, surrounding marshes and a part of the Gulf of Mexico. The coupled model was calibrated and validated against observed data gathered from NOAA and CPRA observation stations. Later after validation, Hurricane Ike forcing condition was introduced to the wave flow coupled model. Moreover, to originate the extreme scenario, the hurricane was introduced by excluding the precipitation and flooding effect of a previous hurricane named Gustav that made landfall 13 days prior to Hurricane Ike. Delft3D vegetation model was also analyzed to investigate the effect of a hurricane on vegetated mud bed. In the third phase, based on the experimental results from the tensile and direct shear tests and hurricane stress results from Delft3D analysis, slope stability analyses were performed for 16 different scenarios by utilizing Slope/W to predict erosion of vegetated and non-vegetated mud surface during different phases of a hurricane.
Experimental results suggested that the marshes do have the potential to enhance soil shear strength. Results suggested that the additional cohesion developed from plant roots played a vital role in enhancing shear strength of marsh soil, especially near the surface. A correlation between Spartina alterniflora root tensile strength and root cohesion was proposed for dredged soil. The validation of the coupled wave flow model showed that the water level computed by Delft3D agrees fairly well with the measured data. Results from Delft3D vegetation model study indicated a major reduction in the current velocity in presence of the Spartina alterniflorashoot system. Results from the hurricane induced wave flow model showed that the wave induced bed shear stress up to 90 Pa can be the result while hurricane reached its peak time.
It was found that the edge and flat soil mass of the marsh reacted differently under hurricane induced wave and current action especially when time dependent analysis is considered. It was also observed that the presence of a shoot system around the weak spot reduces bed shear stress significantly, especially while the marsh bed is submerged or under a low wave energy field. Yet, completely exposed vegetation during the peak of a hurricanes was found to be most vulnerable and supposed to experience severe mass erosion/marsh shears.
It was also noticed from the erosion prediction analysis that the hurricane damage could have been severe if there was no prior hurricane before Hurricane Ike. From the summary of erosion prediction analysis output, it was observed that the uprooting or mass erosion only occurred during two scenarios among sixteen scenarios. Near the marsh edge, mass erosion occurred during the hurricane landfall with the condition that the marsh edge was above water prior to hurricane impact. On marsh flat, mass erosion occurred during the peak of the hurricane when analyzed with drought condition prior to the hurricane.
The combined experimental and numerical analysis of Louisiana coastal marsh under hurricane-induced waves and currents provided useful insights of actual scenarios and probable cases. The findings could be used effectively in the design and construction of future marsh creation projects in Louisiana
Management Aspects of Software Clone Detection and Analysis
Copying a code fragment and reusing it by pasting with or without minor modifications is a common practice in software development for improved productivity. As a result, software systems often have similar segments of code, called software clones or code clones. Due to many reasons, unintentional clones may also appear in the source code without awareness of the developer. Studies report that significant fractions (5% to 50%) of the code in typical software systems are cloned. Although code cloning may increase initial productivity, it may cause fault propagation, inflate the code base and increase maintenance overhead. Thus, it is believed that code clones should be identified and carefully managed. This Ph.D. thesis contributes in clone management with techniques realized into tools and large-scale in-depth analyses of clones to inform clone management in devising effective techniques and strategies.
To support proactive clone management, we have developed a clone detector as a plug-in to the Eclipse IDE. For clone detection, we used a hybrid approach that combines the strength of both parser-based and text-based techniques. To capture clones that are similar but not exact duplicates, we adopted a novel approach that applies a suffix-tree-based k-difference hybrid algorithm, borrowed from the area of computational biology. Instead of targeting all clones from the entire code base, our tool aids clone-aware development by allowing focused search for clones of any code fragment of the developer's interest.
A good understanding on the code cloning phenomenon is a prerequisite to devise efficient clone management strategies. The second phase of the thesis includes large-scale empirical studies on the characteristics (e.g., proportion, types of similarity, change patterns) of code clones in evolving software systems. Applying statistical techniques, we also made fairly accurate forecast on the proportion of code clones in the future versions of software projects. The outcome of these studies expose useful insights into the characteristics of evolving clones and their management implications.
Upon identification of the code clones, their management often necessitates careful refactoring, which is dealt with at the third phase of the thesis. Given a large number of clones, it is difficult to optimally decide what to refactor and what not, especially when there are dependencies among clones and the objective remains the minimization of refactoring efforts and risks while maximizing benefits. In this regard, we developed a novel clone refactoring scheduler that applies a constraint programming approach. We also introduced a novel effort model for the estimation of efforts needed to refactor clones in source code.
We evaluated our clone detector, scheduler and effort model through comparative empirical studies and user studies. Finally, based on our experience and in-depth analysis of the present state of the art, we expose avenues for further research and development towards a versatile clone management system that we envision
Forecasting the Long Term Economics Status of Bangladesh Using Machine Learning Approaches from 2016-2036
It is a piece of happy news for us that Bangladesh has been now converted to a developing country. The United Nation and World Bank have recognized it. But they have a condition that we need to continue this economic progress till 2024 for getting a permanent recognition. The economic condition depends on many factors like Gross Domestic Product (GDP), Personal Saving, Private Sector Investment, Gross National Income (GNI) per capita, Human Assets Index (HAI) and Economic Vulnerability Index (EVI). This paper portrays the forecast of the long-term economic condition of Bangladesh as an independent variable which is a year and the dependent variables are GDP, private sector investment and personal saving. The living conditions of a country depend on GDP. Personal saving and Private Sector Investment are also important parts of a country’s economy. If we will forecast these attributes properly, then we can determine the economic condition of Bangladesh and living status of the people more accurately. Therefore, we can determine that Bangladesh can fulfil the condition of getting permanent recognized as a developing country. For forecasting these attributes, we proposed a model which consists of Karl Pearson’s coefficient and modified linear regression techniques. For improving performance, we modify linear regression by gradient boosting. This experiment shows that our model gives us more accurate forecasting about GDP, Private sector investment and Personal Saving
Forecasting of Breast Cancer and Diabetes Using Ensemble Learning
Machine learning algorithm plays an important role in our life. It is the subset of Artificial intelligence. Recently, everyone tries to use AI or try to invent something related to AI for making life easier. In the medical field, Machine learning is used for the recognition and classification of diseases. It can classify cancer, diabetes or other diseases more accurately from datasets. So, we propose a model which is the combination of Support vector machine and Ad boost. This combine method is known as Ensemble learner. In this paper, we are predicting diabetes and breast cancer. We have used SVM for classification purpose then have applied Ad boost for boosting purposes. The number of a diabetes patient is increasing very rapidly. It causes many other diseases like kidney failure; Eye disorder etc. No medicines are invented to prevent diabetes fully. Breast cancer is increasing very rapidly between women. The cost of breast cancer treatment is very high. More researches are running on diabetes and breast cancer. We proposed our model to predict the diseases more accurately rather than the previous models
A unifying co-operative web caching architecture
Network caching of objects has become a standard way of reducing network traffic and latency in the web. However, web caches exhibit poor performance with a hit rate of about 30%. A solution to improve this hit rate is to have a group of proxies form co-operation where objects can be cached for later retrieval. A co-operative cache system includes protocols for hierarchical and transversal caching. The drawback of such a system lies in the resulting network load due to the number of messages that need to be exchanged to locate an object. This paper proposes a new co-operative web caching architecture, which unifies previous methods of web caching. Performance results shows that the architecture achieve up to 70% co-operative hit rate and accesses the cached object in at most two hops. Moreover, the architecture is scalable with low traffic and database overhead
- …