929 research outputs found
Weather station and electronic device controller : with Texas Instruments SimpleLink CC3200 Wi-Fi SensorTag and Raspberry Pi
In this thesis, the author introduces a demonstration of a weather station and electronic controller web-based application with main components including Raspberry Pi 3 model B and Texas Instruments SimpleLink CC3200 Wi-Fi SensorTag. The application wirelessly collects all sensor data and shows them on real-time graphical reports. Those sensor data are used to help existing consumer electronics work according to surrounding environmental changes. The application also allows the user to switch on/off connected electronic devices manually.
The developer uses Node-RED editor, Node.js modules, and JavaScript programming language to build an expected application. In this project, the SensorTag kit sends all collected data to IBM Watson IoT Platform and Node-RED hosted by Raspberry Pi which receives and processes those data for a real-time graphical report. Those data are used to control a 4 Relay Module that connects to Raspberry Pi GPIO pins in order to open/close 220V AC power circuit.
The outcome of the thesis is a web-based application with all expected functions. After the testing phases, obstacles, future improvements, and limitations are pointed out. Although the SensorTag is under-developing with limited functions, the demonstration is a concept for the next IoT development. The author also gain a tremendous amount of knowledge about technology, programming, and electronics after developing this project
INTERACTIONS OF DECISION MECHANISMS IN AN ENTREPRENEURIAL FIRM’S JOURNEY TO BECOME A MAJOR PLAYER IN ITS INDUSTRY
siirretty Doriast
Improving software quality with programming patterns
Software systems and services are increasingly important, involving and improving the work and lives of billions people. However, software development is still human-intensive and error-prone. Established studies report that software failures cost the global economy $312 billion annually and software vendors often spend 50-75% of the total development cost for finding and fixing bugs, i.e. subtle programming errors that cause software failures.
People rarely develop software from scratch, but frequently reuse existing software artifacts. In this dissertation, we focus on programming patterns, i.e. frequently occurring code resulted from reuse, and explore their potential for improving software quality. Specially, we develop techniques for recovering programming patterns and using them to find, fix, and prevent bugs more effectively.
This dissertation has two main contributions. One is Graph-based Object Usage Model (GROUM), a graph-based representation of source code. A GROUM abstracts a fragment of code as a graph representing its object usages. In a GROUM, nodes correspond to the function calls and control structures while edges capture control and data relationships between them. Based on GROUM, we developed a graph mining technique that could recover programming patterns of API usage and use them for detecting bugs. GROUM is also used to find similar bugs and recommend similar bug fixes.
The other main contribution of this dissertation is SLAMC, a Statistical Semantic LAnguage Model for Source Code. SLAMC represents code as sequences of code elements of different roles, e.g. data types, variables, or functions and annotate those elements with sememes, a text-based annotation of their semantic information. SLAMC models the regularities over the sememe sequences code-based factors like local code context, global concerns, and pair-wise associations, thus, implicitly captures programming idioms and patterns as sequences with high probabilities. Based on SLAMC, we developed a technique for recommending most likely next code sequences, which could improve programming productivity and might reduce the odds of programming errors.
Empirical evaluation shows that our approaches can detect meaningful programming patterns and anomalies that might cause bugs or maintenance issues, thus could improve software quality. In addition, our models have been successfully used for several other problems, from library adaptation, code migration, to bug fix generation. They also have several other potential applications, which we will explore in the future work
Recommended from our members
Policy research - implications of liberalisation of fish trade for developing countries. A case study of Vietnam
Since the end of the 1980s, when Viet Nam launched the doi moi policy of renovation, economic and social conditions have remarkably improved for the majority of the population. Poverty levels have been reduced by half over the past ten years. Social services have been improved and the economy in general continues to grow at a high rate. Vietnam is a country that has made significant strides in poverty reduction. Using the international poverty line, poverty incidence in Vietnam has been reduced from 37.4% in 1998 to 28.9% in 2002, or a 2% decline annually. Indicators such as access to basic services like electricity, clean water, health care and education show substantial improvement, especially in rural and remote mountainous areas. However, poverty still remains at a high level and the disparity among regions and ethnic groups is increasing. Poverty is still mainly concentrated in rural areas, where 90% of poor people live.
In Vietnam the fisheries sector, especially coastal and inland aquaculture, is a prioritized sector for development. It is important not only for national income through exports but also as a subsistence activity for poverty reduction among the large rural population. Several million people depend on aquatic resources, directly or indirectly, for their livelihoods in inland and coastal areas. According to official employment statistics, one in every twenty-five persons in Vietnam is engaged in a fisheries activity. This means that there is a considerable labour force of around 3 million people is directly employed in the country’s fisheries sector.
Farming of catfish is an important freshwater aquaculture activity in Vietnam, reared in floaing cases and ponds. “Tra” (Pangasius hypophthalmus1) and “basa” (Pangasius bocourti) farming is a traditional occupation and a means of livelihood for farmers in the Mekong Delta in the south of Vietnam (Trong et al, 2002). Thanks to the Government’s trade liberalization reforms catfish production increased substantially in recent years to catering to increased international demand and market opportunities. Volumes of cat fish fillet exported by Vietnamese export companies increased from 5,000 tonnes in 1996 to 10,000 tonnes in 2001 (90% of which was the tra species), with half exported to the United States
Internet use and agricultural productivity in rural Vietnam
The use of the internet is growing rapidly and has become an engine for economic development. However, few studies have examined the impact of internet use on agricultural production, and the results are not yet conclusive. Employing a dataset of more than 2000 observations in rural Vietnam, our study analyses the impact of internet use on agricultural productivity using the heteroscedasticity-based instrument approach suggested by Lewbel, Journal of Business and Economic Statistics, 2012, 30, 67–80 and examines the heterogeneity and distribution of the impact using quantile regressions. Our results show that internet use has significant and positive effects on agricultural productivity. However, these effects are heterogeneous across population groups. The positive effects of internet use are stronger for households with a lower level of education, with a young and female head, and from ethnic minorities. The benefits are also found to be skewed towards the group of farmers at the bottom of the productivity distribution. Therefore, we propose facilitating the diffusion of the internet, since it not only boosts agricultural productivity, but also reduces productivity inequality. In addition, we recommend promoting rural education, supporting local markets, investing more in irrigation systems, and facilitating farm mechanisation as these factors are found to contribute to increasing agricultural productivity
Comparison of DNA Extraction Methods in the Detection of Genetically Modified Foods Using Polymerase Chain Reaction
Genetically modified organisms (GMOs) can be defined as organisms in which
genetic materials have been altered in the way that does not occur naturally by
mating or natural combination. According to Novel Food Regulation (EC/258/97,
EC/1139/98, EC/49/2000, EC/50/2000 and EC/1829/2003), foods and food
ingredients derived from GMOs are strictly regulated and are labeled mandatorily.
Polymerase chain reaction (PCR) method is used to detect GM events in foods. The
specific objectives of this study are to establish a sensitive, robust and rapid
operation method for detection of GM events by using PCR and to conduct a
preliminary survey for distribution of animal feeds and foods derived from GM
events in both Malaysia and Vietnam.
The two critical factors taken into account to achieve these objectives are
applicability of different DNA extraction methods for each kind of samples and PCR
amplification conditions.
v
Five DNA extraction methods (the Wizard method from Switzerland, the modified
Wizard method with addition of beta-mercaptoethanol, the combination of Wizard
and CTAB method, the CTAB method from Germany and the modified CTAB
method with addition of beta-mercaptoethanol method) were optimized. The yield
and quality of DNA obtained from raw soyabean, raw maize, animal feed, smooth
tofu and soya milk samples were examined to determine the optimum method for
each kind of sample.
The results of comparative analysis showed that the CTAB method was the most
optimal protocol for extracting total DNA from raw soyabean, raw maize and animal
feed samples with 32.7, 28.4, 33.4 ng DNA/mg sample in yield, respectively. In
addition, the DNA quality (the ratio of A260/A280 ranged from 1.9 to 2) was good
enough not only for PCR amplification but also for DNA sequencing. However, the
Wizard method was the best candidate for DNA isolation from smooth tofu (13.2 ng
DNA/mg sample) and soya milk (3.4 ng DNA/mg sample) with relatively high
quality of DNA (A260/A280 ratio was 1.7 and 1.9, respectively).
The results of this survey showed that 20 out of 24 animal feed samples
contaminated by at least one of three introduced genetic elements as promoter
(P35S), terminator (NOS) and structural gene (EPSPS). In particular, all of the 16
animal feed samples from Malaysia and four out of eight animal feeds from Vietnam
were GM-contaminant products. In contrast, neither soybean samples (12 samples)
nor maize samples (24 samples) were positive with these assays. Therefore they were
categorized as non-GM products.
vi
These results revealed that PCR amplification method provides the key advantages of
high sensitivity, and robust and rapid operation whilst providing the requisites of
careful experimental design that avoids both false-negative and/or false-positive
results. Five primer pairs of LEC1/LEC2; ZE03/ZE04; P35S 1-5’/P35S 2-3’; HANOS118-
F/HA-NOS118-R and EPSPS 1-5’/EPSPS 3-3’ chosen in this study
produced the amplicons of 164, 277, 101, 118 and 118 base pair, respectively that
fulfilled the product-size requirement and completed the whole detection procedure
of GM events for raw soybean and raw maize as well animal feed samples. In
addition, PCR amplification and DNA sequencing protocols presented in this study
should provide a very useful tool for routine GM event detection in foods and feeds
with regards to false-negative and/or false-positive results
LAPFormer: A Light and Accurate Polyp Segmentation Transformer
Polyp segmentation is still known as a difficult problem due to the large
variety of polyp shapes, scanning and labeling modalities. This prevents deep
learning model to generalize well on unseen data. However, Transformer-based
approach recently has achieved some remarkable results on performance with the
ability of extracting global context better than CNN-based architecture and yet
lead to better generalization. To leverage this strength of Transformer, we
propose a new model with encoder-decoder architecture named LAPFormer, which
uses a hierarchical Transformer encoder to better extract global feature and
combine with our novel CNN (Convolutional Neural Network) decoder for capturing
local appearance of the polyps. Our proposed decoder contains a progressive
feature fusion module designed for fusing feature from upper scales and lower
scales and enable multi-scale features to be more correlative. Besides, we also
use feature refinement module and feature selection module for processing
feature. We test our model on five popular benchmark datasets for polyp
segmentation, including Kvasir, CVC-Clinic DB, CVC-ColonDB, CVC-T, and
ETIS-LaribComment: 7 pages, 7 figures, ACL 2023 underrevie
THE EFL 8TH GRADERS’ ATTITUDES TOWARDS THE USE OF COMPETENCY-BASED INSTRUCTION IN LISTENING COMPREHENSION AT A SECONDARY SCHOOL IN KIEN GIANG, VIETNAM
Listening skill is a vital component of language acquisition on the grounds that it can foster the improvement of other language skills. Therefore, enhancing students’ listening is constantly gaining the prominent attention of most English teachers at secondary schools. Besides, applying competency-based instruction into teaching listening is a novel method that helps teachers attain students’ attitudes, affecting their listening comprehension. Therefore, this current study aimed to examine students’ attitudes toward using competency-based instruction in listening for a main idea and specific information. The study employed a qualitative approach to determine 45 8th graders’ attitudes at a secondary school in Kien Giang province. The students’ diaries were the critical data collection. The study’s findings showed that their views affected their listening skills. Most of them had positive attitudes toward the utilization of competency-based instruction in listening comprehension. Article visualizations
- …