213,380 research outputs found
DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs
In this work we address the task of semantic image segmentation with Deep
Learning and make three main contributions that are experimentally shown to
have substantial practical merit. First, we highlight convolution with
upsampled filters, or 'atrous convolution', as a powerful tool in dense
prediction tasks. Atrous convolution allows us to explicitly control the
resolution at which feature responses are computed within Deep Convolutional
Neural Networks. It also allows us to effectively enlarge the field of view of
filters to incorporate larger context without increasing the number of
parameters or the amount of computation. Second, we propose atrous spatial
pyramid pooling (ASPP) to robustly segment objects at multiple scales. ASPP
probes an incoming convolutional feature layer with filters at multiple
sampling rates and effective fields-of-views, thus capturing objects as well as
image context at multiple scales. Third, we improve the localization of object
boundaries by combining methods from DCNNs and probabilistic graphical models.
The commonly deployed combination of max-pooling and downsampling in DCNNs
achieves invariance but has a toll on localization accuracy. We overcome this
by combining the responses at the final DCNN layer with a fully connected
Conditional Random Field (CRF), which is shown both qualitatively and
quantitatively to improve localization performance. Our proposed "DeepLab"
system sets the new state-of-art at the PASCAL VOC-2012 semantic image
segmentation task, reaching 79.7% mIOU in the test set, and advances the
results on three other datasets: PASCAL-Context, PASCAL-Person-Part, and
Cityscapes. All of our code is made publicly available online.Comment: Accepted by TPAM
Video Game Development in a Rush: A Survey of the Global Game Jam Participants
Video game development is a complex endeavor, often involving complex
software, large organizations, and aggressive release deadlines. Several
studies have reported that periods of "crunch time" are prevalent in the video
game industry, but there are few studies on the effects of time pressure. We
conducted a survey with participants of the Global Game Jam (GGJ), a 48-hour
hackathon. Based on 198 responses, the results suggest that: (1) iterative
brainstorming is the most popular method for conceptualizing initial
requirements; (2) continuous integration, minimum viable product, scope
management, version control, and stand-up meetings are frequently applied
development practices; (3) regular communication, internal playtesting, and
dynamic and proactive planning are the most common quality assurance
activities; and (4) familiarity with agile development has a weak correlation
with perception of success in GGJ. We conclude that GGJ teams rely on ad hoc
approaches to development and face-to-face communication, and recommend some
complementary practices with limited overhead. Furthermore, as our findings are
similar to recommendations for software startups, we posit that game jams and
the startup scene share contextual similarities. Finally, we discuss the
drawbacks of systemic "crunch time" and argue that game jam organizers are in a
good position to problematize the phenomenon.Comment: Accepted for publication in IEEE Transactions on Game
Understanding the user - why, what and how?
Explains the need, importance, purposes and scope of user studies, discusses procedure for conducting sound user studies together with associated problems of research like selection of problem, formulation of hypothesis, design of study, sampling strategy, data collection methods, scaling techniques, pilot study, processing and analysis of data, testing of hypothesis, interpretation, drawing inferences, communication and dissemination of results and finally concludes by highlighting methodological flaws and gaps in user studies
Guidelines for physical weed control research: flame weeding, weed harrowing and intra-row cultivation
A prerequisite for good research is the use of appropriate methodology. In order to aggregate sound research methodology, this paper presents some tentative guidelines for physical weed control research in general, and flame weeding, weed harrowing and intra-row cultivation in particular. Issues include the adjustment and use of mechanical weeders and other equipment, the recording of impact factors that affect weeding performance, methods to assess effectiveness, the layout of treatment plots, and the conceptual models underlying the experimental designs (e.g. factorial comparison, dose response).
First of all, the research aims need to be clearly defined, an appropriate experimental design produced and statistical methods chosen accordingly. Suggestions on how to do this are given. For assessments, quantitative measures would be ideal, but as they require more resources, visual classification may in some cases be more feasible. The timing of assessment affects the results and their interpretation.
When describing the weeds and crops, one should list the crops and the most abundantly present weed species involved, giving their density and growth stages at the time of treatment. The location of the experimental field, soil type, soil moisture and amount of fertilization should be given, as well as weather conditions at the time of treatment.
The researcher should describe the weed control equipment and adjustments accurately, preferably according to the prevailing practice within the discipline. Things to record are e.g. gas pressure, burner properties, burner cover dimensions and LPG consumption in flame weeding; speed, angle of tines, number of passes and direction in weed harrowing.
The authors hope this paper will increase comparability among experiments, help less experienced scientists to prevent mistakes and essential omissions, and foster the advance of knowledge on non-chemical weed management
Capturing in-situ Feelings and Experiences of Public Transit Riders Using Smartphones
High-density urban environments are susceptible to ever-growing traffic congestion issues, which speaks to the importance of implementing and maintaining effective and sustainable transportation networks. While transit oriented developments offer the potential to help mitigate traffic congestion issues, transit networks ought to be safe and reliable for ideal transit-user communities. As such, it is imperative to capture meaningful data regarding transit experiences, and deduce how transit networks can be enhanced or modified to continually maintain ideal transit experiences. Historically speaking, it has been relatively tricky to measure how people feel whilst using public transportation, without leaning on recall memory to explain such phenomena. Recall memory can be vague and is often less detailed than recording in-situ observations of the transit-user community. This thesis explores the feasibility of using smartphones to capture meaningful in-situ data to leverage the benefits of the Experience Sampling Method (ESM), while also addressing some limitations. Students travelled along Grand River Transit bus routes in Waterloo, Ontario from Wilfrid Laurier University to Conestoga Mall and back using alternate routes. The mobile survey captured qualitative and quantitative data from 145 students to explore variations in wellbeing, and the extent to which environmental variables can influence transit experiences. There were many findings to consider for future research, especially the overall role anxiety played on transit experiences. In addition, the results indicate that the methodology is appropriate for further research, and can be applied to a wide range of research topics. In particular, it is recommended that a similar study be applied to a much larger, and more representative sample of the transit-user community. Future considerations are discussed as key considerations to leverage the benefits of ESM research, and the promise it can bring towards the enhancement of transit experiences and the cohesion of transit-user communities
An intuitive control space for material appearance
Many different techniques for measuring material appearance have been
proposed in the last few years. These have produced large public datasets,
which have been used for accurate, data-driven appearance modeling. However,
although these datasets have allowed us to reach an unprecedented level of
realism in visual appearance, editing the captured data remains a challenge. In
this paper, we present an intuitive control space for predictable editing of
captured BRDF data, which allows for artistic creation of plausible novel
material appearances, bypassing the difficulty of acquiring novel samples. We
first synthesize novel materials, extending the existing MERL dataset up to 400
mathematically valid BRDFs. We then design a large-scale experiment, gathering
56,000 subjective ratings on the high-level perceptual attributes that best
describe our extended dataset of materials. Using these ratings, we build and
train networks of radial basis functions to act as functionals mapping the
perceptual attributes to an underlying PCA-based representation of BRDFs. We
show that our functionals are excellent predictors of the perceived attributes
of appearance. Our control space enables many applications, including intuitive
material editing of a wide range of visual properties, guidance for gamut
mapping, analysis of the correlation between perceptual attributes, or novel
appearance similarity metrics. Moreover, our methodology can be used to derive
functionals applicable to classic analytic BRDF representations. We release our
code and dataset publicly, in order to support and encourage further research
in this direction
- …