37,708 research outputs found
HOG, LBP and SVM based Traffic Density Estimation at Intersection
Increased amount of vehicular traffic on roads is a significant issue. High
amount of vehicular traffic creates traffic congestion, unwanted delays,
pollution, money loss, health issues, accidents, emergency vehicle passage and
traffic violations that ends up in the decline in productivity. In peak hours,
the issues become even worse. Traditional traffic management and control
systems fail to tackle this problem. Currently, the traffic lights at
intersections aren't adaptive and have fixed time delays. There's a necessity
of an optimized and sensible control system which would enhance the efficiency
of traffic flow. Smart traffic systems perform estimation of traffic density
and create the traffic lights modification consistent with the quantity of
traffic. We tend to propose an efficient way to estimate the traffic density on
intersection using image processing and machine learning techniques in real
time. The proposed methodology takes pictures of traffic at junction to
estimate the traffic density. We use Histogram of Oriented Gradients (HOG),
Local Binary Patterns (LBP) and Support Vector Machine (SVM) based approach for
traffic density estimation. The strategy is computationally inexpensive and can
run efficiently on raspberry pi board. Code is released at
https://github.com/DevashishPrasad/Smart-Traffic-Junction.Comment: paper accepted at IEEE PuneCon 201
An Overview on Application of Machine Learning Techniques in Optical Networks
Today's telecommunication networks have become sources of enormous amounts of
widely heterogeneous data. This information can be retrieved from network
traffic traces, network alarms, signal quality indicators, users' behavioral
data, etc. Advanced mathematical tools are required to extract meaningful
information from these data and take decisions pertaining to the proper
functioning of the networks from the network-generated data. Among these
mathematical tools, Machine Learning (ML) is regarded as one of the most
promising methodological approaches to perform network-data analysis and enable
automated network self-configuration and fault management. The adoption of ML
techniques in the field of optical communication networks is motivated by the
unprecedented growth of network complexity faced by optical networks in the
last few years. Such complexity increase is due to the introduction of a huge
number of adjustable and interdependent system parameters (e.g., routing
configurations, modulation format, symbol rate, coding schemes, etc.) that are
enabled by the usage of coherent transmission/reception technologies, advanced
digital signal processing and compensation of nonlinear effects in optical
fiber propagation. In this paper we provide an overview of the application of
ML to optical communications and networking. We classify and survey relevant
literature dealing with the topic, and we also provide an introductory tutorial
on ML for researchers and practitioners interested in this field. Although a
good number of research papers have recently appeared, the application of ML to
optical networks is still in its infancy: to stimulate further work in this
area, we conclude the paper proposing new possible research directions
- …