3,683 research outputs found

    A Novel Technique to Detect and Track Multiple Objects in Dynamic Video Surveillance Systems

    Get PDF
    Video surveillance is one of the important state of the art systems to be utilized in order to monitor different areas of modern society surveillance like the general public surveillance system, city traffic monitoring system, and forest monitoring system. Hence, surveillance systems have become especially relevant in the digital era. The needs of the video surveillance systems and its video analytics have become inevitable due to an increase in crimes and unethical behavior. Thus enabling the tracking of individuals object in video surveillance is an essential part of modern society. With the advent of video surveillance, performance measures for such surveillance also need to be improved to keep up with the ever increasing crime rates. So far, many methodologies relating to video surveillance have been introduced ranging from single object detection with a single or multiple cameras to multiple object detection using single or multiple cameras. Despite this, performance benchmarks and metrics need further improvements. While mechanisms exist for single or multiple object detection and prediction on videos or images, none can meet the criteria of detection and tracking of multiple objects in static as well as dynamic environments. Thus, real-world multiple object detection and prediction systems need to be introduced that are both accurate as well as fast and can also be adopted in static and dynamic environments. This paper introduces the Densely Feature selection Convolutional neural Network – Hyper Parameter tuning (DFCNHP) and it is a hybrid protocol with faster prediction time and high accuracy levels. The proposed system has successfully tracked multiple objects from multiple channels and is a combination of dense block, feature selection, background subtraction and Bayesian methods. The results of the experiment conducted demonstrated an accuracy of 98% and 1.11 prediction time and these results have also been compared with existing methods such as Kalman Filtering (KF) and Deep Neural Network (DNN)

    Internationalization of emerging Indian multinationals:linkage, leverage and learning (LLL) perspective

    Get PDF
    The multi-polar world in which we now live and work demands re-examination and refinement of the traditional understanding of the internationalization strategies and competitive advantages of multinational firms by incorporating the characteristics of firms from emerging economies. Based on interviews in four Indian multinationals in different industry segments, we present the "voices" of Indian corporate leaders to provide preliminary evidence on the primary motives behind the internationalization process of emerging multinationals from the perspective of linkage, leverage and learning (LLL). We show how the case study organizations have evolved themselves to become credible global players by leveraging on their learning through targeted acquisitions in developed markets to acquire intangible assets and/or following global clients in search of new markets and competitive advantages

    Hypertext transfer protocol performance analysis in traditional and software defined networks during Slowloris attack

    Get PDF
    The extensive use of the internet has resulted in novel technologies and protocol improvisation. Hypertext transfer protocol/1.1 (HTTP/1.1) is widely adapted on the internet. However, HTTP/2 is found to be more efficient over transport control protocol (TCP). The HTTP/2 protocol can withstand the payload overhead when compared to HTTP/1.1 by multiplexing multiple requests. However, both the protocols are highly susceptible to application-level denial of service (DoS) attacks. In this research, a slow-rate DoS attack called Slowloris is detected over Apache2 servers enabled with both versions of HTTP in traditional networks and software defined networks (SDN). Server metrics such as server connection time to the webpage, latency in receiving a response from the server, page load time, response-response gap, and inter-packet arrival time at the server are monitored to analyze attack activity. A Monte Carlo simulation is used to estimate threshold values for server connection time and latency for attack detection. This work is implemented in a lab environment using virtual machines, Ryu controller, zodiac FX OpenFlow switch and Apache2 servers. This study also highlights SDN's security benefits over traditional networks

    A study on the hydrogeology and hydrogeochemistry of groundwater from different depths in a coastal aquifer: Annamalai Nagar, Tamilnadu, India

    Get PDF
    Chemical characterization of groundwater is essential to bring out its nature and utility. Samples from shallow and deep ground water of the same location were collected and studied for their geochemical characteristics following standard procedures (APHA 1998). Sediment samples from different depths were collected and analysed for minerals using FTIR and SEM. Resisitivity logging was carried out in the bore well to understand the variations in depth to fresh water potential. The shallow ground water is dominated by Na-Cl-HCO3-SO4 and deeper groundwater by Na-HCO3-SO4-Cl types. It is observed that there is a significant ionic variation with depth. The ionic strength of the deeper samples is lesser than in the shallower samples. Wide pH variations in the shallow water samples are due to ion exchange process. Thermodynamic stability plot was used to identify the state of stability. It is inferred that there is no major significant difference in the thermodynamic state of stability in the shallow and the deeper aquifers as the aquifer matrix for the shallow and deeper aquifers are almost similar. Saturation index of Gibbsite, Kaolinite, Calcite, Dolomite and Anhydrite, were studied for shallow and deep aquifers, to identify the difference in hydrochemical signatures. The Si/Al ratios of shallow samples are less when compared with the deeper samples. Leaching of secondary salts was the chief mechanism controlling the ground water chemistry of the region

    IDENTIFYING ENTERPRISE RISK BASED ON BUSINESS CONTEXT WITH THREAT INTELLIGENCE

    Get PDF
    Presented herein are techniques that facilitate prioritizing risk mitigation efforts for business-critical services and transactions through the incorporation of a business context into threat intelligence scoring. Under aspects of the presented techniques, traditional threat intelligence tools may be employed to evaluate the risk that is associated with an enterprise asset; the results of such an evaluation may then be augmented with an enterprise-assigned business value for the asset to derive the asset’s business risk; and such a business risk may be leveraged to prioritize risk mitigation efforts, may be combined with other business risks, etc. The above-described process may be referred to herein as Business Risk Management (BRM)

    ASSESSING RISK INTRODUCED THROUGH A CODE CHANGE

    Get PDF
    Techniques are presented herein that shift the risk assessment focus during a software development process, away from the traditional end-of-process review (when a new feature is delivered, or an application is deployed) to earlier in the process when developers are actively at work. Such an approach allows a developer to assess the risk that a candidate software change is about to introduce prior to the developer committing that change, providing the developer with time (during the early portion of the process) to revisit the software and eliminate the identified risk. Aspects of the presented techniques leverage elements of a continuous integration (CI) and continuous deployment (CD) facility, the results that are available from existing unit and end-to-end tests, and the collection and analysis of OpenTelemetry (OTEL)-based metrics, events, logs, and traces (MELT) data to deliver security insights

    Segregationally stabilised plasmids improve production of commodity chemicals in glucose-limited continuous fermentation

    Get PDF
    Background: The production of chemicals via bio-based routes is held back by limited easy-to-use stabilisation systems. A wide range of plasmid stabilisation mechanisms can be found in the literature, however, how these mechanisms effect genetic stability and how host strains still revert to non-productive variants is poorly understood at the single-cell level. This phenomenon can generate difficulties in production-scale bioreactors as different populations of productive and non-productive cells can arise. To understand how to prevent non-productive strains from arising, it is vital to understand strain behaviour at a single-cell level. The persistence of genes located on plasmid vectors is dependent on numerous factors but can be broadly separated into structural stability and segregational stability. While structural stability refers to the capability of a cell to resist genetic mutations that bring about a loss of gene function in a production pathway, segregational stability refers to the capability of a cell to correctly distribute plasmids into daughter cells to maintain copy number. A lack of segregational stability can rapidly generate plasmid-free variants during replication, which compromises productivity. Results: Citramalate synthase expression was linked in an operon to the expression of a fluorescent reporter to enable rapid screening of the retention of a model chemical synthesis pathway in a continuous fermentation of E. coli. Cells without additional plasmid stabilisation started to lose productivity immediately after entering the continuous phase. Inclusion of a multimer resolution site, cer, enabled a steady-state production period of 58 h before a drop in productivity was detected. Single-cell fluorescence measurements showed that plasmid-free variants arose rapidly without cer stabilisation and that this was likely due to unequal distribution of plasmid into daughter cells during cell division. The addition of cer increased total chemical yield by more than 50%. Conclusions: This study shows the potential remains high for plasmids to be used as pathway vectors in industrial bio-based chemicals production, providing they are correctly stabilised. We demonstrate the need for accessible bacterial ‘toolkits’ to enable rapid production of known, stabilised bacterial production strains to enable continuous fermentation at scale for the chemicals industry

    Oceanographic Structure and Light Levels Drive Patterns of Sound Scattering Layers in a Low-Latitude Oceanic System

    Get PDF
    Several factors have been reported to structure the spatial and temporal patterns of sound scattering layers, including temperature, oxygen, salinity, light, and physical oceanographic conditions. In this study, we examined the spatiotemporal variability of acoustically detected sound scattering layers in the northern Gulf of Mexico to investigate the drivers of this variability, including mesoscale oceanographic features [e.g., Loop Current-origin water (LCOW), frontal boundaries, and Gulf Common Water]. Results indicate correlations in the vertical position and acoustic backscatter intensity of sound scattering layers with oceanographic conditions and light intensity. LCOW regions displayed consistent decreases, by a factor of two and four, in acoustic backscatter intensity in the upper 200 m relative to frontal boundaries and Gulf Common Water, respectively. Sound scattering layers had greater backscatter intensity at night in comparison to daytime (25x for frontal boundaries, 17x for LCOW, and 12x for Gulf Common Water). The importance of biotic (primary productivity) and abiotic (sea surface temperature, salinity) factors varied across oceanographic conditions and depth intervals, suggesting that the patterns in distribution and behavior of mesopelagic assemblages in low-latitude, oligotrophic ecosystems can be highly dynamic

    Using an agent-based model to analyze the dynamic communication network of the immune response

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The immune system behaves like a complex, dynamic network with interacting elements including leukocytes, cytokines, and chemokines. While the immune system is broadly distributed, leukocytes must communicate effectively to respond to a pathological challenge. The Basic Immune Simulator 2010 contains agents representing leukocytes and tissue cells, signals representing cytokines, chemokines, and pathogens, and virtual spaces representing organ tissue, lymphoid tissue, and blood. Agents interact dynamically in the compartments in response to infection of the virtual tissue. Agent behavior is imposed by logical rules derived from the scientific literature. The model captured the agent-to-agent contact history, and from this the network topology and the interactions resulting in successful versus failed viral clearance were identified. This model served to integrate existing knowledge and allowed us to examine the immune response from a novel perspective directed at exploiting complex dynamics, ultimately for the design of therapeutic interventions.</p> <p>Results</p> <p>Analyzing the evolution of agent-agent interactions at incremental time points from identical initial conditions revealed novel features of immune communication associated with successful and failed outcomes. There were fewer contacts between agents for simulations ending in viral elimination (<it>win</it>) versus persistent infection (<it>loss</it>), due to the removal of infected agents. However, early cellular interactions preceded successful clearance of infection. Specifically, more Dendritic Agent interactions with TCell and BCell Agents, and more BCell Agent interactions with TCell Agents early in the simulation were associated with the immune <it>win </it>outcome. The Dendritic Agents greatly influenced the outcome, confirming them as hub agents of the immune network. In addition, unexpectedly high frequencies of Dendritic Agent-self interactions occurred in the lymphoid compartment late in the <it>loss </it>outcomes.</p> <p>Conclusions</p> <p>An agent-based model capturing several key aspects of complex system dynamics was used to study the emergent properties of the immune response to viral infection. Specific patterns of interactions between leukocyte agents occurring early in the response significantly improved outcome. More interactions at later stages correlated with persistent inflammation and infection. These simulation experiments highlight the importance of commonly overlooked aspects of the immune response and provide insight into these processes at a resolution level exceeding the capabilities of current laboratory technologies.</p
    • …
    corecore