40 research outputs found

    A hybrid approach for selecting and optimizing graph traversal strategy for analyzing big code

    Get PDF
    Our newfound ability to analyze source code in massive software repositories such as GitHub has led to an uptick in data-driven solutions to software engineering problems. Source code analysis is often realized as traversals over source code artifacts represented as graphs. Since the number of artifacts that are analyzed is huge, in millions, the efficiency of the source code analysis technique is very important. The performance of source code analysis techniques heavily depends on the order of nodes visited during the traversals: the traversal strategy. For instance, selecting the best traversal strategy and optimizing it for a software engineering task, that infers the temporal specification between pairs of API method calls, could reduce the running time on a large codebase from 64% to 96%. While, there exists several choices for traversal strategy, like depth-first, post-order, reverse post-order, etc., there exists no technique to choose the most time-efficient strategy for traversals. In this paper, we show that a single traversal strategy does not fit all source code analysis scenarios. Somewhat more surprisingly, we demonstrate that given the source code expressing the analysis task (in a declarative form) one can compute static characteristics of the task, which together with the runtime characteristics of the input, can help predict the most time-efficient traversal strategy for that (analysis task, input) pair. We also demonstrate that these strategies can be realized in a manner that is effective in accelerating ultra-large-scale source code analysis. Our evaluation shows that our technique successfully selected the most time-efficient traversal strategy for 99.99%-100% of the time and using the selected traversal strategy and optimizing it, the running times of a representative collection of source code analysis in our evaluation were considerably reduced by 1%-28% (13 minutes to 72 minutes in absolute time) when compared against the best performing traversal strategy. The case studies show that hybrid traversal reduces 80--175 minutes in running times for two software engineering tasks. The overhead imposed by collecting additional information for our approach is less than 0.2% of the total running time for a large dataset that contains 287K Control Flow Graphs (CFGs) and less than 0.01% for an ultra-large dataset that contains 162M CFGs

    A sequential study of circulating immune complexes, complement mediated IC solubilisation and immunoglobulins in borderline tuberculoid patients with and without reactions

    Get PDF
    Sequential estimates of the levels of circulating immune complexes (CIC), complement catabolic fragment C3d, complement-mediated immune complex solubilization (CMS) and immunoglobulins were made in 24 newly diagnosed patients with borderline tuberculoid leprosy over a 20 month period after initiation of chemotherapy. Fourteen of these patients had not suffered from reversal reactions either at the time of presentation or during the follow-up period. The levels of CIC were elevated in them from the third to the eleventh month after starting chemotherapy and immunoglobulin G (IgG) levels were elevated upto eight months. The concentrations of C3d and immunoglobulins A (IgA) and M (IgM) were normal in these patients. The other ten patients had reversal reaction at the time of diagnosis which subsided by the third month after starting treatment. They did not have reversal reactions later. The levels of CIC and IgG were elevated and those of CMS were depressed throughout the study period. Serum C3d level was initially elevated but came down to normal by the third month while IgA and IgM levels were within normal limits. The relevance of these findings to the genesis of reversal reaction is discussed in this communication

    Hybrid Traversal: Efficient Source Code Analysis at Scale

    Get PDF
    Source code analysis at a large scale is useful for solving many software engineering problems, however, could be very expensive, thus, making its use difficult. This work proposes hybrid traversal, a technique for performing source code analysis over control flow graphs more efficiently. Analysis over a control flow graph requires traversing the graph and it can be done using several traversal strategies. Our observation is that no single traversal strategy is suitable for different analyses and different graphs. Our key insight is that using the characteristics of the analysis and the properties of the graph it is possible to select a most efficient traversal strategy for a pair. Our evaluation using a set of analyses with different characteristics and a large dataset of graphs with different properties shows up to 30% reduction in the analysis time. Further, the overhead of our technique for selecting the most efficient traversal strategy is very low; between 0.01%-0.2%

    Histological and immunological correlates of suspected leprosy lesions

    Get PDF
    Thirty-two subjects with suspected leprosy lesions were investigated to assess various modalities of sensibility and sweatfunction and these were correlated with immunological and histological parameters. It was found that pain and temperature, mediated by small unmyelinated fibres were impaired in the early lesions. Impairment of sweat function was seen only when one of the modalities of sensibility was also affected Antibodies specific to a protein (35 kDa) antigen and phenolic glycolipid 1 of Mycobacterium leprae were positive in nine and 12 cases respectively, while 15 of the 31 biopsies revealed the presence of mycobacterial antigens in these lesions. The implications of these findings are discussed

    Effect of spark plasma sintering and high-pressure torsion on the microstructural and mechanical properties of a Cu–SiC composite

    Get PDF
    This investigation examines the problem of homogenization in metal matrix composites (MMCs) and the methods of increasing their strength using severe plastic deformation (SPD). In this research MMCs of pure copper and silicon carbide were synthesized by spark plasma sintering (SPS) and then further processed via highpressure torsion (HPT). The microstructures in the sintered and in the deformed materials were investigated using Scanning Electron Microscopy (SEM) and Scanning Transmission Electron Microscopy (STEM). The mechanical properties were evaluated in microhardness tests and in tensile testing. The thermal conductivity of the composites was measured with the use of a laser pulse technique. Microstructural analysis revealed that HPT processing leads to an improved densification of the SPS-produced composites with significant grain refinement in the copper matrix and with fragmentation of the SiC particles and their homogeneous distribution in the copper matrix. The HPT processing of Cu and the Cu-SiC samples enhanced their mechanical properties at the expense of limiting their plasticity. Processing by HPT also had a major influence on the thermal conductivity of materials. It is demonstrated that the deformed samples exhibit higher thermal conductivity than the initial coarse-grained samples

    Effect of sitagliptin on cardiovascular outcomes in type 2 diabetes

    Get PDF
    BACKGROUND: Data are lacking on the long-term effect on cardiovascular events of adding sitagliptin, a dipeptidyl peptidase 4 inhibitor, to usual care in patients with type 2 diabetes and cardiovascular disease. METHODS: In this randomized, double-blind study, we assigned 14,671 patients to add either sitagliptin or placebo to their existing therapy. Open-label use of antihyperglycemic therapy was encouraged as required, aimed at reaching individually appropriate glycemic targets in all patients. To determine whether sitagliptin was noninferior to placebo, we used a relative risk of 1.3 as the marginal upper boundary. The primary cardiovascular outcome was a composite of cardiovascular death, nonfatal myocardial infarction, nonfatal stroke, or hospitalization for unstable angina. RESULTS: During a median follow-up of 3.0 years, there was a small difference in glycated hemoglobin levels (least-squares mean difference for sitagliptin vs. placebo, -0.29 percentage points; 95% confidence interval [CI], -0.32 to -0.27). Overall, the primary outcome occurred in 839 patients in the sitagliptin group (11.4%; 4.06 per 100 person-years) and 851 patients in the placebo group (11.6%; 4.17 per 100 person-years). Sitagliptin was noninferior to placebo for the primary composite cardiovascular outcome (hazard ratio, 0.98; 95% CI, 0.88 to 1.09; P<0.001). Rates of hospitalization for heart failure did not differ between the two groups (hazard ratio, 1.00; 95% CI, 0.83 to 1.20; P = 0.98). There were no significant between-group differences in rates of acute pancreatitis (P = 0.07) or pancreatic cancer (P = 0.32). CONCLUSIONS: Among patients with type 2 diabetes and established cardiovascular disease, adding sitagliptin to usual care did not appear to increase the risk of major adverse cardiovascular events, hospitalization for heart failure, or other adverse events

    A hybrid approach for selecting and optimizing graph traversal strategy for analyzing big code

    No full text
    Our newfound ability to analyze source code in massive software repositories such as GitHub has led to an uptick in data-driven solutions to software engineering problems. Source code analysis is often realized as traversals over source code artifacts represented as graphs. Since the number of artifacts that are analyzed is huge, in millions, the efficiency of the source code analysis technique is very important. The performance of source code analysis techniques heavily depends on the order of nodes visited during the traversals: the traversal strategy. For instance, selecting the best traversal strategy and optimizing it for a software engineering task, that infers the temporal specification between pairs of API method calls, could reduce the running time on a large codebase from 64% to 96%. While, there exists several choices for traversal strategy, like depth-first, post-order, reverse post-order, etc., there exists no technique to choose the most time-efficient strategy for traversals. In this paper, we show that a single traversal strategy does not fit all source code analysis scenarios. Somewhat more surprisingly, we demonstrate that given the source code expressing the analysis task (in a declarative form) one can compute static characteristics of the task, which together with the runtime characteristics of the input, can help predict the most time-efficient traversal strategy for that (analysis task, input) pair. We also demonstrate that these strategies can be realized in a manner that is effective in accelerating ultra-large-scale source code analysis. Our evaluation shows that our technique successfully selected the most time-efficient traversal strategy for 99.99%-100% of the time and using the selected traversal strategy and optimizing it, the running times of a representative collection of source code analysis in our evaluation were considerably reduced by 1%-28% (13 minutes to 72 minutes in absolute time) when compared against the best performing traversal strategy. The case studies show that hybrid traversal reduces 80--175 minutes in running times for two software engineering tasks. The overhead imposed by collecting additional information for our approach is less than 0.2% of the total running time for a large dataset that contains 287K Control Flow Graphs (CFGs) and less than 0.01% for an ultra-large dataset that contains 162M CFGs.</p

    Introductory Econometrics With Applications

    No full text
    xvii. 633 hal.;23 c
    corecore