25 research outputs found

    Measuring the Programming Complexity of C and C++ using Halstead Metrics

    Get PDF
    Computer algorithm is the core of computer science and important prerequisite of computer science professionals. However, its hard and abstract nature makes it difficult to understand. Pedagogical issues in learning of algorithms are generally resolved through elaborating the algorithms with their implementation in some programming language. As there are many programming languages, the selection of appropriate programming language for effective implementation of algorithms remains a challenging issue. In this article, common algorithms of data structures are measured by analyzing their implementation in C and C++ through Halstead complexity metrics. The statistical tests identified that as compare to C++ the less effort, time and bugs are involved in C for the implementation of algorithms, whereas C++ involves less difficulty during the implementation of algorithms. The work stated in this article provide a novel aspect to relate and evaluate other programming languages

    An Approach to Bridge Inspection Using 3D Laser Scanners and Digital Photographs

    Get PDF
    Bridges are an integral component of infrastructure systems, which play a critical role in the development of the economy, society, and national security. However, bridges have not received adequate care and are deteriorating rapidly. More than 9% of bridges in the United States are structurally deficient and need immediate repairs. A major contributing factor to this deficiency is a lack of adequate and accurate inspection processes. Current methods of bridge inspection and assessment involve a reiterative paper-based process that requires manual data entry and extraction. The inspection team analyzes the critical portions of a bridge, identifies problem severity, documents the damages and concentrates on the cause of the problem. This paper-based process is complex, time-consuming and error-prone. To eliminate human errors attached with surveying and the data collection process, practitioners recently have used automated techniques and advanced equipment to inspect bridge conditions. This research introduces a combination of 3D laser scanning and photographic techniques to determine important attributes of bridge inspection. A terrestrial laser scanner is used to collect point cloud data to create a 3D model of the bridge structure. Three-dimensional geometrical information of bridge structure is extracted from the point cloud 3D model with accuracy level in accordance with national bridge inventory (NBI) specifications. The occurrence of cracks in bridge components is a clear sign of potential damage and must be assessed critically. In order to determine the severity of damage, it is important to compute the width of cracks and compare the data with an allowable limit as specified by NBI or state department of transportation (DOT). In addition, to examine geometrical surveying data, this research proposes a framework to detect cracks in the bridge structure. The framework is verified and validated using a case project. The results of this study contribute to the construction engineering and management body of knowledge by demonstrating the extraction of geometric data for bridge inspection in accordance with NBI accuracy specifications using a laser scanner. This study also demonstrates an automated technique to assess structural health by detecting cracks in a concrete bridge using digital photographs and computing the width of those cracks

    Correlation between Ultrasonographic Grading of Fatty Liver and Lipid Profile

    Get PDF
    Fatty liver disease can easily cause  detrimental  changes when it slolwly  progressess towards the aggressive stages of  liver fibrosis leading to cirrhosis, and it can also cause collateral damages in the form of   cardiovascular diseases and atherosclerotic disease. The main  purpose of this study is to analyze the correlation between the grading of fatty liver disease provided by the ultrasound with lipid profile work of the affected patient.  A comparative analytical study  was conducted on 138 patients affected with fatty liver disease,the patients were chosen by  convenient sampling technique. Study was conducted at Radiology Department of General Hospital Lahore and Sheikh Zayed Hospital Rahimyar khan during July 2019 to October 2019. Total 138 patients data were analyzed. Out of 138 males were 65 (47.1%) males and females were 73 (52.9%) w.The study indicated  that the total number of patients having grade 1 fatty liver was 84 (60.9%), patients having grade 2 fatty liver was 52 (37.7%) and patients having grade 3 fatty liver was 2 (1.4%). Out of 138 patients there were total  67 (48.6%)cases of patients with fatty liver disease and  out of 138 patients, 71 (51.4%) were considered normal .Female patients were allegedly found to be more effected with fatty liver than males and majority of patients fell into the category of grade1 fatty liver. Since the majority of the patient fell into the category of grade 1 of fatty liver disese and there lipid profile test also indicated towards normal values so the risk of developing any cardiovascular disease among the patients was mininal

    Initial Steps Towards a Clinical FLASH Radiotherapy System: Pediatric Whole Brain Irradiation with 40 MeV Electrons at FLASH Dose Rates

    Get PDF
    In this work, we investigated the delivery of a clinically acceptable pediatric whole brain radiotherapy plan at FLASH dose rates using two lateral opposing 40-MeV electron beams produced by a practically realizable linear accelerator system. The EGSnrc Monte Carlo software modules, BEAMnrc and DOSXYZnrc, were used to generate whole brain radiotherapy plans for a pediatric patient using two lateral opposing 40-MeV electron beams. Electron beam phase space files were simulated using a model of a diverging beam with a diameter of 10 cm at 50 cm SAD (defined at brain midline). The electron beams were collimated using a 10-cm-thick block composed of 5 cm of aluminum oxide and 5 cm of tungsten. For comparison, a 6-MV photon plan was calculated with the Varian AAA algorithm. Electron beam parameters were based on a novel linear accelerator designed for the PHASER system and powered by a commercial 6-MW klystron. Calculations of the linear accelerator's performance indicated an average beam current of at least 6.25 µA, providing a dose rate of 115 Gy/s at isocenter, high enough for cognition-sparing FLASH effects. The electron plan was less homogenous with a homogeneity index of 0.133 compared to the photon plan's index of 0.087. Overall, the dosimetric characteristics of the 40-MeV electron plan were suitable for treatment. In conclusion, Monte Carlo simulations performed in this work indicate that two lateral opposing 40-MeV electron beams can be used for pediatric whole brain irradiation at FLASH dose rates of >115 Gy/s and serve as motivation for a practical clinical FLASH radiotherapy system, which can be implemented in the near future

    Les effets de législations environnementales sur la chaîne d'approvisionnement

    No full text
    Climate change and global temperature rise has made environmental legislations a focal point of discussion. This dissertation is devoted to the study of environmental legislations and their effect on supply chain practices. More precisely, our center of interest is the product recovery based legislation along with compliance based regulations. We explore the reuse potential and the environmental and economical aspects of different product recovery based legislation schemes by modeling a stackelberg game between a social welfare maximizing policy maker and a profit maximizing monopolistic firm and find that a combination of existing recovery policies i.e., a recovery target in combination with incentive structure such as taxation/subsidy may lead to better outcomesnot only from environmental perspective but also from economical aspects. In Chapter 2, we extend the discussion comparative performance of the recovery legislation based schemes in presence of innovation and product design issues and show how unintended environmental outcomes may appear if the policy framework is not adequately designed. In Chapter 3, wecapture the effect of recovery legislation and compliance based legislation on product selection when a firm serves a number of markets. We incorporate the effects of uncertainty associated with market demands and recovery cost parameters and present a robust optimization based method for product selection and allocation decisions.Cette thèse est consacrée à l'étude des législations environnementales et leurs effets sur la chaîne d'approvisionnement. Plus précisément, nous nous intéressons à la législation basée sur le recyclage du produit mais aussi sur les normes de conformité (ROHS). Nous étudions le potentiel de réutilisation ainsi que les aspects environnementaux et économiques de différents systèmes de législation. La solution se présente sous forme d'une combinaison de politiques de récupération qui mène à de meilleurs résultats sur le plan écologique ainsi que sur le plan économique.Dans la deuxième partie de la thèse, Nous étudions la performance comparative des régimes à base sur la législation de récupération avec des problématiques d'innovation et de conception de produits. La politique de réutilisation des produits peut aggraver l’environnement si le cadre de la régulation n'est pas bien défini. Dans la dernière partie, une étude est menée sur le choix des produits dans une chaine d’approvisionnement avec des législations basées sur la récupération et sur la conformité des produits. Nous intégrons les effets de l'incertitude associée à la demande du marché et les paramètres de coût de récupération. Une méthode d'optimisation robuste pour la sélection et distribution des produits est présentée

    Pedagogical Significance of Natural Language Programming in Introductory Programming

    No full text
    Learning programming is hard for novice students. Complicated syntax and semantic of programming languages and lack of previous knowledge are the contributing factors behind the hardness of programming. Natural programming language allows to program in a natural language and thereby ease the programming. In this paper, it is ascertained whether natural programming language is fruitful in learning the elementary programming concepts and supportive in preparing students for introductory programming courses. The discussion included in this paper can be used to design supportive programming languages and formulating effective courses and learning material to ameliorate performance of students’ in learning of introductory programming environments

    Learners Programming Language a Helping System for Introductory Programming Courses

    No full text
    Programming is the core of computer science and due to this momentousness a special care is taken in designing the curriculum of programming courses. A substantial work has been conducted on the definition of programming courses, yet the introductory programming courses are still facing high attrition, low retention and lack of motivation. This paper introduced a tiny pre-programming language called LPL (Learners Programming Language) as a ZPL (Zeroth Programming Language) to illuminate novice students about elementary concepts of introductory programming before introducing the first imperative programming course. The overall objective and design philosophy of LPL is based on a hypothesis that the soft introduction of a simple and paradigm specific textual programming can increase the motivation level of novice students and reduce the congenital complexities and hardness of the first programming course and eventually improve the retention rate and may be fruitful in reducing the dropout/failure level. LPL also generates the equivalent high level programs from user source program and eventually very fruitful in understanding the syntax of introductory programming languages. To overcome the inherent complexities of unusual and rigid syntax of introductory programming languages, the LPL provide elementary programming concepts in the form of algorithmic and plain natural language based computational statements. The initial results obtained after the introduction of LPL are very encouraging in motivating novice students and improving the retention rate

    The Impact of Language Syntax on the Complexity of Programs: A Case Study of Java and Python

    No full text
    Programming is the cornerstone of computer science, yet it is difficult to learn and program. The syntax of a programming language is particularly challenging to comprehend, which makes learning arduous and affects the program's testability. There is currently no literature that definitively gives quantitative evidence about the effect of programming language complex syntax. The main purpose of this article was to examine the effects of programming syntax on the complexity of their source programs. During the study, 298 algorithms were selected and their implementations in Java and Python were analyzed with the cyclomatic complexity matrix. The results of the study show that Python's syntax is less complex than Java's, and thus coding in Python is more comprehensive and less difficult than Java coding. The Mann-Whitney U test was performed on the results of a statistical analysis that showed a significant difference between Java and Python, indicating that the syntax of a programming language has a major impact on program complexity. The novelty of this article lies in the formulation of new knowledge and study patterns that can be used primarily to compare and analyze other programming languages

    Recovery Targets and Taxation/Subsidy Policies to Promote Product Reuse

    No full text
    This paper seeks to identify the optimal policies for promoting product recovery and remanufacturing. Using a stylized equilibrium model, we analyze the problem as a Stackelberg game between a regulator and a monopolistic firm. We compare three types of policies that legislated regulation could effect: (i) A recovery target policy that requires firms to recover no less than a specified fraction of their production for proper disposal or possible remanufacturing; (ii) a taxation policy that both taxes manufacturing and subsidizes remanufacturing; and (iii) a newly introduced mixed approach that incorporates a recovery target as well as taxes and subsidies. We study a firm's behavior under the three policy types, including pricing decisions for new and remanufactured products as well as the strategic decision of whether to create a secondary channel for remanufactured products. We find that legislative intervention makes it more likely that firms will maintain a single-market strategy. We further demonstrate the mixed approach's superiority as measured by a comprehensive set of economic and environmental criteria, and show that this finding is robust under two different objective functions for the policy maker, one that does and one that does not entail a budget neutrality constraint

    Analysis of Code Vulnerabilities in Repositories of GitHub and Rosettacode: A comparative Study

    No full text
    Open-source code hosted online at programming portals is present in 99% of commercial software and is common practice among developers for rapid prototyping and cost-effective development. However, research reports the presence of vulnerabilities, which result in catastrophic security compromise, and the individual, organization, and even national secrecy are all victims of this circumstance. One of the frustrating aspects of vulnerabilities is that vulnerabilities manifest themselves in hidden ways that software developers are unaware of. One of the most critical tasks in ensuring software security is vulnerability detection, which jeopardizes core security concepts like integrity, authenticity, and availability. This study aims to explore security-related vulnerabilities in programming languages such as C, C++, and Java and present the disparities between them hosted at popular code repositories. To attain this purpose, 708 programs were examined by severity-based guidelines. A total of 1371 vulnerable codes were identified, of which 327 in C, 51 in C++, and 993 in Java. Statistical analysis also indicated a substantial difference between them, as there is ample evidence that the Kruskal-Wallis H-test p-value (.000) is below the 0.05 significance level. The Mann-Whitney Test mean rank for GitHub (Mean-rank=676.05) and Rosettacode (Mean-rank=608.64) are also different. The novelty of this article is to identify security vulnerabilities and grasp the nature severity of vulnerability in popular code repositories. This study eventually manifests a guideline for choosing a secure programming language as a successful testing technique that targets vulnerabilities more liable to breaching security. Full Tex
    corecore