147 research outputs found

    MEASURING AND QUANTIFYING WEB APPLICATION DESIGN

    Get PDF
    The World Wide Web makes it easier for organizations and people to share information across great distances at a minimal cost. In addition, governments and businesses are beginning to offer services on the World Wide Web to their respective populations and customers. Unlike traditional desktop based applications where the programming language used are most likely Object Oriented (OO) based languages such as C++ or Java, most web applications are built upon more lightweight scripting languages. These languages typically don’t support all the OO features that more traditional languages support. In addition, some web applications are dependent on external web services or applications. These differences make it difficult to use traditional measuring techniques such as Quality Measurements in Object Oriented Design to quantify the complexity of a web application or its quality. This paper will propose a set of measurements derived from traditional Object Oriented metrics such as QMOOD, and attempt to use them on two Content Management Systems; Drupal and WordPress. These measurements will try to quantify the size and complexity of the two content management systems and make comparisons between them

    Bandwidth is Political: Reachability in the Public Internet

    Full text link

    The placenames of Midlothian

    Get PDF

    The British Air Campaign during the Battle of the Somme, April-November, 1916: A Pyrrhic Victory

    Get PDF
    The British Air CampaigndDuring the Battle of the Somme, April-November, 1916: A Pyrrhic Victory The Battle of the Somme was Britain's first major offensive of the First World War. Just about every facet of the campaign has been analyzed and reexamined. However, one area of the battle that has been little explored is the second battle which took place simultaneously to the one on the ground. This second battle occurred in the skies above the Somme, where for the first time in the history of warfare a deliberate air campaign was planned and executed to support ground operations. The British Royal Flying Corps (RFC) was tasked with achieving air superiority over the Somme sector before the British Fourth Army attacked to start the ground offensive. This study focuses on the Royal Flying Corps, its organization and leaders, as well as the strategy and doctrine it employed in its attempt to regain air superiority from the German Army Air Service (GAAS) in the spring of 1916. Prior to the start of the ground battle, the commander of the RFC, General Hugh Trenchard, directed his squadrons accomplish six tactical tasks in order for the RFC to achieve aerial superiority over the Somme. These tasks were: 1) aerial reconnaissance, 2) aerial photography, 3) observation and direction of artillery, 4) tactical bombing, 5) `contact' patrols in support of the infantry and 6) air combat against the GAAS to enable achievement of the other five tasks. Critical to answering the question of whether or not the RFC accomplished its assigned tasks this study also examines the development of air power strategy by the RFC before and during the battle. Five factors are used to frame the analysis: strategy, organization, leadership, selection and training of aircrew. Although the RFC suffered high losses because it rigidly adhered to an offensive strategy throughout the air campaign, when the battle ended, the RFC still controlled the skies above the Somme. While the ground campaign failed to accomplish most of its stated objectives, historians have argued that the air campaign was a victory for the RFC. This paper contends that because of the heavy aircrew casualties it in fact proved to be a Pyrrhic victory. The consequences of maintaining a continuous air offensive over the Somme led to nearly disastrous results for the RFC in its subsequent air campaign over Arras in April, 1917

    Judicialization: The Twilight of Administrative Law

    Get PDF
    At its December, 1984 Plenary Session, the Administrative Conference of the United States devoted a part of its agenda to an exchange of ideas on the current state of administrative law and the directions in which this field is likely to move-or be pushed-in the foreseeable future. Perhaps because modern administrative agencies are such a curious admixture of the political, bureaucratic, and judicial components of government, the study of administrative law derives particular benefits from analyses and critiques that emphasize social utility as well as legal precedent. In no other area of the law do the current political agenda and social climate affect so directly both the legal process and its end products. The deliberately provocative essay that follows was written especially for this year\u27s Administrative Law Issue by Loren A. Smith, Chairman of the Administrative Conference. Mr. Smith argues that the current level of judicialization and overproceduralization of the administrative process is a symptom of a fundamental dysfunction. He reminds us that formal methodologies cannot by themselves resolve the difficult issues that inevitably arise in the context of those important social programs placed under the auspices of the administrative agencies and argues that an infatuation with procedural safeguards-the traditional focus of administrative law studies-is counterproductive insofar as it has the effect of diverting attention away from critical substantive problems. Hard looks, Mr. Smith concludes, may be a frustrated body politic\u27s way of avoiding hard choices, and he calls for a renewed recognition of the essentially political-and hence fully accountable-nature of the federal administrative process

    Maritime Jurisdiction and Longshoremen’s Remedies

    Get PDF

    Contribution to reliable end-to-end communication over 5G networks using advanced techniques

    Get PDF
    5G cellular communication, especially with its hugely available bandwidth provided by millimeter-wave, is a promising technology to fulfill the coming high demand for vast data rates. These networks can support new use cases such as Vehicle to Vehicle and augmented reality due to its novel features such as network slicing along with the mmWave multi-gigabit-persecond data rate. Nevertheless, 5G cellular networks suffer from some shortcomings, especially in high frequencies because of the intermittent nature of channels when the frequency rises. Non-line of sight state is one of the significant issues that the new generation encounters. This drawback is because of the intense susceptibility of higher frequencies to blockage caused by obstacles and misalignment. This unique characteristic can impair the performance of the reliable transport layer widely deployed protocol, TCP, in attaining high throughput and low latency throughout a fair network. As a result, the protocol needs to adjust the congestion window size based on the current situation of the network. However, TCP cannot adjust its congestion window efficiently, which leads to throughput degradation of the protocol. This thesis presents a comprehensive analysis of reliable end-to-end communications in 5G networks and analyzes TCP’s behavior in one of the 3GPP’s well-known s cenarios called urban deployment. Furtherm ore, two novel TCPs bas ed on artificial intelligence have been proposed to deal with this issue. The first protocol uses Fuzzy logic, a subset of artificial intelligence, and the second one is based on deep learning. The extensively conducted simulations showed that the newly proposed protocols could attain higher performance than common TCPs, such as BBR, HighSpeed, Cubic, and NewReno in terms of throughput, RTT, and sending rate adjustment in the urban scenario. The new protocols' superiority is achieved by employing smartness in the conges tions control mechanism of TCP, which is a powerful enabler in fos tering TCP’s functionality. To s um up, the 5G network is a promising telecommunication infrastructure that will revolute various aspects of communication. However, different parts of the Internet, such as its regulations and protocol stack, will face new challenges, which need to be solved in order to exploit 5G capacity, and without intelligent rules and protocols, the high bandwidth of 5G, especially 5G mmWave will be wasted. Two novel schemes to solve the issues have been proposed based on an Artificial Intelligence subset technique called fuzzy and a machine learning-based approach called Deep learning to enhance the performance of 5G mmWave by improving the functionality of the transport layer. The obtained results indicated that the new schemes could improve the functionality of TCP by giving intelligence to the protocol. As the protocol works more smartly, it can make sufficient decisions on different conditions.La comunicació cel·lular 5G, especialment amb l’amplada de banda molt disponible que proporciona l’ona mil·limètrica, és una tecnologia prometedora per satisfer l’elevada demanda de grans velocitats de dades. Aquestes xarxes poden admetre casos d’ús nous, com ara Vehicle to Vehicle i realitat augmentada, a causa de les seves novetats, com ara el tall de xarxa juntament amb la velocitat de dades mWave de multi-gigabit per segon. Tot i això, les xarxes cel·lulars 5G pateixen algunes deficiències, sobretot en freqüències altes a causa de la naturalesa intermitent dels canals quan augmenta la freqüència. L’estat de no visió és un dels problemes significatius que troba la nova generació. Aquest inconvenient es deu a la intensa susceptibilitat de freqüències més altes al bloqueig causat per obstacles i desalineació. Aquesta característica única pot perjudicar el rendiment del protocol TCP, àmpliament desplegat, de capa de transport fiable en aconseguir un alt rendiment i una latència baixa en tota una xarxa justa. Com a resultat, el protocol ha d’ajustar la mida de la finestra de congestió en funció de la situació actual de la xarxa. Tot i això, TCP no pot ajustar la seva finestra de congestió de manera eficient, cosa que provoca una degradació del rendiment del protocol. Aquesta tesi presenta una anàlisi completa de comunicacions extrem a extrem en xarxes 5G i analitza el comportament de TCP en un dels escenaris coneguts del 3GPP anomenat desplegament urbà. A més, s'han proposat dos TCP nous basats en intel·ligència artificial per tractar aquest tema. El primer protocol utilitza la lògica Fuzzy, un subconjunt d’intel·ligència artificial, i el segon es basa en l’aprenentatge profund. Les simulacions àmpliament realitzades van mostrar que els protocols proposats recentment podrien assolir un rendiment superior als TCP habituals, com ara BBR, HighSpeed, Cubic i NewReno, en termes de rendiment, RTT i ajust d’índex d’enviament en l’escenari urbà. La superioritat dels nous protocols s’aconsegueix utilitzant la intel·ligència en el mecanisme de control de congestions de TCP, que és un poderós facilitador per fomentar la funcionalitat de TCP. En resum, la xarxa 5G és una prometedora infraestructura de telecomunicacions que revolucionarà diversos aspectes de la comunicació. No obstant això, diferents parts d’Internet, com ara les seves regulacions i la seva pila de protocols, s’enfrontaran a nous reptes, que cal resoldre per explotar la capacitat 5G, i sens regles i protocols intel·ligents, l’amplada de banda elevada de 5G, especialment 5G mmWave, pot ser desaprofitat. S'han proposat dos nous es quemes per resoldre els problemes basats en una tècnica de subconjunt d'Intel·ligència Artificial anomenada “difusa” i un enfocament basat en l'aprenentatge automàtic anomenat “Aprenentatge profund” per millorar el rendiment de 5G mmWave, millorant la funcionalitat de la capa de transport. Els resultats obtinguts van indicar que els nous esquemes podrien millorar la funcionalitat de TCP donant intel·ligència al protocol. Com que el protocol funciona de manera més intel·ligent, pot prendre decisions suficients en diferents condicionsPostprint (published version

    Accelerability vs. scalability : R&D investment under financial constraints and competition

    Get PDF
    I develop a continuous-time model to examine how the interaction between competition and financial constraints affects firms’ research and development (R&D) strategies. The model integrates two key characteristics of R&D investment: accelerability (i.e., higher R&D intensity leads to faster discovery) and scalability (i.e., higher R&D intensity leads to higher project payoff). I find that firms react strategically to their rivals’ financial constraints when making investment decisions in a duopoly R&D race. In particular, firms respond positively to the R&D intensity of an unconstrained rival, while they respond in a hump-shaped fashion to the R&D intensity of a constrained rival. As a result, a constrained firm can pre-empt an unconstrained competitor in market equilibrium. Accelerability is necessary for such pre-emption to occur, and scalability generally reduces its likelihood. Comparison with a monopoly benchmark shows that the economic mechanism differs from over-investment induced by financial constraints alone. The model also generates new implications regarding how project characteristics and cash flow risks impact R&D decisions

    Holistic Network Defense: Fusing Host and Network Features for Attack Classification

    Get PDF
    This work presents a hybrid network-host monitoring strategy, which fuses data from both the network and the host to recognize malware infections. This work focuses on three categories: Normal, Scanning, and Infected. The network-host sensor fusion is accomplished by extracting 248 features from network traffic using the Fullstats Network Feature generator and from the host using text mining, looking at the frequency of the 500 most common strings and analyzing them as word vectors. Improvements to detection performance are made by synergistically fusing network features obtained from IP packet flows and host features, obtained from text mining port, processor, logon information among others. In addition, the work compares three different machine learning algorithms and updates the script required to obtain network features. Hybrid method results outperformed host only classification by 31.7% and network only classification by 25%. The new approach also reduces the number of alerts while remaining accurate compared with the commercial IDS SNORT. These results make it such that even the most typical users could understand alert classification messages

    Data Fingerprinting -- Identifying Files and Tables with Hashing Schemes

    Get PDF
    Master's thesis in Computer scienceINTRODUCTION: Although hash functions are nothing new, these are not limited to cryptographic purposes. One important field is data fingerprinting. Here, the purpose is to generate a digest which serves as a fingerprint (or a license plate) that uniquely identifies a file. More recently, fuzzy fingerprinting schemes — which will scrap the avalanche effect in favour of detecting local changes — has hit the spotlight. The main purpose of this project is to find ways to classify text tables, and discover where potential changes or inconsitencies have happened. METHODS: Large parts of this report can be considered applied discrete mathematics — and finite fields and combinatorics have played an important part. Rabin’s fingerprinting scheme was tested extensively and compared against existing cryptographic algorithms, CRC and FNV. Moreover, a self-designed fuzzy hashing algorithm with the preliminary name No-Frills Hash has been created and tested against Nilsimsa and Spamsum. NFHash is based on Mersenne primes, and uses a sliding window to create a fuzzy hash. Futhermore, the usefullness of lookup tables (with partial seeds) were also explored. The fuzzy hashing algorithm has also been combined with a k-NN classifier to get an overview over it’s ability to classify files. In addition to NFHash, Bloom filters combined with Merkle Trees have been the most important part of this report. This combination will allow a user to see where a change was made, despite the fact that hash functions are one-way. Large parts of this project has dealt with the study of other open-source libraries and applications, such as Cassandra and SSDeep — as well as how bitcoins work. Optimizations have played a crucial role as well; different approaches to a problem might lead to the same solution, but resource consumption can be very different. RESULTS: The results have shown that the Merkle Tree-based approach can track changes to a table very quickly and efficiently, due to it being conservative when it comes to CPU resources. Moreover, the self-designed algorithm NFHash also does well in terms of file classification when it is coupled with a k-NN classifyer. CONCLUSION: Hash functions refers to a very diverse set of algorithms, and not just algorithms that serve a limited purpose. Fuzzy Fingerprinting Schemes can still be considered to be at their infant stage, but a lot has still happened the last ten years. This project has introduced two new ways to create and compare hashes that can be compared to similar, yet not necessarily identical files — or to detect if (and to what extent) a file was changed. Note that the algorithms presented here should be considered prototypes, and still might need some large scale testing to sort out potential flaw
    corecore