3 research outputs found

    Structural profiling of Web sites in the wild

    Get PDF
    The paper reports results of a large-scale survey of 708 websites, in order to measure various features related to their size and structure: DOM tree size, maximum degree, depth, diversity of element types and CSS classes, among others. The goal of this research is to serve as a reference point for studies that include an empirical evaluation on samples of web pages

    Streaming-Based Progressive Enhancement of Websites for Slow and Error-Prone Networks

    Get PDF
    This thesis aims to improve the loading times of web pages by streaming the content in a non-render-blocking way. At the beginning of this thesis, a large-scale analysis was performed, spanning all downloadable pages of the top 10.000 web pages according to the Tranco-list. This analysis aimed to gather data about the render-blocking properties of web page resources, including HTML, JavaScript, and CSS. It further gathered data about code coverage, giving insight into how much of the render-blocking code is actually used. Therefore, the structural optimization potential could be determined. Less render-blocking code will, in turn, lead to faster loading times due to requiring less data to display the page. The analysis showed that there is significant optimization potential left. On average, modern web pages are built with a combined 86.7% of JavaScript and CSS, the rest being HTML. Both JavaScript and CSS are loaded mostly render-blocking, with 91.8% of JavaScript and 89.47% of CSS loaded in this way. Furthermore, only 40.8% of JavaScript and 15.9% of CSS is used until render. This shows that, on average, web pages have significant room for improvement. The concept, which is then developed based on the results of this analysis, aims to load web pages in a new way by streaming all render-blocking content. The related work showed that multiple sub-techniques are required first, which were conceptualized next. First, an optimization and splitting tool for CSS is proposed, called Essential. This is followed by an optimization framework concept for JavaScript, consisting of Waiter and AUTRATAC. Lastly, a backward-compatible approach was developed, which allows for splitting HTML and streaming all content to a client. The evaluation showed that the streamed web page loads significantly faster when comparing FCP, content ”Above-the-Fold,” and total transfer time of all render-blocking resources of the document. For example, the case study test determined that the streamed page could reduce the time until FCP by 83.3% at 2 Mbps and the time until the last render-blocking data is transferred by up to 70.4% at 2 Mbps. Furthermore, existing streaming methods were also compared, determining that WebSockets meets the requirements to stream web page content sufficiently. Lastly, an anonymous online user questionnaire showed that 85% of users preferred this new style of loading pages

    Correction automatique d’erreurs visuelles dans les applications web

    Get PDF
    Cela fait maintenant plusieurs années que les chercheurs se penchent sur le débogage des applications web, chose très complexes pour le développeur dû à l’intrication parfois étrange qu’ont les multiples langages utilisés pour leur conception. Les dernières années n’ont en rien facilité la tâche du débogage puisque les pages web ont de plus en plus fréquemment du contenu généré dynamiquement. Bien que plusieurs outils de la littérature offrent un grand appui pour la détection des bogues, très peu offrent de l’aide à la correction de ceux-ci. Dans ce mémoire sera présenté une étude sur l’évolution de la constitution des pages web permettant de mieux comprendre comment elles sont construites. S’en suivra la présentation d’un outil permettant de corriger plusieurs erreurs visuelles, préalablement détectées par l’outil Cornipickle, accompagné d’exemples réels de son application
    corecore