391 research outputs found
DOMAIN BASED HSTS (HTTP STRICT TRANSPORT SECURITY) MANAGEMENT
HTTP-based APIs, including websites hosting web content, want to force HTTPS for security; however, the current mechanism is not absolute in its intentions and requires third-party configuration to help enforce the policy. On top of that, the configuration is very coarse grained and does not meet many use cases where fine grained control is required, such as having different policies for sub-domains of a second-level domain. By allowing web domains to host their own configuration, we can give fine grained control to the owners of domains and help to secure the web from malicious actors
CoStricTor: Collaborative HTTP Strict Transport Security in Tor Browser
HTTP Strict Transport Security (HSTS) is a widely-deployed security feature in modern web browsing. It is also, however, a potential vector for user tracking and surveillance. Tor Browser, a web browser primarily concerned with online anonymity, disables HSTS as a result of this tracking potential. We present the CoStricTor protocol which crowdsources HSTS data among Tor Browser clients. It gives Tor Browser users increased resistance to man-in-the-middle attacks without exposing them to HSTS tracking. Our protocol adapts other privacy-preserving data aggregation algorithms to share data effectively among users with strong local differential privacy guarantees. The CoStricTor protocol resists denial of service attacks by design through our innovative use of Bloom filters to represent complementary data. Our simulations show our protocol can model up to 150,000 websites, providing 10,000 upgrades to HSTS for users
ANALISIS KEAMANAN JARINGAN UNIVERSITAS KRISTEN DUTA WACANA DENGAN SERANGAN SSL/TLS
The security of data communication over the network has become an obligation that needs to be considered in a technology ecosystem. Data security has various layers, one layer that needs to be protected is the presentation layer where SSL/TLS is located. If at this layer there are vulnerabilities where sensitive data such as cookies, usernames, and passwords are present, then data leakage will have a major impact on all stakeholders in the technology sector using SSL/TLS technology. In order to research and improve data security in Duta Wacana Christian University (DWCU) campus network, the researchers conducted SSL/TLS vulnerability testing on the SSAT and E-Class websites using the SSL Test from Qualys and a script from testssl.sh, the author also conducted Checking Mixed Content with GeekFlare and checking HSTS Preload using the HSTS Preload website provided by Google. Researchers also conducted SSL Strip penetration tests at 12 points of the DWCU building and also in Lab D. Based on the results of the study, there were several results found. The results on the SSL Test using Qualys found that the SSAT and E-Class websites already use HTTP Strict Transport Security (HSTS) rules with Max-Age 31536000 (1 year) but HSTS Preload has not been implemented, Mixed Content testing with GeekFlare shows that all transactions on SSAT and E-Class already uses HTTPS paths, then in tests using the testssl.sh script there are vulnerabilities that are read, and SSL Strip attacks are possible in Duta Wacana Christian University network under several conditions.Keamanan komunikasi data melalui jaringan sudah menjadi kewajiban yang perlu di pertimbangkan dalam sebuah ekosistem teknologi. Keamanan data memiliki berbagai layer, salah satu layer yang perlu dilindungi adalah layer presentasi dimana SSL/TLS berada. Jika pada layer ini terdapat kerentanan dimana data sensitif seperti cookie, username, dan password, maka kebocoran data akan berdampak besar bagi semua pelaku kepentingan di bidang teknologi yang menggunakan teknologi SSL/TLS.
Dalam rangka penelitian dan peningkatan keamanan data di jaringan kampus Universitas Kristen Duta Wacana (UKDW), maka peneliti melakukan pengujian kerentanan SSL/TLS pada situs web SSAT UKDW dan E-Class UKDW menggunakan Test SSL dari Qualys dan script dari testssl.sh, penulis juga melakukan pengecekan Mixed Content dengan GeekFlare serta pengecekan HSTS Preload meenggunakan situs web HSTS Preload yang disediakan Google. Peneliti juga melakukan uji penetrasi SSL Strip di 12 titik gedung Universitas Kristen Duta Wacana dan juga di Lab D.
Berdasarkan hasil penelitian, ada beberapa hasil yang ditemukan. Hasil pada SSL Test menggunakan Qualys menemukan situs web SSAT dan E-Class sudah menggunakan aturan HTTP Strict Transport Security (HSTS) dengan Max-Age 31536000 (1 tahun) namun HSTS Preload belum di terapkan, pengujian Mixed Content dengan GeekFlare menunjukkan bahwa seluruh transaksi pada SSAT dan E-Class sudah menggunakan jalur HTTPS, lalu pada uji menggunakan script testssl.sh terdapat kerentanan yang terbaca, serta serangan SSL Strip dimungkinkan terjadi di jaringan Universitas Kristen Duta Wacana dengan beberapa kondisi
On the Feasibility of Fine-Grained TLS Security Configurations in Web Browsers Based on the Requested Domain Name
Most modern web browsers today sacrifice optimal TLS security for backward
compatibility. They apply coarse-grained TLS configurations that support (by
default) legacy versions of the protocol that have known design weaknesses, and
weak ciphersuites that provide fewer security guarantees (e.g. non Forward
Secrecy), and silently fall back to them if the server selects to. This
introduces various risks including downgrade attacks such as the POODLE attack
[15] that exploits the browsers silent fallback mechanism to downgrade the
protocol version in order to exploit the legacy version flaws. To achieve a
better balance between security and backward compatibility, we propose a
mechanism for fine-grained TLS configurations in web browsers based on the
sensitivity of the domain name in the HTTPS request using a whitelisting
technique. That is, the browser enforces optimal TLS configurations for
connections going to sensitive domains while enforcing default configurations
for the rest of the connections. We demonstrate the feasibility of our proposal
by implementing a proof-of-concept as a Firefox browser extension. We envision
this mechanism as a built-in security feature in web browsers, e.g. a button
similar to the \quotes{Bookmark} button in Firefox browsers and as a
standardised HTTP header, to augment browsers security
Web Tracking: Mechanisms, Implications, and Defenses
This articles surveys the existing literature on the methods currently used
by web services to track the user online as well as their purposes,
implications, and possible user's defenses. A significant majority of reviewed
articles and web resources are from years 2012-2014. Privacy seems to be the
Achilles' heel of today's web. Web services make continuous efforts to obtain
as much information as they can about the things we search, the sites we visit,
the people with who we contact, and the products we buy. Tracking is usually
performed for commercial purposes. We present 5 main groups of methods used for
user tracking, which are based on sessions, client storage, client cache,
fingerprinting, or yet other approaches. A special focus is placed on
mechanisms that use web caches, operational caches, and fingerprinting, as they
are usually very rich in terms of using various creative methodologies. We also
show how the users can be identified on the web and associated with their real
names, e-mail addresses, phone numbers, or even street addresses. We show why
tracking is being used and its possible implications for the users (price
discrimination, assessing financial credibility, determining insurance
coverage, government surveillance, and identity theft). For each of the
tracking methods, we present possible defenses. Apart from describing the
methods and tools used for keeping the personal data away from being tracked,
we also present several tools that were used for research purposes - their main
goal is to discover how and by which entity the users are being tracked on
their desktop computers or smartphones, provide this information to the users,
and visualize it in an accessible and easy to follow way. Finally, we present
the currently proposed future approaches to track the user and show that they
can potentially pose significant threats to the users' privacy.Comment: 29 pages, 212 reference
The Security Lottery: Measuring Client-Side Web Security Inconsistencies
To mitigate a myriad of Web attacks, modern browsers support client-side security policies shipped through HTTP response headers. To enforce these defenses, the server needs to communicate them to the client, a seemingly straightforward process. However, users may access the same site in variegate ways, e.g., using different User-Agents, network access methods, or language settings. All these usage scenarios should enforce the same security policies, otherwise a security lottery would take place: depending on specific client characteristics, different levels of Web application security would be provided to users (inconsistencies). We formalize security guarantees provided through four popular mechanisms and apply this to measure the prevalence of inconsistencies in the security policies of top sites across different client characteristics. Based on our insights, we investigate the security implications of both deterministic and non-deterministic inconsistencies, and show how even prominent services are affected by them
Reining in the Web's Inconsistencies with Site Policy
Over the years, browsers have adopted an ever-increasing number of client-enforced security policies deployed through HTTP headers. Such mechanisms are fundamental for web application security, and usually deployed on a per-page basis. This, however, enables inconsistencies, as different pages within the same security boundaries (in form of origins or sites) can express conflicting security requirements. In this paper, we formalize inconsistencies for cookie security attributes, CSP, and HSTS, and then quantify the magnitude and impact of inconsistencies at scale by crawling 15,000 popular sites. We show that numerous sites endanger their own security by omission or misconfiguration of the aforementioned mechanisms, which lead to unnecessary exposure to XSS, cookie theft, and HSTS deactivation. We then use our data to analyse to which extent the recent Origin Policy proposal can fix the problem of inconsistencies. Unfortunately, we conclude that the current Origin Policy design suffers from major shortcomings which limit its practical applicability to address security inconsistencies while catering to the need of real-world sites. Based on these insights, we propose Site Policy, designed to overcome Origin Policy's shortcomings and make any insecurity explicit. We make a prototype implementation of Site Policy publicly available, along with a supporting toolchain for initial policy generation, security analysis, and test deployment
- …