When you load a script from a third-party CDN, you are trusting that CDN to serve exactly the code you expect. If the CDN is compromised, or changes ownership, malicious code can be injected into every site that loads from it.
This is not theoretical. In 2024, the popular polyfill.io domain was acquired by a new owner and began injecting malware into the scripts it served, affecting over 100,000 websites. The original maintainer had no control over the domain after the sale.
To mitigate this risk, always use Subresource Integrity (SRI) attributes on your script and link tags. SRI ensures the browser will reject any file whose contents don't match the expected hash, even if the CDN has been compromised.
Browsers partition their caches by origin, so a resource cached when loaded from one domain will not be served from cache when requested by a different domain. This means that even if a user has previously visited a site that loaded jQuery from a CDN, your site will still have to download it fresh if it's being requested in a different context. Cross-origin cache sharing was once a theoretical benefit of public CDNs, but modern browsers have largely eliminated it as a privacy protection against cross-site tracking.
Your CDN may not always be available to everyone, but seeing as you should always be able to gracefully fall back without JavaScript anyway, that shouldn't be an issue, right?
CSS loaded off of a CDN might be more difficult to fix, but if you're loading CSS, then it's probably from jsDelivr or CDNJS, and it's only Google that has been blocked in the past.
Beyond country-level blocks, corporate firewalls and proxy servers may block or filter CDN domains. Ad-blockers and privacy extensions sometimes catch CDN URLs in their filter lists. And for sites with European users, loading resources from US-hosted CDNs may raise GDPR concerns around cross-border data transfers, since even a script request transmits the user's IP address to the CDN provider.
Always have a fallback strategy for when a CDN is unreachable.
Even Google Hosted, the most popular of all the CDNs, loading the latest jQuery, you still have less than a 1% chance of hitting a users cache. It's just not that likely, there are too many versions and too many places it could be being loaded from. There are nearly 50 versions of jQuery on Google's CDN. Less than 10% of sites use Google's CDN. Less than 20% of those use the latest version. HTTP and HTTPS are separate caches. Plus, caches get cleared over time, and new versions get released.
Instead, try to focus on getting the best you can first time, and then hitting a cache you created after that.
Not jsDelivr, or CDNJS for that matter. These two allow submission of any library (CDNJS is stricter, requiring some level of popularity). Combine that with the file-combiner found at jsDelivr, and you've got yourself one file taking care of several requests that's the same size as if they had been loaded separately.
See How to make fewer requests — though note that HTTP/2 has reduced the benefit of combining.
File combiners like the one offered by jsDelivr were a significant optimisation under HTTP/1.1, where each request required its own connection and browsers limited parallel connections per host.
With HTTP/2 and HTTP/3 (now supported by all modern browsers and CDNs), multiple files are multiplexed over a single connection. The overhead of additional requests is dramatically reduced, making the performance benefit of file combining much smaller than it once was.
Combining can still help when loading many very small files, or when you want a single cache key for a bundle of libraries. But it's no longer the clear win it used to be, and it can hurt caching granularity—changing one library in a combined bundle invalidates the cache for all of them.