I want to discuss a (minor) antipattern that I think is (slightly) harmful.
There are, supposedly, a couple of advantages to doing things this way.
- Users may already have the JS library in their cache from visiting another site.
- Faster download speeds of large libraries from CDNs.
- Latest software versions automatically when the CDN updates.
I think these advantages are overstated and lead to some significant disadvantages.
How much of an advantage does that really give you?
You probably shouldn’t be using multi-megabyte libraries. Have some respect for your users’ download limits. But if you are truly worried about speed, surely your whole site should be behind a CDN – not just a few JS libraries?
There are some CDN’s which let you include the latest version of a library. But then you have to deal with breaking changes with little warning.
So most people only include a specific version of the JS they want. And, of course, if you’re using v1.2 and another site is using v1.2.1 the browser can’t take advantage of cacheing.
If you serve your JS from the same source as your main site, there is less chance of a user getting a broken experience.
What is your CDN doing with all that data?
What happens if someone hacks your CDN?
You gain extra security by using SubResource Integrity. That lets you write code like:
<script src="https://cdn.example.com/js/library-v1.2.3.js" integrity="sha384-oqVuAfXRKap7fdgcCY5uykM6+R9GqQ8K/uxy9rx7HNQlGYl1kPzQho1wx4JwY8wC" crossorigin="anonymous"></script>
If even a single byte of that JS file is changed, the hash won’t match and the browser should refuse to run the code.
Of course, that means that you could end up with a broken experience on your site. So just serve the JS from your own site.
This isn’t the biggest issue on the web. And I’m certainly guilty of misusing CDNs like this.
Back when there were only a few CDNs, and their libraries didn’t change rapidly, there was an advantage to using them.