I'm usually the first one to sneer in disbelief at super users who 'go nuclear' and disable the likes of javascript and images when browsing the web, but lately, I've been experimenting with having custom fonts disabled (that is, fonts I don't have locally), and it feels to me that page loads are much faster, and in some case more readable (pages feel 'cleaner'?).
The only issues I've come across so far is when a site uses font symbols, but overall, I've been pleasantly surprised.
I live in a rural area with bad internet, and the trend to load all of these external fonts, images and javascript externally over https basically makes the web unusable for us. The overhead of the handshakes required by the https connection greatly increased the page load time, and there is about a 1-in-4 chance of one of them timing out and not loading, so as a result the page doesn't load properly or pauses and stops loading.
I browse with JS disabled by default in uMatrix, turn it off on problematic websites (maybe ten times a day). It would revert back to disabled on next browser reload. And I have maybe 20 hosts enabled by default which I use often enough.
> If the local() function is provided, specifying a font name to look for on the user's computer, and the user agent finds a match, that local font is used. Otherwise, the font resource specified using the url() function is downloaded and used.
>> Any correctly-coded website will try a local source first, which should help a lot for poor connections.
> Then Google Fonts is not correctly coded, as it'll use remote font only. And myriads of websites reusing Google Fonts snippets. To try local source first, you have to explicitly ask for it in @font-face/src which Google Fonts does not do.
> So installing font into your OS won't help you with any website using Google Fonts service.
Wasn’t the point of Google Fonts that your browser would have in its cache fonts from Google so it wouldn’t download multiple copies from all over the place? A CDN basically. I know jQuery and Bootstrap recommend using their CDNs so browsers have the copy in their cache.
Of course, there’s always the tracking ability, but that’s inherent to every CDN.
> Since Chrome v86, released October 2020, cross-site resources like fonts can’t be shared on the same CDN anymore. This is due to the partitioned browser cache
Which is one of the many reasons I've stopped the almost hourly updates of Chrome. I feel fo people that have bad bandwidth or poor wireless connections. One of my favorite rants if web builders and more importantly the managers of team that build websites were forced to use a DSL speed level connection daily a lot of the cruft would go away.
The same argument applies to developers not using the state of the art computers packed with RAM. These problems can't be found in QA: if they even show up at all it's too late.
Sure, their code will compile faster but they won't get a feel for how it behaves in the real world. Probably the best is to have fast build servers on the LAN while typing at pedestrian machines.
Whatever excuses we want to make for forcing web devs to use old netbooks from 2004, I'm down with it (everyone knows they think they're cooler than us, we need to knock them down a notch). Usability? Cool yes that sounds very important or whatever.
It's exactly what I've been doing, give or take a few years. I also take the time to boot up my Win95 VM and test in IE3, Netscape in Wine, Lynx, w3m, and many others, with and without JS.
The result is designs I can actually be proud of, not ashamed.
Of course, no one is paying me for it. And that's OK. I'd rather be doing this for frew than getting six figures working on advertising bullshit which will be gone tomorrow.
I must have missed the tweet where web developers were declared cooler than other developers — or was the other way around? I can’t even fathom who is being impugned here, though clearly somebody is quite upset.
If they didn’t do this you (or someone with similar attitudes) would be complaining that cross site caching allows fingerprinting and violates privacy (which is almost certainly why they did do this).
Yeah, however, they could make these things configuration rather than imposing a policy on every user: in different contexts different privacy/performance tradeoffs are acceptable
And then the criticism would be “Google enables violating the privacy of rural and poor customers with less access to high speed Internet”. I’m not saying any one of these criticisms is wrong. Just that they’re predictable and there’s basically no move Google could make besides shutting down that would satisfy the complaints (and even then it would be the ultimate “Google kills products” meme).
Entity A does thing that impacts entity B. Entity B makes those effects known to a broader group than entity B. Maybe entity A does thing about it, maybe not.
ok, that makes sense, but I think that describes such a general idea. I am incapable of imagining a single thing everyone on the entire planet is capable of agreeing on; so inherently, any action anyone could possibly ever take would be subject to mutually exclusive opinions.
But this is exactly the nature of the problem: a company that has basically the power of big government dictating what people will do without their previous approval. The very existence of such a company is anti-democratic. Now we are stuck with a single company dominating a public service (internet) that affects the lives of billions of people around the world and we don't have a say on how they manage this network.
No, the nature of the problem is that Google (and big government, in its own ways) is trying to solve hard problems with no good solution. There are clear downsides (described in this thread) to all the possible options, and even the option of letting people choose between bad options doesn't actually make people's lives better at the end of the day.
Keep in mind that Chrome basically pioneered the concept of doing OS-level sandboxing between websites. We can complain all we want about how Google should give people the option to choose whether they want hard site isolation or not, but if it weren't for Google's investment in Chrome, we wouldn't even think of the option. Would it be better to live in a world where hard problems don't get possible solutions at all, where no one feels like they have choices taken away from them because the choices were never given them in the first place?
(I do firmly agree that the very existence of Google is anti-democratic, though... but I get there via an entirely different argument. I'm glad to see more people concluding this, nonetheless.)
The secret of democracy is that, even when there are no good options to choose from, people agreed on the solution. Therefore you are not a the whim of a single person or entity and this legitimates the way you live.
As for technology existing only because of Google or some other large company, that's not the case. Google was more innovative when it was smaller. Netscape created the whole browser industry as a small startup. Just let a lot of small to medium size companies compete in the market and split the ones that become so big as to become a threat to the whole ecosystem.
Yeah, to me the solution is configurability with sensible defaults. It increases the work of the developers, but it has the benefit of reducing the number of decisions the developers have to make on behalf of the user.
There are many problems that cannot be solved by the resources available to a medium sized company. Which is why you need multiple governments to get together to help out something like AirBus or space programs.
Similarly Google at its scale can tackle large problems more efficiently than having 10000 medium sized companies pool their resources together to do the same.
In other words, everything has pros and cons that depend on each particular situation. Large companies aren't "all bad" and small companies aren't "all good", you just chose to give lower priority (or ignore) ti the bad aspects of small companies while showing concern with the bad aspects of large companies.
That's a theory without data to back it up. Especially in technology, where big innovation usually comes from small to medium sized companies. What smaller companies cannot do is coerce the market to extract profits at the level done by Google and their pears. That's why we need anti-trust legislation.
To a first order approximation, nobody actually changes any of these advanced settings. The default is what most people use, and most settings might as well never exist.
Sure, but the people that do change get these settings are people too, and they’re often the type of people that will be vocal about your product and recommend it to people.
1) “There is a setting” is a good way to deflect from the more difficult job of making decisions about what the product should do.
2) Every setting comes with an opportunity cost. More than one product has died under a morasse of way too many configuration flags, or focused on features to please a minority of users at the cost of the product itself.
3) The group that says they want the setting is always bigger than the group that will actually use the setting.
Google could at least globally cache all the fonts when Chrome is installed; a one time download which is then available for use on all sites. If they licensed other browsers to do the same, the number of fonts available to designers would be large without the performance hit.
Good idea, except that the purpose of Google fonts (from Google's perspective) is to provide another means of tracking what sites people are visiting. Maybe a browser extension can do this though.
If they provide it in Chrome, there’s no reason they couldn’t track that stuff even without Google Fonts being bundled. The browser already knows every site you visit. Knowing you visited example.com that used “Example Sans” doesn’t give anything more than just knowing you visited example.com.
Bundling with browsers outside of Chrome would be a different story, but Google has no say over that anyways.
Google claims to not associate the data with particular users (it's anonymized), but there's nothing to stop it from using the data to know how popular certain sites or pages are, what region(s) visitors are coming from, and what user agents (browsers) are being used. And so on.
Let me ask you a similar question. Is there any evidence whatsoever that Google built this service out of the goodness of their hearts, and have no intent of using it to gain any sort of competitive advantage?
And if you do believe that, is there any evidence that Google will continue to spend millions of dollars supporting a service that gives them zero benefit? What does their track record say about that?
For the record, I'm not anti-Google, nor ultra privacy focused. I just think it's common sense that Google expects some sort of ROI here. To many, using Google fonts is considered a fair exchange (especially given that their data is anonymized) and I think that's a perfectly reasonable position. But make no mistake, there is a clear benefit to Google here.
The whole point of caching in the first place is to make things take less time when possible. That would be a strictly worse solution than self-hosting if performance is your most important outcome.
> Wasn’t the point of Google Fonts that your browser would have in its cache fonts from Google so it wouldn’t download multiple copies from all over the place?
That's true for now (you should know that Google Chrome will deploy changes on caching third-party content), but webfont caching simply doesn't work as well when you've realized that websites that do use webfonts have a tendency to pick fonts that are unlikely to match another website.
That is not true for now, as the article points out.
Hasn't been true for Safari for quite some time, and Chrome followed suit two months ago. That's not "will deploy," it has already been released in Chrome v86.
It would be awesome if uBlock Origin was able to dynamically determine a size limit based on how long assets take to load. Perhaps this is a task for the browser itself, though. Monitor throughput and latency and adjust page load behavior to limit or delay large elements.
I wonder, though, how the client determines the size? Doesn't that require at least a HEAD request, introducing another round trip?
I'm using uMatrix and by default all javascripts are blacklisted until I explicitly enable it for the site. The downside of this is that lots of websites are broken, payment sites are non-functional, but after a week of building the whitelist, the web is somehow usable again.
The worst is when the site is totally blank without it, then you enable it, and there's no interactivity: just images and a few words of text. It was totally unnecessary after all.
I'm well aware. But that day will be when it stops working, and right now it still works fine.
(Actually, it will be some time after that, since the day it stops working I will spend some time investigating how much work it would be to un-break it myself. The web can pry uMatrix from my cold, dead fingers. And I suspect I am not alone in feeling that way.)
You're not. uMatrix makes the web sane and I'm never lettign it go. What I did (on Firefox) is disable uMatrix in Private windows and I just go there for payment sites. All the rest of the wading through internet-crap is done from within the safe cocoon of uMatrix.
The key thing which that comment does not mention is that the interface enables a whitelist approach — block everything (read: as much as you want) by default and selectively allow certain types of content from certain sites.
This is important because of where your effort is spent.
With uBO, sites always work by default, but it requires manual effort — albeit mostly not by regular users — to stay up to date and blocking the latest trackers and annoyances. So, the amount of effort to keep uBO functioning is proportional to the rate of change in tracking. uBO is an arms race between advertisers and blocklist maintainers.
With uMatrix, all annoyances are gone by default, but often so is desired functionality. It requires manual effort to make the site work again, but once it works, it will continue to work until the site owner changes which types of resources must be loaded in order to function. So, the amount of effort required to keep uMatrix functioning is proportional to the rate of development, specifically major changes.
So, really, the two approaches take a different bet. uBO bets that trackers and ads will change less frequently than functionality. uMatrix bets the opposite. I know which bet I think is more reasonable. Advertisers have way more incentive to try and circumvent uBO than developers do to regularly break their site's functionality.
This all said, it works very well to use both. Having uBO installed means that when you're un-breaking sites in uMatrix, and you allow something that was actually advertising, uBO will usually catch it for you, so you don't have to see ads. This means you don't have to think quite so hard before allowing something in uMatrix, which makes the overall experience much more pleasant.
Actually, you _can_ configure uBO to block third-party content by default (by turning on advanced user mode); however, it's not as good as uMatrix as there is no split of request type (script versus image, etc.) as well as no UI to manage subdomains. You'll have to do that last part by editing the text configuration manually.
I think that's a feature now, at least in what's shipping in Chrome. You can click "custom" on any website and select the resources that site is allowed to load, by type.
I use both, they work even better together! uMatrix for enabling JavaScript and cookies only on sites I care about, and uBlock Origin for some cosmetic filtering.
Not OP but it may have to do with how granular the control is - that's my reason anyway. uMatrix breaks down different types of elements into a literal matrix/grid by party, site, and element type (i.e. 1st or 3rd party, current site or e.g. js.stripe.com, cookies vs images vs scripts vs frames etc.) so you can, say, only allow cookies from the current site but block them from gstatic.com, block the javascript from analytics.cloudflare.com, and choose to allow embedded frames from hcaptcha.com, all by clicking cells in the grid.
I do the same for noscript - it doesnt take much - just whitelist the site I am on if it doesnt work the way I want, and if I am worried about a payment on some site I can temporarily allow everything on that tab.
That plus adblock makes the web almost... enjoyable.
I'm in the same boat (mobile + desktop), and agree.
One needs to overcome the initial setup of the 80% most used sites. When that is done you get way less tweaking uMatrix. Improves speed, responsiveness, privacy, data usage and battery drain for me on my mobile. For quick one off article reads i load the page in Firefox Focus.
That’s the feature I really like in Brave, which stores that setting persistently per site in a convenient side panel. For things like unoptimized news sites it’s the difference between 10-30 and <0.5 second load times on the subway.
I've been trying this every few years since browsers started supporting JavaScript and allowing users to disable it (back in the days before extensions and config options), and every time more and more of the web breaks w/o JSscript. 100% of the websites I need for work use JScript.
Most sites are still usable if you cherry pick a subset of the JS they invoke. Blocked trackers like Google analytics or tag manager rarely break things.
Except for sites that I use (internal websites for several vendors), and that is just to manipulate widgets, editors, and forms, and in one case, operate remote machinery. What methods for cherry picking do you like? I've only done all or nothing.
I use NoScript to whitelist a minimal set of what needs to be there. I won't ever allow analytics or doubleclick scripts. When I pop into a media site with 20+ third party scripts I just say nope and spend my time somewhere else. For basket case sites needed for work or e-commerce that are too fubared to work with script blocking I have a clean Chrome browser. None of those sites get to have tracking cookies from my main browser instance.
I HATED sites that used custom fonts when I lived in the country.
There are only two ways loading such a site goes:
a) you get to stare at a blank page for a couple minutes while the font downloads, if it downloads at all. (this seemed to be google's favorite way to screw over people with bad connections; acknowledgment that it's going to suck and doing it anyway)
b) you get two minutes into reading the page only for everything to suddenly jump somewhere else when it switches to the font. (this seems more like the result of ignorant web designers)
JS sucks but webfonts consistently make the web an awful place.
Re: (b): I know there’s a MathJax clone that works around the constant reformatting by precalculating the sizes of the various blocks and putting that into the HTML. So your browser basically makes a big empty placeholder box that the renderer will render into.
It’d be neat if something like that were available for web fonts. Each paragraph gets its own “block” that the installed font uses (and contains padding), and when the font downloads, it won’t reflow. But I’d assume it’d be way too complicated to deal with (different screen widths is the first to come to mind).
The easiest fix is to ensure you pick a native fallback to your web font which has metrics which closely match your web font. You might have to use a different font-size and letter-spacing while it's loading, even.
It's also likely to be implemented as a ton of JS to download, which would either block rendering (see option 1 above), or apply after the content loaded (see option 2).
So, a bunch of work to fix something that isn't broken when you turn off webfonts and respect user settings.
Web fonts are a big one in terms of making various websites feel fast because, when implemented incorrectly, they massively slow down initial text paint.
With that said, they're a fantastic feature but are also way overused. You don't need web fonts unless your website is either heavily type-centric, or has font-related dependencies (such as embedded content of some kind). Icon fonts are also a neat use of web fonts, but they don't make the page "feel slow" when their rendering is slow, because icons loading slower affects our perception of a page's loading time less than its text.
On multiple projects I've used a tool [1] to closely match a built-in font to a web font in order to present content faster and avoid avoid FOUC (Flash of Unstyled Content/Text). In one case, I replaced webfonts entirely with native platform fonts, purely for speed.
With this approach you can be more practical with your goals. If the design cannot be compromised, you can still get to initial paint quickly and let the WOFF files load in as needed.
Edit: If anyone is looking for more information about this, the post [2] from this tool's author from 2016 does a good job of discussing the tradeoffs.
In addition, not all users are going to want to use whatever font you (the web designer) decided on. I have my preferred fonts on my system, and I expect my browser to use them. How arrogant of the web developer to override the user's preference!
I remember when the browsers even allowed the user to configure a background image for content. You could have all your web sites render over a tiled picture of a brick wall if you wanted. Year after year, we give up end-user configurability and simply hand it over to nameless web designers who decide what is good for us.
> How arrogant of the web developer to override the user's preference!
I imagine this is partly tongue-in-cheek, but I think the real difficulty here is that CSS doesn't differentiate between "user-configured sans-serif font" and "the system fallback sans-serif font". As a result, you can't allow users to override your font choices without disregarding custom fonts entirely.
I've sped up sites before by using inline svg rather than icon fonts - downloading an entire icon font for a few icons is inefficient when the actual SVG needed for those icons is < 1Kb and is waaay faster as inline SVG.
Remarkbox does this with the default profile avatar, just-in-time dynamically rendered and inlined SVGs.
I've considered storing each default SVG into the user table and then inlining in the HTML but decided to wait on that optimization since inlining dynamic SVGs itself was such a significant speed boost, especially on threads with hundreds of comments.
Without inlining, each profile image or SVG would be a separate request parallel HTTP request and often at least a couple requests would be slow.
1. If you use, say, 5 icons and there are, say, 250 icons in the full font, then you would need to download the page 50 times to just equal the bandwidth. That's assuming the cache doesn't expire, or it's not an incognito session, or you're not loading it from a CDN where it won't get cached anyway.
2. If you do find this irresponsible, then by all means create a small SVG file with the 5 icons in it and allow that to be cached, or even 5 small individual SVG files if it comes to that.
In my experience it's not about a website being "heavily web-centric" (which I'm not sure I understand to be honest - surely websites are by definition "web-centric"?) it's more about being able to match brand fonts. If sites didn't do that they'd end up feeling pretty weird because the crossover of installed desktop fonts is pretty low: http://www.ampsoft.net/webdesign-l/WindowsMacFonts.html (There used to be a better site with fonts listed by what percentage of Windows and MacOS systems had them installed, but the last time I saw it was a while ago and I can't seem to find it again.)
Unless your font shares (exactly) metrics with the ones the user has installed there's no way to implement this correctly. This is something you can't know so there's no way to implement this correctly.
I can't recall without looking into it, but there's a lot of gotchas when implementing web fonts. You have to ensure it's using fallbacks, that it can use them before the target fonts have loaded, that the fallbacks are close to the web font in style, and there's a bunch of CSS tricks you can use so that a repaint doesn't end up reflowing everything.
> I've been experimenting with having custom fonts disabled (that is, fonts I don't have locally), and it feels to me that page loads are much faster, and in some case more readable (pages feel 'cleaner'?).
That might be due to font hinting. The default fonts (on Windows) were beautifully manually hinted to give a really crisp results without font smoothing. But manual font hinting is super expensive, so web fonts don't do it. If you disable them, I'm sure the system has to fall back to the (higher quality) defaults.
Most web fonts are automatically hinted, so they only look acceptable with font smoothing on (at normal font sizes with normal DPIs). If you turn smoothing off, the fonts look super ugly with all kinds of artifacts like weird double-thick lines.
I've tried to disable web fonts because I hate font smoothing (to me it's just blurry, not smooth), but the issue I've run into is the fad of using a custom font for icons, so I end up getting lots of unicode tofu for graphics (e.g. https://twitter.com/FakeUnicode/status/1194628430559469568).
> That might be due to font hinting. The default fonts (on Windows) were beautifully manually hinted to give a really crisp results without font smoothing.
I used to work at MS, and a lot of people don't realize exactly how much time, money, and love is put into fonts there.
When my team ordered custom fonts, I personally spent a lot of time going over each character pixel by pixel, in combination with many other characters, to make sure everything was perfect. I know other members on my team did the same. It wasn't some sort of mandate, it was just some sort of institutional knowledge, or maybe pride, that Microsoft was paying for custom fonts for our project and we were going to make well sure and good that what we delivered to customers was perfect.
I know there is a holy-war between people who like MacOS's font rendering and people who like Microsoft's font rendering, but I hope everyone can respect the amount of pride that both companies put into their work.
Just want to say that I really appreciate you and people like you. I can spend hours looking over and comparing fonts I use because small problems can really bug me, no pun intended.
I wonder how hard it'd be to create an extension that renders all alphanumeric chars in a font of your choosing, and all other characters in the font the webpage wants.
It'd be more overhead of course, and basically the opposite of what this thread is about, but it's something I'd be interested in. I'd love to be able to choose a font without things getting mangled.
> I wonder how hard it'd be to create an extension that renders all alphanumeric chars in a font of your choosing, and all other characters in the font the webpage wants.
I'm not too familiar with the details of the icon fonts, but my bet is that most of those characters are in the private use area.
I guess it boils down to if an extension can hook into the font-resolution process, to have different resolution policies for different character ranges. I know there are extensions that can block web fonts, but maybe they just prevent the resource from loading at all.
The "cleaner" feel may well be to do with better font rendering for desktop fonts, because you'll be using something like TTF rather than WOFF. Web font files have to trade off file size for quality, which desktop fonts don't have to do.
Well, TIL about it being a container. But do they really contain an identical amount of information for font rendering? I'm no font expert, but I do know that you can have more or less information in a font file that affects how it's rendered.
Yes, it's just an Open Type file. The usual omitted stuff is subsetting a bigger font to include only certain glyphs you need, usually you don't mutilate font by removing rendering data from glyphs.
Yes, it's just a container for the exact same data that an OpenType font holds.
On the other hand, it's possible that in the course of deployment, various things may be dropped from the original font resource: not just subsetting (reducing the repertoire of supported characters), but in some cases also OpenType features (e.g. ligatures or contextual forms), and in some cases hinting is stripped.
If that is done, the webfont version may indeed appear differently than the original "desktop" font, depending on local font-rendering settings.
Mea culpa, I've done this once before. I work on a small 4k screen so I couldn't tell the difference, though others pointed it out and I fixed it. I was trying to eke out that last bit of performance.
Most free fonts are rather poor quality. Most font designers seem to publish web fonts for free as an advertisement for their services or print font products they sell. As a consequence, while the sample sentence might look great, the feeling when reading is poor. Especially screen optimisation is lacking, which is what distinguishes the usual fonts you get with your OS (like Arial or Roboto) from stuff you would buy for print (like Helvetica). Also, people tend to pick more fancy fonts to set themselves apart, which of course decreases readability even with very good fonts: you can just read fonts faster if you are used to them, uncommon ones slow you down and feel uncomfortable for long texts.
You’re underestimating how much work and skill it takes to make a good font, even for a great designer, so type foundries like to lock good fonts up behind rent-seeking license agreements.
Rent-seeking would be if Google demanded money from a font designer to let their fonts be "compatible" with Chrome. I.e. Pay me or I'll make sure your product doesn't work with my product. App stores is a common example; pay Apple or they'll make sure your software doesn't run on the iPhone.
Some features of the license agreement are though.
For example, some foundries require you buy a special expensive license in order to use their fonts in a PDF. IIRC, Hoefler & Co. and Emigre do this (though it's been years since I've looked, so I may have changed this). Their rationale is that since it's possible to extract font from a PDF, they need to pay for any font piracy that results from that.
(never mind that anyone who wants to pirate the font would just go and find a torrent instead of ripping it out of a PDF... and there are torrents of every notable foundry's entire libraries)
You may not like it, neither do I. But what you have there is just rent plain and simple, not rent-seeking.
Rent-seeking is when a third party uses force to claim profit from an agreement they're not actually party to.
For example, you rent a business property. You pay the landlord and agree to their terms. Next week the mob shows up and demand protection money to "keep you from harm". Implying that harm will surely come to you unless you pay. That's rent-seeking.
The app store analogy fits rather nicely I think, but it depends on your viewpoint.
> Rent-seeking is when a third party uses force to claim profit from an agreement they're not actually party to.
The agreement is between you and the law in most countries, which says that typefaces are exempt from copyright. The third party are foundries who license fonts to you and charge you more based on how popular your website is.
A typeface is an artistic work and as such is certainly covered by copyright in "most countries". If they were not then why would license agreements even be a thing.
If you don't wish to agree to the license terms of a particular typeface, you have the option to choose another typeface instead. This includes the myriad free typefaces that exist today.
If you buy something from me or strike a license agreement with me, I'm not a "third party" to that transaction.
The most usual is that you'll have to pay a certain price up-front (or monthly), and some more for each access. The odds are also good that you won't be able to host the font yourself, because well, the foundry has to count those access, doesn't it? And if you want the foundry to respond any quickly, you better pay some extra.
The problem isn't designers who need of course compensation for their work, but building up an e-commerce stack, per-volume licensing infra, and publicity, leading to "platforms", verticals, concentration, and most commercial fonts being distributed by just two companies (Adobe and Monotype). In other words, middle-men taking all the money away, when easy self-publishing was the entire point of the web.
(Though I've heard Adobe Stock and iStock at least works ok for photographers and graphic artists for now)
It's rent-seeking if, instead of selling you the font once under a mutually agreeable license, they charge you an ongoing fee to "use" the font under conditions that the foundry controls and can change without re-negotiating the contract. This is especially true if the condition are "you can use for purposes x, y, and z, but if you want to use it for a, b, or c you have to pay a higher rate".
What do you mean? If you agree to the contract when you rented access to the font, you're subject to all its conditions. If you don't want to pay the additional rent, your choice is to break the contract and lose access to the font, or pay up.
Maybe it does. It isn't poor quality per se, on the first contact those fonts look great. Only prolonged usage will reveal their flaws, and then only if you know what you are looking for. Most buyers won't be so sophisticated, so it can work. Actually, lots and lots of products are sold in such a superficial way.
The cleaner look is because most designers use macOS which renders most fonts in a way that makes them all look good (let’s not get into technical details). On Windows the same fonts might look like shit (for example some parts of a character are thicker than the rest for no apparent reason)
Well, good is relative. MacOS renders fonts in a weight-and-form-correct way which makes them look blurry however. Windows renders fonts in a weight-and-form-incorrect way, which however makes them sharper and easier to read on screens, especially with resolutions below 200dpi.
Now if you use a high-resolution monitor and print-optimized fonts, MacOS will look better. If you use a low-resolution monitor and screen-optimized fonts, Windows will look better. Problem is, most designers come from print and use expensive Mac gear, so they are designing for a minority, their peers, not their users.
Designers primarily designing for other designers is a core problem with the entire field of design and it holds back a lot of society as it seems to really encourage apps having super low information density displays with slow animations.
> This can't still be true. No one comes from print anymore.
It is. The real problem is that a large number of designers like to have a pixel-perfect experience (which on print doesn't really matter as they know what paper will that publication use).
> I think macOS will always be optimised for the best experience.
It is probably better to say that Apple will design for their intended market. I was using Apple products long before HiDPI was a thing and was never particularly happy with the fuzzy fonts under Mac OS X. While that may be acceptable when designing for print, since accuracy is more important, it is an annoyance in many other circumstances.
Its actually incredible how ugly fonts render on windows. After using linux for so long I thought I was experiencing a bug when seeing fonts on windows.
We probably get used to whatever font rendering the systems we use the most have.
To me, fonts indeed look blurry on macOS and weird on Windows. Screenshots from Windows with colored pixels particularly don't look good to me. Though after having seen screens of people using Windows 10, it seems they changed whatever I really didn't like about font rendering on Windows.
I still like font rendering on GNU/Linux (and Android) better it is correctly configured, which it is on widespread distributions with widespread desktop environments (I use a QHD screen). But it could be out of habit. On some configurations, kerning is bad though.
Perhaps the screenshots you're seeing have ClearType (i.e. sub-pixel anti-aliasing) turned on, and when it got to you (as a picture) the subpixels no longer match on your screen? For example, if one of the screens involved was vertically oriented…
Categorically not true in Firefox in default settings, usually disabled by websites in Chrome by using the CSS property "text-rendering: optimizeLegibility" (unless you are somehow in macOS which in this case the default preference is to optimize for precision systemwide anyway).
Have you tried something like uMatrix? If you use it in the denylisting mode, it doesn't seem to impact much on my web browsing experience. By default it is in allowlist mode, which means it's speedy to open pages, but you run into trouble quite quickly. I really can't stand browsing the web without it.
What is the maintenance status of uMatrix? I haven't checked recently, my most up to date info is that Gorhill left it due to too many bogus issues filed on Github.
You'll find that disabling images over 50KB has a similar effect, as does disabling all third-party JS except for a select few CDNs if you're lazy. In the end it's three parts performance to one part online safety.
Tufte would disagree, and at least some consider his thoughts on design to be very solid.
Would you be amenable to "A well-designed web page does not need to rely on custom fonts, provided their impact on page load is absolutely minimal"?
You can unobtrusively serve up a single font embedded as base64 on a static site - a world of difference from some of these sites that load up 5MB of a UX person's vision off some remote CDN.
I did an experiment a few months ago where I just switched of font overrides and set my default font to Comic Sans, at a pretty large default size (and I tend to zoom in a lot, too).
It actually was quite pleasant and readable, despite the particular font choice.
As a guy who does this blocking pretty ruthlessly (edit: inc. cookies), I'd like to understand why - I don't take offence at 'sneer', ISWYM but I'd like to understand the other side of the fence. TIA
sorry, 'sneer' is probably a bit strong - it's really just a lack of understanding on my part (of your side of the fence).
I personally find browsing the internet without javascript or cookies enabled by default to be incredibly frustrating/inconvenient, but I'm sure it's all very subjective, down to your own personal browsing habits.
Overcoming frustration regarding ads, performance, nag screen, consent screens, poor design, various hijacking, tracking, clickbait and otherwise poor content is very personal in my experience.
Some people even invent new protocols (Gemini comes to mind) to circumvent some of these frustrations.
Thanks, and I understood the word was used for dramatic effect only.
From my POV I too find it frustrating at times, but I'd say much of that comes quite unnecessarily not from JS use, but its abuse, eg. fails to display text unless JS is enabled. HTML is pretty good at that!
There's no point me boring you with all the plusses of not using js (just ask) but speed... once a connection is made this http://antirez.com/latest/0 is pretty much instantaneous.
I found it frustrating and inconvenient at first. Now when I try to use a browser w/ default Javascript settings I find that I'm even more frustrated than when I started using NoScript. Websites left to their own devices to run arbitrary code on my machine are almost universally terrible to use. Local news sites are, by far, the worst. Lots of sites are terrible though.
So true! So many sites use those icons/symbols in the nav etc so their sites end up broken without web fonts. Other than that, it feels so much better and faster blocking web fonts.
I've had this as well for quite some time in uBlock Origin - and it works great. Most websites don't need custom fonts, and uBO remembers the ones where I re-enable them back.
Fonts are large assets (often larger than a full bleed banner image), so they do add to page weight. They also “swap” by default, replacing a local font after loading, which adds to the perceptible load time. There are other techniques that may feel different depending on usage but unfortunately there’s no silver bullet.
My favourites are sites that load the layout, the ads, the analytics, then the fonts and only then the actual text of the page. If the font doesn’t load, the text doesn’t load — and if you have JavaScript turned off just a blank layout with ad-sized blocks waiting for something to happen.
I tend to disable page fonts and use my own selection instead because I prefer to have anti-aliasing off. Fonts that were designed before cleartype look great with AA off but cleartype fonts usually look awful with AA off because of the lack of hinting.
I have a regular 16:9 monitor, 2048x1152, in which I have Windows font scaling set at 150%.
The reason I prefer AA off (with fonts that were designed for it like Verdana or Bitstream Vera Sans Mono) is because I find it less tiring on the eyes. With AA off, the edges are all sharp and crisp, there's no blurriness. It looks closer to print. When AA is on, they look smeared and don't have clearly defined edges.
What is also a puzzler, as displays become increasingly higher resolution, now 4K and heading toward 8K, the need for AA is supposed to disappear. You don't need to fake smoothness at high pixel resolutions, yet many OSes are actually now making it harder to turn AA off.
It gets dumber.. OSes that used RGB subpixel AA (cleartype) because they're now on tablets and other devices that rotate, they can't count on the RGB stripe, so they stopped using RGB subpixel AA and went back to greyscale AA.
I have most blocked as well through my ad blockers. The biggest annoyance is when they use fonts for icons instead of images. I’m not sure where that trend comes from, but it’s obnoxious.
> I’m not sure where that trend comes from, but it’s obnoxious.
(1) The options for controlling the style of SVG elements (e.g. on hover) are not great.[0] In contrast, font icons can be styled as easily as any other text.
(2) Font files are usually smaller than their SVG counterparts.
(3) Font rendering is faster than SVG rendering.
(4) IE did not support SVG until version 9, but supported custom fonts since version 5.
(5) Old browsers that did support SVG had serious issues with SVG sprite maps, ranging from creating duplicate rasterisations of the entire SVG file in memory for every use of the same file in a document, to not caching the rasters at all and regenerating them every time they needed to be repainted (e.g. during scrolling). These are fixed now, but inertia is what it is.
In Firefox, setting gfx.downloadable_fonts.enabled to false seems to do the trick. You can verify by visiting https://fonts.google.com/, which looks a bit dull after disabling.
In Firefox, disabling "Allow pages to choose their own fonts" does not completely block web fonts (as setting gfx.downloadable_fonts.enabled to false would); what it does is to prioritise the default fonts, and move any page-specified fonts to the end of the fallback list.
This means that for "normal" text, the browser's default fonts will be used, but if there are Private Use Area character codes that are only supported by the site's custom webfont, it'll still get used for those. So icon fonts that are based on PUA code points will still work.
What doesn't work in this scenario is when the icon font doesn't encode its icons in the PUA, but uses normal Unicode characters -- either symbol code points that are supported by the system's default fonts, or as some icon fonts do, regular English words that then use ligature rules to produce the icon glyphs. (So the icon font contains OpenType mappings such as "phone" -> [phone icon].) In this case the content on the page will end up rendered using the default font instead.
[Edited: HN doesn't let me include an actual phone icon there, apparently.]
for me personally, I've only noticed this problem once, but I can't remember which site that was (must be one I don't visit often), but obviously your browsing habits are different to mine.
I don't see any disadvantages to giving it a try yourself for a while tho' - it's just a single checkbox setting.
That'd break sites that use ligature-based icon fonts, with the actual content being an icon identifier like "thumbs-up" and an OpenType ligature rule in the font to render this as an icon glyph.
Could you just get the icon name in that case? For me it would be good-enough, or, in many cases, better than the crammed design-first icons (Slack, I'm pointing at you!).
In theory, it's legible; but often, rendering the icon name (which may be up to a dozen letters or so) will totally break a layout that was designed to accommodate a single glyph.
That's a MAJOR hit to design and branding for a very small hit in page weight. Properly configured webfonts add almost nothing at all. There may be more business value in the 500ms you save vs branding (absolutely possible) but it's a tradeoff.
> and it feels to me that page loads are much faster, and in some case more readable (pages feel 'cleaner'?)
More readable, and also helps one focus on the content. A lot of web publications with literary pretensions (New Yorker, Quillette, LRB …) use pretty classic-looking externally loaded serif fonts for their articles, which not only slows down reading but can also (IMO) make one less critical of the content than one otherwise would be.
If I were the editors of these magazines I would be seriously tempted to force the use of Comic Sans for all article content. May not be good from advertising point of view but it signals a kind of intellectual honesty that I suspect is valued higher by their readership than a "classic look"
Maybe I'm being naïve, but why don't browsers ship with more popular fonts bundled to avoid problems like these? I have no issue with any of my browsers taking up fractionally more space for an extensive cache of the most popular web fonts when the payoff is better performance on so many sites.
Of course, servers should still be capable of serving the fonts that the client requires to render the content correctly, but the browser should be equipped to make that happen as quickly as possible.
I've always thought the same. I see the same web fonts being used time and time again. It just makes sense for browsers to have an intelligent "permanent cache" - not based on your specific history but a common set of frequently sourced fonts, libraries, etc.
This would be especially helpful now that shared caches are going away.
One other option is to actually install the font into your OS. Any correctly-coded website will try a local source first, which should help a lot for poor connections.
> Any correctly-coded website will try a local source first, which should help a lot for poor connections.
Then Google Fonts is not correctly coded, as it'll use remote font only. And myriads of websites reusing Google Fonts snippets. To try local source first, you have to explicitly ask for it in @font-face/src which Google Fonts does not do.
So installing font into your OS won't help you with any website using Google Fonts service.
> Then Google Fonts is not correctly coded, as it'll use remote font only.
For a good technical reason: the version installed on the computer may not really match whatever the version on Google Fonts serves (fonts have notoriously no semantic versioning aside from some programmer-oriented fonts, examples of these outside are Segoe UI (changed between Windows 7 and 8) and Liberation fonts (some versions notoriously lack some glyph symbols) and on Google Fonts platform the Exo and Exo 2 problem (which was resolved by renaming the second version to Exo 2)).
Of course, there are also some benefits to Google (you know what are those benefits are).
> Note that when browsers render websites that use the Google Fonts API, they will check if a font is installed locally on your computer, and prefer to use the local version over web fonts.
I've definitely seen use of src:local(...) in the CSS served by Google Fonts in the past, but they may have changed things, or perhaps it depends on details of the request, the detected browser, etc. YMMV.
I just checked on a website I maintain, and it's looks like Google Fonts behaviour has changed. It definitely used to prioritise local fonts, and now it doesn't. In a way, I understand, because local fonts aren't always identical to the online ones, leading to weird bugs that might not be noticed at first (especially in languages that don't use Latin characters). Fonts also get updated over time.
When you select fonts on Google Fonts, the website instructs you to link to CSS hosted by Google, which is simple to implement, but not performant. Really, you would want to load that CSS asynchronously, so that the rendering of the whole website doesn't wait for Google Fonts. (Here's how you load CSS asynchronously, by the way: https://stackoverflow.com/q/32759272/247696 ) Or even better, host the fonts yourself.
- 2. Everyone uses a different font (just like everyone uses a slightly different version jquery) so cache hit would be pretty small IMO. Do you really want a browser to be gigabytes large? Especially on mobile it's not viable
Maybe if browsers standardized on shipping, say, 20 carefully chosen fonts (not necessarily the most popular ones, just a good variety of different types of fonts), the smaller websites would follow and use them; but I think any major brand likes to distinguish themselves and have a custom unique font.
For example, Google Maps by default doesn't load maps of the entire world when you install it. However, it does let you opt-in to pre-downloading specific areas that you frequently travel.
Browsers could do the same by simply adding an opt-in to download / cache common assets like fonts, jquery, etc.
> Maybe if browsers standardized on shipping, say, 20 carefully chosen fonts (not necessarily the most popular ones, just a good variety of different types of fonts), the smaller websites would follow and use them; but I think any major brand likes to distinguish themselves and have a custom unique font.
It may be not really carefully chosen, but didn't Microsoft have done this already? (Core fonts for the Web, https://web.archive.org/web/20020124085641/http://www.micros...). The reason they have discontinued this programme is that it actually costs them some money (as the fonts are not owned by Microsoft.)
Also, how would you cater to non-LGC (Latin, Greek, Cyrillic) users?
You might be right, but the next question is fashion. Will the same fonts be used still in 5 years, or (more likely) the new hotness will arrive? Should the browsers update the list each year? Then some fonts that were available will suddenly be not? Not so great.
Web standards and browser features are generally built for the long term and backward compatibility. I mean, it's not impossible to find a solution, but it's definitely not "let's download some fonts and bundle with the browser, done" kind of problem.
When you control 70% of the desktop market share, and you are releasing an evergreen browser that updates weekly, if not more frequently, any argument about not being able to stay on top of changes in the environment become rather silly.
Copyright and size: fonts are larger than you might think because they need full Unicode support and good fonts have tuned variants for different weights, styles, and sizes, along with alternate symbols and ligatures, etc. which all add up.
Self-hosting is the way to go since it gives you full artistic control and you can use things like the Unicode-range CSS property to tell clients how to combine smaller fonts so you can support many languages without forcing a French browser to download a ton of Chinese glyphs which aren’t used on the page.
Another nice option we have now is using font-display to allow using a similar system font until downloads are completed, which can work really well for slow connections:
> fonts are larger than you might think because they need full Unicode support
Correct me if I'm misunderstanding, but don't the vast majority of fonts _not_ include full Unicode support, or even close to it? I know of GNU Unifont, but not many people are using that on the web...
I take your point on selectively downloading the relevant glyphs for the user's locale, but this could be done by the browser too - it is aware of the locale.
full coverage, yes, that's relatively uncommon but the most popular fonts will have a wide variety of languages and that adds up even if you don't support Chinese but do want coverage of the range of fonts designers want to use.
Because web tech (and Google in particular) is a mess of weird politics.
There are some things that are really well executed, and other things where the ball gets dropped completely.
Chrome will come along and do something like hide parts of the url, or allow scroll-jacking, but then say that fonts should probably be a system level implementation.
It would be trivial for the chrome team to establish something like a series of web-safe fonts like Roboto, Inter, Merriweather, etc. Stuff that gets used incredibly often. Even things like licensing aren't really an issue when most of these fonts are under their open license not to mention that Google has the money to be able to license it if they wanted to.
Why can't users install the fonts directly on their OSes instead? If it's on the OS, browsers can pick them up and I can install only the fonts I want.
I seem to remember some consternation not long ago when Decentraleyes reducing the amount of caching they do because it could be used for fingerprinting...? Can't find any reference to this on Google though
I tried LocalCDN after seeing a recommendation for it here, but despite claiming to cache more libraries it was very clearly not catching as many requests as DecentralEyes, which I'm back using now
Mainly for performance reasons I've recently gone from using hosted/local Google Fonts to just calling the system ones (some variant of `-apple-system,BlinkMacSystemFont,Segoe UI,Roboto,Oxygen,Ubuntu,Cantarell,Open Sans,Helvetica Neue`).
Seems most css frameworks are defaulting to this now. The only problem is Ubuntu is a very ugly font that looks very different from the others.
At my previous job I was one of the only Linux users (using Ubuntu) and some designers were shocked how bad their designs looked on my computer due to the Ubuntu font. We removed Ubuntu from the list and just went with sans serif instead.
I like Ubuntu personally (we used it for the Cyph logo), but not as an alternative to sans serif. I mean it's great for some things like headers, but I wouldn't choose to read a whole document printed in Ubuntu; that sounds terrible.
Something like this is really popular, covers most cases and older systems/browsers and even emojis:
`ui-sans-serif,system-ui,-apple-system,BlinkMacSystemFont,Segoe UI,Roboto,Helvetica Neue,Arial,Noto Sans,sans-serif,Apple Color Emoji,Segoe UI Emoji,Segoe UI Symbol,Noto Color Emoji`
For brand-specific fonts though, I highly recommend cleaning the font files from any characters/features you don't need, and then just serve the (now in many cases MUCH smaller) file directly.
Once you’ve cut it down to just the characters you need (say it’s the letters in your logo and slogan) could you inline that within the page itself (and how?) to make it not need an extra request.
Hmm, I just did a search for web fonts with data URIs
In this article: https://www.zachleat.com/web/web-font-data-uris/ mr. Leatherman advises against (because of the bulkiness of web fonts blocking rendering for too long) but he has not taken into account rolling your own custom mini-font with just the glyphs you need.
So I think your idle musing may indeed have merit!
Perhaps those more knowledgeable and preferably with benchmarked evidence could care to comment?
And also, that you were at home (or the office or wherever the ip addr is) at that particular time.
So, also, location and daily habits, where you go and so.
(The tracking company probably has you associated with a bunch of ip addresses and with a bit fingerprinting they'll know it's you at the ip at that time)
And the browser just checks to see if it already has the font. If you release a webfont you can share it with signatures or just generate signatures if the author hasn't done it for you.
This way if you're self-hosting Lato and the user already has Lato, they don't need to download it again.
That kind of content addressable naming based on hashes would be awesome for the web in general.
Think of all the big JS frameworks.
If they were all referenced by hash, their size would become pretty much irrelevant, as they would be amortised across potentially thousands of requests.
But the web chose to go with unified resource LOCATORS for everything, and I don't think it'll be easy to change that mindset.
It's the reason why people do crazy things like register DOI (a location based URL, that is hosted with a single point of trust/failure) to reference git hashes (a content addressed URN). They don't know better, and they want readable names.
A different issue with this kind of caching tho is that it is a potential side channel for privacy sensitive information.
For example if my script requests the pornHub logo, or something else that might even be more specific, like videos themselves. I can use a timing side-channel to figure out wether or not that particular item was in the users cache, and thus visited previously.
Imho they applied the wrong solution to that issue: domain separated caches. You loose the timing channel, but also loose all the caching advantages.
A much better solution would have been to close that timing side channel. If every browser ran TypeScript this could actually be a monad, where a "low resolution timer"-monad is strictly separating timing sensitive portions from high resolution timers.
If you had a generic rule like "this hash must be seen on 10 domains before you start using it for lookups" wouldn't it potentially disincentivize most tracking?
It would take a while for the cache to warm up, but with typical usage that would not take very long. Your performance becomes a bit of a lottery for a few users, but on the whole you should see gains.
Shameless plug for two related articles, one about the privacy problems with Google Fonts and the other about self-hosting them (especially if they're variable fonts):
Yes it is...? Google Fonts is the service, not the fonts themselves. Check the "About" and "License" tab of any font on Google Fonts, they're not created by Google.
Google Fonts is also a website which is used by many to search for and compare fonts. The site conveniently packages font files together and provides a "Download" button. I've always thought of this website as "Google Fonts" and have always downloaded and self-hosted fonts.
A question: instead of redownloading the same resources again and again, can we achieve privacy by adding delay between requesting a resource and providing it? It could be the same amount of time it took downloading it the first time or some randomized value. It won't speed up pages, but at least will save from bandwidth and data caps.
You don't need a timing attack as you could e.g. put a unique identifier in the resourcce you're delivering via the CDN. When the browser delivers the resource from its cache for another domain you could simply extract the identifier from it again to achieve cross-site tracking.
This is actually valid. Without third party to ensure the neutrality of the resource, this would be a unregulated cookie.
A better trap would be if the browser downloads the resource on the second request as well and compares the hashes. If they differ, the resource is considered uncachable.
I've read your comment 3 times, and I'm still not exactly sure what you are suggesting, or what the purported benefit is? (it's probably my poor interpretation, but I'd appreciate if you wouldn't mind expanding :)
Site A can't see the cached resources of site B, because it would show that the user has visited site B if the resource is provided too fast. However, if the resource is provided with some delay, then site A assumes that site B hasn't been visited.
The current solution is to download the same thing twice. It is wasteful and I offer downloading it only once, but if site A asks for a resource for site B, the resource should not be returned immediately, but after a few ms so that site A gets the wrong conclusion.
It should work if site A can observe only the result of the request but can't check if a request happened or not.
Ah, right, I understand your point now. When sites determine if something is cached or not, are they doing it purely based on the time taken to load it?
This is based on trying to prevent profiling based on which fonts or versions of things are precached by behaving as if they are not cached, when in reality serving a local file with a simulated delay.
This happens somewhat by ensuring that timing APIs in Javascript don't have very high accuracy. But these attacks have gotten more sophisticated over time, and any random noise can be filtered out.
I thought that the timing is for attacking the cpu cache, not the web one. A request might take 100ms so such inaccuracy will be regressing lots of functionality.
My idea is for such delays to be introduced transparently in the caching service of the browser, not the js engine.
I'd also point out that Google CDNs don't work in China and make your websites broken for 1/5th of the world's population. (Not to mention your leaking your user's info to Google - which is a dick move..)
They also don't work with, well, Chinese or most of the Asian languages for that matter. Also most of the population in Asia don't consume the web in English. So you are definitely not missing 1/5th of the world just because your fonts don't render the same, you are missing them because of no localization.
If it's your knitting blog then maybe it doesn't matter. It really depends on the content. If you're selling software then it matters. They purchase software licenses like everyone else - so you're losing clients. If it's a code related thing then software developers in China will also look at it just because there is never enough programming resources for obscure issues. They just might not post issues and "engage".
My perennial annoyance was how StackOverflow was always subtly broken in China b/c they refuse to host some of their JS. On my own website I noticed that the default MathJAX CDN was also impossibly slow. Fortunately you can just forked a copy on github and use that - which was much snappier. Github-as-CDN
"They purchase software licenses like everyone else" no they don't, in countries where English is not common they favour software in their language, willing to give up a margin in software quality for it being localized. So they don't purchase "like everyone else".
I am not in China but in Japan, and I'm tired of seeing second-class clones of US companies that strongly win the local market just because they are localized.
Only with translating their websites, western companies would be able to win a lot of the market. And if they can also localize it, then it's a lot better. But if they do neither, as long as there is some sort of local alternative, your software is dead on the water.
I don't think China and Japan are alike in that regard. At least that wasn't my experience in China. I personally know they bought licenses for things like Matlab and TouchDesigner and whatnot. There are no local equivalents and local brands are not very highly respected. In my very short interning experience in Japan - thing were much more insular and they like their own Japanese brands
... also they just curate their fonts well and you don't have to wade through piles of shit fonts and "your download will begin in 15 seconds" like other font sites.
Also themes. I've downloaded/purchased about a dozen of HTML/CMS/static site themes for my own projects over the years, and modifying them to get rid of Google's Fonts is the very first thing I have to do. With various JS/CSS libraries, there's a chance that they'll be self-hosted. For fonts, they're always loaded from Google Fonts.
I then have to occasionally re-do that work when there's a significant update to the theme that I want to have. It's really annoying.
When I was doing a conversion from image based game banners (said how many people were in the server and shit) to a javascript solution that used a single json for all servers (as we got to 9 servers, loading at page start and refreshing these every n seconds was getting wasteful), I ran into this issue.
I wanted to keep the look and feel of the php/gd monospace font (could never find which font this used), but the normal defaultly included fonts i could find didn't have the same stylizing. Inconsolata on google fonts was however pretty close.
Only issue, there would be this ugliness for a half second when the page loaded while the font loaded, monospace has bigger size letters on the same font size, and its worse on firefox, so the banners would often wrap text that was suppose to be contained to one line while this loading happened. I couldn't preload the font, because cross-origin and doing the thing that is suppose to fix that didn't work in chrome.
So I just self hosted it, we already have cloudflare in front. Killed most of the delay, the preload tag killed the rest.
Highly appreciated hint (original author). Note that I haven't updated this project in ~4 years and it's still spinning on Heroku's free tier, haven't spend a penny for the API hosting costs.
Some recommendations regarding the CSS may be outdated by current "webfont hosting standards". Let's see what 2021 will bring for this project...
Logged in just to thank you for this excellent resource! We use it for all new websites and are gradually going back and retrofitting older sites using your project.
Wish it was this painless to migrate from other Google offerings!
The convenience is the major factor. When self hosting, you have to manually take into account stuff like font format, fallbacks for older browsers, preloading considerations, hinting or no hinting, variable or not variable, and manage the (often long) CSS yourself, and keep it updated as browser compatibility and technology changes. Google does all this for you with the inclusion of a single line. But yes, the drawbacks are loss of privacy and potential for stalling if the domain serving the CSS (googleapis) is blocked or slow for some reason.
> Why hasn’t w3c taken on some way to avoid the unnecessary complexity?
They did. Your browser will use the fall back when you use "font-family" selector.
But that's just the best the CSS spec can offer. Why?
Because the problem isn't a browser problem. It's a system level problem.
Operating systems only come with a handful of fonts. And those packs of fonts may differ per OS. Moreover, most fonts are protected by intellectual property rights. Helvetica isn't a free-to-use font. It's IP is currently owned by a company called Monotype Imaging. It was originally licensed to Apple, Xerox and Adobe for use. That's why Microsoft came up with Arial. [1]
> It was created to be metrically identical to the popular typeface Helvetica, with all character widths identical, so that a document designed in Helvetica could be displayed and printed correctly without having to pay for a Helvetica license.
If you want to use a font that isn't part of your OS, you basically have two options. Either you download a font published under a copyleft license; or you buy a license if it's a commercial font.
Suppose you want to use a commercial font in your website, simply adding it through font-face could be a violation of the IP license. You really want to read the fine print of the fonts you buy. Yes, companies and people do find themselves in protracted lawsuits over illegal font use. [2]
Before Google Fonts, adding a font via font-face was a protracted effort because of all this. You had to mess with font-face, and finding a nice, free-to-use font across various font foundries was hard and laborious.
The big boon of Google Fonts is that it provides (a) a set of free-to-use fonts which are (b) quite well designed and widely used everywhere after a decade. (Just like Arial, Google's Roboto is widely popular.) [3] and (c) extremely convenient to use since it's just a single line of CSS.
However, Google Fonts dominance on the Web only masks the underlying complexity of font management [4] and the legal issues of using and distributing commercial fonts that aren't designed by Google.
So, the performance and privacy issues related to using Google Fonts are mainly a trade off people are willing to accept in order to not having to deal with this complexity.
The old alternative is to use "web safe fonts" or "generic font families" which are presumably already available on the vast majority of machines via the operating system. Needless to say, this dramatically reduces the number of fonts designers can choose from. [5]
The other alternative is what the blogpost proposes: Download and host any font you use yourself. Provided that you do your own due diligence regarding rights management.
No. It is not. However, the foundations of digital media are very much rooted in analogue media, including print.
This includes typography. Whether it's for the web, or movable type. The history of typography is a crucially important part of the Web with pioneering designers such as Jeffrey Zeldman, Jason Santa Maria or even Edward Tufte in fields such as Information Design and human interaction design.
These days you can just use WOFF2 and be done with it. It won't work for a small minority of older browsers, but I think that's okay for webfonts, as they're entirely optional and you site will work fine without them.
Every once in awhile I load up the Dillo web browser[1], an extremely basic browser with minimal CSS support and no javascript support. I am always floored by how fast some websites are, and saddened as the list of websites I can realistically visit shrinks each time I try it. (Hacker news works great). It is so fast compared to modern browsers that it almost feels like a page is loaded and rendered before I even finish clicking the mouse button.
1. Who relies on a font that may or may not have been downloaded by a visit to some other website as a performance optimization for one's own website?
2. If cache partitioning obviates the performance benefit of CDNs for fonts, doesn't this mean you're better off hosting your own scripts and images for the same reason?
3. If CDNs are past their use-by date, will performance measurement tools finally stop admonishing you to host your assets there?
While I'm still using Google Fonts, I inlined all the @font-face declarations into my own CSS files, and was then able to remove the <link> tag (a blocking resource) in the head of the document. Lazy loading images also helped, and using webm for video.
Why isn't font caching an exception to site cache partitioning?
Can caching (Google) fonts, in itself, be used as some sort of security exploit?
I mean, sure, a site can time whether or not you've already downloaded a certain font before, but that's only useful to determine whether or not you've visited any of the sites with that font. Which is only usable if your sensitive site is also using a custom, nowhere-else-deployed font. That doesn't seem to be that painful security wise, compared to, say, telling websites which browser you're running.
Regardless - I support less reliance on custom fonts and a slimmer web overall.
Wondering if let's say I am Cloudflare, Fastly or another popular CDN provider. I'm already acting as a reverse proxy to the host. Would it make sense to rewrite calls to my own CDN so that requests could be done in parallel from browser side and then I could cache all Google Fonts, JS libraries, etc? Surely would need to be an optional feature that could be on/off by customers. But if we know performance is negatively impacted by third party loading of popular libraries, would cloning them and rewriting make sense?
That's a really sad side effect of cache partitioning.
By the way, does anyone know if browsers are at least merging/deduplicating cached resources locally again (e.g. by storing cached objects by content hash)?
The same could theoretically apply to computationally intensive tasks like WASM pre-compilation (although care should be taken to record the compilation time and delay loading appropriately for cached loads from different domains, or that would be the next cache leak).
Caching was nice but nicer is not to have to download the fonts, not to mkdir public/fonts, not to copy the files to that dir, not to match the file location in the css file and to have a CDN included for sites which I didn't employ with a CDN.
Besides, one should load one or max two fonts, not more and I couldn't care less if that 150kb is cached or not.
The caching argument is valid, for some browsrs, but the "load time" one really isn't. If your HTML doesn't contain link prefetching (which is async) for your google fonts, and really any linked asset that your page will need up front in order to even load "above the fold", that's on you, not on google fonts.
I've always wondered the legality of downloading web fonts from Google Fonts and self-hosting them.
At first glance it probably doesn't sound like a problem, I get it's effectively the same as serving them from Google, but I know fonts can also sometimes have strange licensing requirements… so I always figured it was safer just to avoid it entirely.
Most (all?) fonts on Google Fonts are licensed to allow free usage except selling the fonts themselves. Also, you can just download the font directly by clicking the "Download family" button on the font page.
"The open source fonts in the Google Fonts catalog are published under licenses that allow you to use them on any website, whether it’s commercial or personal."
Many fonts really do look nice. I have shamelessly "stolen" .woff-files wherever I could find them and the license allowed it. I think Google fonts are all free to download, but don't quote me on that.
I am usually inclined to vendor dependencies on the web. Not an ideal solution but I don't like dependencies on CDNs, even if they are indeed very reliable. But apart from making sure the set has symbols for special characters you might need, it is pretty safe to do so in case of fonts and maybe even advantageous.
Also bought some fonts from other services as there are quite cheap compared to the huge amount of work going into their creation.
I have yet to see anyone independently verify this, but to me it seems that Google fonts sends different versions of fonts, depending on the user agent.
Downloading a font and using it on a different operating system from where the original download comes from seems to make the font look... unusual?
Google acknowledges this themselves, and it's mostly for compatibility reasons. Serving TTF to old browsers and WOFF for newer ones for example. Easy to verify yourself, make a request to the stylesheet from internet explorer and compare with the response you get if you make a request from a chromium-like browser, it'll serve you different contents.
I'm wondering about commercial CDNs for font serving such as by Adobe and Monotype. Last I checked, these require hosting on their properties, or alternatively pinging their counter so that they can enforce their volume licensing. Only last year, there were still type foundries left that would license on a trust base; but these have gone away/bought by Monotype such as URW+ who once created free fonts for F/OSS operating systems (in my home town even). But isn't the practice of pinging or pulling from a third-party domain before presenting a consent dialog a GDPR violation?
Yes, it will be. The main reason, as all things, comes down to breaking human trust: tracking companies may send a small picture/small JS (that only contains a unique ID) that is randomly generated but is continously cached by the browser, effectively circumventing cookie policy.
Makes sense now that cross-site resources can't be shared (previously your Google Font loaded on another site, which is downloaded into your local cache could be used, but it seems like that isn't possible anymore).
Exactly. I use Google Fonts mainly to conveniently try out different fonts. It used to be even better, when you could select a bunch of fonts, and preview all of them on one single page. It's sad that feature is gone now.
This isn't an argument against using Google fonts, but it is a wake up call for developers and CDN vendors. Partitioned browser cache means the whole market for global cross-site CDNs is dead.
Yeah. That's the first thing I noticed. I was curious whether the OP either returned back to Google Fonts since posting this or never did it in the first place!?
CDNs were the way to serve content fast not too long ago.
Looks like they make sense now when your users and your web server are on the opposite side of the planet.
If you have a specific brand font, it's not going to be on Google Fonts anyway, so you're gonna self-host. And if you don't have a specific brand font, you should really just use web safe font stacks that leverage local fonts. System fonts look a lot better these days than they used to.
Basically, does anyone who's doing serious work use Google Fonts anyway? I just assumed it was a thing that mostly hobbyists and personal websites used.
I'd say that it could be done one step further, even: a package in, or part of, the OS, that simply installs the 100 (or ten, I don't know the usage curves) most common fonts and keeps them updated.
Isn't that what's already happening in every OS for a very long time?
The problem that web fonts are addressing is that you have the same font across all platforms, so if some OS ship with additional fonts we'd be in the same position as before: Not everyone will have this package as part of the OS and you have an inconsistent look again.
It's infuriating to see so much web tech being discarded created during the 90s and 00s in good faith, just because the web was subverted into an app delivery platform (in this case, shared caches were obsoleted by timing attacks) and the arms race for ever more crap to counter the negative effects, insufficiently. The web may now be a platform to lock-in and enslave users; just isn't anymore a recommendable place for sharing documents, reach out, and inspiration. We didn't need that; we had plenty of app platforms already.
Edit: you can still serve Google-commissioned fonts directly from your domain of course, and CDNs were always a bit fishy
How has the web been subverted? By whom? And which notable, useful old tecnologies have been discarded?
The capabilities of the web have only been expanded.
You can still build websites in Django and Rails and myriad other "old-school" technologies. What are you missing? Perl and CGI?
To the point of this article, you can perfectly build a website with self-hosted fonts, or with default system fonts. Is this what "lock-in" and discarding good ole' tech looks to you?
It's tiring to see this bitterly pessimistic, conspiracy-laden rhetoric on HN. With the added bonus that as of late it's being radicalized ("subverted", "enslaved"), as some sibling commenters are pointing out. How ironic that the people accusing everyone else of being "enslaved" are themselves carried away by the trends of the time.
But more importantly, this rhetoric is profoundly demeaning.
To web developers and technologists, because it assumes that they are largely incompetent or malicious, and that the commenter and his clique of retro-futurists know much better and are morally more pure than all tech professionals nowadays.
To regular people at large, because it paints them as drones and slaves to greater powers, with no agency whatsoever, instead of analyzing why they may prefer Facebook over old-school forums, and empathizing with them and their problems.
But above all, it's demeaning to the authors of such comments themselves, because it paints them as dejected, bitter fanatics who have lost sight of the forest for the trees.
The web is still full of opportunity - artistic, technical, commercial and otherwise.
If you don't think so, maybe it's time you embrace a low-tech lifestyle and devote yourself to gardening? Everyone will be better off for it.
This comment touches on some points which don't make sense from an engineering perspective. Different tools for different use cases. You can't throw the baby out with the bathwater and choose SPAs for every single application. They still perform poorly with SEO and people with disabilities that need a screen reader. It's not black and white (good or bad). Also, using Django and Rails are hardly the same as using Perl and CGI.
What doesn’t make sense from an engineering perspective?
I did not say you should build everything as an SPA. I said you still can build in SSR - not once did I say what you should or shouldn’t do. But I agree - some sites are better off as SSR, static sites or what have you. What in my comment made you think that I think otherwise?
I also do not think that Django or Rails are remotely similar to CGI sites - what makes you remark that they are hardly the same, when my comment doesn’t suggest the opposite?
Then DOM changing above or close to current position does trip up many accessibility technologies. Most SPA frameworks load all page components async so the page DOM fills in out-of-order.
This is a problem even without accessibility tech, the page content bounces all over the place during load by default in many SPAs.
Musk announces the newest brain-link, of which enplantment is a requirement to use modern phones. User complaints of random, altered perceptions of reality are brushed aside, and mostly accepted.
I flagged your post, so I owe you an explanation: This comment comes across as dismissive ("amirite") and is unlikely to generate good discussion. Inf act, it's the exact kind of comment you would make if you wanted to flame bait.
I'm a Canuck, and frankly most differences between US political parties is barely tangible. For many parts of the world, with real political differences, the US left and right appear completely identical.
Yet sadly, each side seems to see only the differences.
As a fellow Canadian, I think that was more true during the Bush/Gore debates than it has been any time since; becoming less true with time.
While the Democrats are somewhere right of our Liberals, and hover somewhere near our Tories, the Republicans are ... Something else entirely.
We don't have any party nearly as powerful and obstructive. Try to imagine if the Bloc held control of the Senate, and you may come to understand what it's like for the Democrats when having to deal with the Republicans. The legislation they create and pass just piles up at the senate door, while the obstructionists use ever trick they can muster to undermine their opponents.
My post to which you are replying, directly discusses political differences, stances on specific topics, that sort of thing. Viewed externally, much of the world cannot see a difference here.
Put another way, the differences between US political parties are exceptionally tiny, when measured against the sameness of those parties.
Yet what you're describing instead, is how radicalized / unable to communicate with each other, a party may be. Or those with opposing political viewpoints. Yet I see the same degree of "They're wrong, WRONG!" coming from both main political parties in the US. From people on both sides.
Well that's the rub: Republicans aren't particularly fickle about policy: they have been known to obstruct their own bills in order to undermine the Democrats.
I can't recall when last a Canadian party did such a thing.
Because of that I can't really find much merit in comparing the parties on policy. Public policy doesn't appear to be the primary motivator of the GOP. The Democrats may disagree with the GOP on policy and fight to alter legislation, but the GOP disagrees with allowing the Democrats to shape policy at all.
"Lacking any meaningful choice" on more locked-down platforms like the smartphone OS-es is a better choice of words. But when you get down to it and if you are into a bit more dramatic language, "enslaved" (with proper context attached) isn't that far off, no?
Yes enslave. It might come sooner as you think with developers so busy to take their trivial apps and dev workflows into "the cloud", where they soon find they're being monitored and otherwise taken advantage off by the likes of Atlassian and github, to at best result in their replacement by a much cheaper workforce or robots. As compared to traditional F/OSS or commercial software seeking to empower users, use workflows based on open formats and principles of site autonomy. For end users addicted to their smartphones, it's game over already.
How is this going to work, exactly? I take my app to the cloud, Github monitors me, and then ... forces my employer to outsource me to a robot somehow?
Tech is there, I can write static pages like it is 90s and upload to Google Cloud with gsutil, that's not much different from FTP:
<h1>My awesome site!
<p>Under construction
I browse without JS by default last ten years, it works fine, but it is niche. Today's web is so much more. Just look at Youtube, it is incredible. I've recently started playing Ukulele, so much data there.
It's about keeping Google Fonts (not saying goodbye!) but simply self-hosting them instead of using Google's CDN for better performance.
Seems like the author's trying to bait us by making us first assume Google's canceling the product, or that Google's done something scandalous so we should avoid it. :(
Yep, I assumed this was about another Google cancellation. Mods, would it be possible to change the post title to something mentioning font-loading or performance or self hosting? I can't really think of a title myself at the moment.
Even before browsers started implementing separate caches, I really hated the idea of using anything off of a Google CDN. You don't pay them. You're not their customer. There is no SLA.
Just host it yourself, or us a CDN you pay for. Why depend on an external service with no SLA, just for a little speed boost? If they're down, you're down. Break those dependencies on external sites when they're not necessary.
Correct me if I'm wrong, but a CNAME "redirect" on the DNS level does not change the page origin.
`fonts.gstatic.com` will only get the cookies set for `fonts.gstatic.com`, even if it is CNAMEd to a different domain.
Well, you can easily verify this by looking at your browser‘s request header.
If the cookie is sent, you’ll see it in there for requests to the font cdn subdomain.
(Hint: It‘s not.)
Wait, are you saying that if foo.example.com and bar.example.com are each CNAME records pointing at example.com (and the server has regular name based vhosts to isolate what is served for each), then browsers use a single cookie jar for foo and bar? So if the admin of the foo site and the admin of the bar site are adversarial, they can steal each other's cookies?
That doesn't seem right at all... I hope I am misunderstanding.
Now, if the admin of foo overrides the cookie scope to be *.example.com instead of the default foo.example.com, then yes it's expected that the admin of bar can steal foo's cookies. This is mentioned in the NextDNS Medium article you linked to, toward the end. This is definitely a "gotcha" situation but at least the default is safe, as far as I can tell.
Its false. CNAMEs are dealt with on the DNS level and are transparent to HTTP. It will use the origin the client connects with which is fonts.gstatic.com
I maintain a small hosts file project for blocking ads and tracking. I get so many complaints about blocking google fonts that I had to add a section to my README specifically for it [0]. I don't block google fonts, but I do block gstaticadssl.l.google.com and most blockers will block based on CNAME values now. Oh well.
Cookies gstatic.com wouldn't share to the resolved name would they? The cookie site is requested name not the resolved cname - or am I missing something?
The only issues I've come across so far is when a site uses font symbols, but overall, I've been pleasantly surprised.