With the release of iOS 9 and its support for content blocking APIs, there has been an explosion of ad blockers that are proving only too popular with users. This has kicked off the long overdue debate about the malpractices of contemporary online advertisers, and the ethics of blocking said advertising. There have been numerous interesting perspectives on this issue, and rather than recounting them here, I urge you to read them for yourself:
- What Happens Next Will Amaze You by Maciej Cegłowski
- Ad blocking by Seth Godin
- The ethics of modern web ad-blocking by Marco Arment
- Ad Blocking and the Future of the Web by Jeffery Zeldman
Instead, I wish to draw your attention to web fonts. A lot (if not most) of the ad blocker apps also support blocking of web fonts. Some restrict themselves to blocking font hosting services such as Typekit and Google Fonts, while others block all web fonts, self-hosted or otherwise. Designers, as some might have expected, have something to say about that.
The debate over content blockers is an important one for our industry. I just wish web fonts weren't caught up in it.
— Jeffrey Veen (@veen) September 18, 2015
As a designer “Block Webfonts” makes me… happy sad.
— iA Inc. (@iA) September 18, 2015
@iA Webfonts (woff) actually are lean in data: 15–30kb/font. Stupid to block them because that decreases reading experience.
— Linotype (@Linotype_com) September 18, 2015
So designers are not happy. But if you’re a user, chances are, you’re quite relieved (or even ecstatic) at the ability to block web fonts and experience a faster web. And web designers and front-end engineers have no one but themselves to blame for this.
How did we get here?
The web is primarily textual. Typography, thus, becomes an essential component of web design. CSS features such as
@font-face allow us web designers to enhance the typographic quality of our designs by giving us the freedom to use any font at our disposal. This is a good thing. No two ways about that (and more of the same, please).
But somewhere along the way, we forgot (or chose to ignore) that a user’s experience of the web is made up of a wide range of factors, of which fonts are an important but not all encompassing part. Smooth performance and fast access to content are just as (if not more) important factors.
Far too many websites, far too often started resulting in situations such as this:
The page has been rendered. The content is available, but it isn’t accessible to be read until the web fonts finish downloading. This behaviour is called Flash of Invisible Text, or FOIT.
FOIT is bad if you’re on a low-speed connection, because the text might be the last thing that becomes accessible on the page. FOIT is worse if you experience bad latency or lose network connection altogether and the fonts fail to download, leaving you with blank spaces where the text might’ve been.
There is an alternate method, which starts off with making the text available on first render using local fallback fonts from the user’s system and applying the web fonts after – and if – they finish downloading. This behaviour is called Flash of Unstyled Text, or FOUT.
Default browser handling of
@font-face embeds has changed over the last few years, but web designers have more or less had control over going the FOIT or FOUT route right since web fonts gained popularity 1. Sadly, most websites chose invisible (FOIT) over accessible. They chose to hide the text until their preferred typefaces would finish downloading, resulting in longer wait times for the user. They treated web fonts not as an enhancement but as a requirement, seceding the functional high ground for higher quality typography.
Dear front-end developers, I’m on a train. I won’t mind a sudden switch of fonts, but I sure mind this endless wait for the fonts to load.
— Souvik Das Gupta (@souvikdg) December 29, 2014
It shouldn’t come as a surprise that people want to be able to read the news report in Georgia or Roboto, or carry on with their shopping in Arial rather than stare at a blank screen for the rest of their train ride. Hence their joy and support for web font blockers.
The sense of dismay amongst designers hints at a deeper negligence. Why are designers disappointed on hearing that a certain feature might not be available to a certain subset of the people visiting their site? Is that not true for pretty much everything but the vanilla HTML we write? Or do we only concern ourselves for users with the latest software releases running on the most powerful hardware devices connected via the fastest networks?
With iOS 9 now able to block web fonts, it may be time for my amazing new paradigm-shifting dev methodology I call “Progressive Enhancement”
— Bruce Lawson (@brucel) September 17, 2015
Web designers should know better. Websites should not come with minimum software requirements. Websites should not feel doomed if one, or two (or three, or more) of the enhancements are unavailable in a certain browser. Granted, typography is a vital component of web design given the percentage of text on our pages, but it should not come at the cost of the ability to read. The dependence on web fonts for delivering great reading experiences further highlights our mis-treatment of those web users that fall outside our local map.
Users are pushing back against the abusive practises of the online advertising industry. Some might feel that in between this fight, web font blocking is the unfortunate collateral damage, but I disagree. I feel the culprits are the websites that chose FOIT over progressive enhacement. In the process of users reclaiming faster access to content, even the FOUT style web fonts have been blocked. That is the real collateral damage.
Bram Stein has a great slide deck on the current state of web font loading and performance. Bram is also the author of the FontFaceObserver script, which is our weapon of choice to implement a progressive font loading strategy, based on the excellent work done by Filament Group. Recommended reading for web designers and front-end engineers. ↩
We designers often get worked up about the information architecture of our own websites and apps (which I refrain from calling silos, although they often tend towards that) and forget about the information architecture of the web itself. This essay provides an alternate, historical perspective of organising the world’s information.
Very well put together. Worth your time both to better understand the new Apple TV interface as well as brush up on some universal principles like this on Navigation:
People tend to be unaware of an app’s navigation until it doesn’t meet their expectations. Your job is to implement navigation in a way that supports the structure and purpose of your app without calling attention to itself. Navigation should feel natural and familiar, and shouldn’t dominate the user interface or draw focus away from content.
The beauty of the web is in its ubiquity. Its unparalleled reach isn’t a mere co-incidence — rather, a 26 year long journey of consciously embracing the principles of inclusiveness. The minimal hardware and software requirements have enabled most electronic devices today to connect to the web. At the forefront are mobiles which have surpassed their predecessors, laptops and desktops, quite emphatically.
Today, user experience on a mobile device affects way more people than any other device. With several low cost smartphones out in the market the web has been brought within reach of lower sections of the socio-economic pyramid — many for the very first time. In fact, for a large portion of the population, inexpensive mobiles connected to the internet over flaky mobile data connections are their only window to the web.
Mobiles are a hard problem — in many ways it’s like going back a few years in terms of device power and capabilities. Even though we – the web designers and developers – largely acknowledge that mobiles are omnipresent, the user experience challenge these devices pose is often conveniently reduced down to an afterthought. And as a result, the state of mobile browsing continues to be in a mess with endless examples of essential services like banks assuming that users have the privilege of accessing a desktop or a laptop over a fast and reliable connection.
We have ensured that key services are available to you on the mobile website. For other services, please continue to desktop login.
At Meta Refresh 2015, I shared a peek into what constitutes today’s web eco-system. A check on the real world impact of poor mobile web experiences — something we perhaps underestimate. It’s a call out to the community to own up the unremarkable state of mobile web, make the right compromises going forward and refuse to budge even though it may sound unrealistic and drastic.
Dev Ops: Please bring back the bouffant. We know you can do it. pic.twitter.com/dkhupQDBoh— Rachel Nabors (@rachelnabors) August 7, 2015
There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are things we don’t know we don’t know.
— Donald Rumsfeld
Earlier this year, in the month of February, I attended Meta Refresh in Bangalore where I emphasised the importance of progressive enhancement through a workshop and a talk at the conference. It is quite unfortunate that many web designers and developers continue to carry forth the old approach of designing websites for a known system configuration. Only later do they test their websites on alternate browsers / devices and patch issues that are detected. This practise is known as graceful degradation.
This site is best viewed in IE6, 800x600.
About a decade back there was very little variation in hardware and software and one could have got away with making assumptions about client systems – though that doesn’t really justify this practice even at that time. Over the years, the devices (and software) connected to the web have come a long way. New waves have swept across the ecosystem – frequently and unpredictably – bringing in newer hardware, device capabilities, screens, browsers and other software. Today it is hard to keep up with the speed at which things are changing. We don’t know what will storm the ecosystem next but something surely will. Assumptions about a system configuration, today, are far from safe.
At a time when technology is changing faster than what we can keep up with, it’s worth noting that the fundamental principles of the web have remained unchanged since its inception about 25 years ago. These principles have stood the test of time and can perhaps be described as strictly liberal. In many ways they have been responsible for letting the web evolve and improve over time without any major impediments. Even today the very first website on the internet works flawlessly across all standards compliant web browsers on every device.
Perhaps ironically the more backwards compatible your web site is, the more future friendly it is. #ffly— Luke Wroblewski (@lukew) December 29, 2011
The onus of embracing these principles lies on us. If we look closely, there are only a handful of knowns in a traditional web communication — the presence of a client and a server that talk over HTTP, and the fact that client is running a web browser that will understand a hypertext document. Everything else is either a known unknown or an unknown unknown.
Progressive enhancement is a fundamentally opposite approach to the widely adopted method of graceful degradation. It was introduced in the year 2003, by Steve Champeon and Nick Finck in a talk titled Inclusive Web Design for the Future. In short, start with healthy markup, add the styles around it, and finally layer the interactivity around it. How does that help? The outer layers do not interfere with the inner ones, and therefore basic systems can access the content in the markup layer and deliver a base experience to the user. More powerful systems can further take advantage of the outer layers to enhance the user experience. No one’s excluded.
Progressive enhancement is more about dealing with technology failing than technology not being supported. And you can quote me on that.— Andy Hume (@andyhume) June 3, 2013
Last year, all of a sudden, there was a revival of the progressive enhancement debate on the internet. Since then, it’s been regularly discussed at many reputed conferences around the world. Having to defend the values of the web (such as inclusiveness) 25 years after we began this journey by letting go of control, and becoming flexible is a tad disheartening. I wish the community rests this debate and upholds the spirit of the web. This talk is an effort to urge everyone to do just that.
We’re delighted to have two new people working with us over the summer.
The first, Ayush Kumar Tiwari joins us as the web developer intern. Ayush has been dabbling with Python and Django while at college. More importantly, he plays the guitar and has brought it along to the office. Ayush had managed to stay sober at an engineering college for a full two years before finally succumbing to beer during his very first week at Miranj. This has led to some strange and funny side-effects, like showing up at work after guzzling 2 litres of milk in one go. We hope the effects are not permanent.
Our second intern is the soft-spoken but sharp-eyed Kamal Nayan Sharma. He joins us as a web designer. Kamal has been blogging at TechToll for a few years and is also a budding photographer. He possesses a mean (in a good way) sense of humour that will catch you unaware and render you incapable of a response both because of the fits of laughter you roll in as well as the brilliance of the attack.
It’s been great having new people in the team. More hands on deck, more people to help me pull Souvik’s (heavy) leg. Can’t wait to show you what we’ve been up to.
Everybody knows that most passwords will remain unchanged. Yet our collective response to Heartbleed has been to patch our servers and email users asking them to do something we know most of them won’t do.
Here’s what our response should have been:
ALTER TABLE users DROP COLUMN password;
Justin goes on to suggest one-time authentication codes delivered via email and SMS as the replacement. Regardless of what you think of the suggested solution, if Heartbleed get us to re-evaluate passwords and adopt a better authentication protocol on the web, it might just end up being a net win for us all.