One of our recent projects, built on Craft CMS, required support for top-level user profile URLs like
instagram.com/_basestation (also commonly known as vanity URLs). While Craft offers a fair amount of flexibility for routing requests, implementing vanity URLs wasn’t particularly straightforward. This post is a run-through of how we worked our way around it.
To begin with, the routing precedence in Craft is as follows:
- All URIs beginning with the
resourceTriggerconfig setting are treated as a Resource request
- All URIs beginning with the
actionTriggerconfig setting (or POST parameter) are treated as an Action request
- Any direct URI matches with Entry or Category (or any other Element) object URIs are treated as an Entry/Category/Element request, wherein the corresponding Entry template is loaded with a pre-populated
- The first successful URI match against user declared regular expressions. Craft calls these dynamic routes, and the routes can be declared either via the Control Panel or in
- All URIs that match a file in the template folder(s) when interpreted as a template path.
Given this precedence, the ideal scenario for supporting a user profile page for each User object was to assign our desired URI to the User object. The routing would’ve automatically been handled at the third level (as an Entry/Category/Element request). However, the URI field appears to be unsupported for User objects and we couldn’t figured out a way to change or override that behaviour.
We were left then with the fourth level (dynamic routes). So we added a rule to match all top-level URIs and direct those requests to the user profile template page.
'(?P<username>[^/]+)$' => 'user/_profile',
Since this pattern is quite liberal and will match all top level URIs, we placed this as the last rule in our
craft/config/routes.php file. The
craft/templates/user/_profile.html template looked something like this:
With these two components in place, we started seeing the desired results. Requesting any valid user profile page like
/batman loaded the
craft/templates/user/_profile.html template for Bruce Wayne. So far, so good.
The problem we ran into here was that we had broken template path based routes (level 5) for top level URIs. For instance:
- The template
craft/templates/about/index.htmlshould’ve been reachable via
- The template
craft/templates/contact.htmlshould’ve been accessible via
However both those URI requests resulted in a 404 response due to line number 6 of
craft/templates/user/_profile.html. (Provided there were no users with
contact as their username. (Always a good idea to maintain a username blacklist when dealing with vanity URLs.))
Now one option was to simply declare dynamic routes for
'about' => 'about/index' and
'contact' => 'contact' and place them before the user profile routing rule. This would’ve worked if we had only a handful of templates with top-level URIs, but it would not have scaled very well for a large number of template path routes.
So how did we get around this? By altering the logic to look for a template path match before looking for a username match. We introduced an intermediate template file
craft/templates/_vanity_router.html to achieve this.
And we modified the dynamic route to hand off control to the intermediate template file:
Template paths now take higher precedence than vanity URLs, and that is exactly the behaviour we were going for.
Hope you find this useful for adding vanity URLs to your Craft project. If you’ve taken a different approach or have any feedback on this approach, we’d love to know.
Hear, hear! Maciej Cegłowski (of Pinboard fame) on the bloat prevalent in contemporary web design (even for text-based content). Hardware and networks seem to be getting faster and cheaper, while websites appear to be getting slower, heavier, and more expensive to access.
A Craft CMS plugin to route requests to pages with a filtered, pre-loaded list of entries. Developed by us and spawned off during our IndiaBioscience project. The plugin now handles URL routing for the many varied filters on sections such as Jobs, Events, Grants (and more) on the IndiaBioscience website. Also listed on Straight Up Craft.
Our friend and interface designer Abhishek Nandakumar writes about the value of designing with prototypes instead of static mockups. He uses the example of Google Maps to demonstrate how working prototypes can often reveal counter-intuitive improvements in an interface, like increasing the number of clicks:
Most time is spent navigating and scanning through the interface, but if the interface and the path to your eventual goal are both clear, clicks might not matter, because you’ve reduced a majority of the cognitive load associated with scanning.
We’ve felt the same for a long time, and hence prefer designing our sites in the browser as often as we can.
With the release of iOS 9 and its support for content blocking APIs, there has been an explosion of ad blockers that are proving only too popular with users. This has kicked off the long overdue debate about the malpractices of contemporary online advertisers, and the ethics of blocking said advertising. There have been numerous interesting perspectives on this issue, and rather than recounting them here, I urge you to read them for yourself:
- What Happens Next Will Amaze You by Maciej Cegłowski
- Ad blocking by Seth Godin
- The ethics of modern web ad-blocking by Marco Arment
- Ad Blocking and the Future of the Web by Jeffery Zeldman
Instead, I wish to draw your attention to web fonts. A lot (if not most) of the ad blocker apps also support blocking of web fonts. Some restrict themselves to blocking font hosting services such as Typekit and Google Fonts, while others block all web fonts, self-hosted or otherwise. Designers, as some might have expected, have something to say about that.
The debate over content blockers is an important one for our industry. I just wish web fonts weren’t caught up in it.
— Jeffrey Veen (@veen) September 18, 2015
As a designer “Block Webfonts” makes me… happy sad.
— iA Inc. (@iA) September 18, 2015
@iA Webfonts (woff) actually are lean in data: 15–30kb/font. Stupid to block them because that decreases reading experience.
— Linotype (@Linotype_com) September 18, 2015
So designers are not happy. But if you’re a user, chances are, you’re quite relieved (or even ecstatic) at the ability to block web fonts and experience a faster web. And web designers and front-end engineers have no one but themselves to blame for this.
How did we get here?
The web is primarily textual. Typography, thus, becomes an essential component of web design. CSS features such as
@font-face allow us web designers to enhance the typographic quality of our designs by giving us the freedom to use any font at our disposal. This is a good thing. No two ways about that (and more of the same, please).
But somewhere along the way, we forgot (or chose to ignore) that a user’s experience of the web is made up of a wide range of factors, of which fonts are an important but not all encompassing part. Smooth performance and fast access to content are just as (if not more) important factors.
Far too many websites, far too often started resulting in situations such as this:
The page has been rendered. The content is available, but it isn’t accessible to be read until the web fonts finish downloading. This behaviour is called Flash of Invisible Text, or FOIT.
FOIT is bad if you’re on a low-speed connection, because the text might be the last thing that becomes accessible on the page. FOIT is worse if you experience bad latency or lose network connection altogether and the fonts fail to download, leaving you with blank spaces where the text might’ve been.
There is an alternate method, which starts off with making the text available on first render using local fallback fonts from the user’s system and applying the web fonts after — and if — they finish downloading. This behaviour is called Flash of Unstyled Text, or FOUT.
Default browser handling of
@font-face embeds has changed over the last few years, but web designers have more or less had control over going the FOIT or FOUT route right since web fonts gained popularity 1. Sadly, most websites chose invisible (FOIT) over accessible. They chose to hide the text until their preferred typefaces would finish downloading, resulting in longer wait times for the user. They treated web fonts not as an enhancement but as a requirement, seceding the functional high ground for higher quality typography.
Dear front-end developers,I’m on a train. I won’t mind a sudden switch of fonts, but I sure mind this endless wait for the fonts to load.
— Souvik Das Gupta (@souvikdg) December 29, 2014
It shouldn’t come as a surprise that people want to be able to read the news report in Georgia or Roboto, or carry on with their shopping in Arial rather than stare at a blank screen for the rest of their train ride. Hence their joy and support for web font blockers.
The sense of dismay amongst designers hints at a deeper negligence. Why are designers disappointed on hearing that a certain feature might not be available to a certain subset of the people visiting their site? Is that not true for pretty much everything but the vanilla HTML we write? Or do we only concern ourselves for users with the latest software releases running on the most powerful hardware devices connected via the fastest networks?
With iOS 9 now able to block web fonts, it may be time for my amazing new paradigm-shifting dev methodology I call “Progressive Enhancement”
— Bruce Lawson (@brucel) September 17, 2015
Web designers should know better. Websites should not come with minimum software requirements. Websites should not feel doomed if one, or two (or three, or more) of the enhancements are unavailable in a certain browser. Granted, typography is a vital component of web design given the percentage of text on our pages, but it should not come at the cost of the ability to read. The dependence on web fonts for delivering great reading experiences further highlights our mis-treatment of those web users that fall outside our local map.
Users are pushing back against the abusive practises of the online advertising industry. Some might feel that in between this fight, web font blocking is the unfortunate collateral damage, but I disagree. I feel the culprits are the websites that chose FOIT over progressive enhacement. In the process of users reclaiming faster access to content, even the FOUT style web fonts have been blocked. That is the real collateral damage.
- <p><a href=”https://twitter.com/bram_stein”>Bram Stein</a> has a great slide deck on <a href=”https://speakerdeck.com/bramstein/web-fonts-performance”>the current state of web font loading and performance</a>. Bram is also the author of the <a href=”https://github.com/bramstein/fontfaceobserver”>FontFaceObserver script</a>, which is our weapon of choice to implement a <a href=”https://gist.github.com/rungta/fa39058f1d15d6d4ea95”>progressive font loading strategy</a>, based on <a href=”http://www.filamentgroup.com/lab/font-events.html”>the excellent work done by Filament Group</a>. Recommended reading for web designers and front-end engineers. <a href=”#fnref1:1” rev=”footnote” class=”footnote-backref”>↩</a></p>
We designers often get worked up about the information architecture of our own websites and apps (which I refrain from calling silos, although they often tend towards that) and forget about the information architecture of the web itself. This essay provides an alternate, historical perspective of organising the world’s information.