Typekit – the messiah of web typography?
There is some big news this week for those website owners and designers keen to use custom fonts on their websites.
For the longest time web designers have been limited to a small subset of fonts that were known to be present on the majority of users PCs. Although CSS font stacks allowed designers to choose less common fonts, because they could specify a safe alternative if that font was unavailable, this did not guarantee the user would see the design as intended. The only way of forcing a particular font was using Flash (via sIFR) or images. However, both of these approaches created potential accessibility problems.
The irony of this situation is that browsers provide a way to embed fonts in your webpage using @font-face. The problem is not technological but legal. Font foundaries are concerned about loosing control over the distribution of their fonts.
We’ve been working with foundries to develop a consistent web-only font linking license. We’ve built a technology platform that lets us to host both free and commercial fonts in a way that is incredibly fast, smoothes out differences in how browsers handle type, and offers the level of protection that type designers need without resorting to annoying and ineffective DRM.
Apple vs Microsoft – A website usability case study
The Web Designer Depot is running an interesting post that compares the usability of Microsoft and Apple’s websites. Let’s be honest, pitting Microsoft against Apple is a little bit of a gimmick. Its actually very hard to compare these two websites. The two companies are aimed at very different markets (as the post itself admits) and are on very different scales. Apple is much more focused as a business than Microsoft and so the Microsoft site is always going to be more complex.
That said, it is extremely interesting to see the two sites deconstructed from a usability point of view and it does identify a number of common usability issues that we can all learn from.
The article focuses on the following areas…
- Homepage design
I am sure it will come as no surprise that Apple won hands down. However, I think it is interesting to note that the primary reason for Microsoft’s failure was its size. The larger a site is, the harder it is to maintain consistency, ensure quality and handle navigation. There is a lesson here for all owners of large websites – if you want your site to be usable, make it smaller by simplifying. Apple applies the principles of simplicity to everything from their products to their websites and it results in exceptional usability.
Reinvigorating old blog posts
This week I came across possibly the most ridiculously named idea in the whole of the web – “Sneeze Pages“.
Although the name is stupid the idea is actually a good one. According to Sitepoint a Sneeze page is…
a page that propels people in different directions deep within your blog by highlighting a variety of posts that you’ve previously written.
In other words it is a way of drawing users attention to older blog content buried deep in your archive and therefore increasing the ‘stickiness’ of your website.
As the post on Sitepoint points out, the problem with blogs is that new user rarely get past the last dozen or so posts. The wealth of content in older posts is largely invisible. It therefore makes a lot of sense to create the occasion post which draws attention to this older content.
The Sitepoint article suggests four types of “Sneeze Pages”:
- Themed Sneeze Pages—these are posts or pages on your blog or site that revolve around a single theme (e.g. The best of Boagworld usability advice)
- Time-related Sneeze Pages—these pages are based around a defined period of time (e.g. What you might have missed this month)
- Retro Sneeze Pages—another variation of the time-related sneeze page is to do one that unashamedly shows off a number of posts ffrom a particular point in its history (e.g. The best of 2008)
- Series Sneeze Pages—this is the technique of writing a series of blog posts exploring a topic over a period of time with lots of interlinked posts. (e.g. My 10 harsh truth posts)
Personally this strikes me as great advice and you can expect to see several such posts from me over the coming weeks and months.
Creating WCAG 2.0. accessible forms
I never get emails asking us to cover accessibility in more depth. Its just not a sexy topic. Designers, developers and website owners know they should care about accessibility and even endeavor to make their sites accessible. However, it doesn’t really excite people.
However, it is an important topic and one I will continue to cover on the show. I would also argue it can be inspiring too. I will never forget the first time I watched Robin Christopherson from AbilityNet use a screen reader. Its not until you see it in action that you realize the challenges people face.
The same revelation came for me again when reading Accessible Forms using WCAG 2.0. Its not a light read and takes some getting through. However, it has some great insights into exactly how screen readers deal with forms and yet how easy it is to improve the experience if you know what you are doing.
For example did you know that screen reader users have to enter a special “form mode” to complete a form. When in this mode the screen reader will only read form elements. It will ignore any instructional text, unless it is wrapped in a label or other form element. This can easily be a real problem.
There is also advice on…
- Colors and fonts
- Mandatory fields
- Grouping form elements
- and much more
Personally, I feel this should be required reading for all designers and developers.
Feature: How site personas can enhance your sites
If your website was a person, what type of person would it be? It is an interesting question. Take a look at your website for a moment. Look at the design, read some of the copy. Can you picture a single person that represents your site? If the answer is no, then you may benefit from the creation a site persona.
Listeners feedback: Stop hackers hacking your hackey code!
Steve from Aberdeen writes:
You promote the show as being for all those who “design, develop and run websites on a daily basis” but actually don’t cover much for us developers! How about covering some more developer orientated topics such as website security.
Its a fair accusation Steve, which is why I we have Dave on the show this week. He is going to provide a basic introduction to website security.
Security is a complicated monster to tackle, so it helps to think about it in really, really basic terms. Stuff comes in, stuff goes out. We have to assume that anything that comes from the user is dangerous, or tainted, and can’t be trusted in any way what so ever. We don’t even know for sure that the user is who they claim to be. Trust no-one. We also have to be 100% sure that anything we send back to the user is safe, un-tainted, and uncompromising. You don’t want to send dodgy scripts to your users, and neither do you want to send back valuable clues to the inner workings of your code. This is not meant to be an all-encompassing guide to preventing attacks, but instead a set of guidelines to writing applications in secure way.
The first rule is this. Minimise areas that accept input from the request, and minimise areas that send response. Sanitisation and validation should be the first thing you perform on data received and the last thing before you return it. Following a sensible architecture such as the Model View Controller approach separates data received by the Controller area and data returned to the View. This will make your life far simpler when you start tackling such issues. This applies to all forms of input (form data, querystrings and cookies) and all forms of output (HTML, redirect, file download).
A commonly overlooked validation is confirming the data has been intensionally sent from the user. There’s nothing to stop a 3rd party website posting to your website, so it doesn’t matter how secure your login form is, the posted data could be coming from any of the dodgy websites your user has open. An easy solution is to use a random key as part of every posted form, unique to the users session. This way you can easily verify the form has been posted from a tightly controlled page you served to your user. It’s not enough to look at referrer headers, because these are easily faked. ASP.Net web forms, to their credit, do this by default.
Use White-lists over Black-lists. Lets say for example you’re validating a phone number, you don’t specifiy every non-digit character you want to remove, you strip everything that isn’t 0-9. A little too obvious? The same applies to the classic script tag. If you start trying to remove every form of <script> tag, you’ll end up playing catch-up against tricks using <img>, <body> and clever encoding. If allowing any kind of HTML through is necessary, you’d better be damned sure who submitted it and who is going to be able to see it.
So you’ve received your data, it looks pretty good. Whatcha gonna do with it? If it’s heading towards a database, you can’t be too careful. Escape it, or better yet use parameterised queries. If it’s the file system that your data is ending up, put it somewhere sensible. Ideally, this would be somewhere outside of your webroot, or in a protected folder. Whatever happens, you don’t want anything here directly accessible or executable. Just to be sure.
OK, so we’re sending a response. Just because the data we received passed our tried-and-tested validations doesn’t mean it’s safe to send back to the user. We HTML encode everything, unless absolutely necessary. If it’s plain text, fairly straight forward. If you’re putting suspect data into an HTML attribute, it might be an idea to verfify the format. If you think you’re outputting an SRC or HREF, check it at least looks like a path. If your response happens to be a redirect, double check nothing funny is going on with the URL. If your response is a (serious) error, make it look friendly, but don’t give away exactly what went wrong. If you want to send them a file, attaching it and manually setting the MIME type is more controlled to simply pointing them at the file.
This is not intended as a set of golden rules, rather a few key points to help you think about the code you write. Most new forms of injection and hackery are just clever ways of attacking poor code. Writing sensible code will keep you one step ahead of such attacks.