To Hell With Bad Editors

This article was originally posted at evolt.org.

By now everyone has had the opportunity to read the Web Standards Project (WaSP) position on old browsers, and see A List Apart (ALA) implement that campaign’s message with its own rebuild. Overall, a good message, and lots of good points within.

But WaSP has made it too easy. The people who partake in the WaSP and ALA are all too familiar with standards, compliance, and the failure of the browser manufacturers. In fact, none of what happened is really new to anyone who frequents the sites. In essence, WaSP and ALA are preaching to the choir. Not everyone in the choir may agree with the implementation, but they all understand the message.

And they’ve chosen the easiest targets out there — the browser makers. There’s loads of data to demonstrate that the browsers don’t adhere to standards, and that they’ve often ignored them. Microsoft is always a target, so it’s no great leap to attack their implementation of the <marquee> tag. Netscape is now an AOL company, so it’s easy to demonstrate how they took liberty with (then nascent) standards. Regardless of the fact that they both helped extend and stress-test the standards until the W3C was ready to take a strong role.

So now they lay blame on the older browsers, and praise the new ones for almost implementing standards that are up to four years old (even though there are more recent standards out there). They want developers to tell users to upgrade their browsers. With a campaign more annoying to users than the "Best viewed with…" buttons of the first six years of the web, WaSP suggests you kick users over to a page telling the users that their browsers are old and crappy. Who cares about the reason they may be using the browser? Who cares that the browser qualifies as good but the user had JavaScript disabled? Who cares that some users don’t care?

In focusing on the browsers, they’ve taken the pressure from where it really belongs — the editors. The browser makers are getting it, they’ve been making the changes (thanks in part to the WaSP and many developers). The developers who frequent ALA and WaSP get it, they’re coding to standards. The users are going to be assaulted with annoying redirects if WaSP has its way, so they may even upgrade in less than the projected 18 month window. But what about the developer who doesn’t partake? How is the campaign benefiting him/her or users of his/her sites?

Ultimately, there are two kinds of editors, people and software. Not all software writes bad code, and not all hand-coders write good code. But just as everyone thinks he or she is a good driver, nobody wants to fess up to the fact that someone is writing abysmal code.

Software

And by software, I primarily mean WYSIWYGs. This also includes those great text editors that offer incorrect HTML syntax guidance. And there are some that are self-described visual editors, or are really page layout applications, or even word processors. But ultimately, if it writes HTML for you, I’m talking about it. I don’t want to name any in particular, however, since I know people can be defensive about the tools they use. Some are bad, and some are good, and some are only as good as the user is bad.

I am, however, going to offer this statement from a company who makes all sorts of web tools. This statement was reported at a few places, including a review of the Web Standards Project Panel posted by Macromedia (don’t worry, there are other sources to verify it):

"The compliancy argument, despite its good intention, does not have any important real-world application or meaning when considering the challenges Web designers face today. Nearly all professionally-created sites created with a plethora of web design visual authoring and coding tools will not pass compliancy tests as presented at http://validator.w3.org/. Failure of this test likewise does not serve any strong indication as to the validity of the Web site design itself in terms of user experience."

This circular argument basically says, nobody’s making sites with valid code, so we’re not going to make a tool that writes valid code. To some degree, all the tool vendors are guilty of promoting this logic. There are tools that happily insert invalid tags and attributes, allow incorrect nesting of elements, and even have incorrect (or misleading) documentation. The resulting code is often bloated, and is generally optimized for the developer’s system.

There are open-source tools out there that could be incorporated into the editors. Off the top of my head, I can think of three that would make any WYSIWYG (or otherwise) editor a much more viable solution for the developer who wants to code to standards:

  1. The W3C HTML/XHTML validator. This will validate the given page against the DTD listed within the page. The source code is distributed under a GPL-compatible license.
  2. The W3C CSS validator. Another tool that could be integrated into an editor.
  3. HTML Tidy. A handy stand-alone utility that searches for, and corrects, tag errors (nesting, unclosed tags, illegal tags, etc.). The source code is there, and they promote integration with other tools.

Granted, this doesn’t necessarily apply to some tools that only output to HTML as an ancillary function. But if they choose to market this feature and know developers rely on it (like creating entire sites from sliced images), then they should have the responsibility of building the tool to write correct code. Some tools offer the option to customize code by, for instance, letting you quote attributes. This should not be an option, attributes should be quoted. If somebody really wants to write non-compliant code, that person can edit it manually, but the tool should default to correct code at all times, and assume the user utilizes the tool because the user cannot or will not code by hand.

Wouldn’t it be nice if the editor, or other non-dedicated tool (page layout tool, for instance), could notify the developer when he/she is creating inaccessible code? Wouldn’t it be nice if all those positioned <div>s were re-ordered, with prompting to the user, so that a screen reader could make sense of the content when linearized? Maybe it could coach the user for page titles instead of leaving blank <title>s everywhere. Perhaps it could tell the user that “click here” is an unacceptable string of text to make into a hyperlink. How about warning when a frame has no navigation in it? Image maps without text links? Lack of meta information? And the list goes on.

There are too many people who’ve been pushed into web development as part of their daily job, but have no idea what HTML is. I’ve seen too many human resource staffers expected to maintain the job posting section of a site. Why not provide them with a tool that does it right? They aren’t going to learn HTML, or even know about the WaSP campaign, so let’s target the software manufacturers who are the de facto authors of millions of invalid pages. Let the users create good code, despite themselves.

So I say to the tool developers, use your software to guide the user with correct code, validate all output, and cut out all that evil. For all the tool users, you must understand that the tool limits you. Unless you hand tweak the output (in which case I refer you to the next section of this piece), you can only generate what the tool will let you generate.

People

Standards and support is a well understood problem, many people just don’t care. There are developers who want the easiest way out possible, and don’t care about standards in light of everything from the extra work to the nagging client. Just because they eschew WYSIWYGs doesn’t mean they can code their way out of a triply-nested table.

These people can’t or won’t make that change. The worst offenders are those who won’t. After surfing the responses to the ALA article, I saw way too many comments where the person was quite gung-ho about the "to hell with old browsers" message, and used it as justification to abruptly stop coding for older or alternative browsers. Nothing in the world changed just because ALA and the WaSP got some press on this issue. The same people using Navigator 3.04 yesterday are still using it today.

Yet too many people will use it as an excuse to dump all support for those older browsers. Let’s be clear, there’s no reason you can’t build pages that work and look generally good in old browsers while still validating, this very site is an example (in fact, you can read about how we did it). But there’s a certain gee-whiz factor with being on the bleeding edge. So now, instead of trying to get some bizarre DHTML trickery to work properly, a developer feels he/she can say, "It’s the browser’s fault. Tell them to upgrade." Immediately responsibility has been handed off. And all I wanted to do was buy a scarf. I did turn off JavaScript, though, since it kept crashing my version of IE5, so I guess it’s my fault.

These are the developers who need to learn to code for the user, while still adhering to standards. Too many sites are simply justification for playing with code. On a personal site, that’s great. On an e-commerce site, that’s probably a bit dim. On a community site, that’s just bad web karma.

In many cases, the culprit is that the developer is trying to apply old rules to a new medium. There are many things the web is not. It is not a CD-ROM presentation; users don’t come to your site to learn a new navigation technique. It is not print; you can’t control how text wraps, you can’t control the leading, hell, you can’t even control the typeface. The web is not television; users don’t navigate linearly and without bandwidth concerns. This isn’t to say we don’t see the web in these media, but we need to code for what the web is, a highly malleable medium where the user has as much control as the developer in how the content is presented. And I’m not the first person to say it — it’s been said on ALA, Jakob Nielsen has said it, and they’re on opposite ends of the developer scale. Somewhere in between are the rest of us. And yet we see developers constantly massaging image-sliced table layouts and DHTML effects designed to wow themselves, their boss, or their clients, but rarely their users.

Hand coders also need good resources for their skills. Many of them turn to books, given the ease with which one can read them versus surfing the W3C site. However, many of these books provide incorrect code samples. I’ve personally returned four HTML books because they had incorrect tags, attributes, or syntax throughout (I’ve seen them include both the <spacer> tag and the <marquee> tag, among other near-Greek tragedies). This isn’t limited to books on HTML, either, but is seen perhaps more readily in books covering server-side programming and scripting. Often the authors are only concerned with getting their script correct, and the HTML is the unfortunate offspring of the wonderful world of servers and scripts. As such, it is the bastard child of the code in the book, lacking in everything from quotes on the attributes to closing tags. After writing to an author of a server-side scripting book about the incorrect HTML and XHTML examples, I received this response:

"You are correct about the sample of code shown, but it was done deliberately. It was meant to show a typical sample of HTML, whether or not that correctly conformed to standards. I agree that, in itself this isn’t really an excuse for writing ‘bad code’, but it wasn’t sloppyness. […] For myself I just hadn’t really been aware of XHTML and it’s importance – a pretty poor excuse I think you’ll agree."

To his credit, the author was aware of the importance of standards by the time I had found this book, and had made significant improvements in later books. But certainly this is indicative of an overall lack of strict standards compliance in the very “text books” so many developers use. And since those developers often don’t know about, or won’t take the time to visit, the W3C site, they are at a significant disadvantage.

So I say to the people who code, learn the standards, code to compliance, and always keep the user in mind, regardless of what unfortunate browser he or she might use.

Inside the evolt.org Rebuild: The HTML and CSS

This article was originally posted at evolt.org.

It’s been three months since evolt.org rolled out its new design. People have been asking what we did, why we did it, and how we did it. I’ll try to address these questions in the context of the HTML, the CSS, and the overall site design. (Nobody seems to care where we did it, but I’ll touch on that, too.)

What we did

In short, we created a site that is HTML 4.01 Transitional compliant, is CSS Level 1 compliant, conforms to Level A of the Web Content Accessibility Guidelines 1.0, and passes accessibility checkpoints as detailed by CAST’s Bobby validator. Any browser and any user can use this site, and with the browser archive at our disposal, we feel pretty confident making that statement.

Why we did it

We wanted to create an example of how it is possible to build attractive, usable, accessible, and compliant sites. We wanted to stop using <font> tags, and all the other deprecated or non-standard horrors to which we’ve grown accustomed over the years. We wanted to have a simple style guide for our users who submit articles. We wanted to make it easier to maintain the site. We no longer wanted to leave users out in the cold, enabling them instead to use the site regardless of their system. And ultimately, we want our users to customize the site to suit their needs. You may ask why we didn’t go straight to XHTML, and that is a good question. The answer was pretty simple for us — not all the browsers out there can handle it. Internet Explorer 5.0 for the Mac will even display it as source code if the mood strikes it.

Who we did it for

Let’s not kid ourselves. Nothing is completely altruistic. We all had our reasons, but luckily they all had the benefit of helping out the users. We, as volunteers who maintain evolt.org, got to flex our coding muscles on a project that we feel has a lot of weight in the community. We got to finally test the theory that a site can be built to be compliant, accessible, and attractive, all without those meddling clients.

Where we did it

Offices, apartments, and houses in Buffalo, Kopavogur, Milwaukee, Portland, Edinburgh, Adelaide, Anchorage, and all the places in between. There is no evolt.org home office, we are distributed as haphazardly as the Internet itself. Who ever said telecommuting doesn’t work?

Yeah, yeah, yeah, enough of that. How’d you do it?

I’m glad you asked.

The design

Original evolt.org site design concept
We were lucky on this one. Isaac had whipped up a design with which we had all fallen in love, even before we had decided to make a standards-compliant site. Luckily, that design lended itself well to being repurposed into where we were heading.

The code

We wanted to make sure that everyone could use the site. As I’ve mentioned, accessibility was one way to achieve that. However, we also wanted to allow those users who don’t have the latest browsers to still use the site. We try to be an inclusive community, so why would we want to exclude anyone at our site? As such, we had to make some decisions about how we would code the layout. We dumped CSS-P pretty early on. Yes, users of older browsers could just see a stack of <div>s in the corner of their browser, but what fun would that be? Using tables for layout seemed to be the most sensible thing to do, while still allowing us to keep the content and style independent. Given that, visit this site on any older browser, and you’ll see the same old black tab in the corner, and the same layout of the content. No, it won’t be in color, but at least you can still see it. I refer you to the screen captures presented below.

Since the style and the content are independent, we found it was extremely easy to create printer-friendly pages. A very simple header, some changes to the CSS units, and flow the content in. All articles are printable, even if not all of them should be printed (I know for a fact most of mine are dry reading).

The styles can be changed quickly as well. Just load a new style sheet, and it’s a whole new look. But I talk about that more later on. Once we knew that we wanted to allow different styles for different sections or users, we realized we had to create the black tab at the top of the page with aliased edges. The curves may not appear terribly smooth on a white background, but at the very least there aren’t any ugly halos around the curves, either.

Splitting the page into three stacked tables allows the browser to render it progressively, first rendering the tab, then the content cell, then the black footer. This does have some drawbacks, however (see the caveats below).

We wanted to limit the number of tags allowed on the site by authors. This way we could do our best to enforce standards in our articles, and ensure the CSS applies to every article equally. The trick was deciding on what to use. For instance, the debate came up about <i> vs. <em>. We chose to go with <em> since it implies context and structure, and not just style. One of the benefits is that a screen reader will know to pronounce that differently. As another example, we chose to use <strong> over <b>, <code> over <tt>, etc.

You may also wonder why our code blocks are displayed within <textarea> tags. That question is not quite so easy. Since the site is made up of stacked tables, and since the primary site navigation is on the right, any long lines of code would push the navigation off the screen. So, in an effort to make the blocks of code easy to view, without compromising the site design, we chose to use the <textarea>. It isn’t the best use, but it is valid HTML, and doesn’t cut out any users. An added bonus is that for newer browsers, putting the focus on the <textarea>, whether to scroll or highlight, the contents of the entire <textarea> are automagically selected and ready for copying to the user’s clipboard. The advantage for authors is that they need only enter those blocks of code into <pre> tags, and some highly trained gnomes re-code it for them on the server.

For all instances of form elements on the site (the login forms, the rating radio buttons, etc.), we’ve made every effort to use the accessibility features introduced in HTML 4.01. This includes the use of the attributes ‘tabindex,’ ‘accesskey,’ ‘id,’ as well as the <label> tag and even the rare use of <fieldset> and other elements. The advantages to these tags are that form elements can act more like form elements in the user’s operating system. For example, clicking the text next to a radio button or a checkbox will select that radio button or checkbox, using a key combination with the accesskey of a form element will give that field focus, tabbing through the form can happen in a logical order, and other benefits. Older browsers just ignore these tags and attributes, without interrupting their performance in any way.

Screen caps from various browsers

evolt.org home page and article page as viewed in Lynx 2.7
The evolt.org home page and an article page as viewed in Lynx 2.7. As you can see, it’s a few screens high, but all the content is accessible, and the table structure is logically ordered to allow for ease of use for Lynx users, and by association, screen-readers and other text browsers.

evolt.org home page as viewed in Lynx for Linux
The evolt.org home page as viewed in Lynx for Linux. Looks just as nice as in the MS Windows version of Lynx.

evolt.org article page as viewed in MacLynx
An article page on evolt.org as viewed in MacLynx. Noticing a trend?

evolt.org home page as viewed in Netscape Navigator 3.04
The evolt.org home page as viewed in Netscape Navigator 3.04. All the page layout matches that seen by the latest browsers. It degrades very well to older browsers while still retaining the site identity. All features of the site are still available, even if the browser doesn’t support the JavaScript in the form fields.

evolt.org article page as viewed in iCab for Mac
An evolt.org article page as viewed in iCab for Mac. iCab is a very nice standards-compliant browser. It cannot handle the CSS that colors the page, fonts, and form elements, but it does display the correct typefaces and sizes.

evolt.org home page as viewed in Mozilla 0.8
The evolt.org home page as viewed in Mozilla 0.8, probably the most standards-compliant browser out there.

evolt.org article page as viewed in Netscape 6.01
An evolt.org article page as viewed in Netscape 6.01, built on the Mozilla engine, with all the bloat we’ve come to love and expect.

evolt.org home page as viewed in Opera 5.0
The evolt.org home page as viewed in Opera 5.0, also considered one of the most standards-compliant browsers available.

evolt.org article page as viewed in Opera 5.0
An evolt.org article page as viewed in Opera 5.0

The caveats

Not everyone who posts an article or a comment can be guaranteed to use good coding practices, despite our best attempts to clean their code. That being said, not all article pages will validate, especially the older articles which we haven’t yet converted.

Some browsers experience an unfortunate offset with the black tab; it sits a few pixels too high and off to the side a bit. We did everything we could, really, but we just couldn’t get 100% conformance and still be standards-compliant. This is most noticeable in comments where somebody posts a very long URL and it pushes the content table beyond the width of the browser window. It’s just something we’ve chosen to accept.

Down the road

One of the features the site is able to support, but doesn’t just yet (hey, we’re volunteers with day jobs… er…) is user-defined styles. Since the colors and font sizing are independent of the HTML, all we have to do to change the look of the site is link to a different CSS file. This means that if the text is too small, or doesn’t have enough contrast, you’ll be able to adjust it to a comfortable reading level. As a bit of a teaser, the three images below are screen captures of the exact same page, with a pointer to a different style sheet. You’ll mostly notice only color differences. I like the font size.

evolt.org with the 'heaven' style
evolt.org with the 'hell' style
evolt.org with the 'classic' style, for the die-hard fans

Do you think you’re special?

Yes, but that’s because my mom told me that while growing up. I think it was because she dropped me on my head. But you’ve heard that, and seen my incessant stuttering anyway…

But we’re not the only ones who’ve taken this step. Another web community, A List Apart, staged its own Valentine’s Day Massacre of all its non-standard HTML and rolled out a new version of its site. However, they took a slightly different approach.

Joining it to the Web Standards Project’s browser upgrade initiative, they’ve taken it a step further. They’ve dumped the use of tables for layout in favor of a two-column layout driven by CSS. Why would they do this? Well, I’ll leave it to them to explain, they do a much better job. They even have a great article going into the sordid details of how they trotted those poor old HTML tags out into the back alley to put a bullet in their bracket. Is their approach better? No, just different. In their case, they want to show developers it can be done, and try to promote the adoption of newer browsers. In evolt’s case, we want to show it can be done, and work in all browsers. These different goals result in a good contrast of ways to implement a standards-based site.

Have you seen others coding to standards? I know there are some others out there, but I’d love to hear from you (use the comment form below) if you’ve seen other sites who’ve taken this step.

Whose fault is all this?

There’s a long list of people who had a hand in this, and I’ve listed them below. They all deserve a lot of credit for sticking with this redesign for over six months when they could have been drinking milkshakes and trolling the list.

  • aardvark, the initial coding, validating, and browser testing. You can blame him (me) for much of the HTML and CSS.
  • djc, helping to integrate all this with the ColdFusion templates and data sources.
  • .jeff, for integrating some handy JavaScript and beating the hell out of ColdFusion to get it to work with our design.
  • isaac, for the initial design and helping ensure we stayed true to it.
  • mccreath, for updating the styles, the HTML, and working it into ColdFusion
  • elfur, also for updating the styles, the HTML, and working it into ColdFusion
  • martinb, for testing and pushing compliance.
  • marlene, for general harassing us into doing something.
  • rudy, for making me defend the structural appropriateness of practically every tag used.
  • thesite, evolt’s mailing list dedicated to discussing all the development on the site, past, present, and future. Lots of good people there who threw ideas on the table and helped us test.