Blog archive Work and technology National Library of Australia Fields of Interest My contributions Articles Here is a How to Topic Maps, Sir! Web user interface developer Why validation is being smart Topic maps often pondered-about things What about integration? WAI resistance Doing business xSiteable CSXTM Fresco.org Ego ergo sum About this site
Why validation is being smart
There are many opinions and ways of dealing with validation. Some disagree with its strict rules, some disagree of its importance, others will go to flamewar unless you yield, some don't care, and a lot don't know what the fuzz is all about. I would not take on a flamewar on the subject (although I'm no stranger to the phenomenon), but I do promote and push the use of it both early and late in any project I'm involved in.
The history of HTML code on the net is a non-linear one, from the first version of HTML (HTML) to the second (HTML 2.0), through variations to the HTML 3, 3.2, a jump through the non-official 3.6, the quick W3C HTML 4 (the first real effort made by a standards body to consolidate things) and the quick follow-up fix HTML 4.01 and its sister XHTML 1.0 ,and lately the new XHTML 1.1. Sprinkle in a few ISO and propriatary version, shaken up by low willingness to have a standards body, and you get the serious mess we today sometimes refer to the Browser Wars, HTML Dark Ages, or - more often - Tag Soup. Neutral browser developers and webpage markupists had to fight off and with the propriatary hacks of both Microsoft and Netscape, and as with any war, it is far more complicated than at a first glance.
So why bother validating to a set HTML standard that few browsers follow? There is so much variation, propriatary fiddle and otherwise uncertainty about the formats that if you validate, you are more likely to have code that most browsers don't fully support than if you make a version of your page than contains bits of them all.
The keyword is of course "cleanup." The HTML mess of the internet is a serious one, and the efforts in standardizing and validating against such a common body is the only thing that makes sense. How else can we get rid of crud if not by promoting its opposite?
In a parlimental speech about bi-lateral economies of third-world countries, "yo", "homie" and "can you dig it?" would be considered crud. "Your honor", "allies" and "In conclusion ..." is the right jargon, and on the net, language is power just like in the real-world. We do not want browsers that only knows how to "hang out" and "chillin'"; a computer language that is to be shared across the world by millions of people requires a certain degree of unity, making sure that local and demographical jargon is not used. There is a good reason that in diplomatic circuits "yo", "homie" and "can you dig it?" are never used.
There are browsers out there who "digs it" and hangs out with their "homies", without thinking that doing so is limiting to the net. Sure, the browser can understand both "homie" and "allies", but there is a danger with keeping up with too many aliases; maintenence messes and redundancy. In a computer language, such matters are of great importants, as it should be where the protocol is expected to be of a certain type.
Now, there are many types on the net, as the history of HTML above shows. But instead of going the the way of adding crud to a language, we must excercise the art of constraining the language. Now, that ain't easy. A lot of browsers have got it all wrong. A helluvalot of HTML markupeers have gotten it even wronger. And forgiving browsers ain't making the job of telling such poor markupeers to stop the errors of their ways, because - as they might say - it works, don't it?
Yes, it works, in the same meaning that chewing-gum works as a means to connect your battery to your car. It works, sure. Is it smart? Most of the time, no.
Validators simply ensures that we're speaking the same language. It is not - as a lot of people think - a syntax-checker.
<p> <div> Text </div> </p>
Now, that is perfect syntax. But does it validate? Um, no. You can't have a block-element (DIV) inside a P-element. There are good reasons for this, even if it might complicate things. And this is where many gets it wrong; you can still do it. It would still work in most browsers. The browsers don't complain. But it is still wrong.
The validator also checks the semantics of your code, and checks them according to a DTD. This is good; it makes sure that the document you have written - and if it passes - can be guaranteed to be parsed correctly by browsers and tools that can parse on the same DTD. It is a n exchange of rules and regulation for code so that both sides of the world can work on the code on the same level. Good, eh?
So validating your documents is a smart thing to do. Not only because you can make sure the syntax is correct, the semantics are good and the elements are placed right, but because other people, tools and programs do so too. Don't submit to slackiness brought on by the browsers; they are not the reason we've got HTML: we are.
"Sarchasm: The gulf between the author of sarcastic wit and the person who doesn't get it."
My other blog
Goodies from the archives
Blogs I read
Column Two (hot)
Don Park's Daily Habit (hot)
Edd Dumbill (hot)
Guide to Ease (pause)
Scripting News (hot)
Silent Lucidity (balance)
The Bile Blog (hot)
Signal vs noise (hot)
Thought Horizon (hot)