MSDN pain

Will someone please tell me when MSDN started to suck? I remember back when I first started with Visual Basic, MSDN was really great. It was a wonderful reference source with lots of good material. The site was relatively quick and easy to use, the documentation was useful, and the examples tended to be at least moderately informative.

What the hell happened? Today I was looking up some information on using the XPathNodeIterator class in the .NET framework and Google directed me to the MSDN page for it. It was horrible!

The first thing I noticed was the truly massive page size. I literally sat there for seven seconds watching Opera's page load progress bar move smoothly from zero to 100%. And that's on the T1 connection at work!

The second problem is the class declaration, which says that it's a public, abstract class that implements the ICloneable and IEnumerable interfaces. There's nothing wrong with including that information per se. I personally don't think that including the code for the declaration is particularly helpful, as they could just as easily say that in pseudo-code or English, but whatever. What I do object to is that they included this declaration in five different programming languages! Why?!?! Of what conceivable value is it to waste half a screen worth of text to display a freakin' declaration in VB, C#, C++, J#, and JScript? Is the average Windows programmer really so completely clueless that he can't decipher this information without a declaration in his particular language? It's ridiculous!

The third problem is the code samples. Or should I say "sample." There are three code blocks, each of which has exactly the same code, except translated into different languages - VB, C#, and C++. Again, why? Is this really necessary? And if it is, why do they have to display all three on the same page? Why not break out at least two of the samples into separate pages? It's just a pain to have to sort through lots of irrelevant information.

My last complaint is the content of the example itself. Maybe this is just a product of my not yet being too familiar with .NET or with object-oriented enterprise-level frameworks in general, but the code sample just struck me as kind of bizarre. The goal of the algorithm was to iterate through a set of nodes in an XML file. To do this, they created an XPathDocument object and got an XPathNavigator object from that. Fine. Then they selected a node with the navigator object to get an XPathNodeIterator object. OK, I get that. Then they saved the current node of the iterator, which returns an XPathNavigator. Umm.... And after that, they selected the child nodes from the navigator to get another XPathNodeIterator, which they then used to actually iterate through the child nodes.

Is that normal? Do people actually write code like that? I mean, I can follow what they're doing, but it seems like an awfully circuitous route. Why not just go straight to from the initial navigator to the final iterator? You can just chain the method calls rather than creating a new variable for each object that gets created, so why not do that? I suppose the charitable interpretation is that the example is intentionally verbose and general for instructive purposes. But to me, all those extra object variables are just confusing. It makes for another, seemingly redundant, level of indirection. Maybe I'm atypical, but the direct approach makes a lot more sense to me.

You can reply to this entry by leaving a comment below. This entry accepts Pingbacks from other blogs. You can follow comments on this entry by subscribing to the RSS feed.

Add your comments #

A comment body is required. No HTML code allowed. URLs starting with http:// or ftp:// will be automatically converted to hyperlinks.