Sprucing up the blog

I've been trying to spruce up the blog the last few days. I'm trying to make things a little more reader-friendly and possibly increase traffic a little.

First, for the 11 people who actually subscribe to it, I've changed the RSS feed over to FeedBurner. That gets me subscriber statistics, a little extra exposure, and a bunch of other miscellaneous features. It's also completely painless for everybody else, since I just set a mod_rewrite rule to redirect everybody to the new external feed. I even did a little hacking to make the new feed URL integrate nicely with LnBlog.

I also set myself up with a Technorati account. This provides another handy tool for gaining wider exposure. It's a nice complement to TrackBack and Pingback in addition to actually being a nice service to use.

I'm also experimenting with my page layout. In particular, I'm reorganizing the sidebar. I don't really know what the optimal layout it, but I do know I wasn't particularly happy with the old layout. There's still plenty of stuff left to play with, so I guess I'll just try some different things and see what works.

Tomorrow I'll get into the motivation for the change. Right now, I need to go to bed. I was just barely able to muster the concentration to write this, so I'm certainly not up for a long explanation.

DHCP fails on boot in Feisty

On Friday, I upgraded my laptop to Kubuntu Feisty. For the most part, it was uneventful. My only immediate complaint was that this upgrade, too, took 24 hours to complete. There are just too many places where the upgrade tool prompts for user input, so if you walk away, it end up taking 5 times as long as it should.

However, later on, I did find one problem: my wireless network wasn't working. It's an integrated Broadcom BCM4318 which uses the Windows drivers through NDISwrapper. NDISwrapper was still working and the wireless extensions seemed to be functioning properly. Moreover, the interface was up. It's just that it never got an address from the DHCP server.

After some experimentation, it turned out that just restarting the network devices with sudo /etc/init.d/networking restart got things working. Same thing if I did an ifdown and ifup or even manually ran dhclient. Once the system if booted, pretty much any network re-initialization will fix things. It's just on initial boot that I don't get an IP.

KNetworkManager configuration progress
Another thing is that if I use the knetworkmanager to reconnect my wireless access point, I lose my IP again. For some reason, the progress never gets past 28%, the "Activation stage: Configuring device" stage. I suspect there's some relationship, but I really have no idea what it is. I'm not that deep into the bowels of Ubuntu configuration.

The immediate fix was to simply add /etc/init.d/networking restart to my /etc/rc.local. That gets us an IP at login time and keeps Sarah from calling me in because she can't check her e-mail. In the long term, I'd like to figure out what the heck is causing this. There's a bug in Launchpad that looks similar, but I didn't see a fix.

Weighing in at 193.6

This weeks official weigh-in result is 193.6 pounds. That's down 1.2 pounds from last week and 39.6 total.

Now that the weather is turning nice, the carb counting is really starting to suck. We've got to watch the pasta salad, buns for burgers and hot dogs, and worst of all, no ice cream. It just doesn't seem the same without the traditional summer food. At least we can cheat a couple of days a month without breaking the diet.

Sabotaging productivity

This week, .NET Rocks had a great interview with Jeff Atwood. Jeff is a really insightful guy and listening to him was as much fun as reading his blog. In fact, this interview inspired me to start listening to other episodes of .NET Rocks. Well, that and the fact that Carl Franklin co-hosts Hanselminutes, which I also enjoy.

One the topics the interview touch on was Jeff's Programmer's Bill of Rights. It enumerates six things a programmer should expect if he is to be productive.

I found this both depressing and comforting. It's depressing because, as Jeff pointed out in the interview, these things are neither unreasonable nor hard to fix. You can basically just throw money at them without putting in any real effort. These conditions should not be widespread enough that anyone needed to bring them up.

pipecat.jpgAs for comfort...well, it's just nice to know you're not alone. I'm currently one of those poor schleps Jeff talked about who's still working on a single 17" CRT monitor, a three year old PC, and sitting in a cubicle right next to one of the network printers. I'm not even within sight of a window and my cube is literally just barely big enough to fit my desk. I write my code in SharpDevelop because my boss won't spring for a visual studio upgrade. Two years ago it was, "Well, we'll wait for the 2005 version to come out instead of buying 2003." This year it was, "We'll wait for the 2007 version to come out instead of buying 2007." And last but not least, despite the fact that we write mostly reporting-heavy information systems, I use the version of Crystal Reports that came bundled with VS.NET because, as crappy as it is, it's the best thing available to me.

I have to agree with Jeff, Richard, and Carl. The message you get from a setup like this is clear: you are not important. We don't value you as a person, we don't value your work, and we're certainly not going to waste money on making your life easier. The net effect is that morale, especially among the more clueful people in my office, is in the gutter. There's misery galore and productivity is next to nothing. But fortunately we work for the government, so nobody notices. And no, that was not a joke.

Sometimes it seems like our environment is tailored specifically to sabotage productivity. It's kind of like the keyboard they put on the laptops that the police use.Keyboard of actual laptop used in police cars I'm the happless IT guy who has to do configuration and maintenance on these laptops, and I can tell you that the only explanation for those keyboards is that I did something really, really terrible in a past life. They're ruggedized keyboards made of semi-hard plastic. The problem is that they're so rugged that it's completely impossible to type on them. You have to use the two-finger method because the keys are too hard to press with your little fingers. Trying to type with any speed at all is completely futile. And yet the cops are somehow expected to type up tickets and accident reports on these things. It's a wonder they even give out tickets anymore. Actually, maybe that was the idea....

I suppose this is what I get for taking an IT job when I really wanted to be in software development. In retrospect, maybe I should have stayed a full-time student that extra semester or two, finished my damned thesis and looked for a job with a real software company. But I thought I needed some experience and this was the best offer I got, so I took it. Unfortunately, I was too inexperienced to know that crappy experience isn't necessarily better than no experience.

Though on the up side, when I took this job is when I moved in with my (now) wife. It also provided the money that paid for that engagement ring. So in some ways this was the right decision. It's just the professional advancement wasn't one of them.

Now I just need to finish my damned Master's thesis and get the hell out of here.

No, bloggers aren't journalists

Last week, Jeff Atwood posted an anecdote demonstrating yet again that bloggers aren't real journalists. I know this meme has been floating around for some years, but I'm still surprised when people bring it up. In fact, I'm still surprised that it ever got any traction at all.

I'm going to let you in on a little "open secret" here: blogging in 2007 is no different than having a Geocities site in 1996. "Blogging" is really just a fancy word for having a news page on your website.

Oh, sure, we have fancy services and self-hosted blog servers (like this one); there's Pingback, TrackBack, and anti-comment spam services; everybody has RSS or Atom feeds, and support for them now built into browsers. But all that is just gravy. All you really need to have a blog is web hosting, an FTP client, and Windows Notepad.

That's the reason why bloggers in general are not, and never will be, journalists. A "blog" is just a website and, by extension, a "blogger" is just some guy with a web site. There's nothing special about it. A blogger doesn't need to study investigative techniques, learn a code of ethics, or practice dispassionate analysis of the facts. He just needs an internet connection.

That's not to say that a blogger can't practice journalism or that a journalist can't blog. Of course they can. It's just that there's no necessary relationship. A blogger might be doing legitimate journalism. But he could just as easily be engaging in speculation or rumor mongering. There's just no way to say which other than on a case-by-case basis.

Like everything else, blogging, social media, and all the other Web 2.0 hype is subject to Sturgeon's law. The more blogs there are out there total, the more low-quality blogs there are. And the lower the barrier to entry, the higher the lower the average quality is. And since blogs have gotten insanely easy to start, it should come as no surprise that every clueless Tom, Dick, and Harry has started one.

I think George Carlin put it best:

Just think of how stupid the average person is. Then realize that half of them are stupider than that!

Any average person can be a blogger. Thus the quality of those blogs will follow a standard distribution. For every Raymond Chen, Jeff Atwood, and Roger Johansson, there are a thousand angst-ridden teenagers sharing bad poetry and talking about not conforming in exactly the same way. They're definitely bloggers, but if we're going to compare them to journalists, then I think society is pretty much done for a blog.

Hiding any file in KDE

I recently started taking advantage of a handy little feature utilized by Kubuntu: the .hidden file. It provides a simple way to hide any file in a directory.

As you may know, the traditional UNIX method for hiding files is to start the name with a period, such as ".vimrc" or ".profile". That works well enough, but there's one obvious problem: you have to rename the file. You can't just go renaming any old file or directory willy-nilly. That's how you break your system.

hidden.pngHence the .hidden file. You just create a .hidden file in a directory and in it you put the names of any files or directories you want to hide, one per line. That's it. Once you have that, KDE will treat those files/directories as hidden.

This is quite useful for my home directory, because I have my home directory set as my desktop. For the most part, I like this setup. However, there are always a few directories that I seldom, if ever, need to see in the file manager, like ~/Mail or ~/texmf. Using a .hidden file conveniently frees up those extra pixels without the hassle of dealing with symlinks or shortcuts and without the need to reconfigure any software.

Why have bug tracking when you can use post-its ?

I came across a weird article today. It's entitled Why Bugs should not be Tracked and the basic premise is that it's a bad idea to use bug tracking software.

What struck me as really strange about this article is the final conclusion. After telling us how bug tracking is bad and unnecessary, the author offers an alternative: write your priorities down on a piece of paper. In other words, do the extremely low-tech version of what bug tracking software is for. Say what?

The real problem that this article addresses is that bug tracking systems are easily abused. It's easy to enter bugs and just let them sit there until the end of time. However, if you're using something like a piece of paper or a spreadsheet, you have to dispose of the "stale" issues in one way or another, otherwise the whole thing becomes completely unmanageable.

There's an easy response to that, though: "Don't abuse your bug tracking system." Don't let bugs pile up. Categorize sanely. Do some house-cleaning when the number of outstanding bugs gets too big for comfort. It's not rocket science, just use the tool productively. Agile development is all well and good, but sometimes it seems like they're throwing the baby out with the bath water.

Sybase + ADO.NET = pain

I've been wrestling with Sybase again lately. Both last Thursday and today, I had to write a couple of small programs to update and reformat values in some Sybase tables at work. The programs themselves were very simple and took maybe half an hour to write. However, due to the pain inherent in connecting to Sybase, this quick update turned into a several-hour ordeal.

The database in question is running on , which is about three major versions out of date.

While I wrote the original client program in VB6 with ADO 2.6, I have now switched to C#. This is partly because VB6 sucks by comparison to C# and partly because I'm trying to make my skillset more marketable. So when I wrote those little update programs, I did it in C# with .NET 2.0. However, that didn't go so well.

Method 1 for connecting to database in question, which runs Sybase SQL Anywhere 7, was to use the System.Data.OleDb classes and just use the provider that worked with ADO 2.6. But, of course, it didn't work with ADO.NET. I even tried following the Sybase how-to on ADO.NET, but the ExecuteDataReader always resulted in a "no such interface supported" COM exception. The only difference was that I was using the ASAProv.70 provider instead of ASAProv.80, which I don't have. Presumably the older provider doesn't do ADO.NET.

The second attempt was to use the .NET 2.0 System.Data.Odbc classes. My reasoning was that, since ODBC is an industry standard supported by virtually everyone, using straight ODBC should work. And it did...sort of. I was able to connect to the database and read records without incident. The problem was with updating using OdbcCommand objects.

The code I used was pretty unremarkable, which is why I was a little surprised when it failed. It looked something like this:

OdbcConnection dbconn = new OdbcConnection("DSN=SomeDSN);
dbconn.Open();
/* Lots of unrelated code.... /*
string sql = "update MyTable set Bar = ?, Baz = ? where ID = ?;";
OdbcCommand cmd = new OdbcCommand(sql, dbconn);
cmd.Parameters.Add("Foo", OdbcType.Char, 15).Value = someVar;
cmd.Parameters.Add("Baz", OdbcType.Char, 15).Value = someOtherVar;
cmd.Parameters.Add("Foo", OdbcType.Int).Value = someIDVar;
int result = cmd.ExecuteNonQuery();

There shouldn't really be much to go wrong there. However, executing the command resulted in the following exception:

Exception System.Data.Odbc.OdbcException was thrown in debuggee:
ERROR [HY000] [Sybase][ODBC Driver][Adaptive Server Anywhere]General error: Host variables may not be used within a batch

Googling this message came up with this page from Sybase website, which helpfully tells me:


You have attempted to execute a batch which contains host variable references. Host variables are not supported within a batch.


That's it. Just a restatement of the error message. I find this to be fairly typical of Sybase documentation.

At this point, I was faced with two questions:
1) What the heck is a "host variable?"
2) Why am I getting a message that they aren't allowed?

The first question was answerable with a little Googling. As explained by this page from the IBM iSeries manual, a host variable is just what the name suggests: an external variable that gets used in an SQL statement. You can use them for input and output parameters when writing more complicated SQL programs.

For the second question, Google failed me, so I can only speculate as to the answer. My guess is that ADO.NET is translating the parameters into host variables rather than inserting the values directly into the SQL. Presumably my version of the ODBC provider doesn't support host variables, hence the error. It seems to make sense, but as I said, that's only a guess.

My final solution? After spending way too much time on fruitless searching and experimentation, I finally decided to just go the quick and dirty route of string concatenation. Instead of parameters, I just used a String.Replace to escape quotes and spliced the new field values directly into the SQL. It's ugly, and I wouldn't want to use it in a "real" program, but you know what? It works.

Feisty upgrade

Kubuntu Feisty is out and I've upgraded my desktop. All in all, it was a fairly painless process.

The upgrade process for Feisty was much improved over easlier versions. The first time I upgraded, from Hoary to Breezy, it didn't go so well. In fact, the first couple of upgrades were fairly risky affairs. The upgrade from Breezy to Dapper was an unmitigated disaster, with the upgrade simply removing some key packages, like all of KDE.

This time, it went pretty smoothly. I did have to start over again, because the upgrade stalled while trying to contact the PLF repositories, which are now offline, but after I removed that entry from my sources.list, things went very smoothly.

The best part was probably that this upgrade was all graphical. You use Adept to install all the available updates, and when it's done, a little wizard pops up telling you that a new version of Kubuntu is available and asking if you'd like to install it. After that, you click next a few times, watch a few progress bars, and click through a few confirm prompts. That's about it. The installer even reboots the system for you at the end.

The actual upgrade took several hours to complete. In my case, it actually ran for around 12 hours, but I'm sure much of that was just sitting there waiting for input. You see, downloading the updage packages took a couple of hours, but then it prompted me to install. And then, throughout the installation, there were several prompts for configuring various packages. So, unfortunately, it's not something you can just leave for a few hours and come back to an upgraded system. It needs a little babysitting. I had to come back periodically and do things like accept the default options for some RAID software that I don't even know why it's installed (I don't have a RAID array on my system) or tell it what to do with configuration files that has changed in some trivial way. Thus, since I was busy, I ended up spreading it out over the entire day.

So far, Feisty is nice enough. I haven't used it too much yet, but there doesn't seem to be much worthy of comment so far. It seems like just new versions of everything and a few incremental improvements. Which is as it should be. If I wanted a revolutionary change, I'd buy a copy of Windows Vista. Kubuntu seems to be getting very solid now, which is just how I like things.

Averaging out to 194.8

We'll call this week's weigh-in result 194.8. The scale said 195.8 on Thursday, but 193.8 on Friday, so I'm going to just split the difference this week. That puts me down two more pounds from last week for a grand total of 38.4 pounds to date.

While it's great to be making progress this fast, I'm currently in kind of an uncomfortable spot. See, none of my clothes fit right anymore. I'm basically walking around looking funny all the time. But since I've got another 15 pounds to go, there's no point in buying new clothes, because I'd just have to replace them again in a few months.

Not that I'm complaining. If that's the biggest problem I'm having, then I guess all is going well. Though I do really miss Chinese food....

Amarok upgrade = disappearing podcasts

The other day, I made the mistake of actually paying attention to that "updated packages available" icon in my system tray and actually installed those updates. I really should know better by now. Kubuntu is great, but I swear every time I install updates, something breaks.

This time, something happened to Amarok. I'm not sure what, but the next day, my podcasts were just gone. The previously downloaded files were still there, but the "Podcasts" folder in my playlist panel was empty.

The worst part was that, when I tried to re-add the few podcasts I subscribe to, things didn't work properly. I used the "configure children" option to set the defaults to download under a particular directory and to download when available. However, when I added child podcasts, the settings reverted to the system defaults. And for some reason, it took several tries to get the "download when available" option to stick.

So my question is: what the hell?!? That's just weird and I can't figure out, for the life of me, why that would happen. But on the other hand, Ubuntu Feisty is supposed to be coming out in the next week, so hopefully this won't be an issue for much longer.

No zone => Windows script-foo

It's been a very "IT" kind of afternoon.

I started out the morning well, got some coding done, and was in "the zone" up until about 10:50. The then fire alarm went off. I don't know why, because there wasn't any major fire. Maybe it was a drill/test, maybe some kid pulled the fire alarm. Who knows.

Anyway, I had to go stand outside in the unseasonably cold weather (it's in the 30°F range this week) for about 10 minutes. That sapped a great deal of my motivation. Then I had to stand in line for another 10 minutes while the guards checked IDs put people through the metal detector. That pretty much knocked me out of the zone for the rest of the day.

So this afternoon was devoted to Windows XP trouble-shooting, because it's reliatively brain-dead. I worked with a laptop user to finally track down a long-standing error message that had been showing up in one application. And by "track down" I mean I finally got to actually see the error message and quickly diagnosed it as a file permission problem. I would have done that a long time ago, except that these laptops have no connectivity to our network and are almost never available at a time and location where I can look at them. Plus nobody ever bothered to tell me the exact error message, which makes it a little harder to figure out what's going on. While I try to keep my skills sharp and up-to-date, I have to admit that I'm still woefully lacking in the telepathic debugging department.

cmd.pngNow that I've got that problem tracked down, I'm presented with another. This problem affects 20 odd laptops. How do I fix it on all of them without having to go around and physically touch each of them? Remote access is out, so the first obvious solution is to get the users to do it. However, that won't work because I can't give them the admin password. The second obvious solution is to pawn it off on the help desk, but they'd never go for that. After all, we operate on the "whoever touches it first is stuck with it forever" theory of assigning support tasks. Hardly an optimal algorithm, but that's the way it is and nobody with any power is willing to rock the boat.

That leaves me with one option: script that sum bitch! I might not be able to trust the users with complex instructions, but I can certainly trust them to dump some files on a flash drive and double-click an icon.

The only problem is administrator access. The Windows runas command prompts for passwords. I need something that can use a stored password from a batch file, preferably one kept in an encrypted file. Luckily, a little Googling turned up just such a tool: lsrunase. That looks like it should do the trick. Another quick Google for a Windows command reference to find the CACLS command and I should be good to go. All I need to do now is write the batch file and test it out on one of the laptops. But that'll have to wait for tomorrow.

Laptop battery life

battery.pngI finally got around to looking up solution to my laptop power problem. I'd actually never really worried about it, since I almost always use the laptop plugged in, but I read a blog post complaining about battery life under Ubuntu, so I figured I'd look into it.

My Dell Inspiron B120 normally got about an hour an ten minutes on battery. I never actually ran Windows on it, so I have no basis for comparison. However, I'd read other comments about people getting two to three times as much battery life on Windows. For instance, my brother's Inspiron 1501 gets about 3 hours.

The good news is that there's "laptop mode." By flipping on the ENABLE_LAPTOP_MODE setting in /etc/defaultacpi-support, I gained about 20 or 30 minutes.

The bad news is that there's not an awful lot of other tweaking to be done. There are some other tips on the Ubuntu wiki, but I haven't found anything that seems to make a dramatic difference. At least, not anything I can configure with a truly intimate knowledge of the system.

So basically, it seems like we're kind of out of luck for the time being. But I'm not going to complain too loud. I'm just glad all the hardware in the laptop actually functions.

Down to 196.8

Well, this at week's weigh-in I was down to 196.8. That's down 2.4 pounds from last week and puts me down a grand total of 36.4 pounds.

Mac and UNIX security

You often hear Mac and Linux people going on about how their operating system doesn't suffer from viruses and malware. They claim it's because their OS is inherently more secure. The Windows people then retort that the real reason is that hardly anybody uses MacOS or Linux, so the hackers just don't put much effort into it. A huge flamewar then ensues, in which very few of the participants actually know what they're talking about.

I read an InfoWorld blog on this very topic today. While I found it largely unremarkable, I did take issue with one passage.


The difference isn't market share, it's the foundation of the operating systems. Given that most virus authors and hackers are in it for the ego, don't you think that there would be a huge incentive to be the first one to write a widespread OS X, Linux, or FreeBSD virus?

There are two problems with this passage. First, there's the claim that "most virus authors and hackers are in it for the ego." That may have been true 10 years ago, but not anymore. These days, many hackers and malware writers are in it for the money. Some of them are even in bed with organized crime. It's not about learning how systems work anymore. Now it's big business.

In light of this, it's just absurd to dismiss the possibility that market share is not an issue. Just look at the numbers. On desktop PCs, Windows has well over 80% market share - probably more like 90%. So if you're trying to build a big botnet, what are you going to target? Windows is generally less secure by default, has more non-technical users, and if you get just 10% of them, that's more systems than if you got every Mac out there. With numbers like that, targeting anything other than Windows is just a waste of time.

Of course, the underlying operating system may have something to do with why Mac and Linux users have fewer security worries. However, it's certainly not the only reason. The default configuration of each system is another big reason - the out-of-the-box Windows configuration has historically been wide-open, while MacOS X and Linux are fairly secure by default. But if we're going to be honest, we can't ignore market share. It may or may not be the primary reason, but to claim it's not an issue is just wishful thinking.

Kubuntu wins again

I collected a little more proof that Linux is better than Windows today. Or, at the very least, Kubuntu is better than Windows for the work I do. In particular, installing new software is so much more convenient, despite the crap that the nay-sayers spew.

One of the Senior Programmers (SP for short) came to me with a problem related to viewing multi-page TIFF files. Basically, the SP had written a program to normalize the canceled check scans we get from various banks. Of course, each bank has their own data format and associated viewing application, but nobody want to have half a dozen programs to do the same thing. Hence this system.

Anyway, one of the banks had changed the format of the data the were sending us, and while each file contained images of several checks, the SP's third-party TIFF tools and libraries could only access the one. It seems the internal pointers in the TIFF file were either not present or not correct. So we put our heads together and the SP gave me a copy of some sample data to play with.

Now, since I was at work and running Windows XP, I could have scoured the internet for TIFF tools, downloaded a half-dozen MSI or EXE file, and installed them. Instead, I fired up a Kubuntu Edgy virtual machine and started up Adept.

The first thing I installed was the libtiff tools. They had the same limitation as the SP's tools. So I installed KHexEdit to look at the raw TIFF file data. Each of these was selected and installed in a matter of seconds.

Meanwhile, the SP had managed to dig out an old DOS-based hex editor from the bad old days of QBasic. Why? Because that was the only thing lying around. Needless to say, it didn't hold a candle to KHexEdit and was, in fact, totally inadequate.

khex.pngI, however, was investigating the suspicious "offset" and "length" attributes in the XML file that came with these images. I jumped to the first offset value in KHexEdit and noticed that, like the beginning of the file, it started with a TIFF byte order marker. And so did the next offset. So I set bookmarks at those offsets, copied the binary between them, and pasted it to a new file. KHexEdit made it almost painfully easy. And, as I suspected, the resulting file was, in fact, its own valid TIFF file. The bastard bankers had just concatenated a bunch of TIFF files together!

And so the problem was solved. And with the help of Kubuntu, I did it before the SP even had time to hunt and install down a proper hex editor.

Top excuses for bad design

Roger Johansson over at 456 Berea Street has posted a really great rant on lame excuses for not being a web professional. It's a great read for any developer (of any type) who takes pride in his work.

My personal favorite excuse is the "HTML-challenged IDEs and frameworks." It always seems odd to me that back-end developers can look down on the front-end web designers as "not real programmers" and yet be utterly incapable of writing anything even close to valid markup. Sometimes they don't even seem to aware that there are standards for HTML. They have the attitude that "it's just HTML," that it's so simple as to not even be worth worrying about. "Hey, FrontPage will generate it for me, so I'll just concentrate on the important work."

This is fed by the "the real world" and "it gets the job done" excuses. The "target audience" excuse even comes into play a bit. After all, nobody in the real world worries about writing HTML by hand. Especially when FrontPage gets the job done. And since our target audience all uses Internet Explorer anyway, it's all good.

This sort of thinking is especially widespread in the "corporate IT" internal development world. For example, if Roger ever saw the homepage to my employer's "intranet," he would fall over dead, having choked on vomit induced by the sickeningly low quality of the HTML. The index.html page contains an ASP language declaration (but no server-side script in, oddly enough - probably a relic of a Visual Studio template) and a client-side VBScript block before the opening HTML element. Several of the "links" on the page are not actually links at all, but TD elements with JavaScript onclick events to open new pages. For that matter, it uses tables despite the fact that it's laid out as a couple of nested lists of links. Needless to say the markup doesn't even come close to validating against any DOCTYPE. And this was written by a senior programmer who fancies herself the "local expert" on web development.

I guess it's just a problem of mindset. Some people just want to get the project finished and move on. They don't really care about code quality, maintainability, interoperability, or any of those other little things that make for really good software. As long as the customer accepts the final product, everything is fine.

While I don't share that attitude, I can sort of understand it. I'm interested in software development for its own sake. I care about the elegance and purity of my work and enjoy trying new technologies and learning new theories and techniques. But to some people, programming (or web design) is just a job. It's not a hobby or part of their identity, but simply a way to pay the mortgage. For those people, it's probably hard to get excited about semantic HTML or the efficacy of object oriented programming. As long as they get the project done and get paid, that's really all that matters.

Of course, that's just an explanation. It's no excuse for professional incompetence or unwillingness to learn. If you're going to call yourself a professional, I think you have an obligation to at least try to keep up with the current technologies and best practices in your niche. Not everybody has to be an über geek or an expert on the latest trend, but it would be nice if all web developers were at least aware of the basics of web standards, all CRUD application developers knew the basics of relational theory and that XML is more than just angle brackets, and all desktop developers had some basic grasp of object orientation. Yeah, that would be really nice....

Semi-working OpenGL

After playing around with my /etc/X11/xorg.conf a bit, I seem to have OpenGL sort of working on my new monitor. I should probably still replace my aging ATI Xpert 128 card, but for the time being I can at least play ZSNES at a decent resolution.

The main problem was that my card apparently can't do 1440x900 resolution with 24-bit color. In order to get DRI working with anything above 1024x768, I had to drop down to 16-bit color. That does make things just a little less pretty, but I can live with it. However, even after that, the generated monitor setup still didn't work.

After hacking up my xorg.conf file, I seem to come up with something functional. I've been testing two monitor setups, with corresponding screen setups. The first, named "Generic Monitor," is the monitor section generated by the Kubuntu display configuration tool. It has about 16 modelines and looks oh-so-very important. The second is "Old Monitor," which is my simple, hand-written configuration. It simply has horizantal and vertical refresh rates and two modelines stolen from the other section. Both monitor configurations work for 2D graphics, but I've only been able to get the "Old Monitor" setup to work with DRI. For some reason, when using the "Generic Monitor" setup, the glxinfo command dies with a segmentation fault. Don't ask me why. I was just happy to get both OpenGL and widescreen resolution working at the same time.

Now at 199.2

This week I finally got below 200 pounds: 199.2 to be precise. That's the first time I've seen that since my first year of college. And just in time to blow it all on Easter dinner too....

Crystal and XML DataSets

I've had some more Crystal Reports hatred today. I had to revisit an old program to add some reports to it. Due to our crappy development process and the cheapness of my boss, I did it in .NET 2.0, but with .NET 1.0 bundled Crystal Reports. Basically, I ended up with my main .NET 2.0 program and a separate .NET 1.0 executable to do the reports. I just shell out to the report program and pass it paths to the Crystal Report file and an ADO.NET XML file for the data. It's not pretty, but it could be a lot worse.

Given this setup, I thought adding a couple of simple reports would be a piece of cake. The reports were basically just table listings - putting "select * from foo;" into a user-friendly format. I figured it would take me an hour tops. Little did I know....

My problem was simply that the report didn't work. I had my report file and my ADO.NET dataset, I ran the report program, and the Crystal Report preview control threw up a "query engine error" message. If you're not familiar with Crystal, that pretty much just means that something bad happened when Crystal was trying to populate the report. It could be a problem connecting to the database, it could be a malformed query, or it could be that the alignment of the planets has raised evil spirits that are interfering with your computer. Nobody can say for sure and Crystal doesn't really give you any easy way to figure it out.

After some testing, it turned out that the problem was the xml:space attribute. Apparently the ADO.NET DataSet.WriteXML() method adds xml:space="preserve" to fields that contain only whitespace. However, Crystal Reports for Visual Studio .NET 2002 really doesn't like this and bombs out with a query engine error when it sees this. Just having whitespace-only fields is no problem, it's only when this attribute is present that Crystal freaks out.

So now at least I know what the problem is. The immediate fix is easy - just get rid of the spaces in the database and keep the user from putting in more. It still galls me a little that that's necessary, though. I would have thought Crystal would just ignore unrecognized attributes. But at this point, I shouldn't be surprised by the crappy things I find in Crystal.

Outlines in the GIMP

As you may have noticed, I've taken to posting screenshots a bit more. When doing this, I find it helpful to add highlighting to draw attention to the relevant elements. I usually like to do this by drawing circles around things.

Since I have a really hard time drawing circles with the mouse, I figured out how to do it with selections in the GIMP. I record it here as much for my own benefit as for others, because it seems like I have to rediscover this every time I try to do it.

First, use the selection tool to select the area you want to circle. Then, from the "Dialogs" menu, open the "Paths" dialog. Click the "Selection to path" button (the red dot with black lines above and below) to turn you selection into a path. Then click the "Stroke path" button just to the right of that. You can set the stoke thickness in this dialog and change the color by changing the color used by the pencil tool. Click the "Stroke" button and the outline of your selection will be turned into a line or the given thickness and color.

Small problem with new monitor

All is not sweetness and light in hardware land. The other day, I discovered a problem with my new monitor. Or, to be more precise, it's a problem with using my old video card with my new monitor. It seems I can't get OpenGL support for my old 16MB ATI Xpert 128 card (Rage 128 chipset) to work in widescreen mode.

It's weird. I can get 3D acceleration no problem when running at 1024x768. But when I switch to the widescreen aspect ration and run at 1440x900, hardware acceleration is just gone. The card is supposed to support resolutions much higher than my monitor can handle, so I'm not sure why this is. I suspect it's the widescreen, as I haven't found anything one way or the other about the card's support for this.

I've looked at the X.org logs, but I don't really know enough about the inner workings of it to make much of them. I suspect it had something to do with the highlighed lines shown below, but I really don't know. I suspect I'll just have to get a new card. It's not like I'm not due anyway.
xlog.png