I guess if no one else is going to post, I will. My best HD card is a 2600XT and I did most of my research back when they were new cards. What I can tell you about them, is that the lower cards did have reduced image quality but not that bad. Every thing I hear about the newer series of cards though, is they are top rate with HD video (ati cards anyway).
As far as hardware vs software, the 2600xt could do as good of a job as software (lower cards were more questionable though they sitll did a good job). The big issue at the time was that there were still a lot of bugs (issues that would not let you use hardware acceleration). The other problem was that not much playback software supported hardware acceleration and it was buggy too. Just because you have a capable card, doesn't mean you have it. You have to have software that was written to use ati hardware acceleration, and in many cases, you have to enable it.
Things are much better now (though it really hasn't been that much time). I hear nothing but good about newer ati cards (though I gather there is an advantage to going to the 4000 series). More programs are suporting hardware acceleration and the bugs are getting worked out.
While I cannot say it is fact, I think that your card can do just as well as software decoding (and if it cannot do as good as some certain program, it can certainly come very close to the average persons eyes). I would defanatlly use hardware decoding.
Personally. I am very happy with the hd performancve of my 2600xt (a generation older than yours), and have moved on to other issues (like when I set the card to 1024x768, the tv still reports a 1080p signal, so is it really getting one, as higher resolutions make text and icons too small, even on a 40" sony).