Today we have an editorial reprinted from Jan 3, 2006 as Steve is on vacation.
I saw this a piece on code quality recently, specifically mentioning Java development managers and their work on quality. They seem to strive for it and fall short, but I don’t think the results from .NET development managers would be much different.
The piece talked about most managers not measuring quality or if they do, not starting until the project is over half complete. That’s interesting because in the jobs that I’ve done software development, we were most concerned about timelines, and code was usually graded in one of three ways. Works, doesn’t work, or needs more work. And most things passed through all 3 of these phases during a development project.
But interestingly enough, I’ve never had quality measured as a DBA. All the T-SQL work either does what it is supposed to do or it doesn’t. And if it doesn’t, we work some more on it 🙂
There’s never been any measure of code quality for me, and I’m not sure how I’d go about doing it. It seems from the article that they looked at bugs reported v lines of code. I’m not sure that’s the best measurement since I could write some code that works, but is very slow to execute. Or that has hard-coded information that makes maintenance a nightmare.
I’m not a software expert, especially with regard to quality. To me it either works as I expect it to and well enough or it doesn’t. I kind of use that thumbnail estimation in that a particular item, stored procedure, function, etc. either returns the results it should or it doesn’t, and it works in an acceptable time frame, or it doesn’t. Comparing a method call to calculate interest on a line to a stored procedure that produces a sales by month result is hard, and I’m not sure I could setup concrete ways to do it.
But I know people are trying. Like the company that sponsored the survey. So I’m wondering, do any of you measure quality? Know of a good way to do it?
(published at http://www.sqlservercentral.com/articles/Editorial/72366/)