My sister-in-law sent me a link this morning to this video about calling tech support. This is hilarious, and also very true!
Saturday, November 15, 2008
Tuesday, November 11, 2008
Software Quality and Catalogs
I still find it odd that I write so little about software development in my blog, as this what I do 60 or more hour per week. Maybe that is the reason, I am tired of doing and thinking about it all day that I tend not to write much about it. But anyone that knows me knows how I feel about software quality (or lack thereof) and how much I truly despise the "get it done yesterday" approach to software management.
When I was at Microsoft, we used to joke that "Quality is Job 1.1". Sure, it was funny, and only partly true, mostly said tongue-in-cheek. When you look at the vast majority of the software that Microsoft ships, most of it is quite good. There is the occasional turd (Windows Vista), but mostly it is very good quality. Why is it good quality?
Some think Microsoft hires only the best programmers in the world. I tend to disagree here. True, they have many very, very talented programmers, but the vast majority are of the mediocre variety. The seniors lead the projects and make sure that the juniors fix their mistakes, and teach them better ways. Oversight and code and peer reviews here make all the difference. You cannot stay mediocre forever.
I don't know what the current ratio at Microsoft is between programmers and testers. It used to be about 1-to-1 (1993), that meant, a tester who was not a programmer (at least not paid to code) was responsible for testing all of the code that a developer wrote. They were in nearly constant communication with each other and it was the testers job to find bugs and report them. Sometimes this is easy pickins, when new code is being developed quickly. Other times it becomes increasingly difficult as a project matures.
Microsoft makes a distinction between developers and designers. Every book I have ever seen on development (especially web based) also makes this clear distinction. They are not the same person. One is artistic in nature (the designer), the other very scientific (the coder). The designer likely has a background in good user interface design, graphics, usability and accessibility. The programmer likely cannot draw more than a stick figure and has never used a computer without a mouse, monitor, etc. The designer may have been educated at a well respected design school, while the coder is largely self-taught and has learned by the school of hard knocks.
So if these 2 individuals are so different, and talked about as such at great length in books and education materials, why are they so often considered to be the same person? I have NEVER worked for company, outside of Microsoft, that made this distinction. It's as if a company expects its programmers to be able to create wonderful user experiences, gorgeous graphics, create a consistent look and feel, and make it all work. I have no doubt that those people do exist, but I also doubt just as highly that most programmers make good designers. I have been coding professionally for over 20 years, and non-professionally for many more, and cannot design a nice looking UI to save my ass. I can code it for sure, in about 10 different languages and 3 different operating systems, but I cannot design it. I do not have an artistic touch.
And why should I? I am not an artist, I am a programmer. Since when did having artistic ability become a prerequisite for a developer? If these jobs were intended to be done by one person, why are the development tools so obviously catered to two different people?
As an example, let me look at HTML, the markup language used to make web pages. First there was just HTML, a simple markup language that allowed a designer to make a simple page. By definition, it was never intended to describe how the page looked, only the content of a page. The whole idea was that a HTML page could be displayed on any computer-like device, whether it be a full blown GUI like Windows or Mac OS, a text based browser like Lynx found on UNIX systems without graphics, a telephone, whatever. It was a replacement for GOPHER and America Online's custom page creation software.
And it worked, but only for a while. Designers just could not live without defining, in minute detail, exactly how every page should be laid out. So they added tables, frames, objects, etc. to HTML that did exactly that, allowed the layout of the page to be defined, completely breaking the original design of HTML. But they still called it HTML.
So everyone started creating real fancy HTML pages, with lots of pretty graphics and it was all beautiful. But the beauty was only skin deep. Because somewhere in the dark confines of an HTML coders office was someone breaking a cardinal rule of quality software development. Copy and Paste is NOT good code reuse. What I mean by that is that every page in a site, in order to give the site a consistent look and feel, was probably 90% of the same code of every other page. The content was different, but the look and feel was the same. To get the look and feel consistent, a ton of code was copied and pasted from other pages to create a the new page, and then just the wording and graphics were changed.
Now to an outsider you might think "so what"? And you would be right, in the eyes of a designer. A goal was achieved, the site is consistent and looks nice. In the (good) programmers eyes though you see only slop. All of that code shared between web pages using copy and paste is really annoying. If you change one page, you have to change them all. You have to open each file and change the code, a very tedious, repetitious, and error prone proposition.
So they came up with a new way, CSS. Now you would not define the look and feel in the HTML document, only the structure. A programmer would define the HTML, while a designer would define the CSS, and it would all merge in glorious web page heaven. We have come full circle, back to the original intent of HTML. Now we have a perfect world, right? Don't we wish. CSS is implemented differently by different web browsers, Microsoft Internet Explorer does it one way, most of the rest (FireFox, Safari, Opera, etc) do it another. And somehow Internet Explorer is still the most popular browser out there, despite not being standards compliant, despite being the front door used by many virus' and malware, despite being inferior to FireFox in almost every way.
But does that mean HTML/CSS is broken because of Microsoft? Hardly. The specification alone has a lot to do with it. First, CSS was designed to remove the layout and formatting from HTML code, so why did they let coders put CSS content directly in the HTML file? That is not separating look from content. CSS formatting can be applied in 3 places, an external file that is referenced from the HTML (the way it should be), defined (normally) at the top of an HTML file (which can compliment or override CSS read from a file), or the evil of all evils, added to HTML tags, once again completely breaking structure from layout.
From a designers standpoint, the web is wonderful. From a programmers standpoint, it is a chaotic mess. And who is to blame for the mess? The designer? The HTML/CSS spec writers? No. Programmers are responsible for the mess, and I will explain why.
Just because you can shoot yourself in the head with a Glock, does not mean it's a good idea or that you should. Likewise, just because you can define your CSS style in your HTML pages, or worse in your HTML tags, doesn't mean you should. Like grandma used to say "It's just not right". Yet that is exactly what I see being done everywhere I have ever worked, and on every web project I have worked on that I did not personally control.
Quality. That is lack of quality, pure and simple. As developers we have the best programming tools ever, they make many of our tasks almost mindless. Look at the number of really bad web programmers out there in the world making really nice looking web pages. The code is horrible, but the pages look nice. Sometimes that is just fine with companies, but it is not fine with me.
Maybe I am too old school for this stuff. I learned to write code on a computer that had 4KB of memory. That is 4 kilobytes guys, not megabytes or gigabytes. 4,096 bytes of memory, of which I could use about 3,000 to get all my work done. And I could write some pretty impressive programs in those 3KB when I was 11 years old. I was probably a better programmer then than I am right now.
After all of these years I still enjoy writing software. I do not enjoy creating web pages, or what I call Catalogs, Rouge and Lipstick, or Chrome. Being a "Web Developer" has never been on my resume, and will never be. I can program just about anything I want, operating systems, applications, network protocols, reusable tool kits, etc. But I am not a catalog creator. I don't get my rocks off creating pretty pictures with embedded video. I like writing code, real code. Code that you don't see, but does the work. That is my comfort zone.
And I refuse to write poor quality code just to get something done quickly. I am a firm believer it takes less time to do something right the first time, than to patch and hack it together quickly at first. I thoroughly test all of my code, every single line, and use automated testing tools to make sure that things don't break accidentally when I make changes. I am not saying that I never create bugs, we all do, but I am skilled at finding them early, well before they ever see the light of day in production code.
I have always preached that software development is art and science. But the art in this case does not refer to pretty pictures on a web page. I am referring to creating software as an art form, something that is readable, understandable, and yes, downright pretty, if only in a programmers eyes.
I am now pushing 40 years old real hard, and have begun to question whether or not I am too old for this shit. A young development manager that I know thinks he knows everything. He never used a computer before the web, but thinks he understands quality software. He preaches about it out of one side of his mouth, while simultaneously rewarding those who just slap something together quickly for the sake of adding yet another feature to the software. Never mind they wrote 100% of the code when they could have used 90% that was already written before. Never mind that while the user interface looks pretty, the code behind it is horrible. Never mind they used tools from 10 years ago when newer and better software is available that would have cut the actual time in half, and increased code reuse 10 fold. It works and it looks pretty. That's what makes a good programmer these days? Not in my book.
It's as if quality really does not matter any longer, or is measured differently from how I measure it. Real programmers are scarce, though Web Programmers are plentiful. Maybe the best position for me is to develop the tools that others use. Right the real code and let the kids create pretty pictures from it. But then who will be the real programmers of tomorrow? It's the hardcore guys that make it possible for every other mediocre coder to stay employed. What will happen when the juniors start creating the tools? Will software become even worse than it is today? Is that even possible?
This blog entry is depressing. Maybe I am the old curmudgeon telling the kids to get off my grass, I don't know. I do know there are some great technologies maturing now that are truly incredible. The .NET platform has grown up and revolutionized how code is written and shared. Even the old school C/C++ programmer in me says, "WOW"! Silverlight is very promising, and has the potential to be the death nail of HTML, CSS and Flash. Apple's market share is increasing (while their quality is, sadly, going down) and Mac OS X and FireFox keep Microsoft honest. I am excited by Vista's replacement and hope to see something solid next year. The Linux guys are creating wonderful tools in both the server and desktop arenas.
So I am not sick of software development, only sick of mediocrity. I am sick of programmers that do not take pride in their work, never read the writings of those with much more experience (Code Compete anyone?), don't separate UI from domain or business logic, think it's perfectly fine to have SQL sprinkled all through their code, think unit testing is a waste of time, think HTML and CSS are God's gift to web developers, and think comments in code are useless.
I am sick of not having standards, or having standards that get broken without repercussion, not having time for code reviews, not designing software before you write it, expecting programmers to be designers and their own QA, quantity over quality, and generally writing software without your brain engaged.
Maybe I should drive a truck :-)
When I was at Microsoft, we used to joke that "Quality is Job 1.1". Sure, it was funny, and only partly true, mostly said tongue-in-cheek. When you look at the vast majority of the software that Microsoft ships, most of it is quite good. There is the occasional turd (Windows Vista), but mostly it is very good quality. Why is it good quality?
Some think Microsoft hires only the best programmers in the world. I tend to disagree here. True, they have many very, very talented programmers, but the vast majority are of the mediocre variety. The seniors lead the projects and make sure that the juniors fix their mistakes, and teach them better ways. Oversight and code and peer reviews here make all the difference. You cannot stay mediocre forever.
I don't know what the current ratio at Microsoft is between programmers and testers. It used to be about 1-to-1 (1993), that meant, a tester who was not a programmer (at least not paid to code) was responsible for testing all of the code that a developer wrote. They were in nearly constant communication with each other and it was the testers job to find bugs and report them. Sometimes this is easy pickins, when new code is being developed quickly. Other times it becomes increasingly difficult as a project matures.
Microsoft makes a distinction between developers and designers. Every book I have ever seen on development (especially web based) also makes this clear distinction. They are not the same person. One is artistic in nature (the designer), the other very scientific (the coder). The designer likely has a background in good user interface design, graphics, usability and accessibility. The programmer likely cannot draw more than a stick figure and has never used a computer without a mouse, monitor, etc. The designer may have been educated at a well respected design school, while the coder is largely self-taught and has learned by the school of hard knocks.
So if these 2 individuals are so different, and talked about as such at great length in books and education materials, why are they so often considered to be the same person? I have NEVER worked for company, outside of Microsoft, that made this distinction. It's as if a company expects its programmers to be able to create wonderful user experiences, gorgeous graphics, create a consistent look and feel, and make it all work. I have no doubt that those people do exist, but I also doubt just as highly that most programmers make good designers. I have been coding professionally for over 20 years, and non-professionally for many more, and cannot design a nice looking UI to save my ass. I can code it for sure, in about 10 different languages and 3 different operating systems, but I cannot design it. I do not have an artistic touch.
And why should I? I am not an artist, I am a programmer. Since when did having artistic ability become a prerequisite for a developer? If these jobs were intended to be done by one person, why are the development tools so obviously catered to two different people?
As an example, let me look at HTML, the markup language used to make web pages. First there was just HTML, a simple markup language that allowed a designer to make a simple page. By definition, it was never intended to describe how the page looked, only the content of a page. The whole idea was that a HTML page could be displayed on any computer-like device, whether it be a full blown GUI like Windows or Mac OS, a text based browser like Lynx found on UNIX systems without graphics, a telephone, whatever. It was a replacement for GOPHER and America Online's custom page creation software.
And it worked, but only for a while. Designers just could not live without defining, in minute detail, exactly how every page should be laid out. So they added tables, frames, objects, etc. to HTML that did exactly that, allowed the layout of the page to be defined, completely breaking the original design of HTML. But they still called it HTML.
So everyone started creating real fancy HTML pages, with lots of pretty graphics and it was all beautiful. But the beauty was only skin deep. Because somewhere in the dark confines of an HTML coders office was someone breaking a cardinal rule of quality software development. Copy and Paste is NOT good code reuse. What I mean by that is that every page in a site, in order to give the site a consistent look and feel, was probably 90% of the same code of every other page. The content was different, but the look and feel was the same. To get the look and feel consistent, a ton of code was copied and pasted from other pages to create a the new page, and then just the wording and graphics were changed.
Now to an outsider you might think "so what"? And you would be right, in the eyes of a designer. A goal was achieved, the site is consistent and looks nice. In the (good) programmers eyes though you see only slop. All of that code shared between web pages using copy and paste is really annoying. If you change one page, you have to change them all. You have to open each file and change the code, a very tedious, repetitious, and error prone proposition.
So they came up with a new way, CSS. Now you would not define the look and feel in the HTML document, only the structure. A programmer would define the HTML, while a designer would define the CSS, and it would all merge in glorious web page heaven. We have come full circle, back to the original intent of HTML. Now we have a perfect world, right? Don't we wish. CSS is implemented differently by different web browsers, Microsoft Internet Explorer does it one way, most of the rest (FireFox, Safari, Opera, etc) do it another. And somehow Internet Explorer is still the most popular browser out there, despite not being standards compliant, despite being the front door used by many virus' and malware, despite being inferior to FireFox in almost every way.
But does that mean HTML/CSS is broken because of Microsoft? Hardly. The specification alone has a lot to do with it. First, CSS was designed to remove the layout and formatting from HTML code, so why did they let coders put CSS content directly in the HTML file? That is not separating look from content. CSS formatting can be applied in 3 places, an external file that is referenced from the HTML (the way it should be), defined (normally) at the top of an HTML file (which can compliment or override CSS read from a file), or the evil of all evils, added to HTML tags, once again completely breaking structure from layout.
From a designers standpoint, the web is wonderful. From a programmers standpoint, it is a chaotic mess. And who is to blame for the mess? The designer? The HTML/CSS spec writers? No. Programmers are responsible for the mess, and I will explain why.
Just because you can shoot yourself in the head with a Glock, does not mean it's a good idea or that you should. Likewise, just because you can define your CSS style in your HTML pages, or worse in your HTML tags, doesn't mean you should. Like grandma used to say "It's just not right". Yet that is exactly what I see being done everywhere I have ever worked, and on every web project I have worked on that I did not personally control.
Quality. That is lack of quality, pure and simple. As developers we have the best programming tools ever, they make many of our tasks almost mindless. Look at the number of really bad web programmers out there in the world making really nice looking web pages. The code is horrible, but the pages look nice. Sometimes that is just fine with companies, but it is not fine with me.
Maybe I am too old school for this stuff. I learned to write code on a computer that had 4KB of memory. That is 4 kilobytes guys, not megabytes or gigabytes. 4,096 bytes of memory, of which I could use about 3,000 to get all my work done. And I could write some pretty impressive programs in those 3KB when I was 11 years old. I was probably a better programmer then than I am right now.
After all of these years I still enjoy writing software. I do not enjoy creating web pages, or what I call Catalogs, Rouge and Lipstick, or Chrome. Being a "Web Developer" has never been on my resume, and will never be. I can program just about anything I want, operating systems, applications, network protocols, reusable tool kits, etc. But I am not a catalog creator. I don't get my rocks off creating pretty pictures with embedded video. I like writing code, real code. Code that you don't see, but does the work. That is my comfort zone.
And I refuse to write poor quality code just to get something done quickly. I am a firm believer it takes less time to do something right the first time, than to patch and hack it together quickly at first. I thoroughly test all of my code, every single line, and use automated testing tools to make sure that things don't break accidentally when I make changes. I am not saying that I never create bugs, we all do, but I am skilled at finding them early, well before they ever see the light of day in production code.
I have always preached that software development is art and science. But the art in this case does not refer to pretty pictures on a web page. I am referring to creating software as an art form, something that is readable, understandable, and yes, downright pretty, if only in a programmers eyes.
I am now pushing 40 years old real hard, and have begun to question whether or not I am too old for this shit. A young development manager that I know thinks he knows everything. He never used a computer before the web, but thinks he understands quality software. He preaches about it out of one side of his mouth, while simultaneously rewarding those who just slap something together quickly for the sake of adding yet another feature to the software. Never mind they wrote 100% of the code when they could have used 90% that was already written before. Never mind that while the user interface looks pretty, the code behind it is horrible. Never mind they used tools from 10 years ago when newer and better software is available that would have cut the actual time in half, and increased code reuse 10 fold. It works and it looks pretty. That's what makes a good programmer these days? Not in my book.
It's as if quality really does not matter any longer, or is measured differently from how I measure it. Real programmers are scarce, though Web Programmers are plentiful. Maybe the best position for me is to develop the tools that others use. Right the real code and let the kids create pretty pictures from it. But then who will be the real programmers of tomorrow? It's the hardcore guys that make it possible for every other mediocre coder to stay employed. What will happen when the juniors start creating the tools? Will software become even worse than it is today? Is that even possible?
This blog entry is depressing. Maybe I am the old curmudgeon telling the kids to get off my grass, I don't know. I do know there are some great technologies maturing now that are truly incredible. The .NET platform has grown up and revolutionized how code is written and shared. Even the old school C/C++ programmer in me says, "WOW"! Silverlight is very promising, and has the potential to be the death nail of HTML, CSS and Flash. Apple's market share is increasing (while their quality is, sadly, going down) and Mac OS X and FireFox keep Microsoft honest. I am excited by Vista's replacement and hope to see something solid next year. The Linux guys are creating wonderful tools in both the server and desktop arenas.
So I am not sick of software development, only sick of mediocrity. I am sick of programmers that do not take pride in their work, never read the writings of those with much more experience (Code Compete anyone?), don't separate UI from domain or business logic, think it's perfectly fine to have SQL sprinkled all through their code, think unit testing is a waste of time, think HTML and CSS are God's gift to web developers, and think comments in code are useless.
I am sick of not having standards, or having standards that get broken without repercussion, not having time for code reviews, not designing software before you write it, expecting programmers to be designers and their own QA, quantity over quality, and generally writing software without your brain engaged.
Maybe I should drive a truck :-)
Subscribe to:
Posts (Atom)