You Can Be the Expert…

This song has been stuck in my head for days! It so fits my mood though. “You can be the expert by picking up a book,” it says. And isn’t that the point of the Literature Review for the Dissertation?
We study the thoughts that came before ours, not just to give us context but, to borrow from Bernard of Chartres, to help us see further. My ideas may turn out to be revolutionary but without the context of the researchers before me, my ideas may never gain traction. Or worse, like the Professor in Rescue From Gilligan’s Island, I may be reinventing a wheel that already exists.

But the song. The song says there are stories “everywhere we look, if we look in the right way.” I’ll take that to mean that when I’m sitting at my computer cussing the ProQuest database for not finding me a good reference, maybe I just need to step back and think of a new keyword to try.

Hmmm… which reminds me, I haven’t tried …

FOOTNOTES (this time)
Bernard of Chartres actually said “Like dwarfs standing on the shoulders of giants, we see farther than they.” In 1159 though we often attribute the quote to Isaac Newton.

The only scene I recall from watching “Rescue From Gilligan’s Island” as a youth was the Professor in a lab somewhere after the Castaways returned to civilization. He had “invented” a Frisbee and was depressed to find out someone had already marketed it, while he was on the Island. He felt he wasn’t really contributing to society and the Castaways eventually returned to the island… Interestingly, today is a year and two days since the actor playing the Professor died.

Any Tool Can Be the Right Tool…

This picture makes many people cringe.


In the software world, we repurpose many many tools from the manufacturing world. These tools, like PCDA, 6 Sigma, and Kanban have long, successful histories in the manufacturing world. Software people grab them because they work. But do they work as well in software? Sometimes. Sometimes they would work better with a bit of tweaking. Sometimes they are just not the right tool for what we are trying to do.

For example, how many projects are there that count defects per KLOC (thousand lines of code)? This was the predecessor to the 6 Sigma process we now have. It sounds like an interesting metric but, what does it mean? Should I be relieved when I get a score of 3.1 or concerned? If there is only 1 defect in total, I should be happy, right? Maybe. But what’s more likely; the case that the testing hasn’t “really started”, the coding efforts are much better this cycle, or that there’s something so catastrophically wrong with the code, that no one can test further?

As any aficionado of Tim “The Toolman” Taylor knows, “any tool can be the right tool.” GIs are notorious for repurposing tools. So are stoners, the incarcerated, and anyone who needs to do something “now” that they aren’t properly equipped for. Don’t believe me? When I taught at a “drop out high school” I was constantly amazed how quickly random items could be converted into pot pipes. It always amazed me how much engineering the students could do if they were interested in the outcome. And we usually squandered those abilities with “busy work.” But this is not a STEM education article. 🙂

Maybe our company have a rule that blocks our folks from doing effective work which they have to “work around” to get their jobs done. Maybe we don’t have the budget to buy/build the tool we really need. But could we “fab” (fabricate) something that would meet our needs? Will we allow/sponsor our employees to do that?

I once worked for a company that decided that we had too large of a backlog of Category 1 defects. All other work was to stop until the Cat1’s could be resolved. They installed a triage process to make sure only “important” work got done. Seemed reasonable. Except that we had hundreds of little “Cat3’s” on the books, too. These were projects that were mostly under 2 hours to fix and they would never, by definition, be worked on. And worse, it made our defect inventory look huge! My group developed a tool called “QuickPicks” which inventoried the Cat3’s and referenced them to larger work that was going on. The thought was that if you were working on a module to resolve a Cat1, you were to look at the QuickPick list to see if any of the Cat3’s applied to the same module and “just fix it” if the effort to do so was under an (arbitrary) hour of work. By time we got the Category 1’s under control, we had reduced the Category 3 inventory by about 50%. This was “free work” getting done to keep our business customers happy while resolving larger work. How much did it cost? A meeting done over several lunches where the team was b– er, “discussing” projects, a web-enabled Excel sheet (basically stored on an existing SharePoint site), and the willingness of management to let us present the tool to the development team and encourage them to use it. All-in-all, basically a “free” process improvement effort.

So what’s the right tool for what you’re doing? There’s a better question: What’s the purpose of the measurement or tool? What are we hoping to learn from it? Can we compare it to something else or take action based on the measure? Do we need to take action or is the information we’re gathering, well, informational? And can we display the information graphically so we can communicate it to the stakeholders who need to have it?

As engineers, we need to make sure we are properly equipped to complete our tasks. As engineers, we need to be involved in the improvement of our workplaces and industry. We need to feel like the metrics we collect benefit us and aren’t just “busy work” which has been “imposed” by management. As engineers, we need to take a moment to figure out what the requirements are, even for our tools, then check them to see if they are meeting those requirements. If they aren’t, we have processes available to us to improve the tools (or the requirements). Let’s use them!

Angels and Devils

I recently happened across this interesting article from Quality Digest magazine. The gist of the article that I took away is that we find mentors and anti-mentors wherever we go and it got me to thinking (like you knew it would… and I think I may have to contemplate a better descriptor than ‘anti-mentor’)

Some of the best Quality mentors I’ve had haven’t been in “Quality” per se. Sam is a brash man that I hired a few years ago. He had been Special Forces so I kind of just accepted his ego as part of him. 🙂 He hated stupid people and often engineers. He would complain about people who couldn’t think their way out of a wet bag or manage to use the drive-thru “correctly.” He once told me about working at the Aberdeen Proving Grounds as a jumper. He was handed a piece of expensive hardware to test jumping out of the plane. He promptly dropped it which caused all the engineers to panic, asking him, “Do you realize how expensive that is?” He didn’t bat an eye but rather asked them, “do you know how much shock this is going to receive when I jump out of an airplane with it?” They backed off and he conducted the tests as asked. Lesson learned that it’s important to ask the end user how the new product will be used.
I hired him because the man could build craft anything. Even if he was only given a doodle on a napkin. He’d roll his eyes when he saw me coming but he and I both knew we wanted something good to come out of our project. And I think that matters. When both parties know that the goal is to make a good product, you both give the other a little slack. We may egg each other on but we know, know that we are all professionals attempting to put Quality into our products. Our processes are designed to help us communicate with each other and sometimes we get so caught up in the documentation that we forget what we’re trying to communicate.
For example, together we once saved the company we worked for something in the 6 figures because we were able to “fab” a device to test a small component that was thought to be malfunctioning. Using a video camera that was on hand, we were able to demonstrate the part’s reliability, under a variety of attitudes, to the customer using $40 worth of scavenged parts and a metal desk protractor. The customer was happy that we ruled out (no pun intended) the part as a source of the defect, the CEO was happy for the continuing good will from the customer, and the FO was happy she didn’t need to write a huge check. And the team was happy because we now had a cool story to tell. 🙂
Nowadays, Sam is retired but I still look up to him. Quality is where you put it.

Testing, testing, 1, 2, 3…

For my first story, I’d like to share something that happened to a buddy of mine. He worked the help desk for a computer manufacturer and was quite used to getting “stupid”/”strange” calls from end users.
One particular day, he had a call escalated to him. The customer was complaining that his “screen shrunk.”
Being the professional he is, he started by asking clarifying questions: “Do you mean the icons on the screen?” (No, they’re correct) “Do you mean the monitor is too small?” (No, the monitor’s size is just fine) and so on.
They rebooted the monitor and the computer several times but couldn’t resolve the issue of the “shrinkage.” They eventually decided to send the customer a replacement monitor. When the original monitor returned to the factory, it was tested and no problem could be found with it. After this had happened several times, they decided it was important to send a tech to the man’s office to see the problem first hand. They found that the customer was correct, periodically the image on the screen “shrunk” amid a flurry of pixelation and static. They were puzzled for a moment.
I’ll tell you in a minute what the problem was, but I’ll give you this hint for now: moving the monitor and the desk it was sitting on to a different location 3 feet away solved the problem.
So what have we learned from this?
1) The customer is the customer. They might not be right but they are experiencing something with our product.
2) Ask questions to understand what the customer is saying. They may not have the vocabulary to explain what they see or may misunderstand what they are seeing. Doesn’t mean they aren’t seeing it.
3) Follow up. Ask the customer if the resolution is working. If not, escalate the problem so that the customer knows you aren’t just blowing him off.
4) Test the product but know you’ll never account for everything that could happen to it. If the user finds a novel use for your product, either embrace it or don’t but know that the customer thought of your product to try to do what they needed done. That should mean something!

OK, the problem was that the monitor and computer were set against the wall of the office. And the other side of the wall was the elevator shaft where the huge iron counter-weights to the elevator zoomed past periodically. The counter-weights generated a magnetic field over time and degaussed the monitor every time they flew by. Moving the monitor away from the stray magnetic field fixed the problem.