Wow, what a whirlwind! Functional testing, services testing, Cloud testing, SOX compliancy, hiring. Processes, a metrics program, contracts.
It's not so much what to do, it's where to start. So I thought I'd start at the beginning.
One of the biggest problems any new QA/QC manager has is the existing staff's preconceptions of "what testing should be" and "how testing should work". They base that on what they've experienced in past companies.
The problem with that is often "QA" at those former companies was (ahem) somewhat less than what it could have been. Perhaps the testing staff weren't especially valued, and their sole contribution came at the end of the project where the problems they found inevitably led to project delays, or a product with problems in production due to lack of time to fix those problems. Sometimes they didn't test a certain type of application at all. Often they were ignored throughout the entire project cycle. Or never got the information or interaction they require to do their jobs well.
The challenge here is to build a testing organization that supports the development process, which is in somewhat of a flux. While I'm building the testing organization, development is building out their organizations as well. There hasn't been a settling into one project methodology. The quasi-methodology in place isn't followed across the board. The IT organization as a whole is still making decisions in regards to process. It's likely the final result will be an agile process with enough structure to support a large global community, executive reporting, and auditing requirements.
This is not a weird situation, just a tough one. It requires complete responsiveness and a willingness to change course in midstream to "whatever works". I'm sure this sounds familiar to many of you.
A good QA department is collaborative. That means it doesn't matter what kind of documentation is produced or what kind of project methodology is chosen. As long as we have access to people who can answer questions, it's all good. But when you're working with people who have never had QA involved early in the process, they may or may not want to work with you. At the same time, there's usually an expectation that some kind of deliverable will be produced by QA; usually "test cases". And that they'll be produced before the development effort begins. Well, look at your schedule. Test cases take about an hour per to develop and a level of detail that might not be available at the time required to write them. One of the first challenges of a new QA manager is getting the point across that testing staff do not have ESP. Especially when every component of a system is new. If you were coding changes to an existing system, it might be possible to write test cases. If your UI design was done, it might be possible to write test cases. But when you're looking at at a piece of a piece of functionality on a service level where you'll need a virtualization tool just to mock up basic throughput, no. If you, as a designer, developer, or BA, don't know how the whole piece is going to work, neither will the testing staff. And standard test cases are not necessarily useful for services testing. So after you look at the schedule, do you have time to write test cases? If not, you need to start that education process right away. Most reasonable people realize you probably can't spend 10,000 hours writing test cases due in 4 months. The problem with staff living in the past is that they often came from shops in "maintenance mode". So writing 4 test cases that reflect a few changes in an existing screen wouldn't be a big deal. What you have to remember when dealing with someone else's testing baggage is that they've never actually done the work themselves. So a patient explanation of how long those activities take is usually all that is required. Executive management is different; inevitably they will be more with the program on this kind of issue than individual staff members or PMs from highly structured shops. Executive management is HIGHLY interested in any testing methodology that will allow the work to be done more quickly and cheaply, with the same level of robustness and ability to be successfully audited. And you'll find that with executive management on your side, the rest of the organization will give you the support you need as well.
So how can one deal with individual situations where expectations differ according to an individual's sometimes skewed perception of how testing should operate? All I can answer is with my own experience. So my answer is, "With patience, grasshopper.".
First, I'm not the least bit worried about testing. I inevitably hire a kick-ass staff that can test anything. That's how I roll. So when I have the opportunity, I politely point out what we need to have happen in order to find errors early in the process and the consequences if we don't collaborate early and often. But what is always in the back of my mind is the knowledge that my points will be graphically illustrated later down the line. The fact of the matter is that I always want a given project to be successful and I'll do everything I can to help make that happen. But without being totally obnoxious, which is a very unsuccessful way to operate, I can't "force" people to throw away their blankies and build something brand new. They're still tied to their own comfortable preconceptions of what happened where they "grew up". But maybe, just maybe, someone out there makes a better vegetable soup than Mom. Reality will force the issues for me later on and while the lessons might be painful, they'll help us all move in the right direction. I can save the organization from that painful reflection period, but will I get the level of collaboration I need to do that here at this company? I don't know. Time is not on my side. The project managers are, however. The development managers "get it" as well. The staff is hugely talented, so overall the chances of developing an organization that really works for the company is higher than it might be elsewhere. Executive management is what I would call "visionary". I like them. Visionaries in the IT world are not all that common. If you could see what we're building here, it would make you weep in envy. It's that sophisticated, and the technology is cutting edge. We're breaking new ground, and the technology is awesome. The opportunity to test it makes me drool openly. So I'm happy to be here. I know that issues regarding what people expected from their OLD testing organizations will eventually change. So I do what I can, and focus on my own group's work, knowing that eventually everything will work out. And I think that's the key to success. Do what you can early, make your points, and move on. I've seen many a manager become so frustrated, obnoxious, and loud, no one wants to work with them. You CAN'T HAVE EVERYTHING YOU WANT right away. You need to get what you can and allow time and experience to help you move the mountain an inch at a time. Remember that some people's past experience with testing organizations was terrible and they'll have to see with their own eyes the kind of value you add to the process before they'll start to trust you and feel comfortable with working with you. If someone you need to work with really goes over the line and is impacting the progress of your group, do what you can with them, report it unemotionally as an issue, and move on. And keep your sense of humor! The person in charge of your attitude is you and you can't accomplish anything with a negative attitude. And it will make you unpleasant to be around. One of my favorite people in my new company is a project manager. She takes every issue that is given to her and works it positively without slipping into Martyr Mode; she has a great sense of humor and while she has the kind of knowledgeable cynicism that is healthy and keeps her grounded in reality, she moves forward in a confident and positive way. Every time. i admire it; good project managers are hard to come by and she is one of the best I've worked with to date. I've had many, many opportunities to move into project management and have turned them down every time; in my opinion, it is the most thankless, difficult job in IT. But the lesson there is obvious; everyone enjoys working with that PM and respects/supports her; if she was flogging everyone and was unpleasant to work with, people would ignore her as much as they could. Don't YOU avoid unpleasant people? I do. Who wants to work with someone that yells at them all the time? So don't become that person.
Thus far, on the testing side, we've made tool selections, we're staffing up, and we're writing test outlines (a hierarchical list of test conditions) as we get information. Think of it as a testing shopping list. Those lists will keep us SOX compliant, allow us to test, and enable us to gather metrics in regards to pass/fail information. Actual service tests will be imbedded in the tool itself. By the way, I forgot to mention that part of the executive vision is Sigma 3 quality levels. Oh baby. Nothing like high expectations! Hence the need for a metrics program. I know a lot of you spit upon metrics, but large chunks of this baby are being developed by vendor partners and the company has written quality expectations into their contracts to ensure we don't get sold any snake oil (that in itself is pretty visionary). So it has to be measured. Frankly, I've never worked for any company that hasn't needed some sort of metrics; I have to say I've never understood the kickback from QA/QC professionals on this one. It's like babies saying "I don't wanna!". So what? You need them to get the job done at a managerial level. So suck it up and just do it. I've personally seen some great things accomplished through the intelligent use of metrics.
But I digress (too long since I've blogged!). Since January, I've added a services/integration testing guru and a test engineering manager (automation) to my posse. I've got two Killer QA leads. So we're small, but mighty. Another 33 people to go...
I'm going to have to post about hiring in the Great Northwest, and you'll probably get a boatload of comments about services testing. Many of you have tested services - heck, I've "been there, done that" myself. But this effort is building EVERYTHING from scratch, including the services layer, and it's a different animal. Fascinating stuff. Kind of like standard services testing on steroids. I'll have to contemplate how to convey information regarding testing processes without giving away specifics; I seem to recall signing quite a few documents regarding confidentiality when I came on board!