This post could also be subtitled "The Grumpy Programmer's Guide to Getting Rejected at Interviews".
Someone tagged me in a tweet...
Book idea for @grmpyprogrammer: an interviewing guide for job seekers wanting to get an idea of how dedicated companies are to testing. Questions to ask, ways to gauge the culture, etc. (Originally posted on Twitter at https://twitter.com/n00bJackleCity/status/1481632465403981824?s=20)
...and it got me to thinking about where to start with a request like this one. My personal opinion that there really isn't a book in here but it did get me to start thinking about what sort of questions you should be asking.
Again, keep in mind that all of this is just my opinion. One based on many years of experience, but still an opinion.
Why Does It Matter?
In my experience, companies that make a commitment to doing automated testing also tend to make a commitment towards "quality" in their coding practices and "automation" in their software development tooling. The reason those are in quotes is because they definitely can mean different things depending on the company.
Now, again, in my experience, you are likely to have more success in solving problems and growing your own skills as a developer if you work in an environment where they value those things.
After all, just because we can get paid a lot of money to dig in the pixel mines doesn't mean we should be forced to eat a shit sandwich. We should at least have a choice of the additional toppings.
What Questions Should I Ask?
Like a lot of things related to programming, I find it helpful to start at the end result you want and work backwards to figure out what needs to be done. Therefore I think the first two things to ask are:
What things always have to work when you push changes into production and how do you verify that it works as expected?
This question cuts to the heart of the issue: what matters and how do we make sure it stays that way.
What you are looking for is clear statements about what matters and clearer statements about how they verify it. Again, not every company has invested the time and money into having the ability for code changes to seamlessly flow from a development environment into production, accompanied by effective automated tests and a clear understanding of outcomes.
If they already have some kind of commitment to testing, asking follow-up questions like this are also very informative:
What do you like about your current testing practices and what do you want to change?
Pay as much attention to what they like as what they dislike. That will give you an idea of what challenges lie ahead if you want to be the person making the changes.
Finally, if you want to find out about what their commitment to quality is, I feel like a great question is:
Tell me about how code gets from the developer and up into production
Look for things like:
- code reviews
- coding standards
- static code analysis
- continuous integration systems
- separate staging and production environments
- automated deployments
Not all of these things are going to guarantee great results (nothing does and never believe anyone who says it) but, when taken together, they show a commitment to making sure that:
- the intent of code is clear
- others can understand the code
- the code is taking advantage of appropriate language features
- the team uses tooling that integrates with version control to automate error-prone manual checklists
- application / end-to-end testing happens before it reaches production
- repeatable processes ensure consistency
So Now What?
It's hard for me to give any more specific advice other than "don't be afraid to ask more questions based on the answers you are hearing." If we're being honest, most companies aren't doing all that stuff I listed above. You can always start at the bottom ("we try and manually test all changes") and work as hard as you are allowed to on getting to the point where you have an automated test suite catching issues before your users do.