Managing a Production release every 2 weeks
Production release ??? Every 2 weeks ??
I am sure the title of the post would have compelled you to think twice but thats what we have been doing here in Thoughtworks in one of the project. And I am happy to share that I am not the only one who is enjoying this 2 week release cycle, there are bunch of other projects which has the same delivery cycle.
Since this is repeatable and has been achieved by other team as well, so we must be sharing some commonality in terms of process and techniques which helped us achieve this.
What are those processes ? How big is the QA team ? Do you have a separate release management team ? How much is the automation coverage ? Do you automate all the regression tests ? I am sure you would be having these questions in mind as you are going through the post. Let me explain what empowers us to make a production release every 2 weeks.
We here in ThoughtWorks follow agile methodology. I am sure if you are reading this post you would have surely read about what agile methodology is. But in case you are new to software development and come from a different field, going through this link might help you.
In our project we follow some simple but very effective practices. I don't want to run through all the processes we follow but there are definitely some which needs a mention here. One among them is "dev box testing". What it is : once a story is developed by a developer pair, a BA and a QA pair to test that story. They briefly go through the acceptance criteria of a story, if needed do a slight regression testing to see nothing is broken because of that story. What advantage we get: we get the first working version of story out without any major bug. It works as per the acceptance criteria and is showcase-able to client for feedbacks. Next big advantage is it takes away the time of deploying a build on QA, logging the bug, reproducing it for them to fix. It precisely helps us to find the defect early and fix early.
Automation does form the backbone of a frequent product release plan. The acceptance criteria of a story is authored by a business analyst and they are reviewed by client. So the decision to automate any of the acceptance criteria is very close to the critical business functionality. We still do a review of scenarios to make sure it falls in a critical business scenarios to be automated. Automated test suite is hooked up in Continuous Integration engine to run with every check in of a developer. So whenever a build comes out of a pipeline (QA Stage) it has passed all the critical business scenarios. This saves a lot of time during QA to focus on specific areas/issues.
We support all leading browsers like IE8, IE9, Chrome, Firefox and Safari. Even though the automation tests do certify that functionally the app is working, however we still need to run a sanity test on multiple browsers to verify the UI elements and there alignment. You need to keep a close tab on IE as they often behave very weird :)
Just to publish a snapshot of the automation state project is in:
|Total number of scenarios||206|
|Number of scenarios that can be automated||168|
|Number of scenarios automated||136|
|Number of critical scenarios||114|
|Number of critical scenarios automated||107|
We use cucumber for test automation. Our build time is around 12 minutes for the full regression test suite to run. The above stat gives a lot of confidence to me as a QA and as well as a PM on the team.
Coming onto deployment, we don't have a separate release management team. We as QA own the delivery of the product and the job of promoting and making the release. Once we have a stable build, we put it on Staging environment for our clients to test it out and give us any feedback. We have a one click deployment as our environment were set up on Heroku.
This further eases the job.