A lot of people have been saying lately that testing is a little behind the IT industry. Is this true? In part.
Remember what happened 10-12 years ago: a developer wrote code and sent it to a reviewer. After all the edits and low level tests, the code was sent to the admins to be built and deployed. The assembled service was submitted to QA, where everything was tested, and if the testing gave a "green" report, the product went "to the gold". People passed artifacts from hand to hand. But then came Pipeline: Git, Docker, Kubernetes, and other scary stuff. The development team started working right on the assembly line - like a Ford factory in 1913!
Except that many testing teams are still trying to take our Ford off the assembly line, and then put it back on. We can see where that leads. Is it possible to ensure quality without sacrificing speed? And if not, "how much quality" can be sacrificed for the sake of efficiency? In his new talk, Artem Eroshenko will talk about how to find answers to these questions.