Preface to integration tests: the test pyramid
Before we can get into Testcontainers, we need to introduce the “test pyramid”, a concept concerning the separation and number of tests for software, which hasn’t really evolved since its inception. We’ve seen additional layers pop up, e.g. contract driven testing, fuzzing and so on. Like any other concept that industry leaders agree upon, antipatterns are being blogged about quite heavily, my personal favorite being the “ice cream cone“. The reality comes, often times, down to the fact that you don’t trust even your most sophisticated tests and the pyramid gets a big fat block of manual tests on top. “Shift left” describes a scenario where everyone gets involved into the software delivery process as early as possible. Please don’t just brush this blob away saying that you can hire testers for this. It doesn’t even matter who actually does the testing, since it’s still money wasted on manual labor when integration tests can be automated..

The base of the pyramid, namely unit testing, is a topic that needs no further addressing in my opinion. The methodology and tooling that evolved for testing a function are so heavily blogged about, we can close this book for now.
End to end testing has also had it’s prime time in the 2010s. Not only did we build a solid understanding on how to deal with this new Selenium and automated Browser world, convenience frameworks and tooling became the trend and slowly got traction in adoption.
The big fat blob on top is a bit problematic. But most likely we’ll never entirely get rid of it.
Integration tests in theory: an example
Let’s talk about the integration testing stage. What an integration test resembles can be illustrated in a simple Venn diagram: two or more unit tested modules intersect with each other? Et voillá, that’s where an integration test comes into play. But let’s take a more practical example that everyone can understand: a webshop order:
- The Order module asks the User module if the user is active and exists, then
- it asks the payment module if the user can pay the amount via the selected payment method, then
- it asks the product module if the products in the cart are available and can be shipped to the specified address, then
- it creates a database entry for the desired shipping for further processing. Or it writes a log entry into a message queue. You get the gist.
Sidenote: Most engineers are nitpickers (or rather: have to be in a way): No this does not resemble everyone, nor your specific webshop. Again, it’s just an example.
Testcontainers – the framework
A lot of error cases come to mind: Users can be inactive, can have a bad credit which doesn’t allow a certain threshold of gross amount, the products cannot be shipped to said country etc. Given that you write your services in a single language, you can even set up a single test suite where all the conditionals, exceptions and edge cases are being handled. Yet in the end, no matter how you slice and dice your interactions between modules into integration test groups, it will always include a third party component, like a database, a message queue, a cache etc. And this is where testcontainers come into play.
Given you just go ahead and pack your test suite in a container, testcontainers gives you an abstraction for the Docker API, so you can easily create, read from, update and remove containers used in your test scenarios in a programmatic fashion. It includes nice features like re-usability of certain containers, which allows for running these often times hours-long tests suites in a massively parallelized fashion. So let’s look at an example:
@ClassRule public static MySQLContainer ct = new MySQLContainer<>() .withConnectTimeoutSeconds(10) .withStartupTimeoutSeconds(15); protected ResultSet query(String sql) throws SQLException { HikariConfig cfg = new HikariConfig(); cfg.setDriverClassName(ct.getDriverClassName()); cfg.setJdbcUrl(ct.getJdbcUrl()); cfg.setUsername(ct.getUsername()); cfg.setPassword(ct.getPassword()); HikariDataSource ds = new HikariDataSource(cfg); Statement st = ds.getConnection().createStatement(); st.execute(sql); ResultSet rs = statement.getResultSet(); rs.next(); return rs; } @Test public void testSmokeContainer() throws SQLException { String msg = "What everyone uses to see if MySQL is up + running"; ResultSet rs = query("SELECT 1"); int rsInt = rs.getInt(1); assertEquals(msg, 1, rsInt); try { String[] cmd = ct.getCommandParts(); } catch (UnsupportedOperationException e) { fail("MySQL Testcontainer failed to respond accordingly!"); } }
This is pretty straightforward. If you have no experience in Java or MySQL, Hikari is a connection pool library for JDBC, which is the API you use in Java to connect to most of the databases.
Even if you do not rely on Docker as an orchestration of testing the communication between your components, you still have to deal with external components in heavily battle tested ways. Some people use this mechanism to see if, from an application code perspective, they’re fit to migrate their apps to newer versions of these external components: Does my App, which uses Redis as a caching layer, work well with Redis’ latest release?
A great upside of the entire story is that most CIs already support running things Containerized. There are no obstacles to running a scenario like the one described above on FOSS tools like Jenkins or paid Platforms like Drone (which in particular already runs the main entrypoint in a container, the result is a docker-in-docker scenario).
Another great upside is that Testcontainers supports Selenium, and therefore a scriptable browser, controllable via the exact same API. Even though many people would argue that it’s a bad idea to pack every layer of tests into the same codebase, this is now totally possible using this great tool.
Yet another hint: if you use a technology which currently does not have a testcontainers module, you still have the GenericContainer class, which offers higher level ways of doing basically everything you need. But as it turns out, delivering additional modules isn’t hard at all!
How Instana does it
One team in Instana that heavily utilizes Testcontainers is the team that works on the Sensors for third party software. When you run the Instana agent on your machines, not only do we automatically make sense of how your services interact with each other, but the infrastructure needs to be monitored as well! So we realize that you run Elasticsearch on your machine, attach to that process, and start to grab metrics like “Number of Queries” or the Lucene Segments every second until either Elasticsearch or our Instana agent are stopped.
From a QA perspective this gives you many hard questions to answer:
- Which version range of this software do we support, and for how long?
- Which language platform (in the case above: the JVM) version do we support?
- What cluster software is running underneath the setup?
- And of course the operating system?
All of the above are questions where a tool that you can simply fire all your test cases against is the best possible thing that can happen to you. So we utilize testcontainers to see where our code might either get wrong input, no input and/or what needs to be fixed in particular.
Get involved
Give it a spin! Here you can find:
- The website where you can find further examples on how testcontainers works and supported technologies.
- The Github org, with all its language repositories and APIs
- Public Slack, just shoot your question.