Database Testing With TestContainers
Running tests that involve a database can sometimes be a bit hairy. In this post, we take a look at a TestContainers, a solution to help you get around those issues.
Join the DZone community and get the full member experience.
Join For FreeIf you have ever written code that tests database interactions, such as data access objects, you very possibly have run up against one of the most perennial annoyances in testing: In order to accurately test these interactions, a database is required.
For the sake of this article, let’s consider an application that will be using PostgreSQL as part of its environment because that is what the examples will use. Also, although H2 is mentioned extensively, this is meant in no way to denigrate it – used in the right place, it’s a great tool.
The Problem
Various approaches to solve this problem have been put forward, but there always seems to be some drawback.
One testing approach would be to use an in-memory database such as H2.
Pros:
- The database is local to the virtual machine
- The database lifecycle is managed by the build process
- The initial state is managed by either the build process or the test
Cons:
- You’re not accurately modeling the environment
- Not all features of the production database are supported
- Different datatypes mean different column definitions
- Multiple tests touching the same tables can’t be run in parallel without conflicts
If you consider these constraints to be unacceptable, you may consider having a well-known instance of the PostgreSQL database running that is set aside for testing.
Pros:
- 100% compatibility with the production database
Cons:
- No guarantee of initial data state
- Multiple tests within the same build that touch the same tables can’t be run in parallel without conflicts
- Concurrent builds can lead to inconsistent results
- Continuous integration builds can be broken by developers running local tests
A further refinement of this approach would be for each developer to have their own instance of the PostgreSQL database.
Pros:
- 100% compatibility with the production database
- Developer builds do not interfere with continuous integration builds
Cons:
- No guarantee of initial data state
- Multiple tests within the same build that touch the same tables can’t be run in parallel without conflicts
- Concurrent builds can lead to inconsistent results
- Developers have to keep their database instance up-to-date (or tooling must be added to manage this)
With each of these approaches, I see the cons as being detrimental enough to partially or completely cancel out the pros.
The Takeaway
Breaking down the last three paragraphs, we can see the following features are desirable:
- The database should be tied to the test (not the virtual machine)
- an implication of this is that test parallelization is now possible
- The database lifecycle should be managed by the build
- The database should be identical to that used in production
My New Favorite Solution
Using TestContainers, we can tick off each of these features. Using a JUnit @Rule
, TestContainers will start a per-test Docker image that provides a database that lives as long as the test. Because each Docker instance is totally isolated, tests can be run in parallel to speed up builds.
This last point is very important because, as noted above, there always seems to be some drawback. In this case, the overhead of starting the Docker image and everything it contains will increase your overall build time. I would (and do) argue the increased test time doesn’t even come close to impacting on the benefit of having all our desirable features.
Each database supported out of the box by TestContainers has a specific rule, and this rule can be used to obtain all the details needed to connect to the database.
public class FooDaoTest {
@Rule
public PostgreSQLContainer postgres = new PostgreSQLContainer();
@Before
public void setUp() {
// populate database
// postgres.getDriverClassName()
// postgres.getJdbcUrl()
// postgres.getUsername()
// postgres.getPassword()
}
}
Alternatively…
According to the documentation, it’s possible to have a new container start up by altering the JDBC URL to contain tc:
, for example jdbc:tc:postgresql://hostname/databasename
. However, this failed in my application due to this line in the driver.
if (!url.startsWith("jdbc:postgresql:")) {
An Anecdote
To throw an anecdote in here, I switched an application from using H2 to using Dockerized PostgreSQL in 10 minutes and it had made my life way simpler. We’re using jOOQ for our database interactions and found ourselves faced with removing the usage of some very nice jOOQ features because H2 didn’t support them.
Let me repeat that. We were faced with changing production code due to limitations in the test environment.
That is not and never will be an acceptable situation, so the discovery of TestContainers was both fortuitous and time-saving. Fortuitous because it gave us exactly what we need, but time-saving? How can I say that when I just said it increases test time? Simple – I don’t need to spend time looking if there is a H2 mode that will support the feature I’m using; I don’t find myself writing code that must later be removed because H2 won’t allow it; I can write my tests and DB-related code and I’m done.
Wow, an Entire Blog Post Where You Don’t Mention Play?
Nope. Here’s an easy way to use it with Play, based on the application I just mentioned.
To start, create a mixin that combines the TestContainer with Play’s database support.
package be.objectify.tcexample.db;
import com.google.common.collect.ImmutableMap;
import org.testcontainers.containers.PostgreSQLContainer;
import play.db.Database;
import play.db.Databases;
import play.db.evolutions.Evolutions;
public interface DbTestSupport {
default Database create(final PostgreSQLContainer postgres) throws Exception {
final Database database = Databases.createFrom("default",
postgres.getDriverClassName(),
postgres.getJdbcUrl(),
ImmutableMap.of("username", postgres.getUsername(),
"password", postgres.getPassword()));
Evolutions.applyEvolutions(database);
return database;
}
default void destroy(final Database database) {
Evolutions.cleanupEvolutions(database);
database.shutdown();
}
}
The reason I use a mixin here is because people tend to define DAO tests alongside the interfaces – see my previous post on this approach. It would be nicer if the tests could be defined as mixins because the common DB setup code could then be placed into a common class which could then be extended to implement the test mixins, but JUnit doesn’t rrecognize tests defined in this way.
So, the abstract test class has no knowledge it has implementations that require a database – it purely tests the contract of the interface.
package be.objectify.tcexample;
import org.junit.Test;
import static org.assertj.core.api.Assertions.assertThat;
public abstract AbstractUserDaoTest {
@Test
public void testFoo() {
assertThat(dao().something()).isEqualTo(whatever);
}
// many, many more tests
public abstract UserDao dao();
}
Back over by our database-specific implementation, we can now make sure that our implementation behaves in the way the contract requires.
package be.objectify.tcexample.db;
import be.objectify.tcexample.AbstractUserDaoTest;
import be.objectify.tcexample.UserDao;
import org.junit.After;
import org.junit.Before;
import org.junit.Rule;
import org.testcontainers.containers.PostgreSQLContainer;
import play.db.Database;
public class JooqUserDaoTest extends AbstractUserDaoTest implements DbTestSupport,
TestData {
@Rule
public PostgreSQLContainer postgres = new PostgreSQLContainer();
private Database database;
@Before
public void setup() throws Exception {
// the database has all evolutions applied
database = create(postgres);
// load some test data
loadTestData(database);
}
@After
public void tearDown() {
destroy(database);
}
@Override
public UserDao dao() {
return new JooqUserDao(database);
}
}
Our JooqUserDao
implementation will now run against a real instance of the database type used in production.
The TestData
interface used in JooqUserDaoTest
is just another mixin that loads some data into the database. The implementation isn’t particularly important because it very much depends on your own requirements, but it may look something like this.
package be.objectify.tcexample.db;
import org.jooq.impl.DSL;
import play.db.Database;
import java.sql.Connection;
import java.sql.Timestamp;
import java.time.Instant;
import static be.objectify.tcexample.db.jooq.generated.Tables.ACCOUNT;
public interface TestData {
default void loadTestData(Database database) {
database.withConnection((Connection conn) -> {
DSL.using(conn)
.insertInto(ACCOUNT,
ACCOUNT.ID,
ACCOUNT.KEY,
ACCOUNT.CREATED_ON)
.values(1,
"test-account-a",
Timestamp.from(Instant.now()))
.execute();
DSL.using(conn)
.insertInto(ACCOUNT,
ACCOUNT.ID,
ACCOUNT.KEY,
ACCOUNT.CREATED_ON)
.values(2,
"test-account-b",
Timestamp.from(Instant.now()))
.execute();
});
}
}
Published at DZone with permission of Steve Chaloner, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments