PostgreSQL to SQLite: The Journey
Join the DZone community and get the full member experience.
Join For FreeThis article will be useful if you want to support both PostgreSQL and SQLite using JDBC. It will be especially useful if you:
- Are already accessing values from your (PostgreSQL) database using the regular JDBC ResultSet interface, like:
Date d = rs.getDate("date_field"); BigDecimal bd = rs.getBigDecimal("bigdecimal_field");
And it is creating trouble when doing the same for SQLite, but you don't want to change that code. - Are already retrieving autogenerated keys in PostgreSQL with a RETURNING clause, but this won't work in SQLite. You want a unified solution that works for both databases.
- Thought foreign keys are enforced in SQLite by default (like in PostgreSQL) and crashed with a wall. SQLite is allowing you to delete entries from your tables even when they are referenced in another table and you have explicitly told SQLite about it with a REFERENCES table_name(field_name) clause.
- Are having trouble with the differences between PostgreSQL and SQLite dialects (mostly concerning data types), for example, when making query filters with boolean values.
- Had your own way to manage exceptions for PostgreSQL and it is not working for SQLite (obviously). You want SQLite to fit into the model you already have.
- Other stuff might appear if you keep up...
A few months ago I wanted to migrate an app to use SQLite as a data backend. In fact, I wanted it to work with both PostgreSQL and SQLite indistinctly (but not at the same time). I wanted to switch between these two databases easily without changing any code. I did it, but along the way I had to solve some problems that might be interesting to many other people.
Many solutions I found were spread across the web, but there was no single place that explained how to completely achieve what I wanted. So, the aim of this post is to try to condense my learning into one article that may be of help to others as a (semi) complete guide. This guide might be useful not only to those creating their own frameworks, but for anyone who doesn't use any and are willing to try some quirks and tricks to make their app work.
THE BEGINNING
There are many cross-database incompatibilities between PostgreSQL and SQLite, most notably on data types. If you want to have the same code to work for both databases, you better use a framework that manages this for you. But here's the thing: the framework I use is created by myself, and didn't (completely) take these differences into account, since I mainly use PostgreSQL as database; that's how and why my problems arose.
My framework conveys many things, but I focus here in the data access part. It uses some JDBC driver to connect to the databases, but it provides more abstract ways to do it; that's pretty much the data access part of the framework.
A basic DAO class for my framework would look like this:
public class MyDAO extends BaseDAO { public MyDAO() { super("context_alias", new DefaultDataMappingStrategy() { @Override public Object createResultObject(ResultSet rs) throws SQLException { MyModel model = (MyModel)ObjectsFactory.getObject("my_model_alias"); model.setStringField(rs.getString("string_field")); model.setIntegerField(rs.getInt("integer_field")); model.setBigDecimalField(rs.getBigDecimal("bigdecimal_field")); model.setDateField(rs.getDate("date_field")); model.setBooleanField(rs.getBoolean("boolean_field")); return model; } }); } @Override public String getTableName() { return "table_name"; } @Override public String getKeyFields() { return "string_field|integer_field"; } @Override protected Map getInsertionMap(Object obj) { Map map = new HashMap(); MyModel model = (MyModel) obj; map.put("string_field", model.getStringField()); map.put("integer_field", model.getIntegerField()); map.put("bigdecimal_field", model.getBigDecimalField()); map.put("date_field", model.getDateField()); map.put("boolean_field", model.getBooleanField()); return map; } @Override protected Map getUpdateMap(Object obj) { Map map = new HashMap(); MyModel model = (MyModel) obj; map.put("bigdecimal_field", model.getBigDecimalField()); map.put("date_field", model.getDateField()); map.put("boolean_field", model.getBooleanField()); return map; } @Override public String getFindAllStatement() { return "SELECT * FROM :@ "; }
So, that I wanted to switch between databases without changing code means that I wanted to switch without changing my DAO classes.
For SQLite, I used the xerial-jdbc-sqlite driver. I talk about drivers because there are some things that might be driver-specific when solving some problems; so when I say 'SQlite does it this way', I generally mean 'xerial-jdbc-sqlite driver does it this way'.
Now, let's start.
WARNING: Some of the solutions I give here fit into my framework, but might not directly fit into your code. It's up to you to imagine how to adapt what I provide here.
DATA TYPES
Since there are some differences between PostgreSQL and SQLite regarding data types, and I wanted to continue to access database values through the regular ResultSet interface, I had to have some mechanism to intercept the call to, for instance, resultset.getDate("date_field"). So I created a ResultSetWrapper class that would redefine the methods I was interested in, like this:
public class ResultSetWrapper implements ResultSet { // The wrappped ResultSet ResultSet wrapped; /* I will use this DateFormat to format dates. I'm assuming an SQLite style pattern. I should not */ SimpleDateFormat df = new SimpleDateFormat("yyyy-mm-dd"); public ResultSetWrapper(ResultSet wrapped) { this.wrapped = wrapped; } /* Lots of ResultSet methods implementations go here, but this is an example of redefining a method I'm interested in changing its behavior: */ public Date getDate(String columnLabel) throws SQLException { Object value = this.wrapped.getObject(columnLabel); return (Date)TypesInferreer.inferDate(value); } }
The getDate() method in ResultSetWrapper relies on TypesInferreer to convert the value retrieved to a Date value.
All data types convertions would be encapsulated inside TypesInferreer, which would have methods to convert from different data types as needed. For instance, it would have a method like this one:
public static Object inferDate(Object value) { java.util.Date date; // Do convertions here (convert value and asign to date) return date; }
Which tries to convert any value to a Date (I'll show the actual implementation further).
Now, instead of using the original resultset retrieved from saying preparedStatement.executeQuery(), you use new ResultSetWrapper(preparedStatement.executeQuery()). That's what my framework does: it passes this new resultset to DAO objects.
Now let's see some type conversions.
Mixing PostgreSQL Date and SQLite Long/String
You could store Date values as text in a SQLite database (eg. '2013-10-09'); this you can do manually when creating the database, but when SQLite stores a Date object, by default it converts it to a Long value. There is no problem with this when saving the value to the SQLite database, but if you try to retrieve it using resultset.getDate("date_field"), then things get messy; It simply won't work (CastException).
How do you access Date values, then? You create this method in TypesInfereer, which covers both String and Long variations:
public static Object inferDate(Object value) { java.util.Date date = null; if(value == null) return null; if(value instanceof String) { try { date = df.parse((String)value); } catch (ParseException ex) { // Deal with ex } } else if(value instanceof Long) { date = new java.util.Date((Long)value); } else { date = (Date)value; } return new Date(date.getTime()); }
And as you saw, the getDate() function in ResultSetWrapper is redefined like this:
@Override public Date getDate(String columnLabel) throws SQLException { Object value = this.wrapped.getObject(columnLabel); return (Date)TypesInferreer.inferDate(value); }
Now all DAOs can retrieve Date values from both databases indistinctly, using resultset.getDate("date_field").
Mixing PostgreSQL Numeric and SQLite Integer/Double/...
My SQLite driver didn't implement the getBigDecimal() function. It complained like this when I called it: java.sql.SQLException: not implemented by SQLite JDBC driver.
So I had to come up with a solution that was valid for both PostgreSQL and SQlite. This is what I did in ResultSetWrapper:
@Override public BigDecimal getBigDecimal(String columnLabel) throws SQLException { Object value = this.wrapped.getObject(columnLabel); return (BigDecimal)TypesInferreer.inferBigDecimal(value);
}
But value would get different types depending on the actual value stored in the database; it could be an Integer, or a Double, or perhaps something else. I solved all the cases by doing this in TypesInfereer:
public static Object inferBigDecimal(Object value) { if(value == null) return null; if(value instanceof BigDecimal == false) { return new BigDecimal(String.valueOf(value)); } return value; }
Anyway, the String constructor of BigDecimal is the recommended one, so everything's fine with this. Now you can retrieve BigDecimal values using resultset.getBigDecimal("bigdecimal_field") from both databases.
Mixing PostgreSQL Boolean and SQLite Integer
SQLite doesn't have boolean values. Instead, it interprets any other value as boolean by following some rules. When SQLite saves a Boolean value to the database, it saves it as 0 or 1 for false or true respectively. Also, because drivers can interpret any value as boolean, you can use resultset.getBoolean("boolean_field") and it will work as expected by the rules.
But the problem I faced was when creating filters. If a value for true is stored as 1 in the SQLite database, you can't expect the clause WHERE boolean_field = true to work. You will never find a match. Instead, you should have said WHERE boolean_field = 1.
In my app, I created filters like this:
dao.addFilter(new FilterSimple("boolean_field", true));
Now I needed FilterSimple to infer that, for SQLite, I meant 1 instead of true. So I created what I called a DatasourceVariation. These are objects that are specific for each type of database and are used accross all data accesses, by DAOs, Filters, and other objects. These objects would take care of managing all my cross-database incompatibilities, including:
- The way to reference a database object: in PostgreSQL you must prepend the schema name to every database object you refer in your queries. In SQLite you don't.
- The way to manage exeptions: explained further in this post.
- The way to backup and restore data: explained further in this post.
- Expressing BETWEEN clauses: Explained further in this post.
- And also, infering boolean values.
For VariationSQLite, I did this:
@Override public Object getReplaceValue(Object value) { if(value instanceof Boolean) { if((Boolean)value == true) return new Integer(1); else return new Integer(0); } return value; }
Now we can say dao.addFilter(new FilterSimple("boolean_field", true)) for both databases, assuming that FilterSimple uses the variation to adapt the value before constructing the clause.
RETRIEVING AUTOGENERATED KEYS
When you have autonumeric fields (eg. Serial), in PostgreSQL you can specify a RETURNING clause at the end of an INSERT statement to automatically retrieve the values of autogenerated fields by doing this:
PreparedStatement pstm = conn.prepareStatement(queryWithReturningClause); // ex. select * from table_x returning field_x ResultSet rs = statement.executeQuery(); if(rs.next()) { // Get autogenerated fields from rs }
But that won't work with SQLite. In SQLite, retrieving autogenerated fields conveys a process that goes from creating the statement, executing the query and explicitly asking for the generated values. Like this:
PreparedStatement pstm = conn.prepareStatement(queryWITHOUTreturningClause, Statement.RETURN_GENERATED_KEYS); pstm.executeUpdate(); ResultSet rs = pstm.getGeneratedKeys(); if (rs != null && rs.next()) { // Get autogenerated fields from rs }
The good news is that this code works both for PostgreSQL and SQLite, so I replaced my previous code for this, and didn't have to make any distinction between databases.
ENFORCING FOREIGN KEYS
You'd think that using a REFERENCES table_name(field_name) clause when creating a SQLite database table makes foreign keys to be checked when deleting, updating, etc. You're wrong!
Foreign keys are not enforced in SQLite by default. You have to explicitly say it, and it's done when creating the connection (WARNING: This is very driver-specific):
SQLiteConfig config = new SQLiteConfig(); config.enforceForeignKeys(true); Connection conn = DriverManager.getConnection("jdbc:sqlite:" + dataSourcePath, config.toProperties());
For PostgreSQL it's different, so you better have a connection pool for each type of database, and decide which one to use at runtime. My framework does exactly that.
NOTE: If you are capable of getting the connection depending on the database type, then you can enforce foreign keys transparently for both databases (for PostgreSQL it happens naturally without extra code). For instance, you could have an abstract getConnection() method, and each database's connection pool would return the connection in its own way.
MANAGING EXCEPTIONS
I had defined some different types of database exceptions in my framework: ExceptionDBDuplicateEntry, ExceptionDBEntryReferencedElsewhere, etc, which would be thrown and raised to upper layers in my architecture. For PostgreSQL, these exceptions directly mapped to some constant codes (which normally are vendor/driver specific): UNIQUE_VIOLATION = "23505", FOREIGN_KEY_VIOLATION = "23503", etc. So, for PostgreSQL, I managed database exceptions something like this:
@Override public void manageException(SQLException ex) throws ExceptionDBDuplicateEntry, ExceptionDBEntryReferencedElsewhere { if (ex.getSQLState() == null) { ex = (SQLException) ex.getCause(); } if (ex.getSQLState().equals(UNIQUE_VIOLATION)) { throw new ExceptionDBDuplicateEntry(); } else if(ex.getSQLState().equals(FOREIGN_KEY_VIOLATION)) { throw new ExceptionDBEntryReferencedElsewhere(); } else { DAOPackage.log(ex); throw new ExceptionDBUnknownError(ex); } }
That won't work for SQLite, obviously! So, what I did was move the database exceptions management to the DataSourceVariation. The VariationPostgresql class would have a method similar to the one above. For VarialtionSQLite, I did sort of a hack, but it's something that has worked until now (maybe until I change my driver).
@Override public void manageException(SQLException ex) throws ExceptionDBDuplicateEntry, ExceptionDBEntryReferencedElsewhere { // This is a hack (is it???) String message = ex.getMessage().toLowerCase(); if(message.contains("sqlite_constraint")) { if(message.contains("is not unique")) throw new ExceptionDBDuplicateEntry(); else if(message.contains("foreign key constraint failed")) throw new ExceptionDBEntryReferencedElsewhere(); else { DAOPackage.log(ex); throw new ExceptionDBUnknownError(ex); } } else { DAOPackage.log(ex); throw new ExceptionDBUnknownError(ex); } }
Update: This technique might have some flaws. But hey, can you find a better approach right away?
FIXING BETWEEN CLAUSE
The problem with the BETWEEN clause appeared while using a filter like this:
dao.addFilter(new FilterBetween("date_field", date1, date2)); // date1 and date2 are java.util.Date objects
FilterBetween would create a BETWEEN clause by formatting Dates as Strings, normally with the format 'yyyy-MM-dd' (although this should be configurable). Since dates in SQLite are long values, we can't create a clause like date_field BETWEEN '2013-01-01' AND '2013-02-01'. It had to be something like date_field >=1357016400000 AND date_field <= 1359694800000.
So, I moved the creation of BETWEEN clauses to.... that's right, to DataSourceVariation. VariationSQLite does it like this:
@Override public String getBetweenExpression(String fieldName, Object d1, Object d2) { String filter = ""; try { Date dd1 = null; Date dd2 = null; SimpleDateFormat df = new SimpleDateFormat("yyyy-mm-dd"); // Remember, this should be configurable if(d1 instanceof String) dd1 = df.parse((String)d1); else dd1 = (Date)d1; if(d2 instanceof String)dd2 = df.parse((String)d2); else dd2 = (Date)d2; filter = fieldName + " >= " + dd1.getTime() + " AND " + fieldName + " <= " + dd2.getTime(); } catch (ParseException ex) { DAOPackage.log(ex); throw new ExceptionDBUnknownError(ex); } return filter; }
CONCLUSIONS
As you can see, there are many intricacies when making an app support multiple database types. All I did here was only to support PostgreSQL and SQLite, but who knows what is needed to support other databases at the same time too. You can't expect JDBC alone will do all the work, so be prepared to solve some problems (and another problem, and another, ...) to make a database migration. And please, share your journey.
Opinions expressed by DZone contributors are their own.
Comments