Incrementally Read/Stream a CSV File in Java
Join the DZone community and get the full member experience.
Join For FreeI’ve been doing some work that involves reading in CSV files, for which I’ve been using OpenCSV, and my initial approach was to read through the file line by line, parse the contents, and save it into a list of maps.
This works when the contents of the file fit into memory, but is problematic for larger files where I needed to stream the file and process each line individually, rather than all of them after the file was loaded.
I initially wrote a variation on totallylazy’s Strings#lines to do this, and while I was able to stream the file, I made a mistake somewhere which meant the number of maps on the heap was always increasing.
After spending a few hours trying to fix this, Michael suggested that it’d be easier to use an iterator instead, and I ended up with the following code:
public class ParseCSVFile { public static void main(String[] args) throws IOException { final CSVReader csvReader = new CSVReader( new BufferedReader( new FileReader( "/path/to/file.csv" ) ), '\t' ); final String[] fields = csvReader.readNext(); Iterator<Map<String, Object>>() lazilyLoadedFile = return new Iterator<Map<String, Object>>() { String[] data = csvReader.readNext(); @Override public boolean hasNext() { return data != null; } @Override public Map<String, Object> next() { final Map<String, Object> properties = new HashMap<String, Object>(); for ( int i = 0; i < data.length; i++ ) { properties.put(fields[i], data[i]); } try { data = csvReader.readNext(); } catch ( IOException e ) { data = null; } return properties; } @Override public void remove() { throw new UnsupportedOperationException(); } }; } }
Although this code works, it’s not the most readable function I’ve ever written, so any suggestions on how to do this in a cleaner way are welcome.
Published at DZone with permission of Mark Needham, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments