Reading large files in chunks with proper encapsulation

added by JefClaes
3/26/2013 1:13:05 PM

10 Kicks, 346 Views

I've been doing some work lately which involves sequentially reading large files (> 2 to 5GB). This entails that it's not an option to read the whole structure in memory; it's more reliable to process the file in chunks. I occasionally come across legacy that solves exactly this problem, but in a procedural way, resulting in tangled spaghetti. To be honest, the first piece of software I ever wrote in a professional setting also went at it in the wrong way. There is no reason to let it come to this though; you can use the often overlooked yield return keyword to improve encapsulation.


Gary Woodfine
3/28/2013 9:44:55 PM
Great article! very interesting! Like the style and very easy to understand and read