Bit of context first.
So I have data held at three levels (Scheme Level, Policy Level, Increment Level).
I've got three collections to represent each of the levels. Each one of these collections holds a number of class objects created to represent the data, I need, found at each level.
First stage of the process collects the data from a number of Binary workbooks, so it opens the sheet, extracts the row into the class object, adds the class object to the collection and moves to the next row, on finishing with a sheet it will close the workbook and open the next binary workbook and repeat the process until all workbooks have been fully extracted into the collection.
This part of the process appears to work exactly as intended with no discernible issues.
There will be some functional steps processes next to calculate various outputs and pass these to some other collections in order to output to the sheet in one go. This section hasn't been coded yet but I was more interested in handling the data correctly first.
Finally after all the functional process has completed I would just end things.
When I first tried running it against a single external sheet everything was fine. However, when I ramped the volumes up I was getting excel hanging.
Performed some debugging and the hang was occurring after the final End Sub command (strange I thought).
Had a thought that it may have something to do with the size of the collections.
Tried just setting the collection to = Nothing before the end sub but it appears to hang here too. I assume the Collection = Nothing command is probably the same action that occurs at the end sub line so no great surprise that the same thing seems to be happening. Unfortunately as there is no way to track the process of it clearing the collection in this way I couldn't be sure this is what was happening.
Just to prove it to myself I created a For loop to move through the collection and remove the items one at a time and update a statusbar message to track the process. This seems to validate my thoughts as once it cleared everything out it ended the process with no issues.
However, we are in the region of 7 million items in a collection so it's taking a good bit of time to finish the process when a single item is removed at a time.
Just wondered if anyone had any way I hadn't thought of to quickly clear out the collections that I hadn't thought of, or a more efficient way to handle the data than a collection?
NB: I appreciate that using Access to hold the raw data would probably be better all round given the amount of data, but that option is out due to company constraints.