I've found that rather than having each function/sub within an object run a new query and create a recordset, that passing the recordset from one object to another seems to really improve the speed when the database connection is over a network. Now there are a lot of different recordsets I would need to pass between objects because of how many different object types are contained within others, and so on, e.g.
Object1 --> Object2 -->Object4
--> Object3 --> Object5--->Object6
--->Object7
So I'm thinking I could just make a public recordset that reflects all the data from a table and each function/sub that handles that recordset, something like this (just spitballing here)
Public rsTableB as ADODB.Recordset ' In a Module
Public Function getObjectTypeB(ByRef objectB As clsObjectB) As Boolean
Dim rsBookmark as Long
Dim rsFilterStr as String
Dim tableBinUse as Boolean
If rsTableB Is Nothing Then
loadRecordSet("TableB", ByRef rsTableB) ' separate function loads all records from table "TableB" in database
Else
tableBinUse = True
rsBookmark = rsTableB.Bookmark
rsFilterStr = rsTableB.Filter
End If
Set objectB = New clsObjectB
' code to handle loading data from table into object here
If tableBinUse = True Then
rsTableB.Filter = rsFilterStr
rsTableB.Move 0, rsBookmark
Else
rsTableB.Filter = ""
rsTableB.Movefirst
End If
Display More
At some point these recordsets will need to be resynced/updated so I would need to figure out how and when to do that but admit I'm having trouble thinking of a good way to implement that. Alternatively I could just have the functions all pass these recordsets to one another optionally, and if it's not passed then that function loads the recordset itself. This latter strategy seems better but with the sheer number of Tables (around 30) it makes writing the functions parameters a bit confusing/tedious. Hoping to get feedback on whether this is just a terrible idea altogether or if there's a good way to go about this.