I have many class objects that represent types of data/tables in a database with some serving as containers of others, and wondering if using public recordsets to streamline this is a bad idea

  • I've found that rather than having each function/sub within an object run a new query and create a recordset, that passing the recordset from one object to another seems to really improve the speed when the database connection is over a network. Now there are a lot of different recordsets I would need to pass between objects because of how many different object types are contained within others, and so on, e.g.


    Object1 --> Object2 -->Object4

    --> Object3 --> Object5--->Object6

    --->Object7


    So I'm thinking I could just make a public recordset that reflects all the data from a table and each function/sub that handles that recordset, something like this (just spitballing here)




    At some point these recordsets will need to be resynced/updated so I would need to figure out how and when to do that but admit I'm having trouble thinking of a good way to implement that. Alternatively I could just have the functions all pass these recordsets to one another optionally, and if it's not passed then that function loads the recordset itself. This latter strategy seems better but with the sheer number of Tables (around 30) it makes writing the functions parameters a bit confusing/tedious. Hoping to get feedback on whether this is just a terrible idea altogether or if there's a good way to go about this.

Participate now!

Don’t have an account yet? Register yourself now and be a part of our community!