Posts by S Chris

    Hello. I believe the reason it was running so slow as I used Selection instead of a Word,Range. When I changed to this is was much quicker.

    So for reference I changed the slow subroutine above to:

    Public Sub Write_To_File(sText As String, Optional sStyle As String = "No Spacing")
        oDoc.Paragraphs.Last.Range.InsertAfter sText
        oDoc.Paragraphs.Last.Range.Style = sStyle
        oDoc.Paragraphs.Last.Range.InsertAfter Chr(10)
    End Sub

    After which it ran much quicker. Having a few issues now that the Style assignment sometimes is ignored but at least I have solved the speed issue!

    Thanks Roy for your time. Much appreciated.

    Hello. I am transforming Excel data into a Word file from the Excel file's VBA.

    I have included Word references and can produce the file but it runs very slowly! I wonder if anyone has tips on the fastest way to write to Word from Excel VBA?

    I have attached my file with the logic pared down to test and understand - the file and template Word doc need to be in the same folder.

    Speed Understand.xlsm Template.doc

    It writes the same data to the Word document 25 times in a loop to get an idea of the time taken. 25 writes takes about 10 seconds on my laptop so you can imagine when I try to run it on full reports it runs very slow!

    Any hints would be greatly appreciated. Thanks in advance.

    My pseudo code for the actual write to Word:

    • add the text at the current selection
    • select the text just added
    • set the style to that desired for the text
    • move the selection to the end of the document for next time
    • add a line break for formatting

    My code:

    Hello. I have CSV files from a coin counter:

    • The counter continually appends to the one CSV file for the day.
    • Each time coins are counted for one person a row is appended to the CSV data.
    • The counter does not add any unique key data to the CSV - just the totals for each type of coin.
    • I want to read the data into MS Access in parallel so the counters can confirm and see any mistakes.
    • I want to use DoCmd.TransferText to import the CSV data into a table and preserve the row order as appears in the CSV file.
      • Note: the rows are not labelled in the CSV - they just appear with return characters
    • If I have the row order I can then identify the rows (counts) that I have already processed and process the new ones.
      • E.g. Import the CSV file now and see rows 1,2 & 3 - I know I have processed 1 & 2 so I will process 3.
    • I want to key on row order and not the totals as it is possible for the same person to collect the same amount each day for example.
    • I can import the CSV file fine - my question is how do I set up a table field that matches row number on import?
      • I cannot use AutoNumber as I want row 1 in the CSV file to always be row 1 and on a reimport AutoNumber always increments
    • To be clear my pseudo code is:
      • I have TCounterCSV where I load in the counter CSV file.
      • I have TCounterCSVRowsProcessed where I store the rows processed in the current days file
      • Both tables start empty
      • I import the CSV data into TCounterCSV and (with your help) record in one field/column somehow the row number as it appears in the CSV
      • I deal with each row as I have processed none at the moment
      • I then store the row number of each row processed in TCounterCSVRowsProcessed
      • After 5 seconds, I then clear TCounterCSV and reload in the counter CSV file.
      • I can then compare row numbers in the two tables to find the rows I have not processed.
      • Repeat

    Any help would be greatly appreciated. Many thanks.

    Thanks to you both for your posts. However as described I now have a 7MB workbook with nothing but VBA code and a blank, brand new worksheet. To be clear - this worksheet is just added and therefore does not suffer from the unused/blank cells problem.

    I guess my question is this: does 7MB sound right for just the VBA code?

    I have many classes and modules so I expect it to have some size.


    I have a VBA Excel system that is 55MB in size. There are some worksheets with a lot of data which make up most of the size.

    i wrote a utility to delete each worksheet in turn, save the file then record the new file size after each worksheet delete. It runs all the way down to just a "dummy" blank worksheet.

    I created a copy of my system then ran on my utility on this copy. At then end I have a file with just the dummy worksheet and all the VBA code.

    The copy after all worksheet deletion is 7MB in size. Is this size therefore all attributable to the VBA code?

    Thanks in advance.

    Thanks Roy. Your help has been invaluable. I will start a new thread on my file size question. Thank you for this great resource.

    Many thanks Roy. I have downloaded and run the Ribbon's cleaner.

    Can I infer that this is now the most well regarded code cleaner now as Rob Bovey's was once in the past?

    Can you recommend any utilities that help me find file size bloat?

    Hi there. I see that Code Cleaner does not work on 64 bit Excel.

    I have used Code Cleaner in the past and loved it. It seems I need it now on a large file that I have been developing on for over a year but I have 64 bit Excel.

    What can I do to ala Code Cleaner if I have 64 bit ExceL?