Two recommendations. The first becomes obvious if you think of the
PDF as a "view" rather than a document/report. Since you're dealing
with a view, consider paging through the data rather than reading all
1K+ records at once. This might be "chattier" than you'd like but
reducing the memory footprint will be a help to the other users of
your system and it'll greatly reduce the chances of losing your MySQL
connection. In a similar situation, I set a query size of 100
records, then calculated the number of "pages" that I needed to read
from the DB and iterated through the records until I was done.
Suddenly all the memory errors went away.
The second recommendation has to do with PDF::Writer itself. As I
have been able to determine, the greatest part of the slowdown in
PDF:Writer is with table generation. PDF:Writer uses some very time
consuming calculations to determine if a set of related rows should
be "split" or not. I was able to greatly increase the rendering speed
by passing :split_rows=>true to the table generator. This turns off
the expensive calculation and greatly increases the rendering speed.
I think this change helped increase the rendering time of my table
(~1400 rows) by two orders of magnitude. In our specific situation we
were creating one table for every record so we determined the number
of records we could render on each page and then handled the paging
issues on our own. It was a little extra code but the speed up was
well worth it.