Database Update Outside of Rails

Hi! Hopefully my subject is not too obtuse. :slight_smile: I am looking to collect log data into a database for server analysis. I know for a fact that the data will be used by my non-technical peers, so Rails is perfect for building a fairly web interface to generate reports based on the data.

That said, my impression is that you have to work within the Rails framework closely for database updates. I can probably obtain the data and insert them via REST (which I would love to get into), but the problem is that I have a substantial amount of data - so much that I think REST may add unnecessary overhead as I parse log data and push them to my collector.

Hence, the approach I am thinking of trying is to run a Perl script on the collector to capture the log data and then directly update the table. Would that be safe to do?

If not, should I use Active Record within a TCPserver Ruby script (not preferable IMO for performance reasons) and how would I integrate that into Rails?

Nothing wrong with that as long you keep in mind that any Rails magic you've got setup in the models won't be available. But if you keep that in mind, nothing wrong with what you're thinking of.

But if you're already in Rails, I'd suggest pure ruby instead of mixing in Perl...

I would prefer pure Ruby myself, except what I have available is 1.8, which is not impressive performance wise in comparison to Perl. Of course, there is 1.9, but I don't know much compatibility there is with 1.9 and Rail - and I have yet to get through Rails beyond building sample apps utilizing model relationships (heck, I haven't done authentication yet). Last thing I need is to trying to debug issues with 1.9 and Rails all at the same time.

Of course, I could be wrong regarding 1.8. If anybody else have a difference experience using a Ruby 1.8 for tcp services, please let me know.