Saving to the DB uses a lot of RAM?

Hi guys,

I must be missing something obvious...

array_of_stuff = [ contains a lot of fairly small AR objects ]

# RAM at 30MB

array_of_stuff.each do |foo|     foo.save end array_of_stuff = nil

# RAM at 60+MB

The iteration itself takes 30+MB of RAM (I measure with "memory_usage from #{pcaller} at l.#{pline}: "+`ps -o rss= -p #{$$}`.to_i.to_s).

I don't get it. Why is this using so much memory?

Thanks a lot, Pierre

PierreW wrote:

Hi guys,

I must be missing something obvious...

array_of_stuff = [ contains a lot of fairly small AR objects ]

# RAM at 30MB

array_of_stuff.each do |foo|     foo.save end array_of_stuff = nil

This is terrible! You should never have a DB query inside a loop. Instead, generate one query to insert all the records. The ar-extensions plugin can help with that.

# RAM at 60+MB

The iteration itself takes 30+MB of RAM (I measure with "memory_usage from #{pcaller} at l.#{pline}: "+`ps -o rss= -p #{$$}`.to_i.to_s).

I don't get it. Why is this using so much memory?

Thanks a lot, Pierre

Best,

I am sorry but I don't have an answer about the RAM question. However I would like to answer Marnen’s comment. While I agree that letting the DB do the work for mass record processing should be the best and most efficient way to go by reading the OP one cannot assume that is the way things are in this case. The array used by the OP could very well contain tons of different types of objects, used for very different purposes and not necessarily related to each other. Pierre never gave any indication it was one way or the other.

Hi guys,

I must be missing something obvious…

array_of_stuff = [ contains a lot of fairly small AR objects ]

RAM at 30MB

array_of_stuff.each do |foo|

foo.save

end

array_of_stuff = nil

RAM at 60+MB

The iteration itself takes 30+MB of RAM (I measure with "memory_usage

from #{pcaller} at l.#{pline}: "+ps -o rss= -p #{$$}.to_i.to_s).

I don’t get it. Why is this using so much memory?

Thanks a lot,

Pierre

Pierre, you might be interested in the following thread within this mailing list because it covered a similar topic:

“activerecord 2.3.5’s create & find slower than activerecord 2.1.2”

Good luck,

-Conrad

pepe wrote:

I am sorry but I don't have an answer about the RAM question. However I would like to answer Marnen�s comment. While I agree that letting the DB do the work for mass record processing should be the best and most efficient way to go by reading the OP one cannot assume that is the way things are in this case.

No, but it's likely.

The array used by the OP could very well contain tons of different types of objects, used for very different purposes and not necessarily related to each other.

Then for the purpose of saving, they should be separated out by type. Queries don't go in loops. Period.

Pierre never gave any indication it was one way or the other.

True. So why "correct" me with an unlikely exception to the general principle?

Best,

pepe wrote: > I am sorry but I don't have an answer about the RAM question. However > I would like to answer Marnen s comment. While I agree that letting > the DB do the work for mass record processing should be the best and > most efficient way to go by reading the OP one cannot assume that is > the way things are in this case.

No, but it's likely.

Anything is likely.

> The array used by the OP could very > well contain tons of different types of objects, used for very > different purposes and not necessarily related to each other.

Then for the purpose of saving, they should be separated out by type. Queries don't go in loops. Period.

Says who? The point I was making is that it would depend on the situation and the solution the OP is trying to give to his particular problem.

> Pierre > never gave any indication it was one way or the other.

True. So why "correct" me with an unlikely exception to the general principle?

I wasn't trying to "correct" you. I was trying to offer a different point of view and have an open mind.

Cheers.

pepe wrote:

pepe wrote: > I am sorry but I don't have an answer about the RAM question. However > I would like to answer Marnen s comment. While I agree that letting > the DB do the work for mass record processing should be the best and > most efficient way to go by reading the OP one cannot assume that is > the way things are in this case.

No, but it's likely.

Anything is likely.

Anything is *possible*...but it's...er...unusual to be saving an array of unrelated objects.

> The array used by the OP could very > well contain tons of different types of objects, used for very > different purposes and not necessarily related to each other.

Then for the purpose of saving, they should be separated out by type. Queries don't go in loops. �Period.

Says who?

Anyone who understands how to use databases efficiently end effectively.

The point I was making is that it would depend on the situation and the solution the OP is trying to give to his particular problem.

> Pierre > never gave any indication it was one way or the other.

True. �So why "correct" me with an unlikely exception to the general principle?

I wasn't trying to "correct" you.

Yes, I realized that after I posted.

I was trying to offer a different point of view and have an open mind.

OK, but in this case I don't think it was relevant.

Cheers.

Best,