some thoughts for speedup

Here are a few thoughts I had a year or so backy for some potential
low hanging fruit for speeding rails up [if anybody ever wants to
implement them :slight_smile:

Forgive the naivety of these, they're mostly just suggestions from
someone not all that familiar with the core code.

1) wrap tests in transactions. Since rails now has nested
transactions, it may be possible to speedup unit tests, via something
like http://ericholscher.com/blog/2009/jan/15/django-now-has-fast-tests/

2) don't write the session out if it hasn't changed at all during a
request.

3) don't regenerate query plans every time [cache them if possible].

4) avoid n^2 filter searches as you handle a request [I believe that's
what it does currently]. At least with the before and after filters,
you could calculate them all once, at the beginning of the request.

5) Avoid re-generating sql escaped copies of each column name for
tables. Cache the sql escaped column names.

6) LRU cache all sql-escaped data [?]

8) use array based queries instead of hash based. Oh wait you already
knew that one.

Anyway thanks for reading.
Have a good one.
-=r

Here are a few thoughts I had a year or so backy for some potential
low hanging fruit for speeding rails up [if anybody ever wants to
implement them :slight_smile:

Forgive the naivety of these, they're mostly just suggestions from
someone not all that familiar with the core code.

1) wrap tests in transactions. Since rails now has nested
transactions, it may be possible to speedup unit tests, via something
likehttp://ericholscher.com/blog/2009/jan/15/django-now-has-fast-tests/

Doesn't Rails already do this?

2) don't write the session out if it hasn't changed at all during a
request.

If I remember correctly this was implemented some time after switching
to Rack.

3) don't regenerate query plans every time [cache them if possible].

5) Avoid re-generating sql escaped copies of each column name for
tables. Cache the sql escaped column names.

6) LRU cache all sql-escaped data [?]

I attempted this 2-3 years ago with my work on prepared statements for
ActiveRecord. I did not measure any performance gains.

4) avoid n^2 filter searches as you handle a request [I believe that's
what it does currently]. At least with the before and after filters,
you could calculate them all once, at the beginning of the request.

Hm, I always thought the application of filters is O(n) and not O
(n^2). :slight_smile: Why would it be O(n^2)?

> 3) don't regenerate query plans every time [cache them if possible].

> 5) Avoid re-generating sql escaped copies of each column name for
> tables. Cache the sql escaped column names.

> 6) LRU cache all sql-escaped data [?]

I attempted this 2-3 years ago with my work on prepared statements for
ActiveRecord. I did not measure any performance gains.

I wonder if tying it in with "real" prepared statements would help.
Did they?

> 4) avoid n^2 filter searches as you handle a request [I believe that's
> what it does currently]. At least with the before and after filters,
> you could calculate them all once, at the beginning of the request.

Hm, I always thought the application of filters is O(n) and not O
(n^2). :slight_smile: Why would it be O(n^2)?

Hmm...perhaps what I was remembering was that it was recalculating the
filters per action once per request [?] Sorry it's been awhile :slight_smile:

-=r

> 3) don't regenerate query plans every time [cache them if possible].

> 5) Avoid re-generating sql escaped copies of each column name for
> tables. Cache the sql escaped column names.

> 6) LRU cache all sql-escaped data [?]

I attempted this 2-3 years ago with my work on prepared statements for
ActiveRecord. I did not measure any performance gains.

I wonder if tying it in with "real" prepared statements would help.
Did they?

The difference here depends on the database and how its optimiser
works. I believe the last time we looked at it there was no
difference on mysql, slightly negative impact on postgres (the
optimiser didn't know the types of the variables so couldn't use
indexes) and *major* improvements to oracle. So it's not a straight
up and down win, however if you wanted to spend some time tidying up
the query generation to make this possible, that'd be neat.

Hmm...perhaps what I was remembering was that it was recalculating the
filters per action once per request [?] Sorry it's been awhile :slight_smile:

Filters are constant time as far as I know too. I'd be surprised if
something's slipped in to change that as people would probably have
noticed :slight_smile:

Yes I was using real prepared statements. It didn't boost performance.
I concluded that real prepared statements are only of limited use and
cannot be applied generally, but rather have to be used explicitly.

Interesting. At the time I only tested with MySQL and PostgreSQL.

Any published data about performance improvements in Oracle?

Only in the form of constant complaints from oracle users :). But from
memory oracle does a lot of work when it creates a prepared statement
including running the optimiser and getting a query plan for
execution. I believe this is quife expensive so doing it once makes
things fast. Perhaps an oracle user can chime in here?