Hi~
Hi Ezra/Kate - thanks for the advice. I have been wondering which
way to go here. My thinking was as follows:
Assumptions
* Inactive sessions are cleaned regularly (via some approach)
Option 1 - Add specific UserId column to sessions table
* CON - Need to perform an database update to put the userId in
place, each request (unless there's a way in one's before_filter to
tell it's a request associated with the creation of the session for
the first time I guess)
* CON - Some minimal extra complexity in the code (i.e. for each
request what has to be done)
* ADV - Perhaps faster when do you want to pull back a list of
active users
Right but keep in mind that the session gets read from and then
written to the db on every request anyways. So the session is already
going to be read from the db at the beginning of each request. If you
set the user_id column of the session object in a before filter it is
no extra db overhead because the session was already read from the
db. Then at the end of the request, the session has to be written
back out to the db anyways so setting the user_id field when this
happens does not create extra db queries if done right.
Option 2 - Use marshalled session data (stored in the data column
in session table)
* ADV - Can easily add userID to the session when performing normal
authentication/authorisation checks, so no additional overhead
( e.g. find session via AR then update)
* CON - Perhaps slower to iterate through list, BUT the only time
one would need to do this would be for someone requesting a list of
active users logged on.
The real problem with this way of doing things is memory and cpu
time. Lets say you keep your sessions cleaned out pretty regularly.
So assume 200 sessions laying around at any one time(rails makes a
lot of sessions entries so this number is conservative). So each of
these 200 sessions now need to be loaded into memory and unmarshaled
to see the user_id inside the session. That means 200 times however
much data is in each of those sessions. This will end up making your
app leak memory or at least taking a lot more then it needs. Plus it
will burn your cpu while it unmarshalls and un base64 encodes all
that session data.
I kinda thought Option 2 may be better as it avoids an extract
database hit per request, in return for a slower "get me active
sessions" call which is only occasionally when requested by a user.
I think that you can add a user_id column to the sessions model and
not incur anymore database queries then the session is already doing
without it. Its just a matter of writing and reading the user_id from
the Session model instead of as a hash in its data part.
Cheers-
-Ezra
Comments?
[SNIP]
> You are much better off altering the actual sessions
table and
> adding a user_id column that can get set in an after or before
filter
> somewhere. Seriously, you will save yourself a lot of hassle if you
> avoid the loop over all sessions and unmarshal approach Greg.
>
While I'm sure Ezra's right it does bring up a couple other issues.
1) "In even a small rails app that hasn't cleaned out the session
recently"
That should NEVER be the case. Yes it is by default, but you should be
taking steps to avoid that. Yes, you can have a cron job run but
personally I hate the idea of having such an important part of my
system be dependant upon a completely separate mechanism. Makes it
harder to port between os's, some isps don't give you access to your
crontab, etc. My solution is to just hook the login process. Every
time someone logs in have it finish off with a call to the db to
delete old sessions.
2) The assumption that because there's a record in the session table
the person is "logged in". You've GOT to be doing some checking to
see WHEN that session record was created or last touched. The problem
is that you can't wipe out sessions to early or you'll annoy your
users. So you have to leave them there for a while. If you leave them
for only half an hour (not recommended) you'll still be listing a
bunch of users as "logged in" who haven't touched your system for
nearly 30 minutes.
3) For most sites this isn't an issue but.... Should you be so
fortunate as to write the next basecamp you probably don't want to be
doing a database write for every single page load. Reads are quick and
easy on the db writes are slower and can require it to reindex things.
So my previous note about checking when it was last updated can become
problematic because it requires constantly writing to that table. Then
again, maybe this last one is just me being overly cautious.
- kate = masukomi
>
-- Ezra Zygmuntowicz
-- Lead Rails Evangelist
-- ez@engineyard.com
-- Engine Yard, Serious Rails Hosting
-- (866) 518-YARD (9273)