Hi at the rails,
that topic seems to come up sometimes.
I tried middleman with some good initial success.
But the deeper I dig, it’s not rails.
I know, that rails is about dynamic websites.
But sometimes, I need to spit out some static generated pages.
Why I should change my well known tools?
I’m asking to build a static site generatoir, build on rails.
What I learned so far,
you need to have a list of all the pages to generate.
It could be easy to generate that from the routes.
but we need to generate each and every pages for all of the resources.
So what a generator does need to do,
is to trickle down through all the routes,
and grab all the data for every resource,
and call the template upon that.
I know that this can get out of hands quite quickly.
But let us assume, a directed graph of resources.
That makes it a limited number of pages to generate.
In my current case, I back the data with a yaml file.
I use the anchors and links
think of it like an object database.
but I only need it to go one level deep.
I now have tried middleman to good success.
(but it got harder, the deeper I dig in that old code)
I’m looking forward for someone to join on into this project.
~eike
I would build the dynamic site in Rails then have a crawler/scraper run separately (as a Git action?) and serve the scraped pages (NokoGiri @docs?) statically. Perhaps you could provide a sitemap for the scraper to follow.
I’ve done this before and written about it here. IIRC, it’s as barebones as you can get — just Rails and a scraper. It’s reliably deployed my site for a good couple of years now.