I want to experiment with importing i2's wiki into MediaWiki this weekend. Anyone know who I could talk to to get a database dump?
Also, what would be a good scripting language to munge the data?
[Ha! I crack me up.]
I want to experiment with importing i2's wiki into MediaWiki this weekend. Anyone know who I could talk to to get a database dump?
Also, what would be a good scripting language to munge the data?
[Ha! I crack me up.]
Jay Levitt wrote:
I want to experiment with importing i2's wiki into MediaWiki this weekend. Anyone know who I could talk to to get a database dump?
Also, what would be a good scripting language to munge the data? [Ha! I crack me up.]
Sorry for the slow response. I don't actively watch this list, but just happened to take a look today.
It's a few months old, but I've put a dump here:
http://www.jufo.org/dump_2007_09_08.sql.gz
Be warned that it is huge, because there has never been any policy regarding management of page history, and many pages got hundreds or thousands of spammed versions.
I have some notes about the structure here:
http://www.jufo.org/i2_to_Ruse.pdf
regards
Justin