Pure text captcha with Ajax

Hi --

I'm wondering whether one could achieve some degree of protection from automated form submissions with pure-text captchas using Ajax. The basic idea is: the captcha would be created by an Ajax call, and would therefore not be present in the HTML source of the document. So, in theory (pending someone shooting it down, which wouldn't surprise me at all), it would not be available to casual snooping.

I know that the DOM can be examined, and I don't think this is airtight anyway... but I'm curious what people think of it as at least a quick-and-dirty way to protect a site, perhaps one that isn't going to be up forever. The advantage, as you will have surmised, is that it could be done without recourse to any graphics generation and so on.

David

That sounds like it work for most of the bots that I've seen; I've seen a few that use IE hooks to do their dirty work because of things like this (I've seen a site that did sort of the same thing but with an image; they cycled out about 5-10 different images, but ended up getting hit by spam, probably, from one of these IE driving bots).

Other than that small percent of the bot population, I would think this would work famously both with regards to ease of use for the developer and accessibility for the user.

--Jeremy

I don't really get you, but if you mean that using DHTML to generate DOM dynamically in order to let others not see the source, then I would say that you are probably can't do that, because today we have tools for poking around source (even dynamically generated), like Instance Source for IE, or firebug for firefox. Regards. femto 如流,新一代智能工作平台

One idea I had was using something like a javascript method to capture the x and y positions of the mouse. If the user is actually using the mouse over the browser window, it could be assumed that a real person is there. Package that info with something secure like a public/private key such as with an open timestamp as well as another one encrypted with a secret key. Then check to ensure that they are grabbing the page within the last 10 minutes before posting, which would remove any sort of cached logic. They would consistently have to fetch the page to send through the form, or create a page that does this as a hook before they post through.

All the above could still be cracked, but you would have something that would require a significant amount of energy to attempt to achive, and unless you have the holly grail of a contact form (such as a drupal module) that reaches the masses, I doubt they would spend the time?

Been trying to muster with this idea as well. Some sort of seamless captcha that doesn't require any user questioning, but assumes human life on the other end.

Be interested to hear if you have any other ideas.

Nathaniel.

One idea I had was using something like a javascript method to capture the x and y positions of the mouse. If the user is actually using the mouse over the browser window, it could be assumed that a real person is there.

What about screen readers? People that browse using a keyboard instead of a mouse? It's almost never a good idea to assume the presence of a mouse in any type of software, let alone on the Web.

Package that info with something secure like a public/private key such as with an open timestamp as well as another one encrypted with a secret key. Then check to ensure that they are grabbing the page within the last 10 minutes before posting, which would remove any sort of cached logic.

I often open multiple tabs (e.g. from an RSS reader) and browse through them at my leisure. Something like that would have to be awfully seamless so as not to annoy the user.

> One idea I had was using something like a javascript method to capture > the x and y positions of the mouse. If the user is actually using the > mouse over the browser window, it could be assumed that a real person > is there.

What about screen readers? People that browse using a keyboard instead of a mouse? It's almost never a good idea to assume the presence of a mouse in any type of software, let alone on the Web.

Great point. Using the event.keycode's in addition to the mouse x & y's would work well in this case. Soon as the user starts entering/changing in any of the data in one of the form fields it could be caught. Checking both would be added insurance, while having one or the other would be fine as well.

> Package that info with something secure like a > public/private key such as with an open timestamp as well as another > one encrypted with a secret key. Then check to ensure that they are > grabbing the page within the last 10 minutes before posting, which > would remove any sort of cached logic.

I often open multiple tabs (e.g. from an RSS reader) and browse through them at my leisure. Something like that would have to be awfully seamless so as not to annoy the user.

Similar to the suggestion above. If someone enters something in the form, do remember that's the whole point of this, then using AJAX the secure value could be regenerated to add an additional amount of time before they hit submit. Actually, with that metric, it could be way shorter as you would be only measuring the time between the last time they changed a form value until they hit submit.

And also, try suggesting ideas too. I appreciate a challenge, but let's try and make this a group effort to find a solution shall we?

[...]

The advantage, as you will have surmised, is that it could be done without recourse to any graphics generation and so on.

Go back to basics. What is the goal? To reduce or eliminate spam by increasing the cost of producing it. What is the solution? Present an obstacle that is easy for a human to handle and hard for a computer/script to handle. What is the obstacle? For an image captcha, it's image recognition. (General image recognition by computers at the level of human capabilities remains an unsolved problem.) So what other obstacles could you use? Look around for unsolved problems in computer science that humans are innately good at solving.

I'm just leading up to this punchline: Don Blaheta : index.html

This friend of mine, Don Blaheta, came up with a simple, entirely text-based captcha system more than two years ago. I'm not sure why it hasn't seen more popularity. It's dead simple to implement (he has a Movable Type implementation, but anyone could implement it in any context), computationally efficient, doesn't require JavaScript, much less AJAX, and has been very effective on his own blog. The obstacle presented to the user is natural language understanding (which is his field of expertise), which everyone reading a web page is necessarily good at (even those who can't see!) but computers are rather bad at.

David

--Greg

Hi --

I just recently came across something similar, which is on the Rails wiki when you modify a page. Simple logic understanding, very close to what your friend did.

As far as my goal for the best captcha system. No direct user input specifically for a "captcha", but still detect whether they are a computer or a human by them interacting with the page.

Nathaniel.

See:

- rob