The algorithm based on deep learning methods, is the most effective solver captcha security and authentication systems to date and could spell the end for one of the most widely used website security systems.
It relies on people finding it easier to decipher the characters than machines.
The tool, developed by Lancaster University in the UK, Northwest University in the US and Peking University in China, delivers significantly higher accuracy than previous captcha attack systems.
The solver is also highly efficient. It can solve a captcha within 0.05 of a second by using a desktop PC, researchers said.
The method involves teaching a captcha generator programme to produce large numbers of training captchas that are indistinguishable from genuine captchas.
These are then used to rapidly train a solver, which is then refined and tested against real captchas.
By using a machine-learned automatic captcha generator the researchers, or would be attackers, are able to significantly reduce the effort, and time, needed to find and manually tag captchas to train their software.
Previous captcha solvers are specific to one particular captcha variation. Prior machine-learning attack systems are labour intensive to build, requiring a lot of manual tagging of captchas to train the systems.
They are also easily rendered obsolete by small changes in the security features used within captchas.
Since, the new solver requires little human involvement it can easily be rebuilt to target new, or modified, captcha schemes.
"We show for the first time that an adversary can quickly launch an attack on a new text-based captcha scheme with very low effort," said Zheng Wang, Senior Lecturer at Lancaster University.
"It allows an adversary to launch an attack on services, such as Denial of Service attacks or spending spam or fishing messages, to steal personal data or even forge user identities," said Guixin Ye, the lead student author of the study.
(This story has not been edited by Devdiscourse staff and is auto-generated from a syndicated feed.)