Google is a repository of Internet data, as it indexes tremendous amount of data. It uses a prediction service to determine the rest of the search query.
With such a huge repository in place, can we take advantage of it to determine the password strength?
Can we use the password as a search query to determine its popularity? Based upon the number of hits we get for the password, can we give a score to the password? Is this model feasible?
I think the passwords that are popular, occur more frequently on web-pages or as search queries. This doesn't mean that rare words of smaller length will be given a strong score. This is for obvious reasons, smaller passwords can be brute-force searched or can be queried in a finite time.
I think this model will work well for determining the strength of longer passwords that adhere to natural language such as English.
Is this approach to measure the password strength trivial? Or, in other words, is there any easy attack if such a model is used for measuring the password strength?
Note: If there is a trust issue with Google, assume that we can build our own service.
Edit: Research (https://madiba.encs.concordia.ca/~x_decarn/papers/password-meters-ndss2014.pdf) has shown that most of the strength meters deployed today mislead users while creating the passwords. Using repository of search engines like google, can we measure the password strength more accurately? There will be false positives, as pointed out by many, but won't it be better than existing strength meters?