Develop Google scraper in Python with Scraper API

Adnan Siddiqi
2 min readNov 15, 2021

--

This is another post in ScrapeTheFamous, in which I will be parsing some famous websites and will discuss my development process. The posts will be using Scraper API for parsing purposes which makes me free from all worries about blocking and rendering dynamic sites since Scraper API takes care of everything.

So this post is about scraping Google search results, the script will accept a keyword and would return results across multiple pages. The data will be stored in a text file in JSON format.

The code that is parsing the result is pretty straightforward and given below:

It goes from one page to the next, keeping the search keyword intact. It then parses the H3 and a tag and stores in a JSON structure.

Conclusion

In this post, you learned how you can create a Google Search parser in Python using Scraper API. You do not have to worry about Proxy IPs either nor do you have to pay hundreds of dollars, especially when you are an individual or working in a startup. The company I worked with spends 100s of dollars on a monthly basis just for the proxy IPs.

Oh if you sign up here with my referral link or enter promo code adnan10, you will get a 10% discount on it. In case you do not get the discount then just let me know via email on my site and I’d sure help you out.

The code is available on Github

Originally published at http://blog.adnansiddiqi.me on November 15, 2021.

--

--

Adnan Siddiqi
Adnan Siddiqi

Written by Adnan Siddiqi

Pakistani | Husband | Father | Software Consultant | Developer | blogger. I occasionally try to make stuff with code. https://adnansiddiqi.me