I found this:
I love Google Scholar as my go-to place to search for papers. Some features like "forward-citations" and their nice-ish autogenerated bibtex are life savers.
However, sometimes I (and others) wish we could write scripts to help us:
Google Scholar with Matlab;
Automatically building a database of forward and backward citations
However, Google Scholar does not provide an API, their robots.txt disallows scrapers on most pages of interest (for instance the cited-by results are not suppose to be accessed by bots), and if you try to make many requests (as a bot would) you will get an CAPTCHA.
Last year they used to have a EULA that said:
You shall not, and shall not allow any third party to: ...
(i) directly or indirectly generate queries, or impressions of or clicks on Results, through any automated, deceptive, fraudulent or other invalid means (including, but not limited to, click spam, robots, macro programs, and Internet agents);
(l) "crawl", "spider", index or in any non-transitory manner store or cache information obtained from the Service (including, but not limited to, Results, or any part, copy or derivative thereof);
Some Google services like custom search (for which I could find a EULA) still state this in section 1.4, but the link in the SO answer is now dead and I have not been able to find a new EULA for Scholar. From anecdotal evidence, I know that you can get in a decent amount of trouble of you try to circumvent Google's efforts to prevent scraping of Scholar.
I believe the proper response here is "Don't do this. Call Google and ask them (politely) to either give you written permission or point you to the approved API. I suspect this is not what you wanted to hear, but....
I Go Back to Sleep, Now.