I have a python script that scrapes a particular site daily for data points that match the day's date stamp. Right now the script pulls all data points and filters by date and hence takes a very long time, i.e. loading 12*770= 9240 pages at 400KB per for a total of 3609 MB.
Looking for a way to speed this up drastically.
Skills: python, web scraping