Last Updated on 2021-09-05 by Clay
Today when I using a famous package requests
in Python for crawling, I got a following error message:
requests.exceptions.TooManyRedirects: Exceeded 30 redirects.
But I can still execute it the day before.
this is because the website we requested was intended to redirect us to another URL, and the requests
package automatically handled the redirection problem, which resulted in too many redirects and an error.
Solution
Regarding the problem of redirection, you can see the clues from the status code returned. You need to add the allow_redirects=False
parameter to the requests, otherwise the program will report an error when sending the request.
r = requests.get(url, cookies=cookies, allow_redirects=False) print(r.status_code)
Output:
303
The following types of errors are often seen regarding redirection:
- 301: Moved Permanently
- 302: Moved Temporarily
- 303: See Other (I’m not sure how to describe it. Anyway, every time I encounter 303, the cookies are expired.)
So, the solution is: check the URL, check the cookies.
References
- https://stackoverflow.com/questions/23651947/python-requests-requests-exceptions-toomanyredirects-exceeded-30-redirects
- https://stackoverflow.com/questions/23651947/python-requests-requests-exceptions-toomanyredirects-exceeded-30-redirects