我正在编写一个应该通过所有链接 en.wikipedia.org Category:Animals_alphabetically ( https://ru.wikipedia.org/w/index.php?title=Category:Animals_alphabetically&from=А ) 的代码,但是在网站上链接不会更改数字(.../page = 1..),并且 pagefrom=Azov+button(链接取决于页面上的第一个动物)。因此,有必要走更艰难的路。
代码算法如下:
- 从按钮中获取链接 # 该站点有一个“下一页”按钮。
- 按照链接
- 重复
- 停在最后一个链接(我自己开的)
import requests
from bs4 import BeautifulSoup
def parser(url):
html = requests.get(url).text
soup = BeautifulSoup(html,'html.parser')
return soup
def parser_next_link():
soup = parser(url)
result_a = soup.find_all('a',{'title':'Категория:Животные по алфавиту'})
for i in result_a:
if i.get_text() == 'Следующая страница':
link = 'ru.wikipedia.org/' + i.get('href')
# Нам нужна, только первая ссылка т.к. вторая её дублирует
break
return link
wiki_link = ['https://ru.wikipedia.org/w/index.php?title=Категория:Животные_по_алфавиту&subcatfrom=0&filefrom=0&pageuntil=Азовская+пуголовка#mw-pages']
for i in wiki_link:
url = i
wiki_link.append(parser_next_link())
# Последняя страница
if url == 'https://ru.wikipedia.org/w/index.php?title=Категория:Животные_по_алфавиту&pagefrom=Zabrus&subcatfrom=0&filefrom=0#mw-pages':
break
输出:
Traceback (most recent call last):
File "c:\Project_Py\test_2.py", line 26, in <module>
wiki_link.append(parser_next_link())
File "c:\Project_Py\test_2.py", line 12, in parser_next_link
soup = parser(url)
File "c:\Project_Py\test_2.py", line 6, in parser
html = requests.get(url).text
File "C:\Users\Владелец\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\requests\api.py", line 76, in get
return request('get', url, params=params, **kwargs)
File "C:\Users\Владелец\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\requests\api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Users\Владелец\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\requests\sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\Владелец\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\requests\sessions.py", line 649, in send
adapter = self.get_adapter(url=request.url)
File "C:\Users\Владелец\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\requests\sessions.py", line 742, in get_adapter
raise InvalidSchema("No connection adapters were found for {!r}".format(url))
requests.exceptions.InvalidSchema: No connection adapters were found for 'ru.wikipedia.org//w/index.php?title=%D0%9A%D0%B0%D1%82%D0%B5%D0%B3%D0%BE%D1%80%D0%B8%D1%8F:%D0%96%D0%B8%D0%B2%D0%BE%D1%82%D0%BD%D1%8B%D0%B5_%D0%BF%D0%BE_%D0%B0%D0%BB%D1%84%D0%B0%D0%B2%D0%B8%D1%82%D1%83&pagefrom=%D0%90%D0%B7%D0%BE%D0%B2%D1%81%D0%BA%D0%B0%D1%8F+%D0%BF%D1%83%D0%B3%D0%BE%D0%BB%D0%BE%D0%B2%D0%BA%D0%B0&subcatfrom=%D0%90&filefrom=%D0%90#mw-pages'
PS C:\Project_Py> & C:/Users/Владелец/AppData/Local/Microsoft/WindowsApps/PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0/python.exe c:/Project_Py/test_2.py
Traceback (most recent call last):
File "c:\Project_Py\test_2.py", line 26, in <module>
wiki_link.append(parser_next_link())
File "c:\Project_Py\test_2.py", line 12, in parser_next_link
soup = parser(url)
resp = self.send(prep, **send_kwargs)
File "C:\Users\Владелец\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\requests\sessions.py", line 649, in sendbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\requests\sessions.py", line 649, in send bz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\requests\sessions.py", line 742, in get_adapter
adapter = self.get_adapter(url=request.url)
File "C:\Users\Владелец\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_q.org//w/index.php?title=%D0%9A%D0%B0%D1%82%D0%B5%D0%B3%D0%BE%D1%80%D0%B8%D1%8F:%D0%96%D0%B8%D0%B2%D0%BE%D1%82%D0%BD%D1%8bz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\requests\sessions.py", lD0%B7%D0%BE%D0%B2%D1%81%D0%BA%D0%B0%D1%8F+%D0%BF%D1%83%D0%B3%D0%BE%D0%BB%D0%BE%D0%B2%D0%BA%D0%B0&subcatfrom=%D0%90&filefine 742, in get_adapter
raise InvalidSchema("No connection adapters were found for {!r}".format(url))
requests.exceptions.InvalidSchema: No connection adapters were found for 'ru.wikipedia.org//w/index.php?title=%D0%9A%D0%B0%D1%82%D0%B5%D0%B3%D0%BE%D1%80%D0%B8%D1%8F:%D0%96%D0%B8%D0%B2%D0%BE%D1%82%D0%BD%D1%8B%D0%B5_%D0%BF%D0%BE_%D0%B0%D0%BB%D1%84%D0%B0%D0%B2%D0%B8%D1%82%D1%83&pagefrom=%D0%90%D0%B7%D0%BE%D0%B2%D1%81%D0%BA%D0%B0%D1%8F+%D0%BF%D1%83%D0%B3%D0%BE%D0%BB%D0%BE%D0%B2%D0%BA%D0%B0&subcatfrom=%D0%90&filefrom=%D0%90#mw-pages'
我意识到错误是递归的,但我也有一个停止条件。
requests.exceptions.InvalidSchema这种情况下请求通过的错误requests是由于url中没有指定协议(例如http或https)。只需将协议添加到地址的开头: