Skip to content

Commit

Permalink
requests já está há algum tempo no requirements.txt
Browse files Browse the repository at this point in the history
  • Loading branch information
augusto-herrmann committed Feb 20, 2021
1 parent 3f91d63 commit b4da052
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion README.en.md
Original file line number Diff line number Diff line change
Expand Up @@ -114,7 +114,7 @@ being said, scrapers will help *a lot* in this process. However, when
creating a scraper it is important that you follow a few rules:

- It's **required** that you create it using `scrapy`;
- **Do Not** use `pandas`, `BeautifulSoup`, `requests` or other
- **Do Not** use `pandas`, `BeautifulSoup` or other
unnecessary libraries (the standard Python lib already has lots of
useful libs, `scrapy` with XPath is already capable of handling most
of the scraping and `rows` is already a dependency of this
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ forma, os scrapers ajudarão *bastante* no processo. Porém, ao criar um
scraper é importante que você siga algumas regras:

- **Necessário** fazer o scraper usando o `scrapy`;
- **Não usar** `pandas`, `BeautifulSoup`, `requests` ou outras bibliotecas
- **Não usar** `pandas`, `BeautifulSoup` ou outras bibliotecas
desnecessárias (a std lib do Python já tem muita biblioteca útil, o `scrapy`
com XPath já dá conta de boa parte das raspagens e `rows` já é uma
dependência desse repositório);
Expand Down

0 comments on commit b4da052

Please sign in to comment.