From dd6750cdc54441186dba89742307918873757931 Mon Sep 17 00:00:00 2001 From: Augusto Batista Date: Mon, 8 Jun 2020 15:46:26 -0300 Subject: [PATCH] =?UTF-8?q?adiciona=20instru=C3=A7=C3=B5es=20para=20subir?= =?UTF-8?q?=20o=20servi=C3=A7o=20de=20scraping?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- README.en.md | 4 ++++ README.md | 5 ++++- 2 files changed, 8 insertions(+), 1 deletion(-) diff --git a/README.en.md b/README.en.md index 4e98671..e94cd72 100644 --- a/README.en.md +++ b/README.en.md @@ -165,6 +165,10 @@ Requires Python 3 (tested in 3.8.2). To set up your environment: 2. Create a virtualenv (you can use [venv](https://docs.python.org/pt-br/3/library/venv.html) for this). 3. Install the dependencies: `pip install -r requirements-development.txt` +4. Run the collect script: `./run-spiders.sh` +5. Run the consolidation script: `./run.sh` +6. Run the script that starts the scraping service: `./web.sh` + - The scrapers will be available through a web interface at the URL http://localhost:5000 ### Docker setup diff --git a/README.md b/README.md index 5bcf590..cffbcff 100644 --- a/README.md +++ b/README.md @@ -177,7 +177,10 @@ Você pode montar seu ambiente de desenvolvimento utilizando o 2. Crie um virtualenv (você pode usar [venv](https://docs.python.org/pt-br/3/library/venv.html) para isso). 3. Instale as dependências: `pip install -r requirements-development.txt` - +4. Rode o script de coleta: `./run-spiders.sh` +5. Rode o script de consolidação: `./run.sh` +6. Rode o script que sobe o serviço de scraping: `./web.sh` + - Os scrapers estarão disponíveis por uma interface web a partir do endereço http://localhost:5000 ### Setup com Docker