Simple scrapper created to practice async programming
-
Copy the .env.example file as .env:
cp .env.example .env
-
Run docker-compose:
docker-compose up
-
Copy the .env.example file as .env:
cp .env.example .env
-
Create and activate virtual environment:
python3 -m venv venv . venv/bin/activate
-
Install the required dependencies:
pip install -r requirements.txt
-
Copy the .env.example file as .env:
cp .env.example .env
-
Start server:
python server.py
To populate db:
curl -X POST localhost:8080/scrap -d '{"brand":"bmw", "model":"x3"}'
To list scrapped data:
curl -X GET -g 'localhost:8080/cars?brand=bmw&model=x3'
Additional query parameters:
price[lte], price[gte]