Skip to content
Snippets Groups Projects
Commit 0773855e authored by Yoann Schneider's avatar Yoann Schneider :tennis: Committed by Mélodie Boillet
Browse files

Rename Arkindex URL and remove some obsolete Arkindex related requirements in unit tests

parent d38bab3e
No related branches found
No related tags found
1 merge request!228Rename Arkindex URL and remove some obsolete Arkindex related requirements in unit tests
...@@ -34,14 +34,10 @@ test: ...@@ -34,14 +34,10 @@ test:
variables: variables:
PIP_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pip" PIP_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pip"
ARKINDEX_API_SCHEMA_URL: schema.yml
before_script: before_script:
- pip install tox - pip install tox
# Download OpenAPI schema from last backend build
- curl https://assets.teklia.com/arkindex/openapi.yml > schema.yml
# Add system deps for opencv # Add system deps for opencv
- apt-get update -q - apt-get update -q
- apt-get install -q -y libgl1 - apt-get install -q -y libgl1
......
...@@ -6,7 +6,7 @@ There are a several steps to follow when training a DAN model. ...@@ -6,7 +6,7 @@ There are a several steps to follow when training a DAN model.
The data must be extracted and formatted for training. To extract the data, DAN uses an Arkindex export database in SQLite format. You will need to: The data must be extracted and formatted for training. To extract the data, DAN uses an Arkindex export database in SQLite format. You will need to:
1. Structure the data into folders (`train` / `val` / `test`) in [Arkindex](https://arkindex.teklia.com/). 1. Structure the data into folders (`train` / `val` / `test`) in [Arkindex](https://demo.arkindex.org/).
1. [Export the project](https://doc.arkindex.org/howto/export/) in SQLite format. 1. [Export the project](https://doc.arkindex.org/howto/export/) in SQLite format.
1. Extract the data with the [extract command](../usage/datasets/extract.md). 1. Extract the data with the [extract command](../usage/datasets/extract.md).
1. Format the data with the [format command](../usage/datasets/format.md). 1. Format the data with the [format command](../usage/datasets/format.md).
......
...@@ -3,7 +3,7 @@ ...@@ -3,7 +3,7 @@
Two operations are available through subcommands: Two operations are available through subcommands:
`teklia-dan dataset extract` `teklia-dan dataset extract`
: To extract a dataset from Arkindex using its [Python API](https://arkindex.teklia.com/api-docs/). More details in [the dedicated section](./extract.md). : To extract a dataset from Arkindex using its [Python API](https://demo.arkindex.org/api-docs/). More details in [the dedicated section](./extract.md).
`teklia-dan dataset format` `teklia-dan dataset format`
: To format datasets for training. More details in [the dedicated section](./format.md). : To format datasets for training. More details in [the dedicated section](./format.md).
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
import os
from pathlib import Path from pathlib import Path
import pytest import pytest
...@@ -14,22 +13,6 @@ from dan.transforms import Preprocessing ...@@ -14,22 +13,6 @@ from dan.transforms import Preprocessing
FIXTURES = Path(__file__).resolve().parent / "data" FIXTURES = Path(__file__).resolve().parent / "data"
@pytest.fixture(autouse=True)
def setup_environment(responses):
"""Setup needed environment variables"""
# Allow accessing remote API schemas
# defaulting to the prod environment
schema_url = os.environ.get(
"ARKINDEX_API_SCHEMA_URL",
"https://arkindex.teklia.com/api/v1/openapi/?format=openapi-json",
)
responses.add_passthru(schema_url)
# Set schema url in environment
os.environ["ARKINDEX_API_SCHEMA_URL"] = schema_url
@pytest.fixture @pytest.fixture
def database_path(): def database_path():
return FIXTURES / "export.sqlite" return FIXTURES / "export.sqlite"
......
...@@ -2,11 +2,9 @@ ...@@ -2,11 +2,9 @@
envlist = teklia-dan envlist = teklia-dan
[testenv] [testenv]
passenv = ARKINDEX_API_SCHEMA_URL
commands = commands =
pytest {posargs} pytest {posargs}
deps = deps =
pytest pytest
pytest-responses
-rrequirements.txt -rrequirements.txt
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment