The server used to run Keras models from the AI Challenge participant teams.
This uses the latest (?) versions of:
- tensorflow
- aiohttp with Python 3
- Make a copy of example.conf and enter team names and models like in the example
- Run
conda env create -f environment.ymlto download the requirements - Run
server.py
If you want to serve the stream frontend as well, you will need to clone using git clone --recurse-submodules https://github.com/SAC-PhoeniX/AI-Challenge-22-Server.git. Before running server.py, do
$ cd AI-Challenge-22-Server/stream
$ yarn # or: npm i .
$ yarn build # or: npm run buildto build the frontend files. The files will be served if they are found.
If you need to run the server to test the endpoints, you can set the MODELS environment variable to NO_TF before running server.py. To do so, use these commands:
on Linux/OSX:
MODELS=NO_TF python server.pyon Windows in PowerShell:
$env:MODELS='NO_TF'
python server.py
# do after testing:
$env:MODELS="TF"