aaredb/backend/Dockerfile
GotthardG 1da5634013 Add job processing system with streaming endpoint
Introduced a `processing` router to handle job streaming using server-sent events. Added `Jobs` and `JobStatus` models for managing job-related data, along with database creation logic. Updated the `sample` router to create new job entries during experiment creation.
2025-04-10 22:22:51 +02:00

46 lines
1.3 KiB
Docker

FROM python:3.12-slim-bullseye
# Use the Debian 11 base image
# Set the working directory in the container
WORKDIR /app
RUN mkdir -p /app/backend/ssl
# Install system dependencies
RUN apt-get update && apt-get install -y --no-install-recommends \
wget \
default-jre \
build-essential \
unixodbc-dev \
libmariadb-dev \
libssl-dev \
gcc \
curl \
gpg && \
rm -rf /var/lib/apt/lists/*
# May need to install postgreSQL and run the server within the docker
# Download and install the msodbcsql18 driver for arm64-compatible base image
RUN apt-get update && apt-get install -y --no-install-recommends unixodbc-dev curl apt-transport-https gnupg && \
curl -sSL https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > /etc/apt/trusted.gpg.d/microsoft.asc.gpg && \
curl -sSL https://packages.microsoft.com/config/ubuntu/20.04/prod.list > /etc/apt/sources.list.d/mssql-release.list && \
apt-get update && ACCEPT_EULA=Y apt-get install -y msodbcsql18 unixodbc
# Install pyodbc
RUN pip install pyodbc
# Copy the requirements file
COPY requirements.txt /app/requirements.txt
# Install Python dependencies
RUN pip install --no-cache-dir -r requirements.txt
# Copy the rest of the application code
COPY . /app
# Expose the application port
EXPOSE 8000
# Command to run the application
CMD ["python", "main.py"]