No description
  • HTML 35.5%
  • JavaScript 19.9%
  • CSS 14.6%
  • Go 11.1%
  • Rust 8.3%
  • Other 10.6%
Find a file
Dorin-Andrei Geman 5176289099
Fix Docker Compose models attribute and tests (#15)
* fix(compose): use models attribute

Signed-off-by: Dorin Geman <dorin.geman@docker.com>

* test: use jq for robust JSON validation in smoke tests

Signed-off-by: Dorin Geman <dorin.geman@docker.com>

* tests: use setup-compose-action

Signed-off-by: Dorin Geman <dorin.geman@docker.com>

* fix(rust-genai): handle !modelinfo

Signed-off-by: Dorin Geman <dorin.geman@docker.com>

---------

Signed-off-by: Dorin Geman <dorin.geman@docker.com>
2025-11-12 15:33:00 +02:00
.github/workflows Fix Docker Compose models attribute and tests (#15) 2025-11-12 15:33:00 +02:00
go-genai Fix Docker Compose models attribute and tests (#15) 2025-11-12 15:33:00 +02:00
node-genai Fix Docker Compose models attribute and tests (#15) 2025-11-12 15:33:00 +02:00
py-genai Fix Docker Compose models attribute and tests (#15) 2025-11-12 15:33:00 +02:00
rust-genai Fix Docker Compose models attribute and tests (#15) 2025-11-12 15:33:00 +02:00
.gitignore Merge branch 'main' into main 2025-07-28 17:13:13 +03:00
docker-compose.yml Update DMR integration to new Compose integration (#13) 2025-10-09 09:41:17 +01:00
Dockerfile Initial commit 2025-03-28 11:17:35 +01:00
LICENSE Initial commit 2025-03-28 11:17:35 +01:00
README.md Update DMR integration to new Compose integration (#13) 2025-10-09 09:41:17 +01:00

hello-genai

A simple chatbot web application built in Go, Python and Node.js that connects to a local LLM service (llama.cpp) to provide AI-powered responses.

Environment Variables

The application uses the following environment variables defined in the .env file:

  • LLM_BASE_URL: The base URL of the LLM API
  • LLM_MODEL_NAME: The model name to use

To change these settings, simply edit the .env file in the root directory of the project.

Quick Start

  1. Clone the repository:

    git clone https://github.com/docker/hello-genai
    cd hello-genai
    
  2. Start the application using Docker Compose:

    docker compose up
    
  3. Open your browser and visit the following links:

    http://localhost:8080 for the GenAI Application in Go

    http://localhost:8081 for the GenAI Application in Python

    http://localhost:8082 for the GenAI Application in Node

    http://localhost:8083 for the GenAI Application in Rust

Requirements

  • macOS (recent version)
  • Either:
    • Docker and Docker Compose (preferred)
    • Go 1.21 or later
  • Local LLM server

If you're using a different LLM server configuration, you may need to modify the.env file.