push main

This commit is contained in:
greg 2023-09-29 08:52:00 +02:00
commit 766c0d1ac2
15 changed files with 640 additions and 0 deletions

8
Dockerfile Normal file
View File

@ -0,0 +1,8 @@
FROM python:3.10-slim-bullseye
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
WORKDIR /
COPY . .
RUN pip3 install -r requirements.txt
WORKDIR /app
CMD ["gunicorn", "-w", "4", "wsgi:app", "--bind", "0.0.0.0:8000"]

99
README.md Normal file
View File

@ -0,0 +1,99 @@
# GARAGE GPT :alien:
![CHATGPT](https://media2.giphy.com/media/qAtZM2gvjWhPjmclZE/giphy.gif?cid=ecf05e47pji6o2vjk5sa2thp1f8yqjtywlt7vm9m45nqykzx&ep=v1_gifs_search&rid=giphy.gif&ct=g)
Projet pour héberger un Chat-GPT local s'appuyant sur le projet [LocalAI](https://localai.io/)
utilisant une simple flaskapp comme frontend.
:warning: seulement le model ggml-gpt4all-j (gpt-3.5-turbo) est pris en charge pour le moment
## PREREQUIS :paperclip:
- Docker :whale:
## MODELS :moyai:
- compatibles: [LocalAI](https://github.com/go-skynet/model-gallery)
- tous: [HugginFace](https://huggingface.co/models?search=ggml)
## CONFIGURATION :wrench:
La configuration se fait dans le .env.local:
- THREADS -> nombre de cores du CPU utilisés (privilègier le nombre de cores physiques au max)
<img src="https://indipest.files.wordpress.com/2021/03/bw6d5zz.gif" width="200px">
- DEFAULT_MODEL -> Le modèle chargé par défault (dans la RAM)
- PRELOAD_MODELS -> Renseigner les adresses des modèles que l'on veux télécharger via l'adresse https://github.com/go-skynet/model-gallery/model.yaml
:warning: L'image LocalAI fait un peu plus de 12Go et les modèles 7B ou 13B font en moyenne 4 à 6Go
## UTILISATION :checkered_flag:
- Premier lancement:
```bash
docker compose up -d
```
:hourglass: Attendre que la stack se build :coffee:
- L'interface est accessible à cette adresse:
> http://localhost:5000
- Après les premier lancement:
```bash
nano .env
REBUILD=false
```
> A chaque modification des PRELOAD_MODELS, REBUILD=true
#### voir les logs
```bash
docker compose logs -f
```
## OIDC :key:
> voir le [README](https://git.legaragenumerique.fr/GARAGENUM/flask-keycloak/README.md) dans le projet flask-keycloak
## MODELS OK
:white_check_mark: ggml-gpt4all-j.bin (= gpt-turbo-3.5)
:white_check_mark: stablediffusion (image generator)
- [ ] wizardlm-13b-v1.1-superhot-8k.ggmlv3.q4_0.bin
- [ ] open-llama-7b-q4_0.bin -> HS pour le moment
- [ ] whisper (audio to text)
- [ ] bloomz (traduction)
- [ ] wizardcode (code) -> URL model HS
## TEST HARDWARE :computer:
| MODEL | PROMPT | i5-8350U 16G RAM | RYZEN 7 5800X 32G RAM | TEMP
| :--------------- |:-----------------:| ---------------------:|:----------------:|:------:|
| GPT-TURBO-3.5 | WRITE JS FUNCTION | 41S | 16s | 0.5 |
| STABLEDIFFUSION | BLUE FLOWER | 90s | 20S | X |
## TO DO :bookmark_tabs:
:white_check_mark: une page gpt / une page stablediffusion avec navbar dans base.html
:white_check_mark: Temperature bouton
:white_check_mark: formater code (```js ```)
:white_check_mark: Flask app frontend
:white_check_mark: authentification Keycloak -> https://git.legaragenumerique.fr/GARAGENUM/flask-keycloak
:white_check_mark: wsgi.py for prod + DNS
- [ ] ajouter config Nginx (ai.domaine.tld + image.domaine.tld)
- [ ] bouton stop generating ?
- [ ] bouton home
- [ ] conserver context (sqlite / json / session ?)
- [ ] Traduction via [LibreTranslate](https://github.com/LibreTranslate/LibreTranslate) :gb: -> :fr:
- [ ] restart container si timeout
- [ ] utiliser GPU
- [ ] entraîner avec big GPU
### bugs :ghost:
:white_check_mark: permissions dossier models (root != user)
:white_check_mark: image url en prod

19
app/.env Normal file
View File

@ -0,0 +1,19 @@
#########################LOCALAI#########################
# local-ai quand flask dockerisé
LOCALAI_HOST=local-ai
MODELS_PATH=/models
DEBUG=true
REBUILD=false
#THREADS=4
DEFAULT_MODEL=gpt-3.5-turbo
PRELOAD_MODELS=[{"url":"github:go-skynet/model-gallery/gpt4all-j.yaml","name":"gpt-3.5-turbo"},{"url":"github:go-skynet/model-gallery/stablediffusion.yaml","name":"stablediffusion"}]
#DEFAULT_MODEL=wizard-lm
#PRELOAD_MODELS=[{"url":"github:go-skynet/model-gallery/openllama_7b.yaml","name":"open_llama"}]
#GALLERIES=[{"name":"model-gallery","url":"github:go-skynet/model-gallery/index.yaml"}]

118
app/app.py Normal file
View File

@ -0,0 +1,118 @@
import os, re
import requests, time
from dotenv import load_dotenv
from flask_oidc import OpenIDConnect
from flask import Flask, redirect, render_template, request, url_for
app = Flask(__name__)
model = "ggml-gpt4all-j.bin"
load_dotenv()
host = os.getenv("LOCALAI_HOST")
############################### KEYCLOAK ###############################
# app.config.update({
# # PROD ONLY
# 'SECRET_KEY': 'créer-un-secret-ici',
# 'OIDC_CLIENT_SECRETS': 'client_secrets_prod.json',
# 'OIDC_ID_TOKEN_COOKIE_SECURE': False,
# 'OIDC_REQUIRE_VERIFIED_EMAIL': False,
# 'OIDC_USER_INFO_ENABLED': True,
# 'OIDC_OPENID_REALM': 'gregan',
# 'OIDC_SCOPES': ['openid', 'email', 'profile'],
# 'OIDC_INTROSPECTION_AUTH_METHOD': 'client_secret_post'
# })
# app.config['OVERWRITE_REDIRECT_URI'] = 'https://chat-gpt.domain.tld/oidc_callback'
# oidc = OpenIDConnect(app)
# @app.context_processor
# def inject_oidc_user():
# if oidc.user_loggedin:
# return dict(oidc_user=oidc.user_getfield('email'))
# return dict(oidc_user=None)
# CHAT BOT: GPT-TURBO-3.5
@app.route("/", methods=("GET", "POST"))
# @oidc.require_login
def index():
result = ''
temps = ''
if request.method == "POST":
question = request.form["question"]
temp = request.form["temperature"]
url = "http://" + host + ":8080/v1/chat/completions"
payload = {
# "role": "system", "content": "You are Yoda, the character of Star Wars and you answer the question like he would.",
"model": 'gpt-3.5-turbo',
"messages": [{"role": "user", "content": question}],
"temperature": float(temp)
}
tic = time.perf_counter()
response = requests.post(url, json=payload)
if response.status_code == 200:
result = '<md>' + response.json()['choices'][0]['message']['content'] + '</md>'
# print(result)
# format_code(result)
else:
result = "Erreur de connection avec l'API"
toc = time.perf_counter()
temps = f"temps de réponse: {toc - tic:0.4f} seconds"
return render_template("index.html", result=result, time=temps)
# IMAGE GENERATOR: STABLEDIFFUSION
@app.route("/image", methods=("GET", "POST"))
# @oidc.require_login
def image():
result = ''
temps = ''
if request.method == "POST":
question = request.form["image"]
url = "http://" + host + ":8080/v1/images/generations"
headers = {
"Content-Type": "application/json"
}
data = {
"prompt": question,
"size": "256x256",
"directory": "/tmp"
}
tic = time.perf_counter()
response = requests.post(url, headers=headers, json=data)
if response.status_code == 200:
########## PROD ONLY ##########
# image_url = response.json()['data'][0]['url'].replace("local-ai", "image.domain.tld").replace("http", "https").replace(":8080", "")
image_url = response.json()['data'][0]['url'].replace("local-ai", "localhost")
else:
result = "Erreur de connection avec l'API"
toc = time.perf_counter()
temps = f"temps de réponse: {toc - tic:0.4f} seconds"
result = '<img src=' + image_url + ' >'
return render_template("image.html", result=result, time=temps)

BIN
app/static/favicon.ico Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 254 KiB

BIN
app/static/loader.gif Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 370 KiB

BIN
app/static/logo-4.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.1 KiB

213
app/static/main.css Normal file
View File

@ -0,0 +1,213 @@
@font-face {
font-family: "ColfaxAI";
src: url(https://cdn.openai.com/API/fonts/ColfaxAIRegular.woff2)
format("woff2"),
url(https://cdn.openai.com/API/fonts/ColfaxAIRegular.woff) format("woff");
font-weight: normal;
font-style: normal;
}
@font-face {
font-family: "ColfaxAI";
src: url(https://cdn.openai.com/API/fonts/ColfaxAIBold.woff2) format("woff2"),
url(https://cdn.openai.com/API/fonts/ColfaxAIBold.woff) format("woff");
font-weight: bold;
font-style: normal;
}
body,
input {
font-size: 16px;
line-height: 24px;
color: #353740;
font-family: "ColfaxAI", Helvetica, sans-serif;
}
body {
background-color: #24262B;
/* background: url("matrix.gif"); */
display: flex;
flex-direction: column;
align-items: center;
/* padding-top: 60px; */
min-height: 100%;
text-align: center;
}
.icon {
width: 34px;
}
h3 {
font-size: 32px;
line-height: 40px;
font-weight: bold;
color: #fff;
margin: 16px 0 40px;
}
form {
margin-top: 10px;
display: flex;
flex-direction: column;
width: 400px;
text-align: center;
}
input[type="text"] {
padding: 12px 16px;
border: 1px solid #10a37f;
border-radius: 4px;
margin-top: 5px;
margin-bottom: 24px;
}
::placeholder {
color: #8e8ea0;
opacity: 1;
}
input[type="submit"] {
padding: 12px 0;
color: #fff;
background-color: #10a37f;
border: none;
border-radius: 4px;
text-align: center;
cursor: pointer;
}
.result {
color: #10a37f;
max-width: 80%;
max-height: 80%;
border: 1px solid #10a37f;
border-radius: 4px;
cursor: pointer;
font-weight: bold;
margin-top: 40px;
margin-left: 50px;
margin-right: 50px;
margin-bottom: 30px;
}
label {
color: black;
}
.spinner {
padding-top: 250px;
width: 30%;
}
.footer {
padding: 10px 0;
/* position: fixed; */
color: #fff;
bottom: 0;
left: 0;
width: 100%;
text-align: center;
}
select, input {
margin-bottom: 20px;
text-align: center;
}
option {
text-align: center;
}
.settings {
border: 1px solid #10a37f;
border-radius: 4px;
text-align: center;
cursor: pointer;
margin-bottom: 10px;
}
.temps {
color: #fff;
text-align: center;
font-size: 10px;
}
div {
margin-left: 10px;
margin-top: 10px;
align-items: center;
}
.jumbotron {
height: auto;
padding-top: 0.3em;
padding-bottom: 0.3em;
margin: 20px;
background-color: #E9ECEF;
}
.container {
display: flex;
flex-direction: column;
justify-content: center; /* Centre verticalement */
align-items: center; /* Centre horizontalement */
}
a.nav-link {
color: cornflowerblue;
font-size: 30px;
}
pre {
background-color: #f4f4f4;
margin-left: 5px;
margin-right: 5px;
margin-top: 5px;
border: 1px solid #000;
padding: 10px;
white-space: pre;
}
button {
padding: 12px 10px;
color: #fff;
background-color: #10a37f;
border: none;
border-radius: 4px;
text-align: center;
cursor: pointer;
}
p {
margin-left: 5px;
margin-right: 5px;
margin-top: 5px;
}
#loading-spinner {
display: flex;
align-items: center;
justify-content: center;
height: 100vh; /* Hauteur égale à la hauteur de la fenêtre */
position: fixed;
width: 100%;
background-color: rgba(255, 255, 255, 0.8); /* Fond semi-transparent */
z-index: 9999; /* S'assurer qu'il apparaît au-dessus du contenu */
}
/* #loading-spinner::after {
content: '';
display: inline-block;
width: 50px;
height: 50px;
border: 5px solid #3498db;
border-radius: 50%;
border-top: 5px solid #f3f3f3;
} */
/* @keyframes spin {
0% { transform: rotate(0deg); }
100% { transform: rotate(360deg); }
} */

BIN
app/static/matrix.gif Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 611 KiB

93
app/templates/base.html Normal file
View File

@ -0,0 +1,93 @@
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<!-- Bootstrap CSS -->
<link rel="stylesheet" href="https://stackpath.bootstrapcdn.com/bootstrap/4.3.1/css/bootstrap.min.css" integrity="sha384-ggOyR0iXCbMQv3Xipma34MD+dH/1fQ784/j6cY/iJTQUOhcWr7x9JvoRxT2MZw1T" crossorigin="anonymous">
<link rel="stylesheet" href="{{ url_for('static', filename='main.css') }}" />
<link rel="shortcut icon" href="{{ url_for('static', filename='favicon.ico') }}">
<title>Gregan AI</title>
</head>
<body>
<div id="loading-spinner">
<img src="{{ url_for('static', filename='loader.gif') }}" class="spinner">
<p>Loading ...</p>
</div>
<h3>Gregan AI</h3>
<div class="container">
<nav class="navbar navbar-expand-lg navbar-light bg-light jumbotron">
<!-- <a class="navbar-brand" href="#">Navbar</a> -->
<button class="navbar-toggler" type="button" data-toggle="collapse" data-target="#navbarSupportedContent" aria-controls="navbarSupportedContent" aria-expanded="false" aria-label="Toggle navigation">
<span class="navbar-toggler-icon"></span>
</button>
<div class="collapse navbar-collapse" id="navbarSupportedContent">
<ul class="navbar-nav mr-auto">
{% if request.path != '/' %}
<li class="nav-item active">
<a class="nav-link" href="{{ url_for('index') }}">CHAT <span class="sr-only">(current)</span></a>
</li>
{% elif request.path != '/image' %}
<li class="nav-item active">
<a class="nav-link" href="{{ url_for('image') }}">IMAGE <span class="sr-only">(current)</span></a>
</li>
{% endif %}
</ul>
</div>
</nav>
</div>
<div class="container jumbotron">
{% block content %}
{% endblock %}
<button id="copyButton" style="display: none;" onclick="copyCodeToClipboard()">Copier le code</button>
</div>
<div class="footer">
<div class="temps">{{ time }}</div>
<div>
<img src="{{ url_for('static', filename='logo-4.png') }}" style="max-width: 50%;">
</div>
</div>
<!-- jQuery first, then Popper.js, then Bootstrap JS -->
<script src="https://code.jquery.com/jquery-3.3.1.slim.min.js" integrity="sha384-q8i/X+965DzO0rT7abK41JStQIAqVgRVzpbzo5smXKp4YfRvH+8abtTE1Pi6jizo" crossorigin="anonymous"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/popper.js/1.14.7/umd/popper.min.js" integrity="sha384-UO2eT0CpHqdSJQ6hJty5KVphtPhzWj9WO1clHTMGa3JDZwrnQq4sF86dIHNDz0W1" crossorigin="anonymous"></script>
<script src="https://stackpath.bootstrapcdn.com/bootstrap/4.3.1/js/bootstrap.min.js" integrity="sha384-JjSmVgyd0p3pXB1rRibZUAYoIIy6OrQ6VrjIEaFf/nJGzIxFDsf4x0xIM+B07jRM" crossorigin="anonymous"></script>
<script src="https://cdn.jsdelivr.net/gh/MarketingPipeline/Markdown-Tag/markdown-tag-GitHub.js"></script>
<script>
const myForm = document.getElementById("form");
const loadingSpinner = document.getElementById("loading-spinner");
myForm.addEventListener("submit", function (event) {
loadingSpinner.style.display = "block";
});
function onPageLoaded() {
loadingSpinner.style.display = "none";
if (document.getElementsByTagName("code").length > 0)
{
var copyButton = document.getElementById("copyButton").style.display = "block"; // Affiche le bouton
}
}
function copyCodeToClipboard() {
var codeToCopy = document.querySelector("code");
var textArea = document.createElement("textarea");
textArea.value = codeToCopy.textContent;
document.body.appendChild(textArea);
textArea.select();
document.execCommand("copy");
document.body.removeChild(textArea);
}
window.addEventListener("load", onPageLoaded);
</script>
</body>
</html>

14
app/templates/image.html Normal file
View File

@ -0,0 +1,14 @@
{% extends "base.html" %}
{% block content %}
<form action="/image" method="post" id="form">
<input type="text" name="image" placeholder="Description de l'image ici!" required />
<input type="submit" value="Go!" />
</form>
{% if result %}
<div class="result">{{ result|safe }}</div>
{% endif %}
{% endblock %}

29
app/templates/index.html Normal file
View File

@ -0,0 +1,29 @@
{% extends "base.html" %}
{% block content %}
<form action="/" method="post" id="form">
<div class="settings">
<!-- <label for="model">Sélectionnez un model :</label><br>
<select id="model" name="model">
{% for e in menu %}
<option value={{ e }}>{{ e }}</option>
{% endfor %}
</select> -->
<br>
<label for="valeur">Sélectionnez la température :</label>
<input type="range" id="valeur" name="temperature" min="0" max="1" step=".1">
<br>
<input type="text" name="question" placeholder="Ecrire la question ici!" required />
<br>
</div>
<br>
<input type="submit" value="Go!" />
<br>
</form>
{% if result %}
<div class="result">{{ result|safe }}</div>
{% endif %}
{% endblock %}

3
app/wsgi.py Normal file
View File

@ -0,0 +1,3 @@
from app import app # Remplacez "votre_application" par le nom de votre application Flask
if __name__ == "__main__":
app.run()

33
docker-compose.yaml Normal file
View File

@ -0,0 +1,33 @@
version: '3.6'
services:
local-ai:
image: quay.io/go-skynet/local-ai:latest
container_name: local-ai
restart: always
ports:
- 8080:8080
env_file:
- .env
volumes:
- ./models:/models:cached
- ./images:/tmp/generated/images
command: ["chmod -R 777 /tmp", "/usr/bin/local-ai"]
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8080/readyz"]
interval: 1m
timeout: 20m
retries: 20
flask-ui:
build:
context: .
image: local/flask-ui:1.0
container_name: flask-ui
restart: always
ports:
- 8000:8000
depends_on:
local-ai:
condition: service_healthy

11
requirements.txt Normal file
View File

@ -0,0 +1,11 @@
Flask==2.0.2
Jinja2==3.0.2
MarkupSafe==2.0.1
openpyxl==3.0.9
requests==2.26.0
urllib3==1.26.7
Werkzeug==2.0.2
gunicorn==21.2.0
itsdangerous==2.0.1
flask_oidc==1.4.0
python-dotenv