Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(api): ajout des différents endpoint #54

Open
wants to merge 14 commits into
base: main
Choose a base branch
from
Open

Conversation

maxgfr
Copy link
Member

@maxgfr maxgfr commented Dec 24, 2024

fix #44

Copy link
Contributor

@RealVidy RealVidy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ça pourrait être pas mal d'ajouter les commandes pour faire tourner le web server et lancer les tests dans le README

Copy link
Contributor

@RealVidy RealVidy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On aurait peut-être pu casser cette PR en 3 ou 4 plus petites pour faciliter le travail de review.

Par exemple en mettant llm_processor, llm_runner, et main + launcher dans des PR différentes

srdt_analysis/api/main.py Outdated Show resolved Hide resolved
srdt_analysis/api/main.py Outdated Show resolved Hide resolved
srdt_analysis/api/main.py Outdated Show resolved Hide resolved
srdt_analysis/api/main.py Outdated Show resolved Hide resolved
srdt_analysis/api/main.py Outdated Show resolved Hide resolved
srdt_analysis/llm_runner.py Outdated Show resolved Hide resolved
srdt_analysis/llm_processor.py Outdated Show resolved Hide resolved
queries_splitting_prompt, rephrased_question
)

query_list = [q.strip() for q in queries.split("\n") if q.strip()]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ça passe pour le moment mais à mon avis on aura besoin de retravailler ça parce qu'un split sur "\n" ne fonctionnera pas forcément tout le temps.
Peut-être en utilisant Pydantic pour forcer une réponse du LLM formattée comme on le souhaite et automatiquement récupérer les queries.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ouais carrément !!! J'ai fait ça, car c'était le plus simple. J'ai vu avec l'API ChatGPT qu'on pouvait passer des fonctions qui s'occupe à formatter ce que tu as en output : https://platform.openai.com/docs/api-reference/chat/create#chat-create-tools

Ça serait très intéressant de le mettre en place

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO dans le futur

srdt_analysis/models.py Show resolved Hide resolved
srdt_analysis/tokenizer.py Outdated Show resolved Hide resolved
Copy link

socket-security bot commented Jan 9, 2025

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Déclencher la pipeline RAG par API (et 4 endpoints)
2 participants