- RAG Specialist fully implemented new style

- Selection Specialist - VA version - fully implemented
- Correction of TRAICIE_ROLE_DEFINITION_SPECIALIST - adaptation to new style
- Removal of 'debug' statements
This commit is contained in:
Josako
2025-07-10 10:39:42 +02:00
parent 509ee95d81
commit 51fd16bcc6
40 changed files with 110 additions and 298 deletions

View File

@@ -4,7 +4,8 @@ role: >
{tenant_name} Spokesperson. {custom_role}
goal: >
You get questions by a human correspondent, and give answers based on a given context, taking into account the history
of the current conversation. {custom_goal}
of the current conversation.
{custom_goal}
backstory: >
You are the primary contact for {tenant_name}. You are known by {name}, and can be addressed by this name, or you. You are
a very good communicator, and adapt to the style used by the human asking for information (e.g. formal or informal).
@@ -13,7 +14,7 @@ backstory: >
language the context provided to you is in. You are participating in a conversation, not writing e.g. an email. Do not
include a salutation or closing greeting in your answer.
{custom_backstory}
full_model_name: "mistral.mistral-small-latest"
full_model_name: "mistral.mistral-medium-latest"
temperature: 0.3
metadata:
author: "Josako"

View File

@@ -1,13 +1,17 @@
version: "1.0.0"
content: >
Check if additional information or questions are available in the following answer (answer in between triple
backquotes):
Check if there are other elements available in the provided text (in between triple $) than answers to the
following question (in between triple €):
```{answer}```
€€€
{question}
€€€
in addition to answers to the following question (in between triple backquotes):
```{question}```
Provided text:
$$$
{answer}
$$$
Answer with True or False, without additional information.
llm_model: "mistral.mistral-medium-latest"

View File

@@ -4,7 +4,7 @@ content: |
question is understandable without that history. The conversation is a consequence of questions and context provided
by the HUMAN, and the AI (you) answering back, in chronological order. The most recent (i.e. last) elements are the
most important when detailing the question.
You answer by stating the detailed question in {language}.
You return the only the detailed question in {language}. Without any additional information.
History:
```{history}```
Question to be detailed:

View File

@@ -93,7 +93,7 @@ arguments:
name: "Interaction Mode"
type: "enum"
description: "The interaction mode the specialist will start working in."
allowed_values: ["orientation", "seduction"]
allowed_values: ["orientation", "selection"]
default: "orientation"
required: true
results:

View File

@@ -8,14 +8,14 @@ task_description: >
Use the following {language} in your communication, and cite the sources used at the end of the full conversation.
If the question cannot be answered using the given context, answer "I have insufficient information to answer this
question."
Context (in between triple backquotes):
```{context}```
History (in between triple backquotes):
```{history}```
Question (in between triple backquotes):
```{question}```
Context (in between triple $):
$$${context}$$$
History (in between triple ):
€€€{history}€€€
Question (in between triple £):
£££{question}£££
expected_output: >
Your answer.
metadata:
author: "Josako"
date_added: "2025-01-08"