48 Commits

Author SHA1 Message Date
Josako
43ee9139d6 Changelog for version 2.3.3-alfa 2025-06-07 11:18:05 +02:00
Josako
8f45005713 - Bug fixes:
- Catalog Name Unique Constraint
  - Selection constraint to view processed document
  - remove tab from tenant overview
2025-06-07 11:14:23 +02:00
Josako
bc1626c4ff - Initialisation of the EveAI Chat Client.
- Introduction of Tenant Makes
2025-06-06 16:42:24 +02:00
Josako
57c0e7a1ba Update changelog 2025-06-04 13:35:27 +02:00
Josako
0d05499d2b - Add Specialist Magic Links
- correction of some bugs:
  - dynamic fields for adding documents / urls to dossier catalog
  - tabs in latest bootstrap version no longer functional
  - partner association of license tier not working when no partner selected
  - data-type dynamic field needs conversion to isoformat
  - Add public tables to env.py of tenant schema
2025-06-04 11:53:35 +02:00
Josako
b4e58659a8 - Allow and improve viewing of different content types. First type implemented: changelog 2025-06-03 09:48:50 +02:00
Josako
67078ce925 - Add view to show release notes
- Update release notes for 2.3.1-alfa
2025-06-01 22:03:32 +02:00
Josako
ebdb836448 - Add view to show release notes
- Update release notes for 2.3.1-alfa
2025-06-01 22:03:15 +02:00
Josako
81e754317a - smaller changes to eveai.css to ensure background of selected buttons do not get all white and to ensure that the background of fiels in editable cells do not become white in a tabulator.
- The Role Definition Specialist now creates a new selection specialist upon completion
2025-06-01 10:09:34 +02:00
Josako
578981c745 Present the Specialist Editor in 1 screen 2025-05-30 12:51:34 +02:00
Josako
8fb2ad43c5 Moved styling elements in eveai_ordered_list_editor.html to eveai.css for consistency 2025-05-30 10:04:39 +02:00
Josako
49f9077a7b Improvement of the color scheme of the table editor. 2025-05-30 09:48:42 +02:00
Josako
d290b46a0c Improvement of the color scheme of the table editor. 2025-05-30 05:20:25 +02:00
Josako
73647e4795 We have a reasonable layout for the table-editor in the specialist. To be further refined. 2025-05-30 05:05:13 +02:00
Josako
25e169dbea - Replace old implementation of PROCESSOR_TYPES and CATALOG_TYPES with the new cached approach
- Add an ordered_list dynamic field type (to be refined)
- Add tabulator javascript library to project
2025-05-29 16:00:25 +02:00
Josako
8a29eb0d8f - Updated changelog for 2.3.0-alfa 2025-05-28 16:29:51 +02:00
Josako
0a5f0986e6 - correct bug for adding document (due to newly introduced hard usage limits)
- correct comment bug in scripts.html
- start of selection_specialist
2025-05-28 15:44:53 +02:00
Josako
4d79c4fd5a Add developer documentation for javascript library management 2025-05-27 17:46:51 +02:00
Josako
5123de55cc - Change TRAICIE_VACANCY_DEFINTION_SPECIALIST to TRAICIE_ROLE_DEFINITION_SPECIALIST
- Introduce new vanilla-jsoneditor iso older jsoneditor (for viewing a.o. ChatSessions)
- Introduce use of npm to install required javascript libraries
- update Material-kit-pro
- Introduce new top bar to show session defaults, remove old navbar buttons
- Correct Task & Tools editor
2025-05-27 17:37:32 +02:00
Josako
1fdbd2ff45 - Move global config files to globals iso global folder, as the name global conflicts with python language
- Creation of Traicie Vancancy Definition specialist
- Allow to invoke non-interaction specialists from withing Evie's mgmt interface (eveai_app)
- Improvements to crewai specialized classes
- Introduction to json editor for showing specialists arguments and results in a better way
- Introduction of more complex pagination (adding extra arguments) by adding a global 'get_pagination_html'
- Allow follow-up of ChatSession / Specialist execution
- Improvement in logging of Specialists (but needs to be finished)
2025-05-26 11:26:03 +02:00
Josako
d789e431ca Remove ModelVariables (model_utils) from application & optimize Tenant 2025-05-20 10:17:08 +02:00
Josako
70de4c0328 Improve HTML Processing + Introduction of Processed File viewer 2025-05-19 17:18:16 +02:00
Josako
d2bb51a4a8 Extra commit for files in 'common'
- Add functionality to add a default dictionary for configuration fields
- Correct entitlement processing
- Remove get_template functionality from ModelVariables, define it directly with LLM model definition in configuration file.
2025-05-19 14:12:38 +02:00
Josako
28aea85b10 - Add functionality to add a default dictionary for configuration fields
- Correct entitlement processing
- Remove get_template functionality from ModelVariables, define it directly with LLM model definition in configuration file.
2025-05-19 14:10:09 +02:00
Josako
d2a9092f46 Cleanup .pyc and .DS_Store, add new modules, remove legacy services 2025-05-17 18:46:17 +02:00
Josako
5c982fcc2c - Added EveAI Client to project
- Improvements to EntitlementsDomain & Services
- Prechecks in Document domain
- Add audit information to LicenseUsage
2025-05-17 15:56:14 +02:00
Josako
b4f7b210e0 - Improvement of Entitlements Domain
- Introduction of LicensePeriod
  - Introduction of Payments
  - Introduction of Invoices
- Services definitions for Entitlements Domain
2025-05-16 09:06:13 +02:00
Josako
1b1eef0d2e - Improve Repopack 2025-05-16 09:04:22 +02:00
Josako
17d32cd039 - Improve layout of emails sent
- Enable Scaleway TEM for the test environment.
2025-05-12 06:57:32 +02:00
Josako
12a53ebc1c - Convert mail messaging from SMTP to Scaleway TEM mails 2025-05-10 10:49:15 +02:00
Josako
a421977918 - Common library was removed one way or another
- Processor Catalog now required
2025-05-08 15:47:39 +02:00
Josako
4c480c9baa - Corrections for setting up the test environment
- Correction of some bugs discovered
2025-05-08 14:15:06 +02:00
Josako
9ea04572c8 Configuration of a compose file for the test environment 2025-05-05 12:28:05 +02:00
Josako
6ef025363d - Partner model additions
- menu changes to allow for partners
- partner views and forms now in partner_forms.py and partner_views.py
- Introduction of services layer
- Allow all configuration to handle partner configurations, and adaptation of caching to allow for this
2025-05-02 13:10:59 +02:00
Josako
9652d0bff9 - RAG & SPIN Specialist improvements 2025-04-22 13:49:38 +02:00
Josako
4bf12db142 - Significantly changed the PDF Processor to use Mistral's OCR model
- ensure very long chunks get split into smaller chunks
- ensure TrackedMistralAIEmbedding is batched if needed to ensure correct execution
- upgraded some of the packages to a higher version
2025-04-16 15:39:16 +02:00
Josako
5f58417d24 - Add 'Partner Admin' role to actual functionality in eveai_app 2025-04-15 17:12:46 +02:00
Josako
3eed546879 - Added permissions to the partner service configuration
- Corrected a nasty bug where dynamic boolean fields were not returned correctly
2025-04-11 21:47:41 +02:00
Josako
35f0adef1b Merge remote-tracking branch 'origin/main' 2025-04-09 09:42:12 +02:00
Josako
f43e79376c - Introduction of Partner Admin role in combination with 'Management Partner' type. 2025-04-09 09:40:59 +02:00
jlaroy1
be76dd5240 Remove .idea folder 2025-04-04 13:04:38 +02:00
Josako
c2c3b01b28 - Tenant code visible on tenant overview window 2025-04-03 14:28:11 +02:00
Josako
8daa52d1e9 - Small corrections to the role definitions (Tenant Tester and Tenant Financial Roles are no longer required) 2025-04-03 14:23:22 +02:00
Josako
9ad7c1aee9 Introduction of Partner Model, adding code to Tenant model 2025-04-03 14:13:56 +02:00
Josako
1762b930bc - Correct asynchronous behavior in the EveAICrewAI classes. 2025-03-31 10:26:23 +02:00
Josako
d57bc5cf03 - Enable additional environments in Docker 2025-03-28 09:48:41 +01:00
Josako
6c8c33d296 - Added 'Register ...' functionality to overviews. ==> No more separate menu items... 2025-03-27 09:13:37 +01:00
Josako
4ea16521e2 - Prometheus metrics go via pushgateway, as different worker processes might have different registries that are not picked up by Prometheus 2025-03-25 15:48:00 +01:00
1555 changed files with 62288 additions and 289284 deletions

3
.gitignore vendored
View File

@@ -14,7 +14,6 @@ __pycache__
**/__pycache__
/.idea
*.pyc
*.pyc
common/.DS_Store
common/__pycache__/__init__.cpython-312.pyc
common/__pycache__/extensions.cpython-312.pyc
@@ -52,3 +51,5 @@ scripts/__pycache__/run_eveai_app.cpython-312.pyc
/patched_packages/crewai/
/docker/prometheus/data/
/docker/grafana/data/
/temp_requirements/
/nginx/node_modules/

8
.idea/.gitignore generated vendored
View File

@@ -1,8 +0,0 @@
# Default ignored files
/shelf/
/workspace.xml
# Editor-based HTTP Client requests
/httpRequests/
# Datasource local storage ignored files
/dataSources/
/dataSources.local.xml

22
.idea/eveAI.iml generated
View File

@@ -1,22 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<module type="PYTHON_MODULE" version="4">
<component name="Flask">
<option name="enabled" value="true" />
</component>
<component name="NewModuleRootManager">
<content url="file://$MODULE_DIR$">
<excludeFolder url="file://$MODULE_DIR$/.venv" />
<excludeFolder url="file://$MODULE_DIR$/.venv2" />
</content>
<orderEntry type="jdk" jdkName="Python 3.12 (eveai_dev)" jdkType="Python SDK" />
<orderEntry type="sourceFolder" forTests="false" />
</component>
<component name="TemplatesService">
<option name="TEMPLATE_CONFIGURATION" value="Jinja2" />
<option name="TEMPLATE_FOLDERS">
<list>
<option value="$MODULE_DIR$/templates" />
</list>
</option>
</component>
</module>

View File

@@ -1,6 +0,0 @@
<component name="InspectionProjectProfileManager">
<settings>
<option name="USE_PROJECT_PROFILE" value="false" />
<version value="1.0" />
</settings>
</component>

7
.idea/misc.xml generated
View File

@@ -1,7 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="Black">
<option name="sdkName" value="Python 3.12 (eveai_tbd)" />
</component>
<component name="ProjectRootManager" version="2" project-jdk-name="Python 3.12 (TBD)" project-jdk-type="Python SDK" />
</project>

8
.idea/modules.xml generated
View File

@@ -1,8 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="ProjectModuleManager">
<modules>
<module fileurl="file://$PROJECT_DIR$/.idea/TBD.iml" filepath="$PROJECT_DIR$/.idea/TBD.iml" />
</modules>
</component>
</project>

6
.idea/vcs.xml generated
View File

@@ -1,6 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="VcsDirectoryMappings">
<mapping directory="$PROJECT_DIR$" vcs="Git" />
</component>
</project>

View File

@@ -5,19 +5,79 @@ All notable changes to EveAI will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [Unreleased]
## [2.3.0-alfa]
### Added
- For new features.
- Introduction of Push Gateway for Prometheus
- Introduction of Partner Models
- Introduction of Tenant and Partner codes for more security
- Introduction of 'Management Partner' type and additional 'Partner Admin'-role
- Introduction of a technical services layer
- Introduction of partner-specific configurations
- Introduction of additional test environment
- Introduction of strict no-overage usage
- Introduction of LicensePeriod, Payments & Invoices
- Introduction of Processed File Viewer
- Introduction of Traicie Role Definition Specialist
- Allow invocation of non-interactive specialists in administrative interface (eveai_app)
- Introduction of advanced JSON editor
- Introduction of ChatSession (Specialist Execution) follow-up in administrative interface
- Introduce npm for javascript libraries usage and optimisations
- Introduction of new top bar in administrative interface to show session defaults (removing old navbar buttons)
-
### Changed
- For changes in existing functionality.
- Add 'Register'-button to list views, replacing register menu-items
- Add additional environment capabilities in docker
- PDF Processor now uses Mistral OCR
- Allow additional chunking mechanisms for very long chunks (in case of very large documents)
- Allow for TrackedMistralAIEmbedding batching to allow for processing long documents
- RAG & SPIN Specialist improvements
- Move mail messaging from standard SMTP to Scaleway TEM mails
- Improve mail layouts
- Add functionality to add a default dictionary for dynamic forms
- AI model choices defined by Ask Eve AI iso Tenant (replaces ModelVariables completely)
- Improve HTML Processing
- Pagination improvements
- Update Material Kit Pro to latest version
### Removed
- Repopack implementation ==> Using PyCharm's new AI capabilities instead
### Fixed
- Synchronous vs Asynchronous behaviour in crewAI type specialists
- Nasty dynamic boolean fields bug corrected
- Several smaller bugfixes
- Tasks & Tools editors finished
### Security
- In case of vulnerabilities.
## [2.2.0-alfa]
### Added
- Mistral AI as main provider for embeddings, chains and specialists
- Usage measuring for specialists
- RAG from chain to specialist technology
- Dossier catalog management possibilities added to eveai_app
- Asset definition (Paused - other priorities)
- Prometheus and Grafana
- Add prometheus monitoring to business events
- Asynchronous execution of specialists
### Changed
- Moved choice for AI providers / models to specialists and prompts
- Improve RAG to not repeat historic answers
- Fixed embedding model, no more choices allowed
- clean url (of tracking parameters) before adding it to a catalog
### Deprecated
- For soon-to-be removed features.
### Removed
- For now removed features.
- Add Multiple URLs removed from menu
- Old Specialist items removed from interaction menu
-
### Fixed
- Set default language when registering Documents or URLs.

85
Evie Overview.md Normal file
View File

@@ -0,0 +1,85 @@
# Evie Overview
Owner: pieter Laroy
# Introduction
The Evie project (developed by AskEveAI) is a SAAS product that enables SMEs to easily introduce AI optimisations for both internal and external use. There are two big concepts:
- Catalogs: these allow tenants to store information about their organisations or enterprises
- Specialists: these allow tenants to perform logic into their processes, communications, …
As such, we could say we have an advanced RAG system tenants can use to optimise their workings.
## Multi-tenant
The application has a multi-tenant setup built in. This is reflected in:
- The Database:
- We have 1 public schema, in which general information is defined such as tenants, their users, domains, licenses, …
- We have a schema (named 1, 2, …) for each of the tenants defined in the system, containing all information on the tenants catalogs & documents, specialists & interactions, …
- File Storage
- We use S3-compatible storage
- A bucket is defined for each tenant, storing their specific documents, assets, …
That way, general information required for the operation of Evie is stored in the public schema, and specific and potentially sensitive information is nicely stored behind a Chinese wall for each of the tenants.
## Partners
We started to define the concept of a partner. This allows us to have partners that introduce tenants to Evie, or offer them additional functionality (specialists) or knowledge (catalogs). This concept is in an early stage at this point.
## Domains
In order to ensure a structured approach, we have defined several domains in the project:
- **User**: the user domain is used to store all data on partners, tenants, actual users.
- **Document**: the document domain is used to store all information on catalogs, documents, how to process documents, …
- **Interaction**: This domain allows us to define specialists, agents, … and to interact with the specialists and agents.
- **Entitlements**: This domain defines all license information, usage, …
# Project Structure
## Common
The common folder contains code that is used in different components of the system. It contains the following important pieces:
- **models**: in the models folder you can find the SQLAlchemy models used throughout the application. These models are organised in their relevant domains.
- **eveai_model**: some classes to handle usage, wrappers around standard LLM clients
- **langchain**: similar to eveai_model, but in the langchain library
- **services**: I started to define services to define reusable functionality in the system. There again are defined in their respective domains
- **utils**: a whole bunch of utility classes. Some should get converted to services classes in the future
- **utils/cache**: contains code for caching different elements in the application
## config
The config folder contains quite some configuration data (as the name suggests):
- **config.py**: general configuration
- **logging_config.py**: definition of logging files
- **model_config.py**: obsolete
- **type_defs**: contains the lists of definitions for several types used throughout the application. E.g. processor_types, specialist_types, …
- **All other folders**: detailed configuration of all the types defined in type_defs.
## docker
The docker folder contains the configuration and scripts used for all operations on configuring and building containers, distributing containers, …
## eveai_… folders
These are different components (containerized) of our application:
- **eveai_api**: The API of our application.
- **eveai_app**: The administrative interface of our application.
- **eveai_beat**: a means to install batch processes for our application.
- **eveai_chat**: obsolete at this moment
- **eveai_chat_workers**: celery based invocation of our specialists
- **eveai_client**: newly added. A desktop client to invoking specialists.
- **eveai_entitlements**: celery based approach to handling business events, measuring and updating usage, …
- **eveai_workers**: celery based approach to filling catalogs with documents (embedding)
## Remaining folders
- **integrations**: integrations to e.g. Wordpress and Zapier.
- **migrations**: SQLAlchemy database migration files (for public and tenant schema)
- **nginx**: configuration and static files for nginx
- **scripts**: various scripts used to start up components, to perform database operations, …

BIN
common/.DS_Store vendored

Binary file not shown.

View File

@@ -9,32 +9,133 @@ from mistralai import Mistral
class TrackedMistralAIEmbeddings(EveAIEmbeddings):
def __init__(self, model: str = "mistral_embed"):
def __init__(self, model: str = "mistral_embed", batch_size: int = 10):
"""
Initialize the TrackedMistralAIEmbeddings class.
Args:
model: The embedding model to use
batch_size: Maximum number of texts to send in a single API call
"""
api_key = current_app.config['MISTRAL_API_KEY']
self.client = Mistral(
api_key=api_key
)
self.model = model
self.batch_size = batch_size
super().__init__()
def embed_documents(self, texts: list[str]) -> list[list[float]]:
start_time = time.time()
result = self.client.embeddings.create(
model=self.model,
inputs=texts
)
end_time = time.time()
"""
Embed a list of texts, processing in batches to avoid API limitations.
metrics = {
'total_tokens': result.usage.total_tokens,
'prompt_tokens': result.usage.prompt_tokens, # For embeddings, all tokens are prompt tokens
'completion_tokens': result.usage.completion_tokens,
'time_elapsed': end_time - start_time,
'interaction_type': 'Embedding',
}
current_event.log_llm_metrics(metrics)
Args:
texts: A list of texts to embed
embeddings = [embedding.embedding for embedding in result.data]
Returns:
A list of embeddings, one for each input text
"""
if not texts:
return []
return embeddings
all_embeddings = []
# Process texts in batches
for i in range(0, len(texts), self.batch_size):
batch = texts[i:i + self.batch_size]
batch_num = i // self.batch_size + 1
current_app.logger.debug(f"Processing embedding batch {batch_num}, size: {len(batch)}")
start_time = time.time()
try:
result = self.client.embeddings.create(
model=self.model,
inputs=batch
)
end_time = time.time()
batch_time = end_time - start_time
batch_embeddings = [embedding.embedding for embedding in result.data]
all_embeddings.extend(batch_embeddings)
# Log metrics for this batch
metrics = {
'total_tokens': result.usage.total_tokens,
'prompt_tokens': result.usage.prompt_tokens,
'completion_tokens': result.usage.completion_tokens,
'time_elapsed': batch_time,
'interaction_type': 'Embedding',
'batch': batch_num,
'batch_size': len(batch)
}
current_event.log_llm_metrics(metrics)
current_app.logger.debug(f"Batch {batch_num} processed: {len(batch)} texts, "
f"{result.usage.total_tokens} tokens, {batch_time:.2f}s")
# If processing multiple batches, add a small delay to avoid rate limits
if len(texts) > self.batch_size and i + self.batch_size < len(texts):
time.sleep(0.25) # 250ms pause between batches
except Exception as e:
current_app.logger.error(f"Error in embedding batch {batch_num}: {str(e)}")
# If a batch fails, try to process each text individually
for j, text in enumerate(batch):
try:
current_app.logger.debug(f"Attempting individual embedding for item {i + j}")
single_start_time = time.time()
single_result = self.client.embeddings.create(
model=self.model,
inputs=[text]
)
single_end_time = time.time()
# Add the single embedding
single_embedding = single_result.data[0].embedding
all_embeddings.append(single_embedding)
# Log metrics for this individual embedding
single_metrics = {
'total_tokens': single_result.usage.total_tokens,
'prompt_tokens': single_result.usage.prompt_tokens,
'completion_tokens': single_result.usage.completion_tokens,
'time_elapsed': single_end_time - single_start_time,
'interaction_type': 'Embedding',
'batch': f"{batch_num}-recovery-{j}",
'batch_size': 1
}
current_event.log_llm_metrics(single_metrics)
except Exception as inner_e:
current_app.logger.error(f"Failed to embed individual text at index {i + j}: {str(inner_e)}")
# Add a zero vector as a placeholder for failed embeddings
# Use the correct dimensionality for the model (1024 for mistral_embed)
embedding_dim = 1024
all_embeddings.append([0.0] * embedding_dim)
total_batches = (len(texts) + self.batch_size - 1) // self.batch_size
current_app.logger.info(f"Embedded {len(texts)} texts in {total_batches} batches")
return all_embeddings
# def embed_documents(self, texts: list[str]) -> list[list[float]]:
# start_time = time.time()
# result = self.client.embeddings.create(
# model=self.model,
# inputs=texts
# )
# end_time = time.time()
#
# metrics = {
# 'total_tokens': result.usage.total_tokens,
# 'prompt_tokens': result.usage.prompt_tokens, # For embeddings, all tokens are prompt tokens
# 'completion_tokens': result.usage.completion_tokens,
# 'time_elapsed': end_time - start_time,
# 'interaction_type': 'Embedding',
# }
# current_event.log_llm_metrics(metrics)
#
# embeddings = [embedding.embedding for embedding in result.data]
#
# return embeddings

View File

@@ -0,0 +1,53 @@
import re
import time
from flask import current_app
from mistralai import Mistral
from common.utils.business_event_context import current_event
class TrackedMistralOcrClient:
def __init__(self):
api_key = current_app.config['MISTRAL_API_KEY']
self.client = Mistral(
api_key=api_key,
)
self.model = "mistral-ocr-latest"
def _get_title(self, markdown):
# Look for the first level-1 heading
match = re.search(r'^# (.+)', markdown, re.MULTILINE)
return match.group(1).strip() if match else None
def process_pdf(self, file_name, file_content):
start_time = time.time()
uploaded_pdf = self.client.files.upload(
file={
"file_name": file_name,
"content": file_content
},
purpose="ocr"
)
signed_url = self.client.files.get_signed_url(file_id=uploaded_pdf.id)
ocr_response = self.client.ocr.process(
model=self.model,
document={
"type": "document_url",
"document_url": signed_url.url
},
include_image_base64=False
)
nr_of_pages = len(ocr_response.pages)
all_markdown = " ".join(page.markdown for page in ocr_response.pages)
title = self._get_title(all_markdown)
end_time = time.time()
metrics = {
'nr_of_pages': nr_of_pages,
'time_elapsed': end_time - start_time,
'interaction_type': 'OCR',
}
current_event.log_llm_metrics(metrics)
return all_markdown, title

View File

@@ -2,7 +2,6 @@ from flask_sqlalchemy import SQLAlchemy
from flask_migrate import Migrate
from flask_bootstrap import Bootstrap
from flask_security import Security
from flask_mailman import Mail
from flask_login import LoginManager
from flask_cors import CORS
from flask_jwt_extended import JWTManager
@@ -11,11 +10,10 @@ from flask_wtf import CSRFProtect
from flask_restx import Api
from prometheus_flask_exporter import PrometheusMetrics
from .langchain.templates.template_manager import TemplateManager
from .utils.cache.eveai_cache_manager import EveAICacheManager
from .utils.content_utils import ContentManager
from .utils.simple_encryption import SimpleEncryption
from .utils.minio_utils import MinioClient
from .utils.performance_monitoring import EveAIMetrics
# Create extensions
@@ -24,7 +22,6 @@ migrate = Migrate()
bootstrap = Bootstrap()
csrf = CSRFProtect()
security = Security()
mail = Mail()
login_manager = LoginManager()
cors = CORS()
jwt = JWTManager()
@@ -33,7 +30,6 @@ api_rest = Api()
simple_encryption = SimpleEncryption()
minio_client = MinioClient()
metrics = PrometheusMetrics.for_app_factory()
template_manager = TemplateManager()
cache_manager = EveAICacheManager()
eveai_metrics = EveAIMetrics()
content_manager = ContentManager()

View File

@@ -1,153 +0,0 @@
import os
import yaml
from typing import Dict, Optional, Any
from packaging import version
from dataclasses import dataclass
from flask import current_app, Flask
from common.utils.os_utils import get_project_root
@dataclass
class PromptTemplate:
"""Represents a versioned prompt template"""
content: str
version: str
metadata: Dict[str, Any]
class TemplateManager:
"""Manages versioned prompt templates"""
def __init__(self):
self.templates_dir = None
self._templates = None
self.app = None
def init_app(self, app: Flask) -> None:
# Initialize template manager
base_dir = "/app"
self.templates_dir = os.path.join(base_dir, 'config', 'prompts')
self.app = app
self._templates = self._load_templates()
# Log available templates for each supported model
for llm in app.config['SUPPORTED_LLMS']:
try:
available_templates = self.list_templates(llm)
app.logger.info(f"Loaded templates for {llm}: {available_templates}")
except ValueError:
app.logger.warning(f"No templates found for {llm}")
def _load_templates(self) -> Dict[str, Dict[str, Dict[str, PromptTemplate]]]:
"""
Load all template versions from the templates directory.
Structure: {provider.model -> {template_name -> {version -> template}}}
Directory structure:
prompts/
├── provider/
│ └── model/
│ └── template_name/
│ └── version.yaml
"""
templates = {}
# Iterate through providers (anthropic, openai)
for provider in os.listdir(self.templates_dir):
provider_path = os.path.join(self.templates_dir, provider)
if not os.path.isdir(provider_path):
continue
# Iterate through models (claude-3, gpt-4o)
for model in os.listdir(provider_path):
model_path = os.path.join(provider_path, model)
if not os.path.isdir(model_path):
continue
provider_model = f"{provider}.{model}"
templates[provider_model] = {}
# Iterate through template types (rag, summary, etc.)
for template_name in os.listdir(model_path):
template_path = os.path.join(model_path, template_name)
if not os.path.isdir(template_path):
continue
template_versions = {}
# Load all version files for this template
for version_file in os.listdir(template_path):
if not version_file.endswith('.yaml'):
continue
version_str = version_file[:-5] # Remove .yaml
if not self._is_valid_version(version_str):
current_app.logger.warning(
f"Invalid version format for {template_name}: {version_str}")
continue
try:
with open(os.path.join(template_path, version_file)) as f:
template_data = yaml.safe_load(f)
# Verify required fields
if not template_data.get('content'):
raise ValueError("Template content is required")
template_versions[version_str] = PromptTemplate(
content=template_data['content'],
version=version_str,
metadata=template_data.get('metadata', {})
)
except Exception as e:
current_app.logger.error(
f"Error loading template {template_name} version {version_str}: {e}")
continue
if template_versions:
templates[provider_model][template_name] = template_versions
return templates
def _is_valid_version(self, version_str: str) -> bool:
"""Validate semantic versioning string"""
try:
version.parse(version_str)
return True
except version.InvalidVersion:
return False
def get_template(self,
provider_model: str,
template_name: str,
template_version: Optional[str] = None) -> PromptTemplate:
"""
Get a specific template version. If version not specified,
returns the latest version.
"""
if provider_model not in self._templates:
raise ValueError(f"Unknown provider.model: {provider_model}")
if template_name not in self._templates[provider_model]:
raise ValueError(f"Unknown template: {template_name}")
versions = self._templates[provider_model][template_name]
if template_version:
if template_version not in versions:
raise ValueError(f"Template version {template_version} not found")
return versions[template_version]
# Return latest version
latest = max(versions.keys(), key=version.parse)
return versions[latest]
def list_templates(self, provider_model: str) -> Dict[str, list]:
"""
List all available templates and their versions for a provider.model
Returns: {template_name: [version1, version2, ...]}
"""
if provider_model not in self._templates:
raise ValueError(f"Unknown provider.model: {provider_model}")
return {
template_name: sorted(versions.keys(), key=version.parse)
for template_name, versions in self._templates[provider_model].items()
}

Binary file not shown.

View File

@@ -8,7 +8,7 @@ import sqlalchemy as sa
class Catalog(db.Model):
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(50), nullable=False)
name = db.Column(db.String(50), nullable=False, unique=True)
description = db.Column(db.Text, nullable=True)
type = db.Column(db.String(50), nullable=False, default="STANDARD_CATALOG")

View File

@@ -1,4 +1,14 @@
from sqlalchemy.sql.expression import text
from common.extensions import db
from datetime import datetime as dt, timezone as tz
from enum import Enum
from sqlalchemy import event
from sqlalchemy.dialects.postgresql import JSONB
from sqlalchemy.ext.hybrid import hybrid_property
from dateutil.relativedelta import relativedelta
from common.utils.database import Database
class BusinessEventLog(db.Model):
@@ -25,6 +35,7 @@ class BusinessEventLog(db.Model):
llm_metrics_prompt_tokens = db.Column(db.Integer)
llm_metrics_completion_tokens = db.Column(db.Integer)
llm_metrics_total_time = db.Column(db.Float)
llm_metrics_nr_of_pages = db.Column(db.Integer)
llm_metrics_call_count = db.Column(db.Integer)
llm_interaction_type = db.Column(db.String(20))
message = db.Column(db.Text)
@@ -41,6 +52,7 @@ class License(db.Model):
tier_id = db.Column(db.Integer, db.ForeignKey('public.license_tier.id'),nullable=False) # 'small', 'medium', 'custom'
start_date = db.Column(db.Date, nullable=False)
end_date = db.Column(db.Date, nullable=True)
nr_of_periods = db.Column(db.Integer, nullable=False)
currency = db.Column(db.String(20), nullable=False)
yearly_payment = db.Column(db.Boolean, nullable=False, default=False)
basic_fee = db.Column(db.Float, nullable=False)
@@ -55,10 +67,41 @@ class License(db.Model):
additional_interaction_bucket = db.Column(db.Integer, nullable=False)
overage_embedding = db.Column(db.Float, nullable=False, default=0)
overage_interaction = db.Column(db.Float, nullable=False, default=0)
additional_storage_allowed = db.Column(db.Boolean, nullable=False, default=False)
additional_embedding_allowed = db.Column(db.Boolean, nullable=False, default=False)
additional_interaction_allowed = db.Column(db.Boolean, nullable=False, default=False)
# Versioning Information
created_at = db.Column(db.DateTime, nullable=True, server_default=db.func.now())
created_by = db.Column(db.Integer, db.ForeignKey('public.user.id'), nullable=True)
updated_at = db.Column(db.DateTime, nullable=True, server_default=db.func.now(), onupdate=db.func.now())
updated_by = db.Column(db.Integer, db.ForeignKey('public.user.id'), nullable=True)
tenant = db.relationship('Tenant', back_populates='licenses')
license_tier = db.relationship('LicenseTier', back_populates='licenses')
usages = db.relationship('LicenseUsage', order_by='LicenseUsage.period_start_date', back_populates='license')
periods = db.relationship('LicensePeriod', back_populates='license',
order_by='LicensePeriod.period_number',
cascade='all, delete-orphan')
def calculate_end_date(start_date, nr_of_periods):
"""Utility functie om einddatum te berekenen"""
if start_date and nr_of_periods:
return start_date + relativedelta(months=nr_of_periods) - relativedelta(days=1)
return None
# Luister naar start_date wijzigingen
@event.listens_for(License.start_date, 'set')
def set_start_date(target, value, oldvalue, initiator):
"""Bijwerken van end_date wanneer start_date wordt aangepast"""
if value and target.nr_of_periods:
target.end_date = calculate_end_date(value, target.nr_of_periods)
# Luister naar nr_of_periods wijzigingen
@event.listens_for(License.nr_of_periods, 'set')
def set_nr_of_periods(target, value, oldvalue, initiator):
"""Bijwerken van end_date wanneer nr_of_periods wordt aangepast"""
if value and target.start_date:
target.end_date = calculate_end_date(target.start_date, value)
class LicenseTier(db.Model):
@@ -87,7 +130,219 @@ class LicenseTier(db.Model):
standard_overage_embedding = db.Column(db.Float, nullable=False, default=0)
standard_overage_interaction = db.Column(db.Float, nullable=False, default=0)
# Versioning Information
created_at = db.Column(db.DateTime, nullable=True, server_default=db.func.now())
created_by = db.Column(db.Integer, db.ForeignKey('public.user.id'), nullable=True)
updated_at = db.Column(db.DateTime, nullable=True, server_default=db.func.now(), onupdate=db.func.now())
updated_by = db.Column(db.Integer, db.ForeignKey('public.user.id'), nullable=True)
licenses = db.relationship('License', back_populates='license_tier')
partner_services = db.relationship('PartnerServiceLicenseTier', back_populates='license_tier')
class PartnerServiceLicenseTier(db.Model):
__bind_key__ = 'public'
__table_args__ = {'schema': 'public'}
partner_service_id = db.Column(db.Integer, db.ForeignKey('public.partner_service.id'), primary_key=True,
nullable=False)
license_tier_id = db.Column(db.Integer, db.ForeignKey('public.license_tier.id'), primary_key=True,
nullable=False)
# Versioning Information
created_at = db.Column(db.DateTime, nullable=True, server_default=db.func.now())
created_by = db.Column(db.Integer, db.ForeignKey('public.user.id'), nullable=True)
updated_at = db.Column(db.DateTime, nullable=True, server_default=db.func.now(), onupdate=db.func.now())
updated_by = db.Column(db.Integer, db.ForeignKey('public.user.id'), nullable=True)
license_tier = db.relationship('LicenseTier', back_populates='partner_services')
partner_service = db.relationship('PartnerService', back_populates='license_tiers')
class PeriodStatus(Enum):
UPCOMING = "UPCOMING" # The period is still in the future
PENDING = "PENDING" # The period is active, but prepaid is not yet received
ACTIVE = "ACTIVE" # The period is active and prepaid has been received
COMPLETED = "COMPLETED" # The period has been completed, but not yet invoiced
INVOICED = "INVOICED" # The period has been completed and invoiced, but overage payment still pending
CLOSED = "CLOSED" # The period has been closed, invoiced and fully paid
class LicensePeriod(db.Model):
__bind_key__ = 'public'
__table_args__ = {'schema': 'public'}
id = db.Column(db.Integer, primary_key=True)
license_id = db.Column(db.Integer, db.ForeignKey('public.license.id'), nullable=False)
tenant_id = db.Column(db.Integer, db.ForeignKey('public.tenant.id'), nullable=False)
# Period identification
period_number = db.Column(db.Integer, nullable=False)
period_start = db.Column(db.Date, nullable=False)
period_end = db.Column(db.Date, nullable=False)
# License configuration snapshot - copied from license when period is created
currency = db.Column(db.String(20), nullable=True)
basic_fee = db.Column(db.Float, nullable=True)
max_storage_mb = db.Column(db.Integer, nullable=True)
additional_storage_price = db.Column(db.Float, nullable=True)
additional_storage_bucket = db.Column(db.Integer, nullable=True)
included_embedding_mb = db.Column(db.Integer, nullable=True)
additional_embedding_price = db.Column(db.Numeric(10, 4), nullable=True)
additional_embedding_bucket = db.Column(db.Integer, nullable=True)
included_interaction_tokens = db.Column(db.Integer, nullable=True)
additional_interaction_token_price = db.Column(db.Numeric(10, 4), nullable=True)
additional_interaction_bucket = db.Column(db.Integer, nullable=True)
# Allowance flags - can be changed from False to True within a period
additional_storage_allowed = db.Column(db.Boolean, nullable=True, default=False)
additional_embedding_allowed = db.Column(db.Boolean, nullable=True, default=False)
additional_interaction_allowed = db.Column(db.Boolean, nullable=True, default=False)
# Status tracking
status = db.Column(db.Enum(PeriodStatus), nullable=False, default=PeriodStatus.UPCOMING)
# State transition timestamps
upcoming_at = db.Column(db.DateTime, nullable=True)
pending_at = db.Column(db.DateTime, nullable=True)
active_at = db.Column(db.DateTime, nullable=True)
completed_at = db.Column(db.DateTime, nullable=True)
invoiced_at = db.Column(db.DateTime, nullable=True)
closed_at = db.Column(db.DateTime, nullable=True)
# Standard audit fields
created_at = db.Column(db.DateTime, server_default=db.func.now())
updated_at = db.Column(db.DateTime, server_default=db.func.now(), onupdate=db.func.now())
created_by = db.Column(db.Integer, db.ForeignKey('public.user.id'))
updated_by = db.Column(db.Integer, db.ForeignKey('public.user.id'))
# Relationships
license = db.relationship('License', back_populates='periods')
license_usage = db.relationship('LicenseUsage',
uselist=False, # This makes it one-to-one
back_populates='license_period',
cascade='all, delete-orphan')
payments = db.relationship('Payment', back_populates='license_period')
invoices = db.relationship('Invoice', back_populates='license_period',
cascade='all, delete-orphan')
def update_allowance(self, allowance_type, allow_value, user_id=None):
"""
Update an allowance flag within a period
Only allows transitioning from False to True
Args:
allowance_type: One of 'storage', 'embedding', or 'interaction'
allow_value: The new value (must be True)
user_id: User ID performing the update
Raises:
ValueError: If trying to change from True to False, or invalid allowance type
"""
field_name = f"additional_{allowance_type}_allowed"
# Verify valid field
if not hasattr(self, field_name):
raise ValueError(f"Invalid allowance type: {allowance_type}")
# Get current value
current_value = getattr(self, field_name)
# Only allow False -> True transition
if current_value is True and allow_value is True:
# Already True, no change needed
return
elif allow_value is False:
raise ValueError(f"Cannot change {field_name} from {current_value} to False")
# Update the field
setattr(self, field_name, True)
self.updated_at = dt.now(tz.utc)
if user_id:
self.updated_by = user_id
@property
def prepaid_invoice(self):
"""Get the prepaid invoice for this period"""
return Invoice.query.filter_by(
license_period_id=self.id,
invoice_type=PaymentType.PREPAID
).first()
@property
def overage_invoice(self):
"""Get the overage invoice for this period"""
return Invoice.query.filter_by(
license_period_id=self.id,
invoice_type=PaymentType.POSTPAID
).first()
@property
def prepaid_payment(self):
"""Get the prepaid payment for this period"""
return Payment.query.filter_by(
license_period_id=self.id,
payment_type=PaymentType.PREPAID
).first()
@property
def overage_payment(self):
"""Get the overage payment for this period"""
return Payment.query.filter_by(
license_period_id=self.id,
payment_type=PaymentType.POSTPAID
).first()
@property
def all_invoices(self):
"""Get all invoices for this period"""
return self.invoices
@property
def all_payments(self):
"""Get all payments for this period"""
return self.payments
def transition_status(self, new_status: PeriodStatus, user_id: int = None):
"""Transition to a new status with proper validation and logging"""
if not self.can_transition_to(new_status):
raise ValueError(f"Invalid status transition from {self.status} to {new_status}")
self.status = new_status
self.updated_at = dt.now(tz.utc)
if user_id:
self.updated_by = user_id
# Set appropriate timestamps
if new_status == PeriodStatus.ACTIVE and not self.prepaid_received_at:
self.prepaid_received_at = dt.now(tz.utc)
elif new_status == PeriodStatus.COMPLETED:
self.completed_at = dt.now(tz.utc)
elif new_status == PeriodStatus.INVOICED:
self.invoiced_at = dt.now(tz.utc)
elif new_status == PeriodStatus.CLOSED:
self.closed_at = dt.now(tz.utc)
@property
def is_overdue(self):
"""Check if a prepaid payment is overdue"""
return (self.status == PeriodStatus.PENDING and
self.period_start <= dt.now(tz.utc).date())
def can_transition_to(self, new_status: PeriodStatus) -> bool:
"""Check if a status transition is valid"""
valid_transitions = {
PeriodStatus.UPCOMING: [PeriodStatus.ACTIVE, PeriodStatus.PENDING],
PeriodStatus.PENDING: [PeriodStatus.ACTIVE],
PeriodStatus.ACTIVE: [PeriodStatus.COMPLETED],
PeriodStatus.COMPLETED: [PeriodStatus.INVOICED, PeriodStatus.CLOSED],
PeriodStatus.INVOICED: [PeriodStatus.CLOSED],
PeriodStatus.CLOSED: []
}
return new_status in valid_transitions.get(self.status, [])
def __repr__(self):
return f'<LicensePeriod {self.id}: License {self.license_id}, Period {self.period_number}>'
class LicenseUsage(db.Model):
@@ -95,7 +350,6 @@ class LicenseUsage(db.Model):
__table_args__ = {'schema': 'public'}
id = db.Column(db.Integer, primary_key=True)
license_id = db.Column(db.Integer, db.ForeignKey('public.license.id'), nullable=False)
tenant_id = db.Column(db.Integer, db.ForeignKey('public.tenant.id'), nullable=False)
storage_mb_used = db.Column(db.Float, default=0)
embedding_mb_used = db.Column(db.Float, default=0)
@@ -105,9 +359,170 @@ class LicenseUsage(db.Model):
interaction_prompt_tokens_used = db.Column(db.Integer, default=0)
interaction_completion_tokens_used = db.Column(db.Integer, default=0)
interaction_total_tokens_used = db.Column(db.Integer, default=0)
period_start_date = db.Column(db.Date, nullable=False)
period_end_date = db.Column(db.Date, nullable=False)
license_period_id = db.Column(db.Integer, db.ForeignKey('public.license_period.id'), nullable=False)
# Standard audit fields
created_at = db.Column(db.DateTime, server_default=db.func.now())
updated_at = db.Column(db.DateTime, server_default=db.func.now(), onupdate=db.func.now())
created_by = db.Column(db.Integer, db.ForeignKey('public.user.id'))
updated_by = db.Column(db.Integer, db.ForeignKey('public.user.id'))
license_period = db.relationship('LicensePeriod', back_populates='license_usage')
def recalculate_storage(self):
Database(self.tenant_id).switch_schema()
# Perform a SUM operation to get the total file size from document_versions
total_storage = db.session.execute(text(f"""
SELECT SUM(file_size)
FROM document_version
""")).scalar()
self.storage_mb_used = total_storage
class PaymentType(Enum):
PREPAID = "PREPAID"
POSTPAID = "POSTPAID"
class PaymentStatus(Enum):
PENDING = "PENDING"
PAID = "PAID"
FAILED = "FAILED"
CANCELLED = "CANCELLED"
class Payment(db.Model):
__bind_key__ = 'public'
__table_args__ = {'schema': 'public'}
id = db.Column(db.Integer, primary_key=True)
license_period_id = db.Column(db.Integer, db.ForeignKey('public.license_period.id'), nullable=True)
tenant_id = db.Column(db.Integer, db.ForeignKey('public.tenant.id'), nullable=False)
# Payment details
payment_type = db.Column(db.Enum(PaymentType), nullable=False)
amount = db.Column(db.Numeric(10, 2), nullable=False)
currency = db.Column(db.String(3), nullable=False)
description = db.Column(db.Text, nullable=True)
# Status tracking
status = db.Column(db.Enum(PaymentStatus), nullable=False, default=PaymentStatus.PENDING)
# External provider information
external_payment_id = db.Column(db.String(255), nullable=True)
payment_method = db.Column(db.String(50), nullable=True) # credit_card, bank_transfer, etc.
provider_data = db.Column(JSONB, nullable=True) # Provider-specific data
# Payment information
paid_at = db.Column(db.DateTime, nullable=True)
# Standard audit fields
created_at = db.Column(db.DateTime, server_default=db.func.now())
created_by = db.Column(db.Integer, db.ForeignKey('public.user.id'))
updated_at = db.Column(db.DateTime, server_default=db.func.now(), onupdate=db.func.now())
updated_by = db.Column(db.Integer, db.ForeignKey('public.user.id'))
# Relationships
license_period = db.relationship('LicensePeriod', back_populates='payments')
invoice = db.relationship('Invoice', back_populates='payment', uselist=False)
@property
def is_overdue(self):
"""Check if payment is overdue"""
if self.status != PaymentStatus.PENDING:
return False
# For prepaid payments, check if period start has passed
if (self.payment_type == PaymentType.PREPAID and
self.license_period_id):
return self.license_period.period_start <= dt.now(tz.utc).date()
# For postpaid, check against due date (would be on invoice)
return False
def __repr__(self):
return f'<Payment {self.id}: {self.payment_type} {self.amount} {self.currency}>'
class InvoiceStatus(Enum):
DRAFT = "DRAFT"
SENT = "SENT"
PAID = "PAID"
OVERDUE = "OVERDUE"
CANCELLED = "CANCELLED"
class Invoice(db.Model):
__bind_key__ = 'public'
__table_args__ = {'schema': 'public'}
id = db.Column(db.Integer, primary_key=True)
license_period_id = db.Column(db.Integer, db.ForeignKey('public.license_period.id'), nullable=False)
payment_id = db.Column(db.Integer, db.ForeignKey('public.payment.id'), nullable=True)
tenant_id = db.Column(db.Integer, db.ForeignKey('public.tenant.id'), nullable=False)
# Invoice details
invoice_type = db.Column(db.Enum(PaymentType), nullable=False)
invoice_number = db.Column(db.String(50), unique=True, nullable=False)
invoice_date = db.Column(db.Date, nullable=False)
due_date = db.Column(db.Date, nullable=False)
# Financial details
amount = db.Column(db.Numeric(10, 2), nullable=False)
currency = db.Column(db.String(3), nullable=False)
tax_amount = db.Column(db.Numeric(10, 2), default=0)
# Descriptive fields
description = db.Column(db.Text, nullable=True)
status = db.Column(db.Enum(InvoiceStatus), nullable=False, default=InvoiceStatus.DRAFT)
# Timestamps
sent_at = db.Column(db.DateTime, nullable=True)
paid_at = db.Column(db.DateTime, nullable=True)
# Standard audit fields
created_at = db.Column(db.DateTime, server_default=db.func.now())
created_by = db.Column(db.Integer, db.ForeignKey('public.user.id'))
updated_at = db.Column(db.DateTime, server_default=db.func.now(), onupdate=db.func.now())
updated_by = db.Column(db.Integer, db.ForeignKey('public.user.id'))
# Relationships
license_period = db.relationship('LicensePeriod', back_populates='invoices')
payment = db.relationship('Payment', back_populates='invoice')
def __repr__(self):
return f'<Invoice {self.invoice_number}: {self.amount} {self.currency}>'
class LicenseChangeLog(db.Model):
"""
Log of changes to license configurations
Used for auditing and tracking when/why license details changed
"""
__bind_key__ = 'public'
__table_args__ = {'schema': 'public'}
id = db.Column(db.Integer, primary_key=True)
license_id = db.Column(db.Integer, db.ForeignKey('public.license.id'), nullable=False)
changed_at = db.Column(db.DateTime, nullable=False, default=lambda: dt.now(tz.utc))
# What changed
field_name = db.Column(db.String(100), nullable=False)
old_value = db.Column(db.String(255), nullable=True)
new_value = db.Column(db.String(255), nullable=False)
# Why it changed
reason = db.Column(db.Text, nullable=True)
# Standard audit fields
created_by = db.Column(db.Integer, db.ForeignKey('public.user.id'), nullable=True)
# Relationships
license = db.relationship('License', backref=db.backref('change_logs', order_by='LicenseChangeLog.changed_at'))
def __repr__(self):
return f'<LicenseChangeLog: {self.license_id} {self.field_name} {self.old_value} -> {self.new_value}>'
license = db.relationship('License', back_populates='usages')

View File

@@ -8,7 +8,7 @@ from .document import Embedding, Retriever
class ChatSession(db.Model):
id = db.Column(db.Integer, primary_key=True)
user_id = db.Column(db.Integer, db.ForeignKey(User.id), nullable=True)
session_id = db.Column(db.String(36), nullable=True)
session_id = db.Column(db.String(49), nullable=True)
session_start = db.Column(db.DateTime, nullable=False)
session_end = db.Column(db.DateTime, nullable=True)
timezone = db.Column(db.String(30), nullable=True)
@@ -189,6 +189,7 @@ class Interaction(db.Model):
question_at = db.Column(db.DateTime, nullable=False)
detailed_question_at = db.Column(db.DateTime, nullable=True)
answer_at = db.Column(db.DateTime, nullable=True)
processing_error = db.Column(db.String(255), nullable=True)
# Relations
embeddings = db.relationship('InteractionEmbedding', backref='interaction', lazy=True)
@@ -214,3 +215,24 @@ class SpecialistDispatcher(db.Model):
dispatcher_id = db.Column(db.Integer, db.ForeignKey(Dispatcher.id, ondelete='CASCADE'), primary_key=True)
dispatcher = db.relationship("Dispatcher", backref="specialist_dispatchers")
class SpecialistMagicLink(db.Model):
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(50), nullable=False)
description = db.Column(db.Text, nullable=True)
specialist_id = db.Column(db.Integer, db.ForeignKey(Specialist.id, ondelete='CASCADE'), nullable=False)
magic_link_code = db.Column(db.String(55), nullable=False, unique=True)
valid_from = db.Column(db.DateTime, nullable=True)
valid_to = db.Column(db.DateTime, nullable=True)
specialist_args = db.Column(JSONB, nullable=True)
created_at = db.Column(db.DateTime, nullable=False, server_default=db.func.now())
created_by = db.Column(db.Integer, db.ForeignKey(User.id), nullable=True)
updated_at = db.Column(db.DateTime, nullable=False, server_default=db.func.now(), onupdate=db.func.now())
updated_by = db.Column(db.Integer, db.ForeignKey(User.id))
def __repr__(self):
return f"<SpecialistMagicLink {self.specialist_id} {self.magic_link_code}>"

View File

@@ -2,7 +2,7 @@ from datetime import date
from common.extensions import db
from flask_security import UserMixin, RoleMixin
from sqlalchemy.dialects.postgresql import ARRAY
from sqlalchemy.dialects.postgresql import ARRAY, JSONB
import sqlalchemy as sa
from common.models.entitlements import License
@@ -20,20 +20,17 @@ class Tenant(db.Model):
# company Information
id = db.Column(db.Integer, primary_key=True)
code = db.Column(db.String(50), unique=True, nullable=True)
name = db.Column(db.String(80), unique=True, nullable=False)
website = db.Column(db.String(255), nullable=True)
timezone = db.Column(db.String(50), nullable=True, default='UTC')
rag_context = db.Column(db.Text, nullable=True)
type = db.Column(db.String(20), nullable=True, server_default='Active')
# language information
default_language = db.Column(db.String(2), nullable=True)
allowed_languages = db.Column(ARRAY(sa.String(2)), nullable=True)
# LLM specific choices
llm_model = db.Column(db.String(50), nullable=True)
# Entitlements
# Entitlements
currency = db.Column(db.String(20), nullable=True)
storage_dirty = db.Column(db.Boolean, nullable=True, default=False)
@@ -61,11 +58,9 @@ class Tenant(db.Model):
'name': self.name,
'website': self.website,
'timezone': self.timezone,
'rag_context': self.rag_context,
'type': self.type,
'default_language': self.default_language,
'allowed_languages': self.allowed_languages,
'llm_model': self.llm_model,
'currency': self.currency,
}
@@ -99,6 +94,7 @@ class User(db.Model, UserMixin):
# User Information
id = db.Column(db.Integer, primary_key=True)
tenant_id = db.Column(db.Integer, db.ForeignKey('public.tenant.id'), nullable=False)
user_name = db.Column(db.String(80), unique=True, nullable=False)
email = db.Column(db.String(255), unique=True, nullable=False)
password = db.Column(db.String(255), nullable=True)
@@ -120,7 +116,6 @@ class User(db.Model, UserMixin):
# Relations
roles = db.relationship('Role', secondary=RolesUsers.__table__, backref=db.backref('users', lazy='dynamic'))
tenant_id = db.Column(db.Integer, db.ForeignKey('public.tenant.id'), nullable=False)
def __repr__(self):
return '<User %r>' % self.user_name
@@ -143,9 +138,9 @@ class TenantDomain(db.Model):
# Versioning Information
created_at = db.Column(db.DateTime, nullable=False, server_default=db.func.now())
created_by = db.Column(db.Integer, db.ForeignKey(User.id), nullable=False)
created_by = db.Column(db.Integer, db.ForeignKey('public.user.id'), nullable=False)
updated_at = db.Column(db.DateTime, nullable=False, server_default=db.func.now(), onupdate=db.func.now())
updated_by = db.Column(db.Integer, db.ForeignKey(User.id))
updated_by = db.Column(db.Integer, db.ForeignKey('public.user.id'))
def __repr__(self):
return f"<TenantDomain {self.id}: {self.domain}>"
@@ -176,3 +171,133 @@ class TenantProject(db.Model):
def __repr__(self):
return f"<TenantProject {self.id}: {self.name}>"
class TenantMake(db.Model):
__bind_key__ = 'public'
__table_args__ = {'schema': 'public'}
id = db.Column(db.Integer, primary_key=True)
tenant_id = db.Column(db.Integer, db.ForeignKey('public.tenant.id'), nullable=False)
name = db.Column(db.String(50), nullable=False)
description = db.Column(db.Text, nullable=True)
active = db.Column(db.Boolean, nullable=False, default=True)
website = db.Column(db.String(255), nullable=True)
logo_url = db.Column(db.String(255), nullable=True)
# Chat customisation options
chat_customisation_options = db.Column(JSONB, nullable=True)
# Versioning Information
created_at = db.Column(db.DateTime, nullable=False, server_default=db.func.now())
created_by = db.Column(db.Integer, db.ForeignKey('public.user.id'), nullable=True)
updated_at = db.Column(db.DateTime, nullable=False, server_default=db.func.now(), onupdate=db.func.now())
updated_by = db.Column(db.Integer, db.ForeignKey('public.user.id'))
class Partner(db.Model):
__bind_key__ = 'public'
__table_args__ = {'schema': 'public'}
id = db.Column(db.Integer, primary_key=True)
tenant_id = db.Column(db.Integer, db.ForeignKey('public.tenant.id'), nullable=False, unique=True)
code = db.Column(db.String(50), unique=True, nullable=False)
# Basic information
logo_url = db.Column(db.String(255), nullable=True)
active = db.Column(db.Boolean, default=True)
# Versioning Information
created_at = db.Column(db.DateTime, nullable=False, server_default=db.func.now())
created_by = db.Column(db.Integer, db.ForeignKey('public.user.id'), nullable=True)
updated_at = db.Column(db.DateTime, nullable=False, server_default=db.func.now(), onupdate=db.func.now())
updated_by = db.Column(db.Integer, db.ForeignKey('public.user.id'), nullable=True)
# Relationships
services = db.relationship('PartnerService', back_populates='partner')
tenant = db.relationship('Tenant', backref=db.backref('partner', uselist=False))
def to_dict(self):
services_info = []
for service in self.services:
services_info.append({
'id': service.id,
'name': service.name,
'description': service.description,
'type': service.type,
'type_version': service.type_version,
'active': service.active,
'configuration': service.configuration,
'permissions': service.permissions,
})
return {
'id': self.id,
'tenant_id': self.tenant_id,
'code': self.code,
'logo_url': self.logo_url,
'active': self.active,
'name': self.tenant.name,
'services': services_info
}
class PartnerService(db.Model):
__bind_key__ = 'public'
__table_args__ = {'schema': 'public'}
id = db.Column(db.Integer, primary_key=True)
partner_id = db.Column(db.Integer, db.ForeignKey('public.partner.id'), nullable=False)
# Basic info
name = db.Column(db.String(50), nullable=False)
description = db.Column(db.Text, nullable=True)
# Service type with versioning (similar to your specialist/retriever pattern)
type = db.Column(db.String(50), nullable=False) # REFERRAL, KNOWLEDGE, SPECIALIST, IMPLEMENTATION, WHITE_LABEL
type_version = db.Column(db.String(20), nullable=False, default="1.0.0")
# Status
active = db.Column(db.Boolean, default=True)
# Dynamic configuration specific to this service - using JSONB like your other models
configuration = db.Column(db.JSON, nullable=True)
permissions = db.Column(db.JSON, nullable=True)
# For services that need to track shared resources
system_metadata = db.Column(db.JSON, nullable=True)
user_metadata = db.Column(db.JSON, nullable=True)
# Versioning Information
created_at = db.Column(db.DateTime, nullable=False, server_default=db.func.now())
created_by = db.Column(db.Integer, db.ForeignKey('public.user.id'), nullable=True)
updated_at = db.Column(db.DateTime, nullable=False, server_default=db.func.now(), onupdate=db.func.now())
updated_by = db.Column(db.Integer, db.ForeignKey('public.user.id'), nullable=True)
# Relationships
partner = db.relationship('Partner', back_populates='services')
license_tiers = db.relationship('PartnerServiceLicenseTier', back_populates='partner_service')
class PartnerTenant(db.Model):
__bind_key__ = 'public'
__table_args__ = {'schema': 'public'}
partner_service_id = db.Column(db.Integer, db.ForeignKey('public.partner_service.id'), primary_key=True)
tenant_id = db.Column(db.Integer, db.ForeignKey('public.tenant.id'), primary_key=True)
# JSONB for flexible configuration specific to this relationship
configuration = db.Column(db.JSON, nullable=True)
# Tracking
created_at = db.Column(db.DateTime, nullable=False, server_default=db.func.now())
created_by = db.Column(db.Integer, db.ForeignKey('public.user.id'), nullable=True)
updated_at = db.Column(db.DateTime, nullable=False, server_default=db.func.now(), onupdate=db.func.now())
updated_by = db.Column(db.Integer, db.ForeignKey('public.user.id'), nullable=True)
class SpecialistMagicLinkTenant(db.Model):
__bind_key__ = 'public'
__table_args__ = {'schema': 'public'}
magic_link_code = db.Column(db.String(55), primary_key=True)
tenant_id = db.Column(db.Integer, db.ForeignKey('public.tenant.id'), nullable=False)

View File

@@ -0,0 +1,9 @@
from common.services.entitlements.license_period_services import LicensePeriodServices
from common.services.entitlements.license_usage_services import LicenseUsageServices
from common.services.entitlements.license_tier_services import LicenseTierServices
__all__ = [
'LicensePeriodServices',
'LicenseUsageServices',
'LicenseTierServices'
]

View File

@@ -0,0 +1,247 @@
from dateutil.relativedelta import relativedelta
from datetime import datetime as dt, timezone as tz, timedelta
from flask import current_app
from sqlalchemy.exc import SQLAlchemyError
from sqlalchemy.sql.expression import and_
from common.extensions import db
from common.models.entitlements import LicensePeriod, License, PeriodStatus, LicenseUsage
from common.utils.eveai_exceptions import EveAILicensePeriodsExceeded, EveAIPendingLicensePeriod, EveAINoActiveLicense
from common.utils.model_logging_utils import set_logging_information, update_logging_information
class LicensePeriodServices:
@staticmethod
def find_current_license_period_for_usage(tenant_id: int) -> LicensePeriod:
"""
Find the current license period for a tenant. It ensures the status of the different license periods are adapted
when required, and a LicenseUsage object is created if required.
Args:
tenant_id: The ID of the tenant to find the license period for
Raises:
EveAIException: and derived classes
"""
try:
current_app.logger.debug(f"Finding current license period for tenant {tenant_id}")
current_date = dt.now(tz.utc).date()
license_period = (db.session.query(LicensePeriod)
.filter_by(tenant_id=tenant_id)
.filter(and_(LicensePeriod.period_start <= current_date,
LicensePeriod.period_end >= current_date))
.first())
current_app.logger.debug(f"End searching for license period for tenant {tenant_id} ")
if not license_period:
current_app.logger.debug(f"No license period found for tenant {tenant_id} on date {current_date}")
license_period = LicensePeriodServices._create_next_license_period_for_usage(tenant_id)
current_app.logger.debug(f"Created license period {license_period.id} for tenant {tenant_id}")
if license_period:
current_app.logger.debug(f"Found license period {license_period.id} for tenant {tenant_id} "
f"with status {license_period.status}")
match license_period.status:
case PeriodStatus.UPCOMING:
current_app.logger.debug(f"In upcoming state")
LicensePeriodServices._complete_last_license_period(tenant_id=tenant_id)
current_app.logger.debug(f"Completed last license period for tenant {tenant_id}")
LicensePeriodServices._activate_license_period(license_period=license_period)
current_app.logger.debug(f"Activated license period {license_period.id} for tenant {tenant_id}")
if not license_period.license_usage:
new_license_usage = LicenseUsage(
tenant_id=tenant_id,
)
new_license_usage.license_period = license_period
try:
db.session.add(new_license_usage)
db.session.commit()
except SQLAlchemyError as e:
db.session.rollback()
current_app.logger.error(
f"Error creating new license usage for license period "
f"{license_period.id}: {str(e)}")
raise e
if license_period.status == PeriodStatus.ACTIVE:
return license_period
else:
# Status is PENDING, so no prepaid payment received. There is no license period we can use.
# We allow for a delay of 5 days before raising an exception.
current_date = dt.now(tz.utc).date()
delta = abs(current_date - license_period.period_start)
if delta > timedelta(days=current_app.config.get('ENTITLEMENTS_MAX_PENDING_DAYS', 5)):
raise EveAIPendingLicensePeriod()
case PeriodStatus.ACTIVE:
return license_period
case PeriodStatus.PENDING:
return license_period
else:
raise EveAILicensePeriodsExceeded(license_id=None)
except SQLAlchemyError as e:
db.session.rollback()
current_app.logger.error(f"Error finding current license period for tenant {tenant_id}: {str(e)}")
raise e
except Exception as e:
raise e
@staticmethod
def _create_next_license_period_for_usage(tenant_id) -> LicensePeriod:
"""
Create a new period for this license using the current license configuration
Args:
tenant_id: The ID of the tenant to create the period for
Returns:
LicensePeriod: The newly created license period
"""
current_date = dt.now(tz.utc).date()
# Zoek de actieve licentie voor deze tenant op de huidige datum
the_license = (db.session.query(License)
.filter_by(tenant_id=tenant_id)
.filter(License.start_date <= current_date)
.filter(License.end_date >= current_date)
.first())
if not the_license:
current_app.logger.error(f"No active license found for tenant {tenant_id} on date {current_date}")
raise EveAINoActiveLicense(tenant_id=tenant_id)
else:
current_app.logger.debug(f"Found active license {the_license.id} for tenant {tenant_id} "
f"on date {current_date}")
next_period_number = 1
if the_license.periods:
# If there are existing periods, get the next sequential number
next_period_number = max(p.period_number for p in the_license.periods) + 1
current_app.logger.debug(f"Next period number for tenant {tenant_id} is {next_period_number}")
if next_period_number > the_license.nr_of_periods:
raise EveAILicensePeriodsExceeded(license_id=the_license.id)
new_license_period = LicensePeriod(
license_id=the_license.id,
tenant_id=tenant_id,
period_number=next_period_number,
period_start=the_license.start_date + relativedelta(months=next_period_number-1),
period_end=the_license.end_date + relativedelta(months=next_period_number, days=-1),
status=PeriodStatus.UPCOMING,
upcoming_at=dt.now(tz.utc),
)
set_logging_information(new_license_period, dt.now(tz.utc))
try:
current_app.logger.debug(f"Creating next license period for tenant {tenant_id} ")
db.session.add(new_license_period)
db.session.commit()
current_app.logger.info(f"Created next license period for tenant {tenant_id} "
f"with id {new_license_period.id}")
return new_license_period
except SQLAlchemyError as e:
db.session.rollback()
current_app.logger.error(f"Error creating next license period for tenant {tenant_id}: {str(e)}")
raise e
@staticmethod
def _activate_license_period(license_period_id: int = None, license_period: LicensePeriod = None) -> LicensePeriod:
"""
Activate a license period
Args:
license_period_id: The ID of the license period to activate (optional if license_period is provided)
license_period: The LicensePeriod object to activate (optional if license_period_id is provided)
Returns:
LicensePeriod: The activated license period object
Raises:
ValueError: If neither license_period_id nor license_period is provided
"""
current_app.logger.debug(f"Activating license period")
if license_period is None and license_period_id is None:
raise ValueError("Either license_period_id or license_period must be provided")
# Get a license period object if only ID was provided
if license_period is None:
current_app.logger.debug(f"Getting license period {license_period_id} to activate")
license_period = LicensePeriod.query.get_or_404(license_period_id)
if license_period.pending_at is not None:
license_period.pending_at = dt.now(tz.utc)
license_period.status = PeriodStatus.PENDING
if license_period.prepaid_payment:
# There is a payment received for the given period
license_period.active_at = dt.now(tz.utc)
license_period.status = PeriodStatus.ACTIVE
# Copy snapshot fields from the license to the period
the_license = License.query.get_or_404(license_period.license_id)
license_period.currency = the_license.currency
license_period.basic_fee = the_license.basic_fee
license_period.max_storage_mb = the_license.max_storage_mb
license_period.additional_storage_price = the_license.additional_storage_price
license_period.additional_storage_bucket = the_license.additional_storage_bucket
license_period.included_embedding_mb = the_license.included_embedding_mb
license_period.additional_embedding_price = the_license.additional_embedding_price
license_period.additional_embedding_bucket = the_license.additional_embedding_bucket
license_period.included_interaction_tokens = the_license.included_interaction_tokens
license_period.additional_interaction_token_price = the_license.additional_interaction_token_price
license_period.additional_interaction_bucket = the_license.additional_interaction_bucket
license_period.additional_storage_allowed = the_license.additional_storage_allowed
license_period.additional_embedding_allowed = the_license.additional_embedding_allowed
license_period.additional_interaction_allowed = the_license.additional_interaction_allowed
update_logging_information(license_period, dt.now(tz.utc))
if not license_period.license_usage:
license_period.license_usage = LicenseUsage(
tenant_id=license_period.tenant_id,
license_period_id=license_period.id,
)
license_period.license_usage.recalculate_storage()
try:
db.session.add(license_period)
db.session.add(license_period.license_usage)
db.session.commit()
except SQLAlchemyError as e:
db.session.rollback()
current_app.logger.error(f"Error activating license period {license_period_id}: {str(e)}")
raise e
return license_period
@staticmethod
def _complete_last_license_period(tenant_id) -> None:
"""
Complete the active or pending license period for a tenant. This is done by setting the status to COMPLETED.
Args:
tenant_id: De ID van de tenant
"""
# Zoek de licenseperiode voor deze tenant met status ACTIVE of PENDING
active_period = (
db.session.query(LicensePeriod)
.filter_by(tenant_id=tenant_id)
.filter(LicensePeriod.status.in_([PeriodStatus.ACTIVE, PeriodStatus.PENDING]))
.first()
)
# Als er geen actieve periode gevonden is, hoeven we niets te doen
if not active_period:
return
# Zet de gevonden periode op COMPLETED
active_period.status = PeriodStatus.COMPLETED
active_period.completed_at = dt.now(tz.utc)
update_logging_information(active_period, dt.now(tz.utc))
try:
db.session.add(active_period)
db.session.commit()
except SQLAlchemyError as e:
db.session.rollback()
current_app.logger.error(f"Error completing period {active_period.id} for {tenant_id}: {str(e)}")
raise e

View File

@@ -0,0 +1,67 @@
from flask import session, flash, current_app
from datetime import datetime as dt, timezone as tz
from sqlalchemy.exc import SQLAlchemyError
from common.extensions import db
from common.models.entitlements import PartnerServiceLicenseTier
from common.models.user import Partner
from common.utils.eveai_exceptions import EveAINoManagementPartnerService, EveAINoSessionPartner
from common.utils.model_logging_utils import set_logging_information
class LicenseTierServices:
@staticmethod
def associate_license_tier_with_partner(license_tier_id):
"""Associate a license tier with a partner"""
try:
partner_id = session['partner']['id']
# Get partner service (MANAGEMENT_SERVICE type)
partner = Partner.query.get(partner_id)
if not partner:
raise EveAINoSessionPartner()
# Find a management service for this partner
management_service = next((service for service in session['partner']['services']
if service.get('type') == 'MANAGEMENT_SERVICE'), None)
if not management_service:
flash("Cannot associate license tier with partner. No management service defined for partner", "danger")
current_app.logger.error(f"No Management Service defined for partner {partner_id}"
f"trying to associate license tier {license_tier_id}.")
raise EveAINoManagementPartnerService()
# Check if the association already exists
existing_association = PartnerServiceLicenseTier.query.filter_by(
partner_service_id=management_service['id'],
license_tier_id=license_tier_id
).first()
if existing_association:
# Association already exists, nothing to do
flash("License tier was already associated with partner", "info")
current_app.logger.info(f"Association between partner service {management_service['id']} and "
f"license tier {license_tier_id} already exists.")
return
# Create the association
association = PartnerServiceLicenseTier(
partner_service_id=management_service['id'],
license_tier_id=license_tier_id
)
set_logging_information(association, dt.now(tz.utc))
db.session.add(association)
db.session.commit()
flash("Successfully associated license tier to partner", "success")
current_app.logger.info(f"Successfully associated license tier {license_tier_id} with "
f"partner service {management_service['id']}")
return True
except SQLAlchemyError as e:
db.session.rollback()
flash("Failed to associated license tier with partner service due to an internal error. "
"Please contact the System Administrator", "danger")
current_app.logger.error(f"Error associating license tier {license_tier_id} with partner: {str(e)}")
raise e

View File

@@ -0,0 +1,143 @@
from dateutil.relativedelta import relativedelta
from flask import session, current_app, flash
from sqlalchemy.exc import SQLAlchemyError
from sqlalchemy.sql.expression import text
from common.extensions import db, cache_manager
from common.models.entitlements import PartnerServiceLicenseTier, License, LicenseUsage, LicensePeriod, PeriodStatus
from common.models.user import Partner, PartnerTenant
from common.services.entitlements import LicensePeriodServices
from common.utils.database import Database
from common.utils.eveai_exceptions import EveAINoManagementPartnerService, EveAINoActiveLicense, \
EveAIStorageQuotaExceeded, EveAIEmbeddingQuotaExceeded, EveAIInteractionQuotaExceeded, EveAILicensePeriodsExceeded, \
EveAIException
from common.utils.model_logging_utils import set_logging_information, update_logging_information
from datetime import datetime as dt, timezone as tz
from common.utils.security_utils import current_user_has_role
class LicenseUsageServices:
@staticmethod
def check_storage_and_embedding_quota(tenant_id: int, file_size_mb: float) -> None:
"""
Check if a tenant can add a new document without exceeding storage and embedding quotas
Args:
tenant_id: ID of the tenant
file_size_mb: Size of the file in MB
Raises:
EveAIStorageQuotaExceeded: If storage quota would be exceeded
EveAIEmbeddingQuotaExceeded: If embedding quota would be exceeded
EveAINoActiveLicense: If no active license is found
EveAIException: For other errors
"""
# Get active license period
license_period = LicensePeriodServices.find_current_license_period_for_usage(tenant_id)
# Early return if both overruns are allowed - no need to check usage at all
if license_period.additional_storage_allowed and license_period.additional_embedding_allowed:
return
# Check storage quota only if overruns are not allowed
if not license_period.additional_storage_allowed:
LicenseUsageServices._validate_storage_quota(license_period, file_size_mb)
# Check embedding quota only if overruns are not allowed
if not license_period.additional_embedding_allowed:
LicenseUsageServices._validate_embedding_quota(license_period, file_size_mb)
@staticmethod
def check_embedding_quota(tenant_id: int, file_size_mb: float) -> None:
"""
Check if a tenant can re-embed a document without exceeding embedding quota
Args:
tenant_id: ID of the tenant
file_size_mb: Size of the file in MB
Raises:
EveAIEmbeddingQuotaExceeded: If embedding quota would be exceeded
EveAINoActiveLicense: If no active license is found
EveAIException: For other errors
"""
# Get active license period
license_period = LicensePeriodServices.find_current_license_period_for_usage(tenant_id)
# Early return if both overruns are allowed - no need to check usage at all
if license_period.additional_embedding_allowed:
return
# Check embedding quota
LicenseUsageServices._validate_embedding_quota(license_period, file_size_mb)
@staticmethod
def check_interaction_quota(tenant_id: int) -> None:
"""
Check if a tenant can execute a specialist without exceeding interaction quota. As it is impossible to estimate
the number of interaction tokens, we only check if the interaction quota are exceeded. So we might have a
limited overrun.
Args:
tenant_id: ID of the tenant
Raises:
EveAIInteractionQuotaExceeded: If interaction quota would be exceeded
EveAINoActiveLicense: If no active license is found
EveAIException: For other errors
"""
# Get active license period
license_period = LicensePeriodServices.find_current_license_period_for_usage(tenant_id)
# Early return if both overruns are allowed - no need to check usage at all
if license_period.additional_interaction_allowed:
return
# Convert tokens to M tokens and check interaction quota
LicenseUsageServices._validate_interaction_quota(license_period)
@staticmethod
def _validate_storage_quota(license_period: LicensePeriod, additional_mb: float) -> None:
"""Check storage quota and raise exception if exceeded"""
current_storage = license_period.license_usage.storage_mb_used or 0
projected_storage = current_storage + additional_mb
max_storage = license_period.max_storage_mb
# Hard limit check (we only get here if overruns are NOT allowed)
if projected_storage > max_storage:
raise EveAIStorageQuotaExceeded(
current_usage=current_storage,
limit=max_storage,
additional=additional_mb
)
@staticmethod
def _validate_embedding_quota(license_period: LicensePeriod, additional_mb: float) -> None:
"""Check embedding quota and raise exception if exceeded"""
current_embedding = license_period.license_usage.embedding_mb_used or 0
projected_embedding = current_embedding + additional_mb
max_embedding = license_period.included_embedding_mb
# Hard limit check (we only get here if overruns are NOT allowed)
if projected_embedding > max_embedding:
raise EveAIEmbeddingQuotaExceeded(
current_usage=current_embedding,
limit=max_embedding,
additional=additional_mb
)
@staticmethod
def _validate_interaction_quota(license_period) -> None:
"""Check interaction quota and raise exception if exceeded (tokens in millions). We might have an overrun!"""
current_tokens = license_period.license_usage.interaction_total_tokens_used / 1_000_000 or 0
max_tokens = license_period.included_interaction_tokens
# Hard limit check (we only get here if overruns are NOT allowed)
if current_tokens > max_tokens:
raise EveAIInteractionQuotaExceeded(
current_usage=current_tokens,
limit=max_tokens
)

View File

@@ -0,0 +1,222 @@
import uuid
from datetime import datetime as dt, timezone as tz
from typing import Dict, Any, Tuple, Optional
from flask import current_app
from sqlalchemy.exc import SQLAlchemyError
from common.extensions import db, cache_manager
from common.models.interaction import (
Specialist, EveAIAgent, EveAITask, EveAITool
)
from common.utils.celery_utils import current_celery
from common.utils.model_logging_utils import set_logging_information, update_logging_information
class SpecialistServices:
@staticmethod
def start_session() -> str:
return f"CHAT_SESSION_{uuid.uuid4()}"
@staticmethod
def execute_specialist(tenant_id, specialist_id, specialist_arguments, session_id, user_timezone) -> Dict[str, Any]:
task = current_celery.send_task(
'execute_specialist',
args=[tenant_id,
specialist_id,
specialist_arguments,
session_id,
user_timezone,
],
queue='llm_interactions'
)
return {
'task_id': task.id,
'status': 'queued',
}
@staticmethod
def initialize_specialist(specialist_id: int, specialist_type: str, specialist_version: str):
"""
Initialize an agentic specialist by creating all its components based on configuration.
Args:
specialist_id: ID of the specialist to initialize
specialist_type: Type of the specialist
specialist_version: Version of the specialist type to use
Raises:
ValueError: If specialist not found or invalid configuration
SQLAlchemyError: If database operations fail
"""
config = cache_manager.specialists_config_cache.get_config(specialist_type, specialist_version)
if not config:
raise ValueError(f"No configuration found for {specialist_type} version {specialist_version}")
if config['framework'] == 'langchain':
pass # Langchain does not require additional items to be initialized. All configuration is in the specialist.
specialist = Specialist.query.get(specialist_id)
if not specialist:
raise ValueError(f"Specialist with ID {specialist_id} not found")
if config['framework'] == 'crewai':
SpecialistServices.initialize_crewai_specialist(specialist, config)
@staticmethod
def initialize_crewai_specialist(specialist: Specialist, config: Dict[str, Any]):
timestamp = dt.now(tz=tz.utc)
try:
# Initialize agents
if 'agents' in config:
for agent_config in config['agents']:
SpecialistServices._create_agent(
specialist_id=specialist.id,
agent_type=agent_config['type'],
agent_version=agent_config['version'],
name=agent_config.get('name'),
description=agent_config.get('description'),
timestamp=timestamp
)
# Initialize tasks
if 'tasks' in config:
for task_config in config['tasks']:
SpecialistServices._create_task(
specialist_id=specialist.id,
task_type=task_config['type'],
task_version=task_config['version'],
name=task_config.get('name'),
description=task_config.get('description'),
timestamp=timestamp
)
# Initialize tools
if 'tools' in config:
for tool_config in config['tools']:
SpecialistServices._create_tool(
specialist_id=specialist.id,
tool_type=tool_config['type'],
tool_version=tool_config['version'],
name=tool_config.get('name'),
description=tool_config.get('description'),
timestamp=timestamp
)
db.session.commit()
current_app.logger.info(f"Successfully initialized crewai specialist {specialist.id}")
except SQLAlchemyError as e:
db.session.rollback()
current_app.logger.error(f"Database error initializing crewai specialist {specialist.id}: {str(e)}")
raise
except Exception as e:
db.session.rollback()
current_app.logger.error(f"Error initializing crewai specialist {specialist.id}: {str(e)}")
raise
@staticmethod
def _create_agent(
specialist_id: int,
agent_type: str,
agent_version: str,
name: Optional[str] = None,
description: Optional[str] = None,
timestamp: Optional[dt] = None
) -> EveAIAgent:
"""Create an agent with the given configuration."""
if timestamp is None:
timestamp = dt.now(tz=tz.utc)
# Get agent configuration from cache
agent_config = cache_manager.agents_config_cache.get_config(agent_type, agent_version)
agent = EveAIAgent(
specialist_id=specialist_id,
name=name or agent_config.get('name', agent_type),
description=description or agent_config.get('metadata').get('description', ''),
type=agent_type,
type_version=agent_version,
role=None,
goal=None,
backstory=None,
tuning=False,
configuration=None,
arguments=None
)
set_logging_information(agent, timestamp)
db.session.add(agent)
current_app.logger.info(f"Created agent {agent.id} of type {agent_type}")
return agent
@staticmethod
def _create_task(
specialist_id: int,
task_type: str,
task_version: str,
name: Optional[str] = None,
description: Optional[str] = None,
timestamp: Optional[dt] = None
) -> EveAITask:
"""Create a task with the given configuration."""
if timestamp is None:
timestamp = dt.now(tz=tz.utc)
# Get task configuration from cache
task_config = cache_manager.tasks_config_cache.get_config(task_type, task_version)
task = EveAITask(
specialist_id=specialist_id,
name=name or task_config.get('name', task_type),
description=description or task_config.get('metadata').get('description', ''),
type=task_type,
type_version=task_version,
task_description=None,
expected_output=None,
tuning=False,
configuration=None,
arguments=None,
context=None,
asynchronous=False,
)
set_logging_information(task, timestamp)
db.session.add(task)
current_app.logger.info(f"Created task {task.id} of type {task_type}")
return task
@staticmethod
def _create_tool(
specialist_id: int,
tool_type: str,
tool_version: str,
name: Optional[str] = None,
description: Optional[str] = None,
timestamp: Optional[dt] = None
) -> EveAITool:
"""Create a tool with the given configuration."""
if timestamp is None:
timestamp = dt.now(tz=tz.utc)
# Get tool configuration from cache
tool_config = cache_manager.tools_config_cache.get_config(tool_type, tool_version)
tool = EveAITool(
specialist_id=specialist_id,
name=name or tool_config.get('name', tool_type),
description=description or tool_config.get('metadata').get('description', ''),
type=tool_type,
type_version=tool_version,
tuning=False,
configuration=None,
arguments=None,
)
set_logging_information(tool, timestamp)
db.session.add(tool)
current_app.logger.info(f"Created tool {tool.id} of type {tool_type}")
return tool

View File

@@ -0,0 +1,5 @@
from common.services.user.user_services import UserServices
from common.services.user.partner_services import PartnerServices
from common.services.user.tenant_services import TenantServices
__all__ = ['UserServices', 'PartnerServices', 'TenantServices']

View File

@@ -0,0 +1,47 @@
from typing import List
from flask import session
from sqlalchemy.exc import SQLAlchemyError
from common.models.entitlements import PartnerServiceLicenseTier
from common.utils.eveai_exceptions import EveAINoManagementPartnerService, EveAINoSessionPartner
from common.utils.security_utils import current_user_has_role
class PartnerServices:
@staticmethod
def get_allowed_license_tier_ids() -> List[int]:
"""
Retrieve IDs of all License Tiers associated with the partner's management service
Returns:
List of license tier IDs
Raises:
EveAINoSessionPartner: If no partner is in the session
EveAINoManagementPartnerService: If partner has no management service
"""
partner = session.get("partner", None)
if not partner:
raise EveAINoSessionPartner()
# Find a management service for this partner
management_service = next((service for service in session['partner']['services']
if service.get('type') == 'MANAGEMENT_SERVICE'), None)
if not management_service:
raise EveAINoManagementPartnerService()
management_service_id = management_service['id']
# Query for all license tiers associated with this management service
associations = PartnerServiceLicenseTier.query.filter_by(
partner_service_id=management_service_id
).all()
# Extract the license tier IDs
license_tier_ids = [assoc.license_tier_id for assoc in associations]
return license_tier_ids

View File

@@ -0,0 +1,175 @@
from typing import Dict, List
from flask import session, current_app
from sqlalchemy.exc import SQLAlchemyError
from common.extensions import db, cache_manager
from common.models.user import Partner, PartnerTenant, PartnerService, Tenant
from common.utils.eveai_exceptions import EveAINoManagementPartnerService
from common.utils.model_logging_utils import set_logging_information
from datetime import datetime as dt, timezone as tz
from common.utils.security_utils import current_user_has_role
class TenantServices:
@staticmethod
def associate_tenant_with_partner(tenant_id):
"""Associate a tenant with a partner"""
try:
partner_id = session['partner']['id']
# Get partner service (MANAGEMENT_SERVICE type)
partner = Partner.query.get(partner_id)
if not partner:
return
# Find a management service for this partner
management_service = next((service for service in session['partner']['services']
if service.get('type') == 'MANAGEMENT_SERVICE'), None)
if not management_service:
current_app.logger.error(f"No Management Service defined for partner {partner_id} "
f"while associating tenant {tenant_id} with partner.")
raise EveAINoManagementPartnerService()
# Create the association
tenant_partner = PartnerTenant(
partner_service_id=management_service['id'],
tenant_id=tenant_id,
)
set_logging_information(tenant_partner, dt.now(tz.utc))
db.session.add(tenant_partner)
db.session.commit()
except SQLAlchemyError as e:
db.session.rollback()
current_app.logger.error(f"Error associating tenant {tenant_id} with partner: {str(e)}")
raise e
@staticmethod
def get_available_types_for_tenant(tenant_id: int, config_type: str) -> Dict[str, Dict[str, str]]:
"""
Get available configuration types for a tenant based on partner relationships
Args:
tenant_id: The tenant ID
config_type: The configuration type ('specialists', 'agents', 'tasks', etc.)
Returns:
Dictionary of available types for the tenant
"""
# Get the appropriate cache handler based on config_type
cache_handler = None
if config_type == 'specialists':
cache_handler = cache_manager.specialists_types_cache
elif config_type == 'agents':
cache_handler = cache_manager.agents_types_cache
elif config_type == 'tasks':
cache_handler = cache_manager.tasks_types_cache
elif config_type == 'tools':
cache_handler = cache_manager.tools_types_cache
else:
raise ValueError(f"Unsupported config type: {config_type}")
# Get all types with their metadata (including partner info)
all_types = cache_handler.get_types()
# Filter to include:
# 1. Types with no partner (global)
# 2. Types with partners that have a SPECIALIST_SERVICE relationship with this tenant
available_partners = TenantServices.get_tenant_partner_names(tenant_id)
available_types = {
type_id: info for type_id, info in all_types.items()
if info.get('partner') is None or info.get('partner') in available_partners
}
return available_types
@staticmethod
def get_tenant_partner_names(tenant_id: int) -> List[str]:
"""
Get names of partners that have a SPECIALIST_SERVICE relationship with this tenant
Args:
tenant_id: The tenant ID
Returns:
List of partner names (tenant names)
"""
# Find all PartnerTenant relationships for this tenant
partner_names = []
try:
# Get all partner services of type SPECIALIST_SERVICE
specialist_services = (
PartnerService.query
.filter_by(type='SPECIALIST_SERVICE')
.all()
)
if not specialist_services:
return []
# Find tenant relationships with these services
partner_tenants = (
PartnerTenant.query
.filter_by(tenant_id=tenant_id)
.filter(PartnerTenant.partner_service_id.in_([svc.id for svc in specialist_services]))
.all()
)
# Get the partner names (their tenant names)
for pt in partner_tenants:
partner_service = (
PartnerService.query
.filter_by(id=pt.partner_service_id)
.first()
)
if partner_service:
partner = Partner.query.get(partner_service.partner_id)
if partner:
# Get the tenant associated with this partner
partner_tenant = Tenant.query.get(partner.tenant_id)
if partner_tenant:
partner_names.append(partner_tenant.name)
except SQLAlchemyError as e:
current_app.logger.error(f"Database error retrieving partner names: {str(e)}")
return partner_names
@staticmethod
def can_use_specialist_type(tenant_id: int, specialist_type: str) -> bool:
"""
Check if a tenant can use a specific specialist type
Args:
tenant_id: The tenant ID
specialist_type: The specialist type ID
Returns:
True if the tenant can use the specialist type, False otherwise
"""
# Get the specialist type definition
try:
specialist_types = cache_manager.specialists_types_cache.get_types()
specialist_def = specialist_types.get(specialist_type)
if not specialist_def:
return False
# If it's a global specialist, anyone can use it
if specialist_def.get('partner') is None:
return True
# If it's a partner-specific specialist, check if tenant has access
partner_name = specialist_def.get('partner')
available_partners = TenantServices.get_tenant_partner_names(tenant_id)
return partner_name in available_partners
except Exception as e:
current_app.logger.error(f"Error checking specialist type access: {str(e)}")
return False

View File

@@ -0,0 +1,95 @@
from flask import session
from common.models.user import Partner, Role, PartnerTenant
from common.utils.eveai_exceptions import EveAIRoleAssignmentException
from common.utils.security_utils import current_user_has_role
class UserServices:
@staticmethod
def get_assignable_roles():
"""Retrieves roles that can be assigned to a user depending on the current user logged in,
and the active tenant for the session"""
current_tenant_id = session.get('tenant').get('id', None)
effective_role_names = []
if current_tenant_id == 1:
if current_user_has_role("Super User"):
effective_role_names.append("Super User")
elif current_tenant_id:
if current_user_has_role("Tenant Admin"):
effective_role_names.append("Tenant Admin")
if current_user_has_role("Partner Admin") or current_user_has_role("Super User"):
effective_role_names.append("Tenant Admin")
if session.get('partner'):
if session.get('partner').get('tenant_id') == current_tenant_id:
effective_role_names.append("Partner Admin")
effective_role_names = list(set(effective_role_names))
effective_roles = [(role.id, role.name) for role in
Role.query.filter(Role.name.in_(effective_role_names)).all()]
return effective_roles
@staticmethod
def validate_role_assignments(role_ids):
"""Validate a set of role assignments, raising exception for first invalid role"""
assignable_roles = UserServices.get_assignable_roles()
assignable_role_ids = {role[0] for role in assignable_roles}
role_id_set = set(role_ids)
return role_id_set.issubset(assignable_role_ids)
@staticmethod
def can_user_edit_tenant(tenant_id) -> bool:
if current_user_has_role('Super User'):
return True
elif current_user_has_role('Partner Admin'):
partner = session.get('partner', None)
if partner and partner["tenant_id"] == tenant_id:
return True
partner_service = next((service for service in session['partner']['services']
if service.get('type') == 'MANAGEMENT_SERVICE'), None)
if not partner_service:
return False
else:
partner_tenant = PartnerTenant.query.filter(
PartnerTenant.tenant_id == tenant_id,
PartnerTenant.partner_service_id == partner_service['id'],
).first()
if partner_tenant:
return True
else:
return False
else:
return False
@staticmethod
def can_user_create_tenant() -> bool:
if current_user_has_role('Super User'):
return True
elif current_user_has_role('Partner Admin'):
partner_id = session['partner']['id']
partner_service = next((service for service in session['partner']['services']
if service.get('type') == 'MANAGEMENT_SERVICE'), None)
if not partner_service:
return False
else:
partner_permissions = partner_service.get('permissions', None)
return partner_permissions.get('can_create_tenant', False)
else:
return False
@staticmethod
def can_user_assign_license() -> bool:
if current_user_has_role('Super User'):
return True
elif current_user_has_role('Partner Admin'):
partner_id = session['partner']['id']
partner_service = next((service for service in session['partner']['services']
if service.get('type') == 'MANAGEMENT_SERVICE'), None)
if not partner_service:
return False
else:
partner_permissions = partner_service.get('permissions', None)
return partner_permissions.get('can_assign_license', False)
else:
return False

BIN
common/utils/.DS_Store vendored

Binary file not shown.

View File

@@ -5,7 +5,6 @@ from sqlalchemy.exc import SQLAlchemyError
from common.extensions import cache_manager, minio_client, db
from common.models.interaction import EveAIAsset, EveAIAssetVersion
from common.utils.document_utils import mark_tenant_storage_dirty
from common.utils.model_logging_utils import set_logging_information
@@ -55,7 +54,8 @@ def create_version_for_asset(asset, tenant_id):
def add_asset_version_file(asset_version, field_name, file, tenant_id):
object_name, file_size = minio_client.upload_file(asset_version.bucket_name, asset_version.id, field_name,
file.content_type)
mark_tenant_storage_dirty(tenant_id)
# mark_tenant_storage_dirty(tenant_id)
# TODO - zorg ervoor dat de herberekening van storage onmiddellijk gebeurt!
return object_name

View File

@@ -6,16 +6,18 @@ from datetime import datetime
from typing import Dict, Any, Optional, List
from datetime import datetime as dt, timezone as tz
import logging
from prometheus_client import Counter, Histogram, Gauge, Summary
from flask import current_app
from prometheus_client import Counter, Histogram, Gauge, Summary, push_to_gateway, REGISTRY
from .business_event_context import BusinessEventContext
from common.models.entitlements import BusinessEventLog
from common.extensions import db
from .celery_utils import current_celery
from common.utils.performance_monitoring import EveAIMetrics
from common.utils.prometheus_utils import sanitize_label
# Standard duration buckets for all histograms
DURATION_BUCKETS = EveAIMetrics.get_standard_buckets()
DURATION_BUCKETS = [0.1, 0.5, 1, 2.5, 5, 10, 15, 30, 60, 120, 240, 360, float('inf')]
# Prometheus metrics for business events
TRACE_COUNTER = Counter(
@@ -104,6 +106,7 @@ class BusinessEvent:
'total_tokens': 0,
'prompt_tokens': 0,
'completion_tokens': 0,
'nr_of_pages': 0,
'total_time': 0,
'call_count': 0,
'interaction_type': None
@@ -112,14 +115,17 @@ class BusinessEvent:
# Prometheus label values must be strings
self.tenant_id_str = str(self.tenant_id)
self.event_type_str = sanitize_label(self.event_type)
self.specialist_id_str = str(self.specialist_id) if self.specialist_id else ""
self.specialist_type_str = str(self.specialist_type) if self.specialist_type else ""
self.specialist_type_version_str = str(self.specialist_type_version) if self.specialist_type_version else ""
self.specialist_type_version_str = sanitize_label(str(self.specialist_type_version)) \
if self.specialist_type_version else ""
self.span_name_str = ""
# Increment concurrent events gauge when initialized
CONCURRENT_TRACES.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
event_type=self.event_type_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
@@ -128,12 +134,14 @@ class BusinessEvent:
# Increment trace counter
TRACE_COUNTER.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
event_type=self.event_type_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
).inc()
self._push_to_gateway()
def update_attribute(self, attribute: str, value: any):
if hasattr(self, attribute):
setattr(self, attribute, value)
@@ -143,78 +151,86 @@ class BusinessEvent:
elif attribute == 'specialist_type':
self.specialist_type_str = str(value) if value else ""
elif attribute == 'specialist_type_version':
self.specialist_type_version_str = str(value) if value else ""
self.specialist_type_version_str = sanitize_label(str(value)) if value else ""
elif attribute == 'tenant_id':
self.tenant_id_str = str(value)
elif attribute == 'event_type':
self.event_type_str = sanitize_label(value)
elif attribute == 'span_name':
self.span_name_str = sanitize_label(value)
else:
raise AttributeError(f"'{self.__class__.__name__}' object has no attribute '{attribute}'")
def update_llm_metrics(self, metrics: dict):
self.llm_metrics['total_tokens'] += metrics['total_tokens']
self.llm_metrics['prompt_tokens'] += metrics['prompt_tokens']
self.llm_metrics['completion_tokens'] += metrics['completion_tokens']
self.llm_metrics['total_time'] += metrics['time_elapsed']
self.llm_metrics['total_tokens'] += metrics.get('total_tokens', 0)
self.llm_metrics['prompt_tokens'] += metrics.get('prompt_tokens', 0)
self.llm_metrics['completion_tokens'] += metrics.get('completion_tokens', 0)
self.llm_metrics['nr_of_pages'] += metrics.get('nr_of_pages', 0)
self.llm_metrics['total_time'] += metrics.get('time_elapsed', 0)
self.llm_metrics['call_count'] += 1
self.llm_metrics['interaction_type'] = metrics['interaction_type']
# Track in Prometheus metrics
interaction_type = metrics['interaction_type']
interaction_type_str = sanitize_label(metrics['interaction_type']) if metrics['interaction_type'] else ""
# Track token usage
LLM_TOKENS_COUNTER.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
interaction_type=interaction_type,
event_type=self.event_type_str,
interaction_type=interaction_type_str,
token_type='total',
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
).inc(metrics['total_tokens'])
).inc(metrics.get('total_tokens', 0))
LLM_TOKENS_COUNTER.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
interaction_type=interaction_type,
event_type=self.event_type_str,
interaction_type=interaction_type_str,
token_type='prompt',
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
).inc(metrics['prompt_tokens'])
).inc(metrics.get('prompt_tokens', 0))
LLM_TOKENS_COUNTER.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
interaction_type=interaction_type,
event_type=self.event_type_str,
interaction_type=interaction_type_str,
token_type='completion',
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
).inc(metrics['completion_tokens'])
).inc(metrics.get('completion_tokens', 0))
# Track duration
LLM_DURATION.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
interaction_type=interaction_type,
event_type=self.event_type_str,
interaction_type=interaction_type_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
).observe(metrics['time_elapsed'])
).observe(metrics.get('time_elapsed', 0))
# Track call count
LLM_CALLS_COUNTER.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
interaction_type=interaction_type,
event_type=self.event_type_str,
interaction_type=interaction_type_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
).inc()
self._push_to_gateway()
def reset_llm_metrics(self):
self.llm_metrics['total_tokens'] = 0
self.llm_metrics['prompt_tokens'] = 0
self.llm_metrics['completion_tokens'] = 0
self.llm_metrics['nr_of_pages'] = 0
self.llm_metrics['total_time'] = 0
self.llm_metrics['call_count'] = 0
self.llm_metrics['interaction_type'] = None
@@ -236,6 +252,7 @@ class BusinessEvent:
# Set the new span info
self.span_id = new_span_id
self.span_name = span_name
self.span_name_str = sanitize_label(span_name) if span_name else ""
self.parent_span_id = parent_span_id
# Track start time for the span
@@ -244,8 +261,8 @@ class BusinessEvent:
# Increment span metrics - using span_name as activity_name for metrics
SPAN_COUNTER.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
activity_name=span_name,
event_type=self.event_type_str,
activity_name=self.span_name_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
@@ -254,13 +271,15 @@ class BusinessEvent:
# Increment concurrent spans gauge
CONCURRENT_SPANS.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
activity_name=span_name,
event_type=self.event_type_str,
activity_name=self.span_name_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
).inc()
self._push_to_gateway()
self.log(f"Start")
try:
@@ -272,8 +291,8 @@ class BusinessEvent:
# Observe span duration
SPAN_DURATION.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
activity_name=span_name,
event_type=self.event_type_str,
activity_name=self.span_name_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
@@ -282,13 +301,15 @@ class BusinessEvent:
# Decrement concurrent spans gauge
CONCURRENT_SPANS.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
activity_name=span_name,
event_type=self.event_type_str,
activity_name=self.span_name_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
).dec()
self._push_to_gateway()
if self.llm_metrics['call_count'] > 0:
self.log_final_metrics()
self.reset_llm_metrics()
@@ -296,10 +317,12 @@ class BusinessEvent:
# Restore the previous span info
if self.spans:
self.span_id, self.span_name, self.parent_span_id = self.spans.pop()
self.span_name_str = sanitize_label(span_name) if span_name else ""
else:
self.span_id = None
self.span_name = None
self.parent_span_id = None
self.span_name_str = ""
@asynccontextmanager
async def create_span_async(self, span_name: str):
@@ -314,6 +337,7 @@ class BusinessEvent:
# Set the new span info
self.span_id = new_span_id
self.span_name = span_name
self.span_name_str = sanitize_label(span_name) if span_name else ""
self.parent_span_id = parent_span_id
# Track start time for the span
@@ -322,8 +346,8 @@ class BusinessEvent:
# Increment span metrics - using span_name as activity_name for metrics
SPAN_COUNTER.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
activity_name=span_name,
event_type=self.event_type_str,
activity_name=self.span_name_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
@@ -332,13 +356,15 @@ class BusinessEvent:
# Increment concurrent spans gauge
CONCURRENT_SPANS.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
activity_name=span_name,
event_type=self.event_type_str,
activity_name=self.span_name_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
).inc()
self._push_to_gateway()
self.log(f"Start")
try:
@@ -350,8 +376,8 @@ class BusinessEvent:
# Observe span duration
SPAN_DURATION.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
activity_name=span_name,
event_type=self.event_type_str,
activity_name=self.span_name_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
@@ -360,13 +386,15 @@ class BusinessEvent:
# Decrement concurrent spans gauge
CONCURRENT_SPANS.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
activity_name=span_name,
event_type=self.event_type_str,
activity_name=self.span_name_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
).dec()
self._push_to_gateway()
if self.llm_metrics['call_count'] > 0:
self.log_final_metrics()
self.reset_llm_metrics()
@@ -374,10 +402,12 @@ class BusinessEvent:
# Restore the previous span info
if self.spans:
self.span_id, self.span_name, self.parent_span_id = self.spans.pop()
self.span_name_str = sanitize_label(span_name) if span_name else ""
else:
self.span_id = None
self.span_name = None
self.parent_span_id = None
self.span_name_str = ""
def log(self, message: str, level: str = 'info', extra_fields: Dict[str, Any] = None):
log_data = {
@@ -429,10 +459,11 @@ class BusinessEvent:
'specialist_type': self.specialist_type,
'specialist_type_version': self.specialist_type_version,
'environment': self.environment,
'llm_metrics_total_tokens': metrics['total_tokens'],
'llm_metrics_prompt_tokens': metrics['prompt_tokens'],
'llm_metrics_completion_tokens': metrics['completion_tokens'],
'llm_metrics_total_time': metrics['time_elapsed'],
'llm_metrics_total_tokens': metrics.get('total_tokens', 0),
'llm_metrics_prompt_tokens': metrics.get('prompt_tokens', 0),
'llm_metrics_completion_tokens': metrics.get('completion_tokens', 0),
'llm_metrics_nr_of_pages': metrics.get('nr_of_pages', 0),
'llm_metrics_total_time': metrics.get('time_elapsed', 0),
'llm_interaction_type': metrics['interaction_type'],
'message': message,
}
@@ -460,6 +491,7 @@ class BusinessEvent:
'llm_metrics_total_tokens': self.llm_metrics['total_tokens'],
'llm_metrics_prompt_tokens': self.llm_metrics['prompt_tokens'],
'llm_metrics_completion_tokens': self.llm_metrics['completion_tokens'],
'llm_metrics_nr_of_pages': self.llm_metrics['nr_of_pages'],
'llm_metrics_total_time': self.llm_metrics['total_time'],
'llm_metrics_call_count': self.llm_metrics['call_count'],
'llm_interaction_type': self.llm_metrics['interaction_type'],
@@ -526,6 +558,17 @@ class BusinessEvent:
# Clear the buffer after sending
self._log_buffer = []
def _push_to_gateway(self):
# Push metrics to the gateway
try:
push_to_gateway(
current_app.config['PUSH_GATEWAY_URL'],
job=current_app.config['COMPONENT_NAME'],
registry=REGISTRY
)
except Exception as e:
current_app.logger.error(f"Failed to push metrics to Prometheus Push Gateway: {e}")
def __enter__(self):
self.trace_start_time = time.time()
self.log(f'Starting Trace for {self.event_type}')
@@ -537,7 +580,7 @@ class BusinessEvent:
# Record trace duration
TRACE_DURATION.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
event_type=self.event_type_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
@@ -546,18 +589,22 @@ class BusinessEvent:
# Decrement concurrent traces gauge
CONCURRENT_TRACES.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
event_type=self.event_type_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
).dec()
self._push_to_gateway()
if self.llm_metrics['call_count'] > 0:
self.log_final_metrics()
self.reset_llm_metrics()
self.log(f'Ending Trace for {self.event_type}', extra_fields={'trace_duration': trace_total_time})
self._flush_log_buffer()
return BusinessEventContext(self).__exit__(exc_type, exc_val, exc_tb)
async def __aenter__(self):
@@ -571,7 +618,7 @@ class BusinessEvent:
# Record trace duration
TRACE_DURATION.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
event_type=self.event_type_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
@@ -580,12 +627,14 @@ class BusinessEvent:
# Decrement concurrent traces gauge
CONCURRENT_TRACES.labels(
tenant_id=self.tenant_id_str,
event_type=self.event_type,
event_type=self.event_type_str,
specialist_id=self.specialist_id_str,
specialist_type=self.specialist_type_str,
specialist_type_version=self.specialist_type_version_str
).dec()
self._push_to_gateway()
if self.llm_metrics['call_count'] > 0:
self.log_final_metrics()
self.reset_llm_metrics()

View File

@@ -7,7 +7,7 @@ from flask import current_app
from common.utils.cache.base import CacheHandler, CacheKey
from config.type_defs import agent_types, task_types, tool_types, specialist_types, retriever_types, prompt_types, \
catalog_types
catalog_types, partner_service_types, processor_types, customisation_types
def is_major_minor(version: str) -> bool:
@@ -59,16 +59,10 @@ class BaseConfigCacheHandler(CacheHandler[Dict[str, Any]]):
"""Set the version tree cache dependency."""
self.version_tree_cache = cache
def _load_specific_config(self, type_name: str, version_str: str) -> Dict[str, Any]:
def _load_specific_config(self, type_name: str, version_str: str = 'latest') -> Dict[str, Any]:
"""
Load a specific configuration version
Args:
type_name: Type name
version_str: Version string
Returns:
Configuration data
Automatically handles global vs partner-specific configs
"""
version_tree = self.version_tree_cache.get_versions(type_name)
versions = version_tree['versions']
@@ -79,11 +73,16 @@ class BaseConfigCacheHandler(CacheHandler[Dict[str, Any]]):
if version_str not in versions:
raise ValueError(f"Version {version_str} not found for {type_name}")
file_path = versions[version_str]['file_path']
version_info = versions[version_str]
file_path = version_info['file_path']
partner = version_info.get('partner')
try:
with open(file_path) as f:
config = yaml.safe_load(f)
# Add partner information to the config
if partner:
config['partner'] = partner
return config
except Exception as e:
raise ValueError(f"Error loading config from {file_path}: {e}")
@@ -133,20 +132,37 @@ class BaseConfigVersionTreeCacheHandler(CacheHandler[Dict[str, Any]]):
def _load_version_tree(self, type_name: str) -> Dict[str, Any]:
"""
Load version tree for a specific type without loading full configurations
Args:
type_name: Name of configuration type
Returns:
Dict containing available versions and their metadata
Checks both global and partner-specific directories
"""
type_path = Path(self._config_dir) / type_name
if not type_path.exists():
# First check the global path
global_path = Path(self._config_dir) / "globals" / type_name
# If global path doesn't exist, check if the type exists directly in the root
# (for backward compatibility)
if not global_path.exists():
global_path = Path(self._config_dir) / type_name
if not global_path.exists():
# Check if it exists in any partner subdirectories
partner_dirs = [d for d in Path(self._config_dir).iterdir()
if d.is_dir() and d.name != "globals"]
for partner_dir in partner_dirs:
partner_type_path = partner_dir / type_name
if partner_type_path.exists():
# Found in partner directory
return self._load_versions_from_path(partner_type_path)
# If we get here, the type wasn't found anywhere
raise ValueError(f"No configuration found for type {type_name}")
version_files = list(type_path.glob('*.yaml'))
return self._load_versions_from_path(global_path)
def _load_versions_from_path(self, path: Path) -> Dict[str, Any]:
"""Load all versions from a specific path"""
version_files = list(path.glob('*.yaml'))
if not version_files:
raise ValueError(f"No versions found for type {type_name}")
raise ValueError(f"No versions found in {path}")
versions = {}
latest_version = None
@@ -160,9 +176,17 @@ class BaseConfigVersionTreeCacheHandler(CacheHandler[Dict[str, Any]]):
with open(file_path) as f:
yaml_data = yaml.safe_load(f)
metadata = yaml_data.get('metadata', {})
# Add partner information if available
partner = None
if "globals" not in str(file_path):
# Extract partner name from path
# Path format: config_dir/partner_name/type_name/version.yaml
partner = file_path.parent.parent.name
versions[ver] = {
'metadata': metadata,
'file_path': str(file_path)
'file_path': str(file_path),
'partner': partner
}
# Track latest version
@@ -316,7 +340,8 @@ class BaseConfigTypesCacheHandler(CacheHandler[Dict[str, Any]]):
type_definitions = {
type_id: {
'name': info['name'],
'description': info['description']
'description': info['description'],
'partner': info.get('partner') # Include partner info if available
}
for type_id, info in self._types_module.items()
}
@@ -422,7 +447,6 @@ PromptConfigCacheHandler, PromptConfigVersionTreeCacheHandler, PromptConfigTypes
config_type='prompts',
config_dir='config/prompts',
types_module=prompt_types.PROMPT_TYPES
))
CatalogConfigCacheHandler, CatalogConfigVersionTreeCacheHandler, CatalogConfigTypesCacheHandler = (
@@ -430,9 +454,30 @@ CatalogConfigCacheHandler, CatalogConfigVersionTreeCacheHandler, CatalogConfigTy
config_type='catalogs',
config_dir='config/catalogs',
types_module=catalog_types.CATALOG_TYPES
))
ProcessorConfigCacheHandler, ProcessorConfigVersionTreeCacheHandler, ProcessorConfigTypesCacheHandler = (
create_config_cache_handlers(
config_type='processors',
config_dir='config/processors',
types_module=processor_types.PROCESSOR_TYPES
))
PartnerServiceConfigCacheHandler, PartnerServiceConfigVersionTreeCacheHandler, PartnerServiceConfigTypesCacheHandler = (
create_config_cache_handlers(
config_type='partner_services',
config_dir='config/partner_services',
types_module=partner_service_types.PARTNER_SERVICE_TYPES
))
CustomisationConfigCacheHandler, CustomisationConfigVersionTreeCacheHandler, CustomisationConfigTypesCacheHandler = (
create_config_cache_handlers(
config_type='customisations',
config_dir='config/customisations',
types_module=customisation_types.CUSTOMISATION_TYPES
)
)
def register_config_cache_handlers(cache_manager) -> None:
cache_manager.register_handler(AgentConfigCacheHandler, 'eveai_config')
@@ -456,9 +501,18 @@ def register_config_cache_handlers(cache_manager) -> None:
cache_manager.register_handler(CatalogConfigCacheHandler, 'eveai_config')
cache_manager.register_handler(CatalogConfigTypesCacheHandler, 'eveai_config')
cache_manager.register_handler(CatalogConfigVersionTreeCacheHandler, 'eveai_config')
cache_manager.register_handler(ProcessorConfigCacheHandler, 'eveai_config')
cache_manager.register_handler(ProcessorConfigTypesCacheHandler, 'eveai_config')
cache_manager.register_handler(ProcessorConfigVersionTreeCacheHandler, 'eveai_config')
cache_manager.register_handler(AgentConfigCacheHandler, 'eveai_config')
cache_manager.register_handler(AgentConfigTypesCacheHandler, 'eveai_config')
cache_manager.register_handler(AgentConfigVersionTreeCacheHandler, 'eveai_config')
cache_manager.register_handler(PartnerServiceConfigCacheHandler, 'eveai_config')
cache_manager.register_handler(PartnerServiceConfigTypesCacheHandler, 'eveai_config')
cache_manager.register_handler(PartnerServiceConfigVersionTreeCacheHandler, 'eveai_config')
cache_manager.register_handler(CustomisationConfigCacheHandler, 'eveai_config')
cache_manager.register_handler(CustomisationConfigTypesCacheHandler, 'eveai_config')
cache_manager.register_handler(CustomisationConfigVersionTreeCacheHandler, 'eveai_config')
cache_manager.agents_config_cache.set_version_tree_cache(cache_manager.agents_version_tree_cache)
cache_manager.tasks_config_cache.set_version_tree_cache(cache_manager.tasks_version_tree_cache)
@@ -466,3 +520,7 @@ def register_config_cache_handlers(cache_manager) -> None:
cache_manager.specialists_config_cache.set_version_tree_cache(cache_manager.specialists_version_tree_cache)
cache_manager.retrievers_config_cache.set_version_tree_cache(cache_manager.retrievers_version_tree_cache)
cache_manager.prompts_config_cache.set_version_tree_cache(cache_manager.prompts_version_tree_cache)
cache_manager.catalogs_config_cache.set_version_tree_cache(cache_manager.catalogs_version_tree_cache)
cache_manager.processors_config_cache.set_version_tree_cache(cache_manager.processors_version_tree_cache)
cache_manager.partner_services_config_cache.set_version_tree_cache(cache_manager.partner_services_version_tree_cache)
cache_manager.customisations_config_cache.set_version_tree_cache(cache_manager.customisations_version_tree_cache)

102
common/utils/cache/license_cache.py vendored Normal file
View File

@@ -0,0 +1,102 @@
# common/utils/cache/license_cache.py
from typing import Dict, Any, Optional
from datetime import datetime as dt, timezone as tz
from flask import current_app
from sqlalchemy import and_
from sqlalchemy.inspection import inspect
from common.utils.cache.base import CacheHandler
from common.models.entitlements import License
class LicenseCacheHandler(CacheHandler[License]):
"""Handles caching of active licenses for tenants"""
handler_name = 'license_cache'
def __init__(self, region):
super().__init__(region, 'active_license')
self.configure_keys('tenant_id')
def _to_cache_data(self, instance: License) -> Dict[str, Any]:
"""Convert License instance to cache data using SQLAlchemy inspection"""
if not instance:
return {}
# Get all column attributes from the SQLAlchemy model
mapper = inspect(License)
data = {}
for column in mapper.columns:
value = getattr(instance, column.name)
# Handle date serialization
if isinstance(value, dt):
data[column.name] = value.isoformat()
else:
data[column.name] = value
return data
def _from_cache_data(self, data: Dict[str, Any], **kwargs) -> License:
"""Create License instance from cache data using SQLAlchemy inspection"""
if not data:
return None
# Create a new License instance
license = License()
mapper = inspect(License)
# Set all attributes dynamically
for column in mapper.columns:
if column.name in data:
value = data[column.name]
# Handle date deserialization
if column.name.endswith('_date') and value:
if isinstance(value, str):
value = dt.fromisoformat(value).date()
setattr(license, column.name, value)
return license
def _should_cache(self, value: License) -> bool:
"""Validate if the license should be cached"""
return value is not None and value.id is not None
def get_active_license(self, tenant_id: int) -> Optional[License]:
"""
Get the currently active license for a tenant
Args:
tenant_id: ID of the tenant
Returns:
License instance if found, None otherwise
"""
def creator_func(tenant_id: int) -> Optional[License]:
from common.extensions import db
current_date = dt.now(tz=tz.utc).date()
# TODO --> Active License via active Period?
return (db.session.query(License)
.filter_by(tenant_id=tenant_id)
.filter(License.start_date <= current_date)
.last())
return self.get(creator_func, tenant_id=tenant_id)
def invalidate_tenant_license(self, tenant_id: int):
"""Invalidate cached license for specific tenant"""
self.invalidate(tenant_id=tenant_id)
def register_license_cache_handlers(cache_manager) -> None:
"""Register license cache handlers with cache manager"""
cache_manager.register_handler(
LicenseCacheHandler,
'eveai_model' # Use existing eveai_model region
)

View File

@@ -0,0 +1,42 @@
"""
Utility functions for chat customization.
"""
def get_default_chat_customisation(tenant_customisation=None):
"""
Get chat customization options with default values for missing options.
Args:
tenant_customization (dict, optional): The tenant's customization options.
Defaults to None.
Returns:
dict: A dictionary containing all customization options with default values
for any missing options.
"""
# Default customization options
default_customisation = {
'primary_color': '#007bff',
'secondary_color': '#6c757d',
'background_color': '#ffffff',
'text_color': '#212529',
'sidebar_color': '#f8f9fa',
'logo_url': None,
'sidebar_text': None,
'welcome_message': 'Hello! How can I help you today?',
'team_info': []
}
# If no tenant customization is provided, return the defaults
if tenant_customisation is None:
return default_customisation
# Start with the default customization
customisation = default_customisation.copy()
# Update with tenant customization
for key, value in tenant_customisation.items():
if key in customisation:
customisation[key] = value
return customisation

View File

@@ -21,7 +21,7 @@ class TaggingField(BaseModel):
@field_validator('type', mode='before')
@classmethod
def validate_type(cls, v: str) -> str:
valid_types = ['string', 'integer', 'float', 'date', 'enum']
valid_types = ['string', 'integer', 'float', 'date', 'enum', 'color']
if v not in valid_types:
raise ValueError(f'type must be one of {valid_types}')
return v
@@ -243,7 +243,7 @@ class ArgumentDefinition(BaseModel):
@field_validator('type')
@classmethod
def validate_type(cls, v: str) -> str:
valid_types = ['string', 'integer', 'float', 'date', 'enum']
valid_types = ['string', 'integer', 'float', 'date', 'enum', 'color']
if v not in valid_types:
raise ValueError(f'type must be one of {valid_types}')
return v
@@ -256,7 +256,8 @@ class ArgumentDefinition(BaseModel):
'integer': NumericConstraint,
'float': NumericConstraint,
'date': DateConstraint,
'enum': EnumConstraint
'enum': EnumConstraint,
'color': StringConstraint
}
expected_type = expected_constraint_types.get(self.type)

View File

@@ -0,0 +1,215 @@
import os
import re
import logging
from packaging import version
from flask import current_app
logger = logging.getLogger(__name__)
class ContentManager:
def __init__(self, app=None):
self.app = app
if app:
self.init_app(app)
def init_app(self, app):
self.app = app
# Controleer of het pad bestaat
if not os.path.exists(app.config['CONTENT_DIR']):
logger.warning(f"Content directory not found at: {app.config['CONTENT_DIR']}")
else:
logger.info(f"Content directory configured at: {app.config['CONTENT_DIR']}")
def get_content_path(self, content_type, major_minor=None, patch=None):
"""
Geef het volledige pad naar een contentbestand
Args:
content_type (str): Type content (bv. 'changelog', 'terms')
major_minor (str, optional): Major.Minor versie (bv. '1.0')
patch (str, optional): Patchnummer (bv. '5')
Returns:
str: Volledige pad naar de content map of bestand
"""
content_path = os.path.join(self.app.config['CONTENT_DIR'], content_type)
if major_minor:
content_path = os.path.join(content_path, major_minor)
if patch:
content_path = os.path.join(content_path, f"{major_minor}.{patch}.md")
return content_path
def _parse_version(self, filename):
"""Parse een versienummer uit een bestandsnaam"""
match = re.match(r'(\d+\.\d+)\.(\d+)\.md', filename)
if match:
return match.group(1), match.group(2)
return None, None
def get_latest_version(self, content_type, major_minor=None):
"""
Verkrijg de laatste versie van een bepaald contenttype
Args:
content_type (str): Type content (bv. 'changelog', 'terms')
major_minor (str, optional): Specifieke major.minor versie, anders de hoogste
Returns:
tuple: (major_minor, patch, full_version) of None als niet gevonden
"""
try:
# Basispad voor dit contenttype
content_path = os.path.join(self.app.config['CONTENT_DIR'], content_type)
if not os.path.exists(content_path):
logger.error(f"Content path does not exist: {content_path}")
return None
# Als geen major_minor opgegeven, vind de hoogste
if not major_minor:
available_versions = os.listdir(content_path)
if not available_versions:
return None
# Sorteer op versienummer (major.minor)
available_versions.sort(key=lambda v: version.parse(v))
major_minor = available_versions[-1]
# Nu we major_minor hebben, zoek de hoogste patch
major_minor_path = os.path.join(content_path, major_minor)
if not os.path.exists(major_minor_path):
logger.error(f"Version path does not exist: {major_minor_path}")
return None
files = os.listdir(major_minor_path)
version_files = []
for file in files:
mm, p = self._parse_version(file)
if mm == major_minor and p:
version_files.append((mm, p, f"{mm}.{p}"))
if not version_files:
return None
# Sorteer op patch nummer
version_files.sort(key=lambda v: int(v[1]))
return version_files[-1]
except Exception as e:
logger.error(f"Error finding latest version for {content_type}: {str(e)}")
return None
def read_content(self, content_type, major_minor=None, patch=None):
"""
Lees content met versieondersteuning
Als major_minor en patch niet zijn opgegeven, wordt de laatste versie gebruikt.
Als alleen major_minor is opgegeven, wordt de laatste patch van die versie gebruikt.
Args:
content_type (str): Type content (bv. 'changelog', 'terms')
major_minor (str, optional): Major.Minor versie (bv. '1.0')
patch (str, optional): Patchnummer (bv. '5')
Returns:
dict: {
'content': str,
'version': str,
'content_type': str
} of None bij fout
"""
try:
# Als geen versie opgegeven, vind de laatste
if not major_minor:
version_info = self.get_latest_version(content_type)
if not version_info:
logger.error(f"No versions found for {content_type}")
return None
major_minor, patch, full_version = version_info
# Als geen patch opgegeven, vind de laatste patch voor deze major_minor
elif not patch:
version_info = self.get_latest_version(content_type, major_minor)
if not version_info:
logger.error(f"No versions found for {content_type} {major_minor}")
return None
major_minor, patch, full_version = version_info
else:
full_version = f"{major_minor}.{patch}"
# Nu hebben we major_minor en patch, lees het bestand
file_path = self.get_content_path(content_type, major_minor, patch)
if not os.path.exists(file_path):
logger.error(f"Content file does not exist: {file_path}")
return None
with open(file_path, 'r', encoding='utf-8') as file:
content = file.read()
return {
'content': content,
'version': full_version,
'content_type': content_type
}
except Exception as e:
logger.error(f"Error reading content {content_type} {major_minor}.{patch}: {str(e)}")
return None
def list_content_types(self):
"""Lijst alle beschikbare contenttypes op"""
try:
return [d for d in os.listdir(self.app.config['CONTENT_DIR'])
if os.path.isdir(os.path.join(self.app.config['CONTENT_DIR'], d))]
except Exception as e:
logger.error(f"Error listing content types: {str(e)}")
return []
def list_versions(self, content_type):
"""
Lijst alle beschikbare versies voor een contenttype
Returns:
list: Lijst van dicts met versie-informatie
[{'version': '1.0.0', 'path': '/path/to/file', 'date_modified': datetime}]
"""
versions = []
try:
content_path = os.path.join(self.app.config['CONTENT_DIR'], content_type)
if not os.path.exists(content_path):
return []
for major_minor in os.listdir(content_path):
major_minor_path = os.path.join(content_path, major_minor)
if not os.path.isdir(major_minor_path):
continue
for file in os.listdir(major_minor_path):
mm, p = self._parse_version(file)
if mm and p:
file_path = os.path.join(major_minor_path, file)
mod_time = os.path.getmtime(file_path)
versions.append({
'version': f"{mm}.{p}",
'path': file_path,
'date_modified': mod_time
})
# Sorteer op versienummer
versions.sort(key=lambda v: version.parse(v['version']))
return versions
except Exception as e:
logger.error(f"Error listing versions for {content_type}: {str(e)}")
return []

View File

@@ -16,9 +16,30 @@ from .eveai_exceptions import (EveAIInvalidLanguageException, EveAIDoubleURLExce
EveAIInvalidCatalog, EveAIInvalidDocument, EveAIInvalidDocumentVersion, EveAIException)
from ..models.user import Tenant
from common.utils.model_logging_utils import set_logging_information, update_logging_information
from common.services.entitlements import LicenseUsageServices
MB_CONVERTOR = 1_048_576
def get_file_size(file):
try:
# Als file een bytes object is of iets anders dat len() ondersteunt
file_size = len(file)
except TypeError:
# Als file een FileStorage object is
current_position = file.tell()
file.seek(0, os.SEEK_END)
file_size = file.tell()
file.seek(current_position)
return file_size
def create_document_stack(api_input, file, filename, extension, tenant_id):
# Precheck if we can add a document to the stack
LicenseUsageServices.check_storage_and_embedding_quota(tenant_id, get_file_size(file)/MB_CONVERTOR)
# Create the Document
catalog_id = int(api_input.get('catalog_id'))
catalog = Catalog.query.get(catalog_id)
@@ -102,8 +123,6 @@ def create_version_for_document(document, tenant_id, url, sub_file_type, langua
set_logging_information(new_doc_vers, dt.now(tz.utc))
mark_tenant_storage_dirty(tenant_id)
return new_doc_vers
@@ -124,7 +143,7 @@ def upload_file_for_version(doc_vers, file, extension, tenant_id):
)
doc_vers.bucket_name = bn
doc_vers.object_name = on
doc_vers.file_size = size / 1048576 # Convert bytes to MB
doc_vers.file_size = size / MB_CONVERTOR # Convert bytes to MB
db.session.commit()
current_app.logger.info(f'Successfully saved document to MinIO for tenant {tenant_id} for '
@@ -274,6 +293,9 @@ def refresh_document_with_info(doc_id, tenant_id, api_input):
if not old_doc_vers.url:
return None, "This document has no URL. Only documents with a URL can be refreshed."
# Precheck if we have enough quota for the new version
LicenseUsageServices.check_storage_and_embedding_quota(tenant_id, old_doc_vers.file_size)
new_doc_vers = create_version_for_document(
doc, tenant_id,
old_doc_vers.url,
@@ -330,6 +352,9 @@ def refresh_document_with_content(doc_id: int, tenant_id: int, file_content: byt
old_doc_vers = DocumentVersion.query.filter_by(doc_id=doc_id).order_by(desc(DocumentVersion.id)).first()
# Precheck if we have enough quota for the new version
LicenseUsageServices.check_storage_and_embedding_quota(tenant_id, get_file_size(file_content) / MB_CONVERTOR)
# Create new version with same file type as original
extension = old_doc_vers.file_type
@@ -377,13 +402,6 @@ def refresh_document(doc_id, tenant_id):
return refresh_document_with_info(doc_id, tenant_id, api_input)
# Function triggered when a document_version is created or updated
def mark_tenant_storage_dirty(tenant_id):
tenant = db.session.query(Tenant).filter_by(id=int(tenant_id)).first()
tenant.storage_dirty = True
db.session.commit()
def cope_with_local_url(url):
parsed_url = urlparse(url)
# Check if this is an internal WordPress URL (TESTING) and rewrite it

View File

@@ -0,0 +1,46 @@
def create_default_config_from_type_config(type_config):
"""
Creëert een dictionary met standaardwaarden gebaseerd op een typedefinitie configuratie.
Args:
type_config (dict): Het configuration-veld van een typedefinitie (bijv. uit processor_types).
Returns:
dict: Een dictionary met de naam van ieder veld als sleutel en de standaardwaarde als waarde.
Alleen velden met een standaardwaarde of die verplicht zijn, worden opgenomen.
Voorbeeld:
>>> config = PROCESSOR_TYPES["HTML_PROCESSOR"]["configuration"]
>>> create_default_config_from_type_def(config)
{'html_tags': 'p, h1, h2, h3, h4, h5, h6, li, table, thead, tbody, tr, td',
'html_end_tags': 'p, li, table',
'html_excluded_classes': '',
'html_excluded_elements': 'header, footer, nav, script',
'html_included_elements': 'article, main',
'chunking_heading_level': 2}
"""
if not type_config:
return {}
default_config = {}
for field_name, field_def in type_config.items():
# Als het veld een standaardwaarde heeft, voeg deze toe
if "default" in field_def:
default_config[field_name] = field_def["default"]
# Als het veld verplicht is maar geen standaardwaarde heeft, voeg een lege string toe
elif field_def.get("required", False):
# Kies een geschikte "lege" waarde op basis van het type
field_type = field_def.get("type", "string")
if field_type == "string":
default_config[field_name] = ""
elif field_type == "integer":
default_config[field_name] = 0
elif field_type == "boolean":
default_config[field_name] = False
elif field_type == "color":
default_config[field_name] = "#000000"
else:
default_config[field_name] = ""
return default_config

135
common/utils/errors.py Normal file
View File

@@ -0,0 +1,135 @@
import traceback
import jinja2
from flask import render_template, request, jsonify, redirect, current_app, flash
from flask_login import current_user
from common.utils.eveai_exceptions import EveAINoSessionTenant
from common.utils.nginx_utils import prefixed_url_for
def not_found_error(error):
if not current_user.is_authenticated:
return redirect(prefixed_url_for('security.login'))
current_app.logger.error(f"Not Found Error: {error}")
current_app.logger.error(traceback.format_exc())
return render_template('error/404.html'), 404
def internal_server_error(error):
if not current_user.is_authenticated:
return redirect(prefixed_url_for('security.login'))
current_app.logger.error(f"Internal Server Error: {error}")
current_app.logger.error(traceback.format_exc())
return render_template('error/500.html'), 500
def not_authorised_error(error):
if not current_user.is_authenticated:
return redirect(prefixed_url_for('security.login'))
current_app.logger.error(f"Not Authorised Error: {error}")
current_app.logger.error(traceback.format_exc())
return render_template('error/401.html')
def access_forbidden(error):
if not current_user.is_authenticated:
return redirect(prefixed_url_for('security.login'))
current_app.logger.error(f"Access Forbidden: {error}")
current_app.logger.error(traceback.format_exc())
return render_template('error/403.html')
def key_error_handler(error):
# Check if the KeyError is specifically for 'tenant'
if str(error) == "'tenant'":
return redirect(prefixed_url_for('security.login'))
# For other KeyErrors, you might want to log the error and return a generic error page
current_app.logger.error(f"Key Error: {error}")
current_app.logger.error(traceback.format_exc())
return render_template('error/generic.html', error_message="An unexpected error occurred"), 500
def attribute_error_handler(error):
"""Handle AttributeError exceptions.
Specifically catches SQLAlchemy relationship errors when string IDs
are used instead of model instances.
"""
error_msg = str(error)
current_app.logger.error(f"AttributeError: {error_msg}")
current_app.logger.error(traceback.format_exc())
# Handle the SQLAlchemy relationship error specifically
if "'str' object has no attribute '_sa_instance_state'" in error_msg:
flash('Database relationship error. Please check your form inputs and try again.', 'error')
return render_template('error/500.html',
error_type="Relationship Error",
error_details="A string value was provided where a database object was expected."), 500
# Handle other AttributeErrors
flash('An application error occurred. The technical team has been notified.', 'error')
return render_template('error/500.html',
error_type="Attribute Error",
error_details=error_msg), 500
def no_tenant_selected_error(error):
"""Handle errors when no tenant is selected in the current session.
This typically happens when a session expires or becomes invalid after
a long period of inactivity. The user will be redirected to the login page.
"""
current_app.logger.error(f"No Session Tenant Error: {error}")
current_app.logger.error(traceback.format_exc())
flash('Your session expired. You will have to re-enter your credentials', 'warning')
# Perform logout if user is authenticated
if current_user.is_authenticated:
from flask_security.utils import logout_user
logout_user()
# Redirect to login page
return redirect(prefixed_url_for('security.login'))
def general_exception(e):
current_app.logger.error(f"Unhandled Exception: {e}", exc_info=True)
flash('An application error occurred. The technical team has been notified.', 'error')
return render_template('error/500.html',
error_type=type(e).__name__,
error_details=str(e)), 500
def template_not_found_error(error):
"""Handle Jinja2 TemplateNotFound exceptions."""
current_app.logger.error(f'Template not found: {error.name}')
current_app.logger.error(f'Search Paths: {current_app.jinja_loader.list_templates()}')
current_app.logger.error(traceback.format_exc())
return render_template('error/500.html',
error_type="Template Not Found",
error_details=f"Template '{error.name}' could not be found."), 404
def template_syntax_error(error):
"""Handle Jinja2 TemplateSyntaxError exceptions."""
current_app.logger.error(f'Template syntax error: {error.message}')
current_app.logger.error(f'In template {error.filename}, line {error.lineno}')
current_app.logger.error(traceback.format_exc())
return render_template('error/500.html',
error_type="Template Syntax Error",
error_details=f"Error in template '{error.filename}' at line {error.lineno}: {error.message}"), 500
def register_error_handlers(app):
app.register_error_handler(404, not_found_error)
app.register_error_handler(500, internal_server_error)
app.register_error_handler(401, not_authorised_error)
app.register_error_handler(403, not_authorised_error)
app.register_error_handler(EveAINoSessionTenant, no_tenant_selected_error)
app.register_error_handler(KeyError, key_error_handler)
app.register_error_handler(AttributeError, attribute_error_handler)
app.register_error_handler(jinja2.TemplateNotFound, template_not_found_error)
app.register_error_handler(jinja2.TemplateSyntaxError, template_syntax_error)
app.register_error_handler(Exception, general_exception)

View File

@@ -136,3 +136,115 @@ class EveAIInvalidEmbeddingModel(EveAIException):
# Construct the message dynamically
message = f"Tenant with ID '{tenant_id}' has no or an invalid embedding model in Catalog {catalog_id}."
super().__init__(message, status_code, payload)
class EveAIDoublePartner(EveAIException):
"""Raised when there is already a partner defined for a given tenant (while registering a partner)"""
def __init__(self, tenant_id, status_code=400, payload=None):
self.tenant_id = tenant_id
# Construct the message dynamically
message = f"Tenant with ID '{tenant_id}' is already defined as a Partner."
super().__init__(message, status_code, payload)
class EveAIRoleAssignmentException(EveAIException):
"""Exception raised when a role cannot be assigned due to business rules"""
def __init__(self, message, status_code=403, payload=None):
super().__init__(message, status_code, payload)
class EveAINoManagementPartnerService(EveAIException):
"""Exception raised when the operation requires the logged in partner (or selected parter by Super User)
does not have a MANAGEMENT_SERVICE"""
def __init__(self, message="No Management Service defined for partner", status_code=403, payload=None):
super().__init__(message, status_code, payload)
class EveAINoSessionTenant(EveAIException):
"""Exception raised when no session tenant is set"""
def __init__(self, message="No Session Tenant selected. Cannot perform requested action.", status_code=403,
payload=None):
super().__init__(message, status_code, payload)
class EveAINoSessionPartner(EveAIException):
"""Exception raised when no session partner is set"""
def __init__(self, message="No Session Partner selected. Cannot perform requested action.", status_code=403,
payload=None):
super().__init__(message, status_code, payload)
class EveAINoManagementPartnerForTenant(EveAIException):
"""Exception raised when the selected partner is no management partner for tenant"""
def __init__(self, message="No Management Partner for Tenant", status_code=403, payload=None):
super().__init__(message, status_code, payload)
class EveAIQuotaExceeded(EveAIException):
"""Base exception for quota-related errors"""
def __init__(self, message, quota_type, current_usage, limit, additional=0, status_code=400, payload=None):
super().__init__(message, status_code, payload)
self.quota_type = quota_type
self.current_usage = current_usage
self.limit = limit
self.additional = additional
class EveAIStorageQuotaExceeded(EveAIQuotaExceeded):
"""Raised when storage quota is exceeded"""
def __init__(self, current_usage, limit, additional, status_code=400, payload=None):
message = (f"Storage quota exceeded. Current: {current_usage:.1f}MB, "
f"Additional: {additional:.1f}MB, Limit: {limit}MB")
super().__init__(message, "storage", current_usage, limit, additional, status_code, payload)
class EveAIEmbeddingQuotaExceeded(EveAIQuotaExceeded):
"""Raised when embedding quota is exceeded"""
def __init__(self, current_usage, limit, additional, status_code=400, payload=None):
message = (f"Embedding quota exceeded. Current: {current_usage:.1f}MB, "
f"Additional: {additional:.1f}MB, Limit: {limit}MB")
super().__init__(message, "embedding", current_usage, limit, additional, status_code, payload)
class EveAIInteractionQuotaExceeded(EveAIQuotaExceeded):
"""Raised when the interaction token quota is exceeded"""
def __init__(self, current_usage, limit, status_code=400, payload=None):
message = (f"Interaction token quota exceeded. Current: {current_usage:.2f}M tokens, "
f"Limit: {limit:.2f}M tokens")
super().__init__(message, "interaction", current_usage, limit, 0, status_code, payload)
class EveAIQuotaWarning(EveAIException):
"""Warning when approaching quota limits (not blocking)"""
def __init__(self, message, quota_type, usage_percentage, status_code=200, payload=None):
super().__init__(message, status_code, payload)
self.quota_type = quota_type
self.usage_percentage = usage_percentage
class EveAILicensePeriodsExceeded(EveAIException):
"""Raised when no more license periods can be created for a given license"""
def __init__(self, license_id, status_code=400, payload=None):
message = f"No more license periods can be created for license with ID {license_id}. "
super().__init__(message, status_code, payload)
class EveAIPendingLicensePeriod(EveAIException):
"""Raised when a license period is pending"""
def __init__(self, status_code=400, payload=None):
message = f"Basic Fee Payment has not been received yet. Please ensure payment has been made, and please wait for payment to be processed."
super().__init__(message, status_code, payload)

View File

@@ -0,0 +1,11 @@
import json
from wtforms.validators import ValidationError
def validate_json(form, field):
if field.data:
try:
json.loads(field.data)
except json.JSONDecodeError:
raise ValidationError('Invalid JSON format')

57
common/utils/log_utils.py Normal file
View File

@@ -0,0 +1,57 @@
import pandas as pd
from sqlalchemy import inspect
from typing import Any, List, Union, Optional
def format_query_results(query_results: Any) -> str:
"""
Format query results as a readable string using pandas
Args:
query_results: SQLAlchemy query, query results, or model instance(s)
Returns:
Formatted string representation of the query results
"""
try:
# If it's a query object, execute it
if hasattr(query_results, 'all'):
results = query_results.all()
elif not isinstance(query_results, list):
results = [query_results]
else:
results = query_results
# Handle different types of results
if results and hasattr(results[0], '__table__'):
# SQLAlchemy ORM objects
data = []
for item in results:
row = {}
for column in inspect(item).mapper.column_attrs:
row[column.key] = getattr(item, column.key)
data.append(row)
df = pd.DataFrame(data)
elif results and isinstance(results[0], tuple):
# Join query results (tuples)
if hasattr(results[0], '_fields'): # Named tuples
df = pd.DataFrame(results)
else:
# Regular tuples - try to get column names from query
if hasattr(query_results, 'statement'):
columns = query_results.statement.columns.keys()
df = pd.DataFrame(results, columns=columns)
else:
df = pd.DataFrame(results)
else:
# Fallback for other types
df = pd.DataFrame(results)
# Format the output with pandas
with pd.option_context('display.max_rows', 20, 'display.max_columns', None,
'display.width', 1000):
formatted_output = f"Query returned {len(df)} results:\n{df}"
return formatted_output
except Exception as e:
return f"Error formatting query results: {str(e)}"

View File

@@ -0,0 +1,46 @@
from scaleway import Client
from scaleway.tem.v1alpha1.api import TemV1Alpha1API
from scaleway.tem.v1alpha1.types import CreateEmailRequestAddress
from html2text import HTML2Text
from flask import current_app
def send_email(to_email, to_name, subject, html):
current_app.logger.debug(f"Sending email to {to_email} with subject {subject}")
access_key = current_app.config['SW_EMAIL_ACCESS_KEY']
secret_key = current_app.config['SW_EMAIL_SECRET_KEY']
default_project_id = current_app.config['SW_PROJECT']
default_region = "fr-par"
current_app.logger.debug(f"Access Key: {access_key}\nSecret Key: {secret_key}\n"
f"Default Project ID: {default_project_id}\nDefault Region: {default_region}")
client = Client(
access_key=access_key,
secret_key=secret_key,
default_project_id=default_project_id,
default_region=default_region
)
current_app.logger.debug(f"Scaleway Client Initialized")
tem = TemV1Alpha1API(client)
current_app.logger.debug(f"Tem Initialized")
from_ = CreateEmailRequestAddress(email=current_app.config['SW_EMAIL_SENDER'],
name=current_app.config['SW_EMAIL_NAME'])
to_ = CreateEmailRequestAddress(email=to_email, name=to_name)
email = tem.create_email(
from_=from_,
to=[to_],
subject=subject,
text=html_to_text(html),
html=html,
project_id=default_project_id,
)
current_app.logger.debug(f"Email sent to {to_email}")
def html_to_text(html_content):
"""Convert HTML to plain text using html2text"""
h = HTML2Text()
h.ignore_images = True
h.ignore_emphasis = False
h.body_width = 0 # No wrapping
return h.handle(html_content)

View File

@@ -4,10 +4,11 @@ for handling tenant requests
"""
from flask_security import current_user
from flask import session, current_app, redirect
from common.utils.nginx_utils import prefixed_url_for
from flask import session
from .database import Database
from .eveai_exceptions import EveAINoSessionTenant, EveAINoSessionPartner, EveAINoManagementPartnerService, \
EveAINoManagementPartnerForTenant
from common.services.user import UserServices
def mw_before_request():
@@ -17,17 +18,27 @@ def mw_before_request():
"""
if 'tenant' not in session:
current_app.logger.warning('No tenant defined in session')
return redirect(prefixed_url_for('security_bp.login'))
raise EveAINoSessionTenant()
tenant_id = session['tenant']['id']
if not tenant_id:
raise Exception('Cannot switch schema for tenant: no tenant defined in session')
raise EveAINoSessionTenant()
# user = User.query.get(current_user.id)
if current_user.has_role('Super User') or current_user.tenant_id == tenant_id:
Database(tenant_id).switch_schema()
else:
raise Exception(f'Cannot switch schema for tenant {tenant_id}: user {current_user.email} does not have access')
switch_allowed = False
if current_user.has_role('Super User'):
switch_allowed = True
if current_user.has_role('Tenant Admin') and current_user.tenant_id == tenant_id:
switch_allowed = True
if current_user.has_role('Partner Admin'):
if 'partner' not in session:
raise EveAINoSessionPartner()
management_service = next((service for service in session['partner']['services']
if service.get('type') == 'MANAGEMENT_SERVICE'), None)
if not management_service:
raise EveAINoManagementPartnerService()
if not UserServices.can_user_edit_tenant(tenant_id):
raise EveAINoManagementPartnerForTenant()
Database(tenant_id).switch_schema()

View File

@@ -10,7 +10,6 @@ def set_logging_information(obj, timestamp):
obj.created_by = user_id
obj.updated_by = user_id
def update_logging_information(obj, timestamp):
obj.updated_at = timestamp

View File

@@ -14,7 +14,7 @@ from common.eveai_model.tracked_mistral_embeddings import TrackedMistralAIEmbedd
from common.langchain.tracked_transcription import TrackedOpenAITranscription
from common.models.user import Tenant
from config.model_config import MODEL_CONFIG
from common.extensions import template_manager
from common.extensions import cache_manager
from common.models.document import EmbeddingMistral
from common.utils.eveai_exceptions import EveAITenantNotFound, EveAIInvalidEmbeddingModel
from crewai import LLM
@@ -135,153 +135,30 @@ def get_crewai_llm(full_model_name='mistral.mistral-large-latest', temperature=0
return llm
class ModelVariables:
"""Manages model-related variables and configurations"""
def __init__(self, tenant_id: int, variables: Dict[str, Any] = None):
"""
Initialize ModelVariables with tenant and optional template manager
Args:
tenant_id: Tenant instance
variables: Optional variables
"""
current_app.logger.info(f'Model variables initialized with tenant {tenant_id} and variables \n{variables}')
self.tenant_id = tenant_id
self._variables = variables if variables is not None else self._initialize_variables()
current_app.logger.info(f'Model _variables initialized to {self._variables}')
self._llm_instances = {}
self.llm_metrics_handler = LLMMetricsHandler()
self._transcription_model = None
def _initialize_variables(self) -> Dict[str, Any]:
"""Initialize the variables dictionary"""
variables = {}
tenant = Tenant.query.get(self.tenant_id)
if not tenant:
raise EveAITenantNotFound(self.tenant_id)
# Set model providers
variables['llm_provider'], variables['llm_model'] = tenant.llm_model.split('.')
variables['llm_full_model'] = tenant.llm_model
# Set model-specific configurations
model_config = MODEL_CONFIG.get(variables['llm_provider'], {}).get(variables['llm_model'], {})
variables.update(model_config)
# Additional configurations
variables['annotation_chunk_length'] = current_app.config['ANNOTATION_TEXT_CHUNK_LENGTH'][tenant.llm_model]
variables['max_compression_duration'] = current_app.config['MAX_COMPRESSION_DURATION']
variables['max_transcription_duration'] = current_app.config['MAX_TRANSCRIPTION_DURATION']
variables['compression_cpu_limit'] = current_app.config['COMPRESSION_CPU_LIMIT']
variables['compression_process_delay'] = current_app.config['COMPRESSION_PROCESS_DELAY']
return variables
@property
def annotation_chunk_length(self):
return self._variables['annotation_chunk_length']
@property
def max_compression_duration(self):
return self._variables['max_compression_duration']
@property
def max_transcription_duration(self):
return self._variables['max_transcription_duration']
@property
def compression_cpu_limit(self):
return self._variables['compression_cpu_limit']
@property
def compression_process_delay(self):
return self._variables['compression_process_delay']
def get_llm(self, temperature: float = 0.3, **kwargs) -> Any:
"""
Get an LLM instance with specific configuration
Args:
temperature: The temperature for the LLM
**kwargs: Additional configuration parameters
Returns:
An instance of the configured LLM
"""
cache_key = f"{temperature}_{hash(frozenset(kwargs.items()))}"
if cache_key not in self._llm_instances:
provider = self._variables['llm_provider']
model = self._variables['llm_model']
if provider == 'openai':
self._llm_instances[cache_key] = ChatOpenAI(
api_key=os.getenv('OPENAI_API_KEY'),
model=model,
temperature=temperature,
callbacks=[self.llm_metrics_handler],
**kwargs
)
elif provider == 'anthropic':
self._llm_instances[cache_key] = ChatAnthropic(
api_key=os.getenv('ANTHROPIC_API_KEY'),
model=current_app.config['ANTHROPIC_LLM_VERSIONS'][model],
temperature=temperature,
callbacks=[self.llm_metrics_handler],
**kwargs
)
else:
raise ValueError(f"Unsupported LLM provider: {provider}")
return self._llm_instances[cache_key]
@property
def transcription_model(self) -> TrackedOpenAITranscription:
"""Get the transcription model instance"""
if self._transcription_model is None:
api_key = os.getenv('OPENAI_API_KEY')
self._transcription_model = TrackedOpenAITranscription(
api_key=api_key,
model='whisper-1'
)
return self._transcription_model
# Remove the old transcription-related methods since they're now handled by TrackedOpenAITranscription
@property
def transcription_client(self):
raise DeprecationWarning("Use transcription_model instead")
def transcribe(self, *args, **kwargs):
raise DeprecationWarning("Use transcription_model.transcribe() instead")
def get_template(self, template_name: str, version: Optional[str] = None) -> str:
"""
Get a template for the tenant's configured LLM
Args:
template_name: Name of the template to retrieve
version: Optional specific version to retrieve
Returns:
The template content
"""
try:
template = template_manager.get_template(
self._variables['llm_full_model'],
template_name,
version
)
return template.content
except Exception as e:
current_app.logger.error(f"Error getting template {template_name}: {str(e)}")
# Fall back to old template loading if template_manager fails
if template_name in self._variables.get('templates', {}):
return self._variables['templates'][template_name]
raise
def process_pdf():
full_model_name = 'mistral-ocr-latest'
# Helper function to get cached model variables
def get_model_variables(tenant_id: int) -> ModelVariables:
return ModelVariables(tenant_id=tenant_id)
def get_template(template_name: str, version: Optional[str] = "1.0", temperature: float = 0.3) -> tuple[
Any, BaseChatModel | None | ChatOpenAI | ChatMistralAI]:
"""
Get a prompt template
"""
prompt = cache_manager.prompts_config_cache.get_config(template_name, version)
if "llm_model" in prompt:
llm = get_embedding_llm(full_model_name=prompt["llm_model"], temperature=temperature)
else:
llm = get_embedding_llm(temperature=temperature)
return prompt["content"], llm
def get_transcription_model(model_name: str = "whisper-1") -> TrackedOpenAITranscription:
"""
Get a transcription model instance
"""
api_key = os.getenv('OPENAI_API_KEY')
return TrackedOpenAITranscription(
api_key=api_key,
model=model_name
)

View File

@@ -1,59 +0,0 @@
import time
import threading
from contextlib import contextmanager
from functools import wraps
from prometheus_client import Counter, Histogram, Summary, start_http_server, Gauge
from flask import current_app, g, request, Flask
class EveAIMetrics:
"""
Central class for Prometheus metrics infrastructure.
This class initializes the Prometheus HTTP server and provides
shared functionality for metrics across components.
Component-specific metrics should be defined in their respective modules.
"""
def __init__(self, app: Flask = None):
self.app = app
self._metrics_server_started = False
if app is not None:
self.init_app(app)
def init_app(self, app: Flask):
"""Initialize metrics with Flask app and start Prometheus server"""
self.app = app
self._start_metrics_server()
def _start_metrics_server(self):
"""Start the Prometheus metrics HTTP server if not already running"""
if not self._metrics_server_started:
try:
metrics_port = self.app.config.get('PROMETHEUS_PORT', 8000)
start_http_server(metrics_port)
self.app.logger.info(f"Prometheus metrics server started on port {metrics_port}")
self._metrics_server_started = True
except Exception as e:
self.app.logger.error(f"Failed to start metrics server: {e}")
@staticmethod
def get_standard_buckets():
"""
Return the standard duration buckets for histogram metrics.
Components should use these for consistency across the system.
"""
return [0.1, 0.5, 1, 2.5, 5, 10, 15, 30, 60, 120, 240, 360, float('inf')]
@staticmethod
def sanitize_label_values(labels_dict):
"""
Convert all label values to strings as required by Prometheus.
Args:
labels_dict: Dictionary of label name to label value
Returns:
Dictionary with all values converted to strings
"""
return {k: str(v) if v is not None else "" for k, v in labels_dict.items()}

View File

@@ -0,0 +1,11 @@
from flask import current_app
from prometheus_client import push_to_gateway
def sanitize_label(value):
"""Convert value to valid Prometheus label by removing/replacing invalid chars"""
if value is None:
return ""
# Replace spaces and special chars with underscores
import re
return re.sub(r'[^a-zA-Z0-9_]', '_', str(value))

View File

@@ -1,7 +1,7 @@
from flask import session, current_app
from sqlalchemy import and_
from common.models.user import Tenant
from common.models.user import Tenant, Partner
from common.models.entitlements import License
from common.utils.database import Database
from common.utils.eveai_exceptions import EveAITenantNotFound, EveAITenantInvalid, EveAINoActiveLicense
@@ -13,13 +13,19 @@ def set_tenant_session_data(sender, user, **kwargs):
tenant = Tenant.query.filter_by(id=user.tenant_id).first()
session['tenant'] = tenant.to_dict()
session['default_language'] = tenant.default_language
session['default_llm_model'] = tenant.llm_model
partner = Partner.query.filter_by(tenant_id=user.tenant_id).first()
if partner:
session['partner'] = partner.to_dict()
else:
# Remove partner from session if it exists
session.pop('partner', None)
def clear_tenant_session_data(sender, user, **kwargs):
session.pop('tenant', None)
session.pop('default_language', None)
session.pop('default_llm_model', None)
session.pop('partner', None)
def is_valid_tenant(tenant_id):
@@ -33,11 +39,12 @@ def is_valid_tenant(tenant_id):
raise EveAITenantInvalid(tenant_id)
else:
current_date = dt.now(tz=tz.utc).date()
active_license = (License.query.filter_by(tenant_id=tenant_id)
.filter(and_(License.start_date <= current_date,
License.end_date >= current_date))
.one_or_none())
if not active_license:
raise EveAINoActiveLicense(tenant_id)
# TODO -> Check vervangen door Active License Period!
# active_license = (License.query.filter_by(tenant_id=tenant_id)
# .filter(and_(License.start_date <= current_date,
# License.end_date >= current_date))
# .one_or_none())
# if not active_license:
# raise EveAINoActiveLicense(tenant_id)
return True
return True

View File

@@ -1,9 +1,10 @@
from flask import current_app, render_template
from flask_mailman import EmailMessage
from flask_security import current_user
from itsdangerous import URLSafeTimedSerializer
import socket
from common.models.user import Role
from common.utils.nginx_utils import prefixed_url_for
from common.utils.mail_utils import send_email
def confirm_token(token, expiration=3600):
@@ -16,14 +17,6 @@ def confirm_token(token, expiration=3600):
return email
def send_email(to, subject, template):
msg = EmailMessage(subject=subject,
body=template,
to=[to])
msg.content_subtype = "html"
msg.send()
def generate_reset_token(email):
serializer = URLSafeTimedSerializer(current_app.config['SECRET_KEY'])
return serializer.dumps(email, salt=current_app.config['SECURITY_PASSWORD_SALT'])
@@ -35,9 +28,6 @@ def generate_confirmation_token(email):
def send_confirmation_email(user):
if not test_smtp_connection():
raise Exception("Failed to connect to SMTP server")
token = generate_confirmation_token(user.email)
confirm_url = prefixed_url_for('security_bp.confirm_email', token=token, _external=True)
@@ -45,7 +35,7 @@ def send_confirmation_email(user):
subject = "Please confirm your email"
try:
send_email(user.email, "Confirm your email", html)
send_email(user.email, f"{user.first_name} {user.last_name}", "Confirm your email", html)
current_app.logger.info(f'Confirmation email sent to {user.email}')
except Exception as e:
current_app.logger.error(f'Failed to send confirmation email to {user.email}. Error: {str(e)}')
@@ -60,36 +50,49 @@ def send_reset_email(user):
subject = "Reset Your Password"
try:
send_email(user.email, "Reset Your Password", html)
send_email(user.email, f"{user.first_name} {user.last_name}", subject, html)
current_app.logger.info(f'Reset email sent to {user.email}')
except Exception as e:
current_app.logger.error(f'Failed to send reset email to {user.email}. Error: {str(e)}')
raise
def test_smtp_connection():
try:
current_app.logger.info(f"Attempting to resolve google.com...")
google_ip = socket.gethostbyname('google.com')
current_app.logger.info(f"Successfully resolved google.com to {google_ip}")
except Exception as e:
current_app.logger.error(f"Failed to resolve google.com: {str(e)}")
def get_current_user_roles():
"""Get the roles of the currently authenticated user.
try:
smtp_server = current_app.config['MAIL_SERVER']
current_app.logger.info(f"Attempting to resolve {smtp_server}...")
smtp_ip = socket.gethostbyname(smtp_server)
current_app.logger.info(f"Successfully resolved {smtp_server} to {smtp_ip}")
except Exception as e:
current_app.logger.error(f"Failed to resolve {smtp_server}: {str(e)}")
Returns:
List of Role objects or empty list if no user is authenticated
"""
if current_user.is_authenticated:
return current_user.roles
return []
try:
smtp_server = current_app.config['MAIL_SERVER']
smtp_port = current_app.config['MAIL_PORT']
sock = socket.create_connection((smtp_server, smtp_port), timeout=10)
sock.close()
current_app.logger.info(f"Successfully connected to SMTP server {smtp_server}:{smtp_port}")
return True
except Exception as e:
current_app.logger.error(f"Failed to connect to SMTP server: {str(e)}")
def current_user_has_role(role_name):
"""Check if the current user has the specified role.
Args:
role_name (str): Name of the role to check
Returns:
bool: True if user has the role, False otherwise
"""
if not current_user.is_authenticated:
return False
return any(role.name == role_name for role in current_user.roles)
def current_user_roles():
"""Get the roles of the currently authenticated user.
Returns:
List of Role objects or empty list if no user is authenticated
"""
if current_user.is_authenticated:
return current_user.roles
return []
def all_user_roles():
roles = [(role.id, role.name) for role in Role.query.all()]

View File

@@ -44,4 +44,5 @@ def perform_startup_invalidation(app):
except Exception as e:
app.logger.error(f"Error during startup invalidation: {e}")
# In case of error, we don't want to block the application startup
pass
pass

View File

@@ -1,7 +1,11 @@
# common/utils/filters.py
import pytz
import markdown
from markupsafe import Markup
from datetime import datetime
from common.utils.nginx_utils import prefixed_url_for as puf
from flask import current_app, url_for
def to_local_time(utc_dt, timezone_str):
@@ -29,9 +33,91 @@ def time_difference(start_dt, end_dt):
return "Ongoing"
def status_color(status_name):
"""Return Bootstrap color class for status"""
colors = {
'UPCOMING': 'secondary',
'PENDING': 'warning',
'ACTIVE': 'success',
'COMPLETED': 'info',
'INVOICED': 'primary',
'CLOSED': 'dark'
}
return colors.get(status_name, 'secondary')
def render_markdown(text):
"""
Renders markdown to HTML using Python's markdown library.
Includes common extensions for better rendering.
"""
if not text:
return ""
# Verwijder de triple backticks en markdown label
text = clean_markdown(text)
# Render de markdown met extensies
return Markup(markdown.markdown(text, extensions=[
'markdown.extensions.fenced_code',
'markdown.extensions.codehilite',
'markdown.extensions.tables',
'markdown.extensions.toc'
]))
def clean_markdown(text):
"""
Verwijdert triple backticks en markdown aanduiding uit de tekst
"""
if not text:
return ""
text = text.strip()
if text.startswith("```markdown"):
text = text[len("```markdown"):].strip()
elif text.startswith("```"):
text = text[3:].strip()
if text.endswith("```"):
text = text[:-3].strip()
return text
def prefixed_url_for(endpoint, **kwargs):
return puf(endpoint, **kwargs)
def get_pagination_html(pagination, endpoint, **kwargs):
"""
Generates HTML for pagination with the ability to include additional parameters
"""
html = ['<nav aria-label="Page navigation"><ul class="pagination justify-content-center">']
for page in pagination.iter_pages():
if page:
is_active = 'active' if page == pagination.page else ''
url = url_for(endpoint, page=page, **kwargs)
current_app.logger.debug(f"URL for page {page}: {url}")
html.append(f'<li class="page-item {is_active}"><a class="page-link" href="{url}">{page}</a></li>')
else:
html.append('<li class="page-item disabled"><span class="page-link">...</span></li>')
html.append('</ul></nav>')
return Markup(''.join(html))
def register_filters(app):
"""
Registers custom filters with the Flask app.
"""
app.jinja_env.filters['to_local_time'] = to_local_time
app.jinja_env.filters['time_difference'] = time_difference
app.jinja_env.filters['status_color'] = status_color
app.jinja_env.filters['prefixed_url_for'] = prefixed_url_for
app.jinja_env.filters['markdown'] = render_markdown
app.jinja_env.filters['clean_markdown'] = clean_markdown
app.jinja_env.globals['prefixed_url_for'] = prefixed_url_for
app.jinja_env.globals['get_pagination_html'] = get_pagination_html

BIN
config/.DS_Store vendored

Binary file not shown.

View File

@@ -13,7 +13,7 @@ backstory: >
language the context provided to you is in. You are participating in a conversation, not writing e.g. an email. Do not
include a salutation or closing greeting in your answer.
{custom_backstory}
full_model_name: "mistral.mistral-large-latest"
full_model_name: "mistral.mistral-small-latest"
temperature: 0.3
metadata:
author: "Josako"

View File

@@ -0,0 +1,27 @@
version: "1.0.0"
name: "Traicie HR BP "
role: >
You are an HR BP (Human Resources Business Partner)
{custom_role}
goal: >
As an HR Business Partner, your primary goal is to align people strategies with business objectives. You aim to
ensure that the organisation has the right talent, capabilities, and culture in place to drive performance,
manage change effectively, and support sustainable growth. This involves acting as a trusted advisor to leadership
while advocating for employees and fostering a healthy, high-performing workplace.
{custom_goal}
backstory: >
You didn't start your career as a strategist. You began in traditional HR roles, mastering recruitment, employee
relations, and policy implementation. You developed a deeper understanding of how people decisions impact business
outcomes.
Through experience, exposure to leadership, and a strong interest in organisational dynamics, you transitioned into a
role that bridges the gap between HR and the business. Youve earned a seat at the table not just by knowing HR
processes, but by understanding the business inside-out, speaking the language of executives, and backing their advice
with data and insight.
{custom_backstory}
full_model_name: "mistral.mistral-medium-latest"
temperature: 0.3
metadata:
author: "Josako"
date_added: "2025-05-21"
description: "HR BP Agent."
changes: "Initial version"

View File

@@ -0,0 +1,25 @@
version: "1.0.0"
name: "Traicie HR BP "
role: >
You are an Expert Recruiter working for {tenant_name}
{custom_role}
goal: >
As an expert recruiter, you identify, attract, and secure top talent by building genuine relationships, deeply
understanding business needs, and ensuring optimal alignment between candidate potential and organizational goals
, while championing diversity, culture fit, and long-term retention.
{custom_goal}
backstory: >
You started your career in a high-pressure agency setting, where you quickly learned the art of fast-paced hiring and
relationship building. Over the years, you moved in-house, partnering closely with business leaders to shape
recruitment strategies that go beyond filling roles—you focus on finding the right people to drive growth and culture.
With a strong grasp of both tech and non-tech profiles, youve adapted to changing trends, from remote work to
AI-driven sourcing. Youre more than a recruiter—youre a trusted advisor, a brand ambassador, and a connector of
people and purpose.
{custom_backstory}
full_model_name: "mistral.mistral-medium-latest"
temperature: 0.3
metadata:
author: "Josako"
date_added: "2025-05-21"
description: "HR BP Agent."
changes: "Initial version"

View File

@@ -0,0 +1,19 @@
version: "1.0.0"
name: "Specialist Configuration"
configuration:
specialist_type:
name: "Specialist Type"
type: "str"
description: "The Specialist Type this configuration is made for"
required: True
specialist_version:
name: "Specialist Version"
type: "str"
description: "The Specialist Type version this configuration is made for"
required: True
metadata:
author: "Josako"
date_added: "2025-05-21"
description: "Asset that defines a template in markdown a specialist can process"
changes: "Initial version"

View File

@@ -0,0 +1,21 @@
version: "1.0.0"
name: "Dossier Catalog"
description: "A Catalog with information in Evie's Library in which several Dossiers can be stored"
configuration:
tagging_fields:
name: "Tagging Fields"
type: "tagging_fields"
description: "Define the metadata fields that will be used for tagging documents.
Each field must have:
- type: one of 'string', 'integer', 'float', 'date', 'enum'
- required: boolean indicating if the field is mandatory
- description: field description
- allowed_values: list of values (for enum type only)
- min_value/max_value: range limits (for numeric types only)"
required: true
default: {}
document_version_configurations: ["tagging_fields"]
metadata:
author: "System"
date_added: "2023-01-01"
description: "A Catalog with information in Evie's Library in which several Dossiers can be stored"

View File

@@ -0,0 +1,9 @@
version: "1.0.0"
name: "Standard Catalog"
description: "A Catalog with information in Evie's Library, to be considered as a whole"
configuration: {}
document_version_configurations: []
metadata:
author: "System"
date_added: "2023-01-01"
description: "A Catalog with information in Evie's Library, to be considered as a whole"

View File

@@ -14,7 +14,17 @@ class Config(object):
SECRET_KEY = environ.get('SECRET_KEY')
SESSION_COOKIE_SECURE = False
SESSION_COOKIE_HTTPONLY = True
SESSION_KEY_PREFIX = f'{environ.get('COMPONENT_NAME')}_'
COMPONENT_NAME = environ.get('COMPONENT_NAME')
SESSION_KEY_PREFIX = f'{COMPONENT_NAME}_'
# Database Settings
DB_HOST = environ.get('DB_HOST')
DB_USER = environ.get('DB_USER')
DB_PASS = environ.get('DB_PASS')
DB_NAME = environ.get('DB_NAME')
DB_PORT = environ.get('DB_PORT')
SQLALCHEMY_DATABASE_URI = f'postgresql+pg8000://{DB_USER}:{DB_PASS}@{DB_HOST}:{DB_PORT}/{DB_NAME}'
SQLALCHEMY_BINDS = {'public': SQLALCHEMY_DATABASE_URI}
WTF_CSRF_ENABLED = True
WTF_CSRF_TIME_LIMIT = None
@@ -65,17 +75,13 @@ class Config(object):
# supported LLMs
# SUPPORTED_EMBEDDINGS = ['openai.text-embedding-3-small', 'openai.text-embedding-3-large', 'mistral.mistral-embed']
SUPPORTED_EMBEDDINGS = ['mistral.mistral-embed']
SUPPORTED_LLMS = ['openai.gpt-4o', 'anthropic.claude-3-5-sonnet', 'openai.gpt-4o-mini',
'mistral.mistral-large-latest', 'mistral.mistral-small-latest']
SUPPORTED_LLMS = ['openai.gpt-4o', 'openai.gpt-4o-mini',
'mistral.mistral-large-latest', 'mistral.mistral-medium_latest', 'mistral.mistral-small-latest']
ANTHROPIC_LLM_VERSIONS = {'claude-3-5-sonnet': 'claude-3-5-sonnet-20240620', }
# Annotation text chunk length
ANNOTATION_TEXT_CHUNK_LENGTH = {
'openai.gpt-4o': 10000,
'openai.gpt-4o-mini': 10000,
'anthropic.claude-3-5-sonnet': 8000
}
ANNOTATION_TEXT_CHUNK_LENGTH = 10000
# Environemnt Loaders
OPENAI_API_KEY = environ.get('OPENAI_API_KEY')
@@ -125,15 +131,6 @@ class Config(object):
"LLM": {"name": "LLM", "description": "Algorithm using information integrated in the used LLM"}
}
# flask-mailman settings
MAIL_SERVER = environ.get('MAIL_SERVER')
MAIL_PORT = int(environ.get('MAIL_PORT', 465))
MAIL_USE_TLS = False
MAIL_USE_SSL = True
MAIL_USERNAME = environ.get('MAIL_USERNAME')
MAIL_PASSWORD = environ.get('MAIL_PASSWORD')
MAIL_DEFAULT_SENDER = ('Evie', MAIL_USERNAME)
# Email settings for API key notifications
PROMOTIONAL_IMAGE_URL = 'https://askeveai.com/wp-content/uploads/2024/07/Evie-Call-scaled.jpg' # Replace with your actual URL
@@ -154,12 +151,30 @@ class Config(object):
COMPRESSION_PROCESS_DELAY = 1
# WordPress Integration Settings
WORDPRESS_PROTOCOL = os.environ.get('WORDPRESS_PROTOCOL', 'http')
WORDPRESS_HOST = os.environ.get('WORDPRESS_HOST', 'host.docker.internal')
WORDPRESS_PORT = os.environ.get('WORDPRESS_PORT', '10003')
WORDPRESS_PROTOCOL = environ.get('WORDPRESS_PROTOCOL', 'http')
WORDPRESS_HOST = environ.get('WORDPRESS_HOST', 'host.docker.internal')
WORDPRESS_PORT = environ.get('WORDPRESS_PORT', '10003')
WORDPRESS_BASE_URL = f"{WORDPRESS_PROTOCOL}://{WORDPRESS_HOST}:{WORDPRESS_PORT}"
EXTERNAL_WORDPRESS_BASE_URL = 'localhost:10003'
# Prometheus PUSH Gataway
PUSH_GATEWAY_HOST = environ.get('PUSH_GATEWAY_HOST', 'pushgateway')
PUSH_GATEWAY_PORT = environ.get('PUSH_GATEWAY_PORT', '9091')
PUSH_GATEWAY_URL = f"{PUSH_GATEWAY_HOST}:{PUSH_GATEWAY_PORT}"
# Scaleway parameters
SW_EMAIL_ACCESS_KEY = environ.get('SW_EMAIL_ACCESS_KEY')
SW_EMAIL_SECRET_KEY = environ.get('SW_EMAIL_SECRET_KEY')
SW_EMAIL_SENDER = environ.get('SW_EMAIL_SENDER')
SW_EMAIL_NAME = environ.get('SW_EMAIL_NAME')
SW_PROJECT = environ.get('SW_PROJECT')
# Entitlement Constants
ENTITLEMENTS_MAX_PENDING_DAYS = 5 # Defines the maximum number of days a pending entitlement can be active
# Content Directory for static content like the changelog, terms & conditions, privacy statement, ...
CONTENT_DIR = '/app/content'
class DevConfig(Config):
DEVELOPMENT = True
@@ -167,14 +182,6 @@ class DevConfig(Config):
FLASK_DEBUG = True
EXPLAIN_TEMPLATE_LOADING = False
# Database Settings
DB_HOST = environ.get('DB_HOST', 'localhost')
DB_USER = environ.get('DB_USER', 'luke')
DB_PASS = environ.get('DB_PASS', 'Skywalker!')
DB_NAME = environ.get('DB_NAME', 'eveai')
SQLALCHEMY_DATABASE_URI = f'postgresql+pg8000://{DB_USER}:{DB_PASS}@{DB_HOST}:5432/{DB_NAME}'
SQLALCHEMY_BINDS = {'public': SQLALCHEMY_DATABASE_URI}
# Define the nginx prefix used for the specific apps
EVEAI_APP_LOCATION_PREFIX = '/admin'
EVEAI_CHAT_LOCATION_PREFIX = '/chat'
@@ -237,7 +244,6 @@ class DevConfig(Config):
class ProdConfig(Config):
DEVELOPMENT = False
DEBUG = False
DEBUG = False
FLASK_DEBUG = False
EXPLAIN_TEMPLATE_LOADING = False
@@ -246,24 +252,6 @@ class ProdConfig(Config):
WTF_CSRF_SSL_STRICT = True # Set to True if using HTTPS
# Database Settings
DB_HOST = environ.get('DB_HOST')
DB_USER = environ.get('DB_USER')
DB_PASS = environ.get('DB_PASS')
DB_NAME = environ.get('DB_NAME')
DB_PORT = environ.get('DB_PORT')
SQLALCHEMY_DATABASE_URI = f'postgresql+pg8000://{DB_USER}:{DB_PASS}@{DB_HOST}:{DB_PORT}/{DB_NAME}'
SQLALCHEMY_BINDS = {'public': SQLALCHEMY_DATABASE_URI}
# flask-mailman settings
MAIL_SERVER = 'mail.askeveai.com'
MAIL_PORT = 587
MAIL_USE_TLS = True
MAIL_USE_SSL = False
MAIL_DEFAULT_SENDER = ('Evie Admin', 'evie_admin@askeveai.com')
MAIL_USERNAME = environ.get('MAIL_USERNAME')
MAIL_PASSWORD = environ.get('MAIL_PASSWORD')
# Define the nginx prefix used for the specific apps
EVEAI_APP_LOCATION_PREFIX = '/admin'
EVEAI_CHAT_LOCATION_PREFIX = '/chat'

View File

@@ -0,0 +1,43 @@
version: "1.0.0"
name: "Chat Client Customisation"
configuration:
"primary_color":
name: "Primary Color"
description: "Primary Color"
type: "color"
required: false
"secondary_color":
name: "Secondary Color"
description: "Secondary Color"
type: "color"
required: false
"background_color":
name: "Background Color"
description: "Background Color"
type: "color"
required: false
"text_color":
name: "Text Color"
description: "Text Color"
type: "color"
required: false
"sidebar_color":
name: "Sidebar Color"
description: "Sidebar Color"
type: "color"
required: false
"sidebar_text":
name: "Sidebar Text"
description: "Text to be shown in the sidebar"
type: "text"
required: false
"welcome_message":
name: "Welcome Message"
description: "Text to be shown as Welcome"
type: "text"
required: false
metadata:
author: "Josako"
date_added: "2024-06-06"
changes: "Initial version"
description: "Parameters allowing to customise the chat client"

View File

@@ -13,6 +13,26 @@ GRAYLOG_PORT = int(os.environ.get('GRAYLOG_PORT', 12201))
env = os.environ.get('FLASK_ENV', 'development')
def pad_string(s, target_length=100, pad_char='-'):
"""
Pads a string with the specified character until it reaches the target length.
Args:
s: The original string
target_length: The desired total length
pad_char: Character to use for padding
Returns:
The padded string
"""
current_length = len(s)
if current_length >= target_length:
return s
padding_needed = target_length - current_length - 1
return s + " " + (pad_char * padding_needed)
class TuningLogRecord(logging.LogRecord):
"""Extended LogRecord that handles both tuning and business event logging"""
@@ -27,6 +47,7 @@ class TuningLogRecord(logging.LogRecord):
self._tuning_specialist_id = None
self._tuning_retriever_id = None
self._tuning_processor_id = None
self._session_id = None
self.component = os.environ.get('COMPONENT_NAME', 'eveai_app')
def getMessage(self):
@@ -67,16 +88,18 @@ class TuningLogRecord(logging.LogRecord):
'tuning_specialist_id': self._tuning_specialist_id,
'tuning_retriever_id': self._tuning_retriever_id,
'tuning_processor_id': self._tuning_processor_id,
'session_id': self._session_id,
}
def set_tuning_data(self, tenant_id=None, catalog_id=None, specialist_id=None,
retriever_id=None, processor_id=None):
retriever_id=None, processor_id=None, session_id=None,):
"""Set tuning-specific data"""
object.__setattr__(self, '_tuning_tenant_id', tenant_id)
object.__setattr__(self, '_tuning_catalog_id', catalog_id)
object.__setattr__(self, '_tuning_specialist_id', specialist_id)
object.__setattr__(self, '_tuning_retriever_id', retriever_id)
object.__setattr__(self, '_tuning_processor_id', processor_id)
object.__setattr__(self, '_session_id', session_id)
class TuningFormatter(logging.Formatter):
@@ -100,6 +123,12 @@ class TuningFormatter(logging.Formatter):
identifiers.append(f"Catalog: {record.catalog_id}")
if hasattr(record, 'processor_id') and record.processor_id:
identifiers.append(f"Processor: {record.processor_id}")
if hasattr(record, 'specialist_id') and record.specialist_id:
identifiers.append(f"Specialist: {record.specialist_id}")
if hasattr(record, 'retriever_id') and record.retriever_id:
identifiers.append(f"Retriever: {record.retriever_id}")
if hasattr(record, 'session_id') and record.session_id:
identifiers.append(f"Session: {record.session_id}")
formatted_msg = (
f"{formatted_msg}\n"
@@ -129,22 +158,93 @@ class GraylogFormatter(logging.Formatter):
'specialist_id': record.specialist_id,
'retriever_id': record.retriever_id,
'processor_id': record.processor_id,
'session_id': record.session_id,
}
return super().format(record)
class TuningLogger:
"""Helper class to manage tuning logs with consistent structure"""
def __init__(self, logger_name, tenant_id=None, catalog_id=None, specialist_id=None, retriever_id=None, processor_id=None):
def __init__(self, logger_name, tenant_id=None, catalog_id=None, specialist_id=None, retriever_id=None,
processor_id=None, session_id=None, log_file=None):
"""
Initialize a tuning logger
Args:
logger_name: Base name for the logger
tenant_id: Optional tenant ID for context
catalog_id: Optional catalog ID for context
specialist_id: Optional specialist ID for context
retriever_id: Optional retriever ID for context
processor_id: Optional processor ID for context
session_id: Optional session ID for context and log file naming
log_file: Optional custom log file name to use
"""
self.logger = logging.getLogger(logger_name)
self.tenant_id = tenant_id
self.catalog_id = catalog_id
self.specialist_id = specialist_id
self.retriever_id = retriever_id
self.processor_id = processor_id
self.session_id = session_id
self.log_file = log_file
# Determine whether to use a session-specific logger
if session_id:
# Create a unique logger name for this session
session_logger_name = f"{logger_name}_{session_id}"
self.logger = logging.getLogger(session_logger_name)
def log_tuning(self, tuning_type: str, message: str, data=None, level=logging.DEBUG):
# If this logger doesn't have handlers yet, configure it
if not self.logger.handlers:
# Determine log file path
if not log_file and session_id:
log_file = f"logs/tuning_{session_id}.log"
elif not log_file:
log_file = "logs/tuning.log"
# Configure the logger
self._configure_session_logger(log_file)
else:
# Use the standard tuning logger
self.logger = logging.getLogger(logger_name)
def _configure_session_logger(self, log_file):
"""Configure a new session-specific logger with appropriate handlers"""
# Create and configure a file handler
file_handler = logging.handlers.RotatingFileHandler(
filename=log_file,
maxBytes=1024 * 1024 * 3, # 3MB
backupCount=3
)
file_handler.setFormatter(TuningFormatter())
file_handler.setLevel(logging.DEBUG)
# Add the file handler to the logger
self.logger.addHandler(file_handler)
# Add Graylog handler in production
env = os.environ.get('FLASK_ENV', 'development')
if env == 'production':
try:
graylog_handler = GELFUDPHandler(
host=GRAYLOG_HOST,
port=GRAYLOG_PORT,
debugging_fields=True
)
graylog_handler.setFormatter(GraylogFormatter())
self.logger.addHandler(graylog_handler)
except Exception as e:
# Fall back to just file logging if Graylog setup fails
fallback_logger = logging.getLogger('eveai_app')
fallback_logger.warning(f"Failed to set up Graylog handler: {str(e)}")
# Set logger level and disable propagation
self.logger.setLevel(logging.DEBUG)
self.logger.propagate = False
def log_tuning(self, tuning_type: str, message: str, data=None, level=logging.DEBUG):
"""Log a tuning event with structured data"""
try:
# Create a standard LogRecord for tuning
@@ -153,7 +253,7 @@ class TuningLogger:
level=level,
pathname='',
lineno=0,
msg=message,
msg=pad_string(message, 100, '-'),
args=(),
exc_info=None
)
@@ -166,6 +266,7 @@ class TuningLogger:
record.specialist_id = self.specialist_id
record.retriever_id = self.retriever_id
record.processor_id = self.processor_id
record.session_id = self.session_id
if data:
record.tuning_data = data
@@ -202,10 +303,10 @@ LOGGING = {
'backupCount': 2,
'formatter': 'standard',
},
'file_chat': {
'file_chat_client': {
'level': 'DEBUG',
'class': 'logging.handlers.RotatingFileHandler',
'filename': 'logs/eveai_chat.log',
'filename': 'logs/eveai_chat_client.log',
'maxBytes': 1024 * 1024 * 1, # 1MB
'backupCount': 2,
'formatter': 'standard',
@@ -250,14 +351,6 @@ LOGGING = {
'backupCount': 2,
'formatter': 'standard',
},
'file_mailman': {
'level': 'DEBUG',
'class': 'logging.handlers.RotatingFileHandler',
'filename': 'logs/mailman.log',
'maxBytes': 1024 * 1024 * 1, # 1MB
'backupCount': 2,
'formatter': 'standard',
},
'file_security': {
'level': 'DEBUG',
'class': 'logging.handlers.RotatingFileHandler',
@@ -339,8 +432,8 @@ LOGGING = {
'level': 'DEBUG',
'propagate': False
},
'eveai_chat': { # logger for the eveai_chat
'handlers': ['file_chat', 'graylog', ] if env == 'production' else ['file_chat', ],
'eveai_chat_client': { # logger for the eveai_chat
'handlers': ['file_chat_client', 'graylog', ] if env == 'production' else ['file_chat_client', ],
'level': 'DEBUG',
'propagate': False
},
@@ -369,11 +462,6 @@ LOGGING = {
'level': 'DEBUG',
'propagate': False
},
'mailman': { # logger for the mailman
'handlers': ['file_mailman', 'graylog', ] if env == 'production' else ['file_mailman', ],
'level': 'DEBUG',
'propagate': False
},
'security': { # logger for the security
'handlers': ['file_security', 'graylog', ] if env == 'production' else ['file_security', ],
'level': 'DEBUG',

View File

@@ -0,0 +1,27 @@
version: "1.0.0"
name: "Management Service"
configuration:
billing_partner:
name: "Billing Partner"
type: "boolean"
description: "Billing of assigned Tenants is done through the partner."
required: true
default: false
permissions:
can_create_tenant:
name: "Can Create Tenant"
type: "boolean"
description: "The Partner Admin can create new Tenants, linked to the partner"
required: true
default: false
can_assign_license:
name: "Can Assign License"
type: "boolean"
description: "The Partner Admin can assign licenses to Tenants, linked to the partner"
required: true
default: false
metadata:
author: "Josako"
date_added: "2025-04-02"
changes: "Initial version"
description: "Initial definition of the management service"

View File

@@ -0,0 +1,14 @@
version: "1.0.0"
name: "Management Service"
configuration:
specialist_denominator:
name: "Specialist Denominator"
type: "string"
description: "Name defining the denominator for the specialist. Needs to be unique."
required: False
permissions: {}
metadata:
author: "Josako"
date_added: "2025-04-02"
changes: "Initial version"
description: "Initial definition of the management service"

View File

@@ -0,0 +1,9 @@
version: "1.0.0"
name: "AUDIO Processor"
file_types: "mp3, mp4, ogg"
description: "A Processor for audio files"
configuration: {}
metadata:
author: "System"
date_added: "2023-01-01"
description: "A Processor for audio files"

View File

@@ -0,0 +1,59 @@
version: "1.0.0"
name: "DOCX Processor"
file_types: "docx"
description: "A processor for DOCX files"
configuration:
chunking_patterns:
name: "Chunking Patterns"
description: "A list of Patterns used to chunk files into logical pieces"
type: "chunking_patterns"
required: false
chunking_heading_level:
name: "Chunking Heading Level"
type: "integer"
description: "Maximum heading level to consider for chunking (1-6)"
required: false
default: 2
extract_comments:
name: "Extract Comments"
type: "boolean"
description: "Whether to include document comments in the markdown"
required: false
default: false
extract_headers_footers:
name: "Extract Headers/Footers"
type: "boolean"
description: "Whether to include headers and footers in the markdown"
required: false
default: false
preserve_formatting:
name: "Preserve Formatting"
type: "boolean"
description: "Whether to preserve bold, italic, and other text formatting"
required: false
default: true
list_style:
name: "List Style"
type: "enum"
description: "How to format lists in markdown"
required: false
default: "dash"
allowed_values: ["dash", "asterisk", "plus"]
image_handling:
name: "Image Handling"
type: "enum"
description: "How to handle embedded images"
required: false
default: "skip"
allowed_values: ["skip", "extract", "placeholder"]
table_alignment:
name: "Table Alignment"
type: "enum"
description: "How to align table contents"
required: false
default: "left"
allowed_values: ["left", "center", "preserve"]
metadata:
author: "System"
date_added: "2023-01-01"
description: "A processor for DOCX files"

Some files were not shown because too many files have changed in this diff Show More