* feat(web): add PageEmpty component
* feat(web): add PageTabs component
* feat(web): add PageEmpty component
* feat(web): add PageTabs component
* feat(prompt): add history tracking for prompt releases
* feat(web): add prompt menu
* refactor: The PageScrollList component supports two generic parameters
* feat(web): BodyWrapper compoent update PageLoading
* feat(web): add Ontology menu
* feat(web): memory management add scene
* feat(tasks): add celery task configuration for periodic jobs
- Add ignore_result=True to prevent storing results for periodic tasks
- Set max_retries=0 to skip failed periodic tasks without retry attempts
- Configure acks_late=False for immediate acknowledgment in beat tasks
- Add time_limit and soft_time_limit to regenerate_memory_cache task (3600s/3300s)
- Add time_limit and soft_time_limit to workspace_reflection_task (300s/240s)
- Add time_limit and soft_time_limit to run_forgetting_cycle_task (7200s/7000s)
- Improve task reliability and resource management for scheduled jobs
* feat(sandbox): add Node.js code execution support to sandbox
* Release/v0.2.2 (#260)
* [modify] migration script
* [add] migration script
* fix(web): change form message
* fix(web): the memoryContent field is compatible with numbers and strings
* feat(web): code node hidden
* fix(model):
1. create a basic model to check if the name and provider are duplicated.
2. The result shows error models because the provider created API Keys for all matching models.
---------
Co-authored-by: Mark <zhuwenhui5566@163.com>
Co-authored-by: zhaoying <yzhao96@best-inc.com>
Co-authored-by: yingzhao <zhaoyingyz@126.com>
Co-authored-by: Timebomb2018 <18868801967@163.com>
* Feature/ontology class clean (#249)
* [add] Complete ontology engineering feature implementation
* [add] Add ontology feature integration and validation utilities
* [add] Add OWL validator and validation utilities
* [fix] Add missing render_ontology_extraction_prompt function
* [fix]Add dependencies, fix functionality
* [add] migration script
* feat(celery): add dedicated periodic tasks worker and queue (#261)
* fix(web): conflict resolve
* Fix/v022 bug (#263)
* [fix]Fix the issue of inconsistent language in explicit and episodic memory.
* [fix]Fix the issue of inconsistent language in explicit and episodic memory.
* [add]Add scene_id
* [fix]Based on the AI review to fix the code
* Fix/develop memory reflex (#265)
* 遗漏的历史映射
* 遗漏的历史映射
* 反思后台报错处理
* [add] migration script
* fix: chat conversation_id add node_start
* feat(web): show code node
* fix(web): Restructure the CustomSelect component, repair the interface that is called multiple times when the form is updated
* feat(web): RadioGroupCard support block mode
* feat(web): create space add icon
* feat(app and model): token consumption statistics
* Add/develop memory (#264)
* 遗漏的历史映射
* 遗漏的历史映射
* 遗漏的历史映射
* 遗漏的历史映射
* 遗漏的历史映射
* 遗漏的历史映射
* 遗漏的历史映射
* 遗漏的历史映射
* 遗漏的历史映射
* 新增长期记忆功能
* 新增长期记忆功能
* 新增长期记忆功能
* 知识库检索多余字段
* 长期
* feat(app and model): token consumption statistics of the cluster
* memory_BUG_fix
* fix(web): prompt history remove pageLoading
* fix(prompt): remove hard-coded import of prompt file paths (#279)
* Fix/develop memory bug (#274)
* 遗漏的历史映射
* 遗漏的历史映射
* fix_timeline_memories
* fix(web): update retrieve_type key
* Fix/develop memory bug (#276)
* 遗漏的历史映射
* 遗漏的历史映射
* fix_timeline_memories
* fix_timeline_memories
* write_gragp/bug_fix
* write_gragp/bug_fix
* write_gragp/bug_fix
* chore(celery): disable periodic task scheduling
* fix(prompt): remove hard-coded import of prompt file paths
---------
Co-authored-by: lixinyue11 <94037597+lixinyue11@users.noreply.github.com>
Co-authored-by: zhaoying <yzhao96@best-inc.com>
Co-authored-by: yingzhao <zhaoyingyz@126.com>
Co-authored-by: Ke Sun <kesun5@illinois.edu>
* fix(web): remove delete confirm content
* refactor(workflow): relocate template directory into workflow
* feat(memory): add long-term storage task routing and batching
* fix(web): PageScrollList loading update
* fix(web): PageScrollList loading update
* Ontology v1 bug (#291)
* [changes]Add 'id' as the secondary sorting key, and 'scene_id' now returns a UUID object
* [fix]Fix the "end_user" return to be sorted by update time.
* [fix]Set the default values of the memory configuration model based on the spatial model.
* [fix]Remove the entity extraction check combination model, read the configuration list, and add the return of scene_id
* [fix]Fix the "end_user" return to be sorted by update time.
* [fix]
* fix(memory): add Redis session validation
- Add macOS fork() safety configuration in celery_app.py to prevent initialization issues
- Add null/False checks for Redis session queries in term_memory_save to handle missing sessions gracefully
- Add null/False checks in memory_long_term_storage to prevent processing empty Redis results
- Add null/False checks in aggregate_judgment before format_parsing to avoid errors on missing data
- Initialize redis_messages variable in window_dialogue for consistency
- Add debug logging when no existing session found in Redis for better troubleshooting
- Add TODO comments for magic numbers (scope=6, time=5) to be extracted as constants
- Improve error handling when Redis returns False or empty results instead of crashing
* fix(web): PageScrollList style update
* fix(workflow): fix argument passing in code execution nodes
* fix(web): prompt add disabled
* fix(web): space icon required
* feat(app): modify the key of the token
* fix(fix the key of the app's token):
* fix(workflow): switch code input encoding to base64+URL encoding
* [add]The main project adds multi-API Key load balancing.
* [changes]Attribute security access, secure numerical conversion, unified use of local variables
* fix(web): save add session update
* fix(web): language editor support paste
* [changes]Active status filtering logic, API Key selection strategy
* memory_BUG
* memory_BUG_long_term
* [changes]
* memory_BUG_long_term
* memory_BUG_long_term
* Fix/release memory bug (#306)
* memory_BUG_fix
* memory_BUG
* memory_BUG_long_term
* memory_BUG_long_term
* memory_BUG_long_term
* knowledge_retrieval/bug/fix
* knowledge_retrieval/bug/fix
* knowledge_retrieval/bug/fix
* [fix]1.The "read_all_config" interface returns "scene_name";2.Memory configuration for lightweight query ontology scenarios
* fix(web): replace code editor
* [changes]Modify the description of the time for the recent event
* [changes]Modify the code based on the AI review
* feat(web): update memory config ontology api
* fix(web): ui update
* knowledge_retrieval/bug/fix
* knowledge_retrieval/bug/fix
* knowledge_retrieval/bug/fix
* feat(workflow): add token usage statistics for question classifier and parameter extraction
* feat(web): move prompt menu
* Multiple independent transactions - single transaction
* Multiple independent transactions - single transaction
* Multiple independent transactions - single transaction
* Multiple independent transactions - single transaction
* Write Missing None (#321)
* Write Missing None
* Write Missing None
* Write Missing None
* Apply suggestion from @sourcery-ai[bot]
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
* Write Missing None
---------
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
* Fix/release memory bug (#324)
* Write Missing None
* Write Missing None
* Write Missing None
* Apply suggestion from @sourcery-ai[bot]
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
* Write Missing None
* redis update
* redis update
* redis update
* redis update
---------
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
* Fix/writer memory bug (#326)
* [fix]Fix the bug
* [fix]Fix the bug
* [fix]Correct the direction indication.
* fix(web): markdown table ui update
* Fix/release memory bug (#332)
* Write Missing None
* Write Missing None
* Write Missing None
* Apply suggestion from @sourcery-ai[bot]
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
* Write Missing None
* redis update
* redis update
* redis update
* redis update
* writer_dup_bug/fix
---------
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
* Fix/fact summary (#333)
* [fix]Disable the contents related to fact_summary
* [fix]Disable the contents related to fact_summary
* [fix]Modify the code based on the AI review
* Fix/release memory bug (#335)
* Write Missing None
* Write Missing None
* Write Missing None
* Apply suggestion from @sourcery-ai[bot]
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
* Write Missing None
* redis update
* redis update
* redis update
* redis update
* writer_dup_bug/fix
* writer_graph_bug/fix
* writer_graph_bug/fix
---------
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
* Revert "feat(web): move prompt menu"
This reverts commit 9e6e8f50f8.
* fix(web): ui update
* fix(web): update text
* fix(web): ui update
* fix(model): change the "vl" model type of dashscope to "chat"
* fix(model): change the "vl" model type of dashscope to "chat"
---------
Co-authored-by: zhaoying <yzhao96@best-inc.com>
Co-authored-by: Eternity <1533512157@qq.com>
Co-authored-by: Mark <zhuwenhui5566@163.com>
Co-authored-by: yingzhao <zhaoyingyz@126.com>
Co-authored-by: Timebomb2018 <18868801967@163.com>
Co-authored-by: 乐力齐 <162269739+lanceyq@users.noreply.github.com>
Co-authored-by: lixinyue11 <94037597+lixinyue11@users.noreply.github.com>
Co-authored-by: lixinyue <2569494688@qq.com>
Co-authored-by: Eternity <61316157+myhMARS@users.noreply.github.com>
Co-authored-by: lanceyq <1982376970@qq.com>
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
299 lines
12 KiB
Python
299 lines
12 KiB
Python
from typing import List
|
|
|
|
# 使用新的仓储层
|
|
from app.repositories.neo4j.neo4j_connector import Neo4jConnector
|
|
from app.repositories.neo4j.add_nodes import add_dialogue_nodes, add_statement_nodes, add_chunk_nodes
|
|
from app.repositories.neo4j.cypher_queries import (
|
|
STATEMENT_ENTITY_EDGE_SAVE,
|
|
ENTITY_RELATIONSHIP_SAVE,
|
|
EXTRACTED_ENTITY_NODE_SAVE,
|
|
CHUNK_STATEMENT_EDGE_SAVE,
|
|
STATEMENT_ENTITY_EDGE_SAVE,
|
|
ENTITY_RELATIONSHIP_SAVE,
|
|
EXTRACTED_ENTITY_NODE_SAVE,
|
|
)
|
|
from app.core.memory.models.graph_models import (
|
|
DialogueNode,
|
|
ChunkNode,
|
|
StatementChunkEdge,
|
|
StatementEntityEdge,
|
|
StatementNode,
|
|
ExtractedEntityNode,
|
|
EntityEntityEdge,
|
|
)
|
|
import logging
|
|
logger = logging.getLogger(__name__)
|
|
async def save_entities_and_relationships(
|
|
entity_nodes: List[ExtractedEntityNode],
|
|
entity_entity_edges: List[EntityEntityEdge],
|
|
connector: Neo4jConnector
|
|
):
|
|
"""Save entities and their relationships using graph models"""
|
|
all_entities = [entity.model_dump() for entity in entity_nodes]
|
|
all_relationships = []
|
|
|
|
for edge in entity_entity_edges:
|
|
relationship = {
|
|
'source_id': edge.source,
|
|
'target_id': edge.target,
|
|
'predicate': edge.relation_type,
|
|
'statement_id': edge.source_statement_id,
|
|
'value': edge.relation_value,
|
|
'statement': edge.statement,
|
|
'valid_at': edge.valid_at.isoformat() if edge.valid_at else None,
|
|
'invalid_at': edge.invalid_at.isoformat() if edge.invalid_at else None,
|
|
'created_at': edge.created_at.isoformat() if edge.created_at else None,
|
|
'expired_at': edge.expired_at.isoformat() if edge.expired_at else None,
|
|
'run_id': edge.run_id,
|
|
'end_user_id': edge.end_user_id,
|
|
}
|
|
all_relationships.append(relationship)
|
|
|
|
# Save entities
|
|
if all_entities:
|
|
entity_uuids = await connector.execute_query(EXTRACTED_ENTITY_NODE_SAVE, entities=all_entities)
|
|
if entity_uuids:
|
|
print(f"Successfully saved {len(entity_uuids)} entity nodes to Neo4j")
|
|
else:
|
|
print("Failed to save entity nodes to Neo4j")
|
|
else:
|
|
print("No entity nodes to save")
|
|
|
|
# Create relationships
|
|
if all_relationships:
|
|
relationship_uuids = await connector.execute_query(ENTITY_RELATIONSHIP_SAVE, relationships=all_relationships)
|
|
if relationship_uuids:
|
|
print(f"Successfully saved {len(relationship_uuids)} entity relationships (edges) to Neo4j")
|
|
else:
|
|
print("Failed to save entity relationships to Neo4j")
|
|
else:
|
|
print("No entity relationships to save")
|
|
|
|
|
|
async def save_chunk_nodes(
|
|
chunk_nodes: List[ChunkNode],
|
|
connector: Neo4jConnector
|
|
):
|
|
"""Save chunk nodes using graph models"""
|
|
if not chunk_nodes:
|
|
print("No chunk nodes to save")
|
|
return
|
|
|
|
chunk_uuids = await add_chunk_nodes(chunk_nodes, connector)
|
|
if chunk_uuids:
|
|
print(f"Successfully saved {len(chunk_uuids)} chunk nodes to Neo4j")
|
|
else:
|
|
print("Failed to save chunk nodes to Neo4j")
|
|
|
|
|
|
async def save_statement_chunk_edges(
|
|
statement_chunk_edges: List[StatementChunkEdge],
|
|
connector: Neo4jConnector
|
|
):
|
|
"""Save statement-chunk edges using graph models"""
|
|
if not statement_chunk_edges:
|
|
return
|
|
|
|
all_sc_edges = []
|
|
for edge in statement_chunk_edges:
|
|
all_sc_edges.append({
|
|
"id": edge.id,
|
|
"source": edge.source,
|
|
"target": edge.target,
|
|
"end_user_id": edge.end_user_id,
|
|
"run_id": edge.run_id,
|
|
"created_at": edge.created_at.isoformat() if edge.created_at else None,
|
|
"expired_at": edge.expired_at.isoformat() if edge.expired_at else None,
|
|
})
|
|
|
|
try:
|
|
await connector.execute_query(
|
|
CHUNK_STATEMENT_EDGE_SAVE,
|
|
chunk_statement_edges=all_sc_edges
|
|
)
|
|
except Exception:
|
|
pass
|
|
|
|
|
|
async def save_statement_entity_edges(
|
|
statement_entity_edges: List[StatementEntityEdge],
|
|
connector: Neo4jConnector
|
|
):
|
|
"""Save statement-entity edges using graph models"""
|
|
if not statement_entity_edges:
|
|
print("No statement-entity edges to save")
|
|
return
|
|
|
|
all_se_edges = []
|
|
for edge in statement_entity_edges:
|
|
edge_data = {
|
|
"source": edge.source,
|
|
"target": edge.target,
|
|
"end_user_id": edge.end_user_id,
|
|
"run_id": edge.run_id,
|
|
"connect_strength": edge.connect_strength,
|
|
"created_at": edge.created_at.isoformat() if edge.created_at else None,
|
|
"expired_at": edge.expired_at.isoformat() if edge.expired_at else None,
|
|
}
|
|
all_se_edges.append(edge_data)
|
|
|
|
if all_se_edges:
|
|
try:
|
|
await connector.execute_query(
|
|
STATEMENT_ENTITY_EDGE_SAVE,
|
|
relationships=all_se_edges
|
|
)
|
|
except Exception:
|
|
pass
|
|
|
|
|
|
async def save_dialog_and_statements_to_neo4j(
|
|
dialogue_nodes: List[DialogueNode],
|
|
chunk_nodes: List[ChunkNode],
|
|
statement_nodes: List[StatementNode],
|
|
entity_nodes: List[ExtractedEntityNode],
|
|
entity_edges: List[EntityEntityEdge],
|
|
statement_chunk_edges: List[StatementChunkEdge],
|
|
statement_entity_edges: List[StatementEntityEdge],
|
|
connector: Neo4jConnector
|
|
) -> bool:
|
|
"""Save dialogue nodes, chunk nodes, statement nodes, entities, and all relationships to Neo4j using graph models.
|
|
|
|
Args:
|
|
dialogue_nodes: List of DialogueNode objects to save
|
|
chunk_nodes: List of ChunkNode objects to save
|
|
statement_nodes: List of StatementNode objects to save
|
|
entity_nodes: List of ExtractedEntityNode objects to save
|
|
entity_edges: List of EntityEntityEdge objects to save
|
|
statement_chunk_edges: List of StatementChunkEdge objects to save
|
|
statement_entity_edges: List of StatementEntityEdge objects to save
|
|
connector: Neo4j connector instance
|
|
|
|
Returns:
|
|
bool: True if successful, False otherwise
|
|
"""
|
|
|
|
# 定义事务函数,将所有写操作放在一个事务中
|
|
async def _save_all_in_transaction(tx):
|
|
"""在单个事务中执行所有保存操作,避免死锁"""
|
|
results = {}
|
|
|
|
# 1. Save all dialogue nodes in batch
|
|
if dialogue_nodes:
|
|
from app.repositories.neo4j.cypher_queries import DIALOGUE_NODE_SAVE
|
|
dialogue_data = [node.model_dump() for node in dialogue_nodes]
|
|
result = await tx.run(DIALOGUE_NODE_SAVE, dialogues=dialogue_data)
|
|
dialogue_uuids = [record["uuid"] async for record in result]
|
|
results['dialogues'] = dialogue_uuids
|
|
print(f"Dialogues saved to Neo4j with UUIDs: {dialogue_uuids}")
|
|
|
|
# 2. Save all chunk nodes in batch
|
|
if chunk_nodes:
|
|
from app.repositories.neo4j.cypher_queries import CHUNK_NODE_SAVE
|
|
chunk_data = [node.model_dump() for node in chunk_nodes]
|
|
result = await tx.run(CHUNK_NODE_SAVE, chunks=chunk_data)
|
|
chunk_uuids = [record["uuid"] async for record in result]
|
|
results['chunks'] = chunk_uuids
|
|
logger.info(f"Successfully saved {len(chunk_uuids)} chunk nodes to Neo4j")
|
|
|
|
# 3. Save all statement nodes in batch
|
|
if statement_nodes:
|
|
from app.repositories.neo4j.cypher_queries import STATEMENT_NODE_SAVE
|
|
statement_data = [node.model_dump() for node in statement_nodes]
|
|
result = await tx.run(STATEMENT_NODE_SAVE, statements=statement_data)
|
|
statement_uuids = [record["uuid"] async for record in result]
|
|
results['statements'] = statement_uuids
|
|
logger.info(f"Successfully saved {len(statement_uuids)} statement nodes to Neo4j")
|
|
|
|
# 4. Save entities
|
|
if entity_nodes:
|
|
from app.repositories.neo4j.cypher_queries import EXTRACTED_ENTITY_NODE_SAVE
|
|
entity_data = [entity.model_dump() for entity in entity_nodes]
|
|
result = await tx.run(EXTRACTED_ENTITY_NODE_SAVE, entities=entity_data)
|
|
entity_uuids = [record["uuid"] async for record in result]
|
|
results['entities'] = entity_uuids
|
|
logger.info(f"Successfully saved {len(entity_uuids)} entity nodes to Neo4j")
|
|
|
|
# 5. Create entity relationships
|
|
if entity_edges:
|
|
from app.repositories.neo4j.cypher_queries import ENTITY_RELATIONSHIP_SAVE
|
|
relationship_data = []
|
|
for edge in entity_edges:
|
|
relationship_data.append({
|
|
'source_id': edge.source,
|
|
'target_id': edge.target,
|
|
'predicate': edge.relation_type,
|
|
'statement_id': edge.source_statement_id,
|
|
'value': edge.relation_value,
|
|
'statement': edge.statement,
|
|
'valid_at': edge.valid_at.isoformat() if edge.valid_at else None,
|
|
'invalid_at': edge.invalid_at.isoformat() if edge.invalid_at else None,
|
|
'created_at': edge.created_at.isoformat() if edge.created_at else None,
|
|
'expired_at': edge.expired_at.isoformat() if edge.expired_at else None,
|
|
'run_id': edge.run_id,
|
|
'end_user_id': edge.end_user_id,
|
|
})
|
|
result = await tx.run(ENTITY_RELATIONSHIP_SAVE, relationships=relationship_data)
|
|
rel_uuids = [record["uuid"] async for record in result]
|
|
results['entity_relationships'] = rel_uuids
|
|
logger.info(f"Successfully saved {len(rel_uuids)} entity relationships to Neo4j")
|
|
|
|
# 6. Save statement-chunk edges
|
|
if statement_chunk_edges:
|
|
from app.repositories.neo4j.cypher_queries import CHUNK_STATEMENT_EDGE_SAVE
|
|
sc_edge_data = []
|
|
for edge in statement_chunk_edges:
|
|
sc_edge_data.append({
|
|
"id": edge.id,
|
|
"source": edge.source,
|
|
"target": edge.target,
|
|
"created_at": edge.created_at.isoformat() if edge.created_at else None,
|
|
"expired_at": edge.expired_at.isoformat() if edge.expired_at else None,
|
|
"run_id": edge.run_id,
|
|
"end_user_id": edge.end_user_id,
|
|
})
|
|
result = await tx.run(CHUNK_STATEMENT_EDGE_SAVE, chunk_statement_edges=sc_edge_data)
|
|
sc_uuids = [record["uuid"] async for record in result]
|
|
results['statement_chunk_edges'] = sc_uuids
|
|
logger.info(f"Successfully saved {len(sc_uuids)} statement-chunk edges to Neo4j")
|
|
|
|
# 7. Save statement-entity edges
|
|
if statement_entity_edges:
|
|
from app.repositories.neo4j.cypher_queries import STATEMENT_ENTITY_EDGE_SAVE
|
|
se_edge_data = []
|
|
for edge in statement_entity_edges:
|
|
se_edge_data.append({
|
|
"source": edge.source,
|
|
"target": edge.target,
|
|
"created_at": edge.created_at.isoformat() if edge.created_at else None,
|
|
"expired_at": edge.expired_at.isoformat() if edge.expired_at else None,
|
|
"run_id": edge.run_id,
|
|
"end_user_id": edge.end_user_id,
|
|
"connect_strength": getattr(edge, "connect_strength", "strong"),
|
|
})
|
|
result = await tx.run(STATEMENT_ENTITY_EDGE_SAVE, relationships=se_edge_data)
|
|
se_uuids = [record["uuid"] async for record in result]
|
|
results['statement_entity_edges'] = se_uuids
|
|
logger.info(f"Successfully saved {len(se_uuids)} statement-entity edges to Neo4j")
|
|
|
|
return results
|
|
|
|
try:
|
|
# 使用显式写事务执行所有操作,避免死锁
|
|
results = await connector.execute_write_transaction(_save_all_in_transaction)
|
|
summary = {
|
|
key: len(value)
|
|
for key, value in results.items()
|
|
if isinstance(value, (list, tuple, set))
|
|
}
|
|
logger.info("Transaction completed. Summary: %s", summary)
|
|
logger.debug("Full transaction results: %r", results)
|
|
return True
|
|
|
|
except Exception as e:
|
|
logger.error(f"Neo4j integration error: {e}", exc_info=True)
|
|
print(f"Neo4j integration error: {e}")
|
|
print("Continuing without database storage...")
|
|
return False
|
|
|