Volume 16, Number 5
Memory Architecture in S-AI-GPT: From Contextual Adaptation to Hormonal Modulation
Authors
Said Slaoui, Mohammed V University, Morocco
Abstract
This article presents a biologically inspired memory architecture embedded within the Sparse Artificial Intelligence – Generative Pretrained Transformer (S-AI-GPT) conversational framework. Addressing the limitations of stateless Large Language Models (LLMs), the system integrates three complementary components: a Dynamic Contextual Memory (DCM) for short-term working retention, a GPTMemoryAgent for long-term personalized storage, and a GPT-MemoryGland for affective trace encoding and modulation. These components are orchestrated by a hormonal engine, enabling adaptive forgetting, emotional persistence, and context-aware prioritization of memory traces. Unlike typical passive memory modules, this architecture introduces an active, symbolic, and controllable memory system: memory traces can trigger internal hormonal signals, are stored in a structured and interpretable form, and can be selectively reinforced, inhibited, or reorganized by the GPT-MetaAgent. The proposed model provides a promising foundation for building frugal, adaptive, and explainable lifelong memory systems in conversational AI.
Keywords
Memory in AI, Conversational Agents, Sparse Activation, Hormonal Modulation, Personalized Dialogue, Emotional Trace, Dynamic Memory, Modular Architecture, S-AI-GPT.