Confidential guide on numerology and astrology, based of GG33 Public information

mcp-titan
This tool is a cutting-edge memory engine that blends real-time learning, persistent three-tier context awareness, and seamless plug-n-play LLM integration to continuously evolve and enrich your AI’s intelligence.
3 years
Works with Finder
3
Github Watches
6
Github Forks
47
Github Stars
Titan Memory MCP Server
I'm aware its broken right now, I'll fix it! Ideally this just runs in yolo mode in cursor (or claude desktop) without human intervention and creates a "brain" available independent of LLM version.
A neural memory system for LLMs that can learn and predict sequences while maintaining state through a memory vector. This MCP (Model Context Protocol) server provides tools for Claude 3.7 Sonnet and other LLMs to maintain memory state across interactions.
Features
- Perfect for Cursor: Now that Cursor automatically runs MCP in yolo mode, you can take your hands off the wheel with your LLM's new memory
- Neural Memory Architecture: Transformer-based memory system that can learn and predict sequences
- Memory Management: Efficient tensor operations with automatic memory cleanup
- MCP Integration: Fully compatible with Cursor and other MCP clients
- Text Encoding: Convert text inputs to tensor representations
- Memory Persistence: Save and load memory states between sessions
Installation
# Clone the repository
git clone https://github.com/yourusername/titan-memory.git
cd titan-memory
# Install dependencies
npm install
# Build the project
npm run build
# Start the server
npm start
Available Tools
The Titan Memory MCP server provides the following tools:
help
Get help about available tools.
Parameters:
-
tool
(optional): Specific tool name to get help for -
category
(optional): Category of tools to explore -
showExamples
(optional): Include usage examples -
verbose
(optional): Include detailed descriptions
init_model
Initialize the Titan Memory model with custom configuration.
Parameters:
-
inputDim
: Input dimension size (default: 768) -
hiddenDim
: Hidden dimension size (default: 512) -
memoryDim
: Memory dimension size (default: 1024) -
transformerLayers
: Number of transformer layers (default: 6) -
numHeads
: Number of attention heads (default: 8) -
ffDimension
: Feed-forward dimension (default: 2048) -
dropoutRate
: Dropout rate (default: 0.1) -
maxSequenceLength
: Maximum sequence length (default: 512) -
memorySlots
: Number of memory slots (default: 5000) -
similarityThreshold
: Similarity threshold (default: 0.65) -
surpriseDecay
: Surprise decay rate (default: 0.9) -
pruningInterval
: Pruning interval (default: 1000) -
gradientClip
: Gradient clipping value (default: 1.0)
forward_pass
Perform a forward pass through the model to get predictions.
Parameters:
-
x
: Input vector or text -
memoryState
(optional): Memory state to use
train_step
Execute a training step to update the model.
Parameters:
-
x_t
: Current input vector or text -
x_next
: Next input vector or text
get_memory_state
Get the current memory state and statistics.
Parameters:
-
type
(optional): Optional memory type filter
manifold_step
Update memory along a manifold direction.
Parameters:
-
base
: Base memory state -
velocity
: Update direction
prune_memory
Remove less relevant memories to free up space.
Parameters:
-
threshold
: Pruning threshold (0-1)
save_checkpoint
Save memory state to a file.
Parameters:
-
path
: Checkpoint file path
load_checkpoint
Load memory state from a file.
Parameters:
-
path
: Checkpoint file path
reset_gradients
Reset accumulated gradients to recover from training issues.
Parameters: None
Usage with Claude 3.7 Sonnet in Cursor
The Titan Memory MCP server is designed to work seamlessly with Claude 3.7 Sonnet in Cursor. Here's an example of how to use it:
// Initialize the model
const result = await callTool("init_model", {
inputDim: 768,
memorySlots: 10000,
transformerLayers: 8,
});
// Perform a forward pass
const { predicted, memoryUpdate } = await callTool("forward_pass", {
x: "const x = 5;", // or vector: [0.1, 0.2, ...]
memoryState: currentMemory,
});
// Train the model
const result = await callTool("train_step", {
x_t: "function hello() {",
x_next: " console.log('world');",
});
// Get memory state
const state = await callTool("get_memory_state", {});
Memory Management
The Titan Memory MCP server includes sophisticated memory management to prevent memory leaks and ensure efficient tensor operations:
- Automatic Cleanup: Periodically cleans up unused tensors
- Memory Encryption: Securely stores memory states
- Tensor Validation: Ensures tensors have the correct shape
- Error Recovery: Handles tensor errors gracefully
Architecture
The Titan Memory MCP server is built with a modular architecture:
- TitanMemoryServer: Main server class that registers tools and handles requests
- TitanMemoryModel: Neural memory model implementation
- VectorProcessor: Handles input processing and text encoding
- MemoryManager: Manages tensor operations and memory cleanup
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
I find academic articles and books for research and literature reviews.
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools
🧑🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Awesome MCP Servers - A curated list of Model Context Protocol servers
Reviews

user_vUQJOp2u
As an avid user of mcp applications, I can confidently say that mcp-titan by henryhawke is a game-changer. This tool is incredibly efficient and user-friendly. The seamless integration and powerful features make it a must-have for any developer looking to streamline their workflow. Highly recommended! Check it out at https://github.com/henryhawke/mcp-titan.