View AggmGPT-2 !
Read the research paper of AggmGPT-2
AggmGPT is a lightweight foundational AI language model developed by Adolfo GM, designed to generate human-like text using n-gram models combined with self-attention mechanisms. The project is licensed under the MIT License, making it open-source and free for modification and distribution.
- Implements Self-Attention and Multi-Head Attention for improved context understanding.
- Includes positional encoding to retain sequence structure.
- Features a Feed-Forward Neural Network to refine predictions.
- Tokenization and embedding functions for handling input text.
Ensure you have Python installed on your system. You can download it from the official website.
AggmGPT processes text through:
- Tokenization & Embedding
- Positional Encoding
- Self-Attention & Multi-Head Attention
- Feed-Forward Network
- N-gram Models
This project is licensed under the MIT License. See the LICENSE file for details.
Created by Adolfo GM.


