# Justin Huang > A personal blog by Justin Huang about AI, technology, writing, and life. ## About A personal blog by Justin Huang about AI, technology, writing, and life. - Site: https://justinhuangai.github.io/ - RSS: https://justinhuangai.github.io/rss.xml - Full content: https://justinhuangai.github.io/llms-full.txt - Individual posts: append .md to any post URL (e.g. /posts/hello-world.md) - JSON API: https://justinhuangai.github.io/api/posts.json ## Blog Posts - [Technical Report Reading: Attention Residuals](https://justinhuangai.github.io/posts/attention-residuals/): A reading of Kimi Team's Attention Residuals technical report: why residual connections should become attention-like too, and how Full AttnRes / Block AttnRes turn that idea into a trainable, deployable system - [Paper Reading: Training Compute-Optimal Large Language Models](https://justinhuangai.github.io/posts/training-compute-optimal-large-language-models/): The Chinchilla paper — why most large models were undertrained, and how to spend your compute budget wisely, with real Python code examples - [Paper Reading: Scaling Laws for Neural Language Models](https://justinhuangai.github.io/posts/scaling-laws-for-neural-language-models/): The mathematics of scale — why bigger models are predictably better, with real Python code examples - [OpenClaw Deep Dive: Architecture Analysis 🦞](https://justinhuangai.github.io/posts/openclaw-architecture/): Dissecting the engineering skeleton of a self-hosted AI assistant, based on v2026.3.8 source code - [Paper Reading: Language Models are Few-Shot Learners](https://justinhuangai.github.io/posts/language-models-are-few-shot-learners/): Larger models, better at eliciting abilities from context, with real Python code examples - [OpenClaw Deep Dive: Ecosystem Analysis 🦞](https://justinhuangai.github.io/posts/openclaw-ecosystem/): From a single open-source project to a full AI assistant ecosystem - [Paper Reading: BERT — Pre-training of Deep Bidirectional Transformers for Language Understanding](https://justinhuangai.github.io/posts/bert/): Establishing the pre-training paradigm, with real Python code examples - [Paper Reading: Sequence to Sequence Learning with Neural Networks](https://justinhuangai.github.io/posts/sequence-to-sequence-learning-with-neural-networks/): Establishing the encoder-decoder paradigm, with real Python code examples - [Clawdbot: A Decentralized Open-Source AI Project Worth Watching](https://justinhuangai.github.io/posts/clawdbot/): A self-hosted platform that connects all your chat channels to an AI Agent - [Paper Reading: Neural Machine Translation by Jointly Learning to Align and Translate](https://justinhuangai.github.io/posts/neural-machine-translation-by-jointly-learning-to-align-and-translate/): The origin of attention mechanism, with real Python code examples - [Paper Reading: Attention Is All You Need](https://justinhuangai.github.io/posts/attention-is-all-you-need/): Sharing my understanding of the Transformer paper, with real Python code examples - [👋 Hello World](https://justinhuangai.github.io/posts/hello-world/): Welcome to Astro-Theme-Aither — an AI-native Astro theme that believes text itself is beautiful.