RNN vs. CNN vs. Autoencoder vs. Attention/Transformer

RNN vs. CNN vs. Autoencoder vs. Attention/Transformer: A Practical Guide with PyTorch Deep learning has evolved rapidly, offering a toolkit of neural architectures for various data types and tasks. Among the most influential are Recurrent Neural Networks (RNNs), Convolutional Neural Networks (CNNs), Autoencoders, and the modern Attention/Transformer models.But how do Read more

AI Engineering by Chip Huyen: Chapter 7: RAG and Agents

Detailed Notes: RAG, Agents, and Memory in AI Applications 1. Task Solving and Context Instructions vs. Context: Instructions: General, static for the application (how to solve). Context: Specific to each query (what info to use). Missing context → AI more likely to hallucinate or err. Two Major Context Patterns: RAG Read more

AI Engineering by Chip Huyen: Chapter 2 Notes and summary

Chapter 2: Understanding Foundation Models Overview Foundation model design choices (training data, architecture/size, post-training) are increasingly opaque. The training process splits into pre-training (makes model capable) and post-training (aligns model to human preferences). Sampling (how outputs are chosen from all possibilities) is a crucial, often-underestimated factor impacting model behavior and Read more