This document summarizes a research paper on memory-augmented neural networks (MANN) for meta-learning. The paper proposes a MANN model that learns to do classification on unseen classes by learning the sample-class binding in an external memory rather than network weights. The MANN outperforms k-nearest neighbors and using an LSTM controller performs better than a feedforward controller. Experiments show the MANN can rapidly learn from small amounts of data, as wiping memory between tasks encourages incremental learning while retaining relevant background knowledge.