There are two challenging problems in applying standard Deep Neural Networks (DNNs) for incremental learning with a few examples: (i) DNNs do not perform well when little training data is available; (ii) DNNs suffer from catastrophic forgetting when used for incremental class learning. To simultaneously address both problems, we propose Meta Module Generation (MetaMG), a meta-learning method that enables a module generator to rapidly generate a category module from a few examples for a scalable classification network to recognize a new category. The old categories are not forgotten after new categories are added in. Comprehensive experiments conducted on 4 datasets show that our method is promising for fast incremental learning in few-shot setting. Further experiments on the miniImageNet dataset show that even it is not specially designed for the N-wayK-shot learning problem, MetaMG can sitll perform relatively well especially for 20-way K-shot setting.