Facebook AI Open-Sources RAG, An Innovation in Intelligent NLP Models

Facebook collaborated with Hugging Face to open-source a natural language processing model known as RAG(Retrieval Augmented Generation). RAG allows NLP models to bypass the retraining step, access and draw from up-to-date information, and then use a state-of-the-art seq2seq generator to output the results. RAG has built an NLP model that researches and contextualizes (as opposed to the more traditional, general-purpose NLP model). This innovation is essential for teaching computers to understand how to write and speak like a human.

RAG allows researchers and engineers to quickly develop and deploy solutions to their knowledge-intensive tasks with just five lines of code. RAG is made available as a component of the Hugging Face transformer library. Integrated with the new Datasets library, it provides the RAG’s indexed knowledge source. It uses input data to retrieve a relevant set of documents from a database such as Wikipedia. 

https://huggingface.co/facebook/rag-token-nq?

RAG uses a form of “late fusion” to integrate knowledge from the retrieved documents; i.e., it predicts answers for document-question pairs before releasing the final prediction scores. RAG’s performance improves further when it has access to documents containing clues to the solution. 

It also excels at knowledge-intensive natural language questions, explored by Facebook, by creating questions inspired by Jeopardy. Owing to RAG’s ability to synthesize responses using disparate pieces of information drawn from multiple sources, the Jeopardy questions generated were more diverse and factual than those from comparable models. The company is looking forward to scale RAG to multimodal and operate using multiple knowledge sources at once.

Source: https://ai.facebook.com/blog/retrieval-augmented-generation-streamlining-the-creation-of-intelligent-natural-language-processing-models/

Paper: https://arxiv.org/pdf/2005.11401.pdf

The post Facebook AI Open-Sources RAG, An Innovation in Intelligent NLP Models appeared first on MarkTechPost.