IBL News | New York
New York-based Hugging Face, a startup known by an app launched in 2017 that allows you to chat with an artificial digital friend, recently open-sourced its library for natural language processing (NLP) framework, called Transformers. It had massive success as there are over a million downloads and 1,000 companies using it, including Microsoft’s Bing.
Transformers can be leveraged for text classification, information extraction, summarization, text generation, and conversational artificial intelligence.
On Tuesday, Hugging Face, with just 15 employees, announced the close of a $15 million series, a funding round that adds to a previous amount of $5 million.
The round, intended to triple Hugging Face’s headcount in New York and Paris and the release of new software libraries, was led by Lux Capital, with participation from Salesforce chief scientist, Richard Socher, and OpenAI CTO Greg Brockman, as well as Betaworks and A.Capital.
“Tech giants are not taking a truly open-source approach on NLP, and their research and engineering teams are totally disconnected,” Hugging Face CEO, Clément Delangue, said on VentureBeat.
“On one hand, they provide black-box NLP APIs — like Amazon Comprehend or Google APIs — that are neither state-of-the-art nor flexible enough. On the other hand, they release science open source repositories that are extremely hard to use and not maintained (BERT’s last release is from May and only counts 27 contributors).”