In natural language processing, foundation models such as Google’s BERT are making great progress. A foundation model (FM) is a model that is trained with large-scale data and can be used for various tasks. Making full use of such models is not so easy, and developing models separately in individual departments would be wasteful.
Therefore, we are developing an automated platform, AutoFM, for the training and inference of foundation models on our in-house AI platform.
Specifically, we are making it possible to perform the training of models without writing codes simply by preparing training data, and we also promote the sharing of a foundation model that is trained using large-scale in-house texts.
We explain the use case of AutoFM in Yahoo, especially the classification of search queries.