Google TW-BERT Demonstrates Improvements On Search

A Google research paper on Term Weighting Bidirectional Encoder Representations from Transformers (TW-BERT) describes how the new framework improves search rankings without requiring major changes because it integrates with existing query expansion models and improves performance.

BERT refers to artificial intelligence (AI) language models. TW-BERT learns to predict the weight for individual n-grams …

Next story loading loading..

Discover Our Publications