Microsoft announced in a blog post with the title “Bing delivers its largest improvement in search experience using Azure GPUs” that it’s search engine Bing uses BERT algorithms since April 2019.
“Recently, there was a breakthrough in natural language understanding with a type of model called transformers (as popularized by Bidirectional Encoder Representations from Transformers, BERT). Unlike previous deep neural network (DNN) architectures that processed words individually in order, transformers understand the context and relationship between each word and all the words around it in a sentence.
Starting from April of this year, we used large transformer models to deliver the largest quality improvements to our Bing customers in the past year. […] Our search powered by these models can now understand the user intent and deliver a more useful result. More importantly, these models are now applied to every Bing search query globally making Bing results more relevant and intelligent.”
Further information on how Google uses BERT can be found here. Google said that you cannot optimize your web pages for BERT.