Short answer, no. If they did, the growing number of LLMs would all need to maintain their own massive web indexes and constantly store and refresh huge amounts of page data. That is not realistic, and it is not how an LLM works.
It helps to separate what a search engine is from what an LLM is. A search engine crawls the web, builds an index of pages, and retrieves documents that match a query. An LLM is different. It is trained on language and learns to predict the next word based on the context of what came before. By itself, it does not have fresh data or a constantly updated view of the internet.
When an LLM provides up to date answers, it is usually because it is connecting to external web sources. To do that, it can break your prompt into multiple searches, fetch information from the web, and then combine that retrieved data with its language capabilities to generate a response. That retrieval process is where query fan out comes in.