Great that you’re looking into this.
Currently the search is done entirely in the client using lunr.js. The Base docs are by far the biggest of these, and therefore the most painful performance wise. Building the index takes several seconds. In Documenter.jl/#560, there is some discussion on improving this, such as prebuilding at the cost of a bigger download, or building it using a web worker. But especially for the Base docs, it might be better to look at other options such as Algolia, also because doing all this work on the client is heavy, especially for mobile.
On Algolia’s pricing page I see that the free community plan is up to 10K Records/100K Operations, and if you apply for the free for open source essential plan, you can get more, with a standard volume of the AOS Plan is 100k records and 200k operations monthly. At least for the Base docs, applying for this plan would probably be wise. No idea how quickly this runs out though, and if it does, will search just stop working altogether?
What you are showing is both a new search provider and a new seach result UI, how tightly are these coupled, can we also use your UI with the existing search, or vice versa?