All Blocks Are Equal, Some More Ranked
The team at Google DeepMind published a new paper earlier this month called "Scalable In-context Ranking with Generative Models". This paper proposes a new AI ranking algorithm called BlockRank.
This paper... hurt my head. I've seen a lot of genAI posts about it on LinkedIn. Here are 13 takeways from the migraine.
Key learnings about Blockrank
- BlockRank makes advanced semantic ranking up to 4.7x faster (and therefore more affordable at scale)
- The performance improvements come from scaling the "deep read" of documents from quadratic to linear complexity
- Think of deep reading as looking at the whole page of a text book, rather than just a sentence. There might be illustrations or chart that help improve your understanding. A callout might reinforce an insight by presenting it in a new way. These elements work together
- Deep reading is made possible because it uses an emerging new strategy called In-context Ranking (ICR)
- In-context Ranking (ICR) is a framework where an item’s relevance is determined by both its features and the context provided through co-occurring items or demonstration examples
- BlockRank prompts the model with the instructions for the task, the candidate for the task, and the search query
- ICR holds the promise of considering the query and all candidates simultaneously while performing relevance judgements
- Also known as Blockwise In-context Ranking, Block Rank is a novel method that adapts the attention operation in an LLM. The attention mechanism allows the model to selectively focus on the most relevant parts of the input text when generating a response, dynamically assigning different levels of importance (weights) to each word based on context.
- It is a proposed technology, which mean it isn't yet in play. Keep in mind that MUM went from proposed technology to in SERPs in less than year– and that was before the AI race
- This matters because ranking faster means less power usage. Power grid demands are a critical blocker for many big AI promises. Google has a monetary incentive to implement BlockRank, meaning we likely have less than a year before its production rollout.
- The paper's authors are working on converting their internal code to an openly supported codebase and expected it to be available in a week or so
- This is not the same as "valuing keywords over meaning". Some folks on LinkedIn are clearly asking untrained models to spew expert insights. It is not 2013. Get back in your shame closet.
- If you were hiding in the shame closet because you're overwhelmed or afraid you're not smart enough, get out of there. That's not for you. Deep breath. Learn five minutes a day. Be discerning on your sources. I'm rooting for you.
If your head doesn't hurt from that quick summary and you're deeply nerdy, check out librarian-bot's recommended reading on Hugging Face. If that hurt your head but you're still holding on, check out Search Engine Journal's 6 minute summary.
Resources
Given that we likely have less than a year before this technology is in production, here are some resources to help you plot, scheme, and ponder how this may impact your search strategies:
- Be Multi-Modal Ready: Make Your PDPs, Products & Packaging Machine-Readable
- Making all content machine-readable: GEO beyond text optimization
- What is an attention mechanism?
- How Attention Mechanism Works in Transformer Architecture
Published on 11/3/2025 by Jamie Indigo