The deprecation of MV2 has been scheduled for June 2024 and enterprise in 2025 (link). One thing to keep in mind is the deprecation doesn’t mean removal straight away because if you have a look at the past roadmap is anything to go by they’ll stop accepting new MV2 based extensions in the store help developers gradually migrate to MV3. The other thing to keep in mind is that MV3 is still very much ‘work in progress’ if you have a look at the latest meeting minutes (link). As noted in the first link there is the section title ‘June 2024 + 1-X months’ which tells me that there is still features missing from MV3 that developers are depending on and to suddenly remove support could cause a user backlash as a result.
Some of the features missing include CSS support in the userScripts API, implementing a high precise timer particularly when it comes to questions such as a timer, the user puts the computer to sleep, does the timer pause? does it keep counting so when the computer is woken up it counts the time asleep as part of the timer process? I hope that they take their time because many of the features in discussion are either critical to the functioning of content blockers or will make content blockers a lot more powerful. There is also the issue of the number of static and dynamic rules, the number of rulesets etc (link) that appear to be open to being changed when more telemetry data comes in. As long as DNR (Declarative Net Request) works as reliably as the Safari implementation which if I remember correctly sits on top of the native content blocking API (link) that existed prior to the standardisation of declarative network request.
It is interesting to see hear the rumours regarding Apple and how they’re going to make use of AI technology in their products (link) and from what it looks like the focus by Apple isn’t about the novelty of generating images or a chatbot but rather integrating into their products to improve the user experience. It’ll be interesting to see whether the rumoured Web Eraser will result in the Declarative Net Request become more powerful so that one could theoretically create an AI model that is built based on the existing filters and rules where it can adapt quicker to address websites that try to work around content blocking rules and filters – imagine an evolving AI model. I don’t see Google every adopting it as part of Chrome given it is against their business interests but I could imagine Firefox maybe learning from it.