You are right , a simple filter algorithm that decides which bits of data it keeps and which it discards would have to be adapted almost every time they make changes to the website.
But if you make the thing as a ranking algorithm based on Markov-chains that applies probabilities to bits of data, it can be made adaptive, and even use User signals to help it adjust. User signals can be as simple as giving the users a try-agin-button if it doesn't render correctly.
User signal inputs have to be grouped into User bias-groups, so that if bots are used to insert noise into the ranking signals, they just group together in noise-groups and all the regular users group together in a human group.
This is 2 technical generations before machine learning stuff, so a 20 year old trick.
Maybe that's doable.