258
YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users, Researchers Suggest
(lettersandscience.ucdavis.edu)
This is a most excellent place for technology news and articles.
They optimize recommendations to a large degree to induce anger and rage, because anger and rage are the most effective ways to drive platform engagement.
Facebook does the same.
We also have no idea what measures they take to stop the system being manipulated (if any).
The far-right could be working to ensure they're recommended as often as possible and if it just shows up as "engagement" or "impressions" on their stats, YouTube is unlikely to fight it with much enthusiasm.