To be fair, in that specific case it is almost certainly not YouTube directly censoring the phrase. They aren’t known to do any kind of editing like that on uploaded videos.
What is happening is the person that uploaded that video censored themselves…because YouTube’s policy around monetization. They’ll demonetize videos with certain no-no words. Part of that is YouTube and part of that is advertisers demanding their ads not be placed on content that they find objectionable.
Indirectly, YouTube and advertisers are censoring our content. A lot of it is also TikTok, which will ban you for no-no words. This seeps over into YouTube where something that might be fine on YouTube but is banned on TikTok gets censored anyway in case it gets clipped for TikTok.
Genuinely the power TikTok and it’s advertisers have over how we communicate is pretty scary. Imagine how often you hear “unalive” instead of “suicide” these days. “Pdf” (or others) instead of “pedophile.” The list goes on.
So they progressively increase closing force if it keeps detecting something but the owner keeps trying to close it. I can vaguely see the reasoning only if they aren’t confident in the frunk sensor for some reason. I mean garage doors solved this problem forever ago without having to resort to something like that.
I wonder if the “vision-based everything” mandate from Musk applies outside of autonomous driving features? Makes sense to not be confident in it if it’s just a camera…