It appears YouTube has listened to human rights activists and freedom lovers everywhere. It will now offer an in-house face feature that allows users to blur out peoples faces as an optional extra. You do not have to always blur out people’s faces – as this is a 100% optional feature.
Citing footage of children and potentially dangerous protests, Google now believes they can offer a layer of anonymity to the YouTube service that has never been offered before, making YouTube a less harsh and unforgiving medium of social media.
The tool is found under Additional Features, find the Blur All Faces option and click the Apply button below it. The tool is automatic, i.e. it detects and blurs faces automatically using facial recognition software. You will not have to, or be able to blur faces manually.
Unfortunately there is not currently a way to select the faces you wish to blur, its either everyone or no one. However, the feature could act as a double-edge sword – offering to protect those in need of anonymity but threatening to allow people to get away with things they shouldn’t. There are sure to be many more controversial occurrences as a result of YouTube’s new face blurring feature.
YouTube says this is an emerging technology and the automatic software may have trouble picking up faces depending on variables such as lighting obstructions, angle and source quality. Users are advised to preview their videos first.
The gaming industry is facing significant challenges, with major companies like Xbox, Nintendo, and PlayStation…
A couple of months ago, the long-awaited Borderlands movie finally hit theaters. Despite years of…
On November 8, 2019, Death Stranding arrived on the PlayStation 4, positioning itself as one…
ASUS China’s General Manager, Tony Yu, recently shared his impressive overclocking tests of the AMD…
The MEDION SPRCHRGD 14 S1 Elite laptop combines outstanding performance, advanced AI technology and elegant…
Although it is surprising what the x86-64 instruction emulation system that Microsoft has achieved in…