A YouTube spokesperson confirmed that free membership to the third-party captioning service for eligible creators will now last for a year.
In a statement to this website, they added that the extension is to ensure that the platform’s new captions editor role is made available to channels before the subscription ends, but that they are confident that it will be rolled out before then.
The news comes after a comment on YouTube’s Help Centre article about the deprecation of community contributions was edited last week to include this change.
The reply now reads: “YouTube will be covering the cost of a 12 month subscription of Amara Community for all creators who have used the Community Contribution feature for at least 3 videos in the last 60 days (more details coming soon).
“Creators who don’t qualify for the subsidy are still able and encouraged to use Amara’s tools, including their free subtitle editor.”
The news also comes after YouTube’s Vice President of Product Management, Ariel Bardin, said in an unlisted video: “We’ve read a lot of the feedback and one of the areas that resonated with me was what happens at the end of the six months, especially if we don’t have these [new] features?
“Let’s say it takes you guys a year to build those features, what about the six-month thing?’
“We are thinking about potentially making that period of time longer, and obviously, as we get towards the end of the six months, we’ll know where we are with these different features.”
The update was made last week, on the same day that YouTube announced a new default setting aimed at minimising mistakes in automatic captions.
The function, which does not apply to manual captions, sees “potentially inappropriate words” replaced with square brackets and two underscores (‘[ __ ]’).
In a YouTube Help Center post on the new feature, a YouTube employee said: “Because our automatic captions can make mistakes, we want to be extra careful not to caption certain words incorrectly.
“So to better avoid these mistakes, viewers will now see “[ __ ]” appear instead of a potentially inappropriate word (when auto-generated subtitles/closed captions are turned on during playback).”
“Creators – if you don’t want this setting for auto-generated subtitles/closed captions on your videos, you can opt out and turn the setting off.”
It follows reports of the n-word being inserted into the automatic captions, which rely on speech recognition, when the word wasn’t said in the video itself.
UK YouTuber and podcaster Jack Dean, known online as JaackMaate, tweeted earlier this month: “Your auto captions have just picked up a racial slur, which was NEVER said, and because of this, the entire podcast has been demonetised.
“Also, it’s super unfair on our show, and our guest. Please can somebody contact me about this ASAP.”
In an exclusive interview with this website, YouTube explained that professionally captioned videos are used to train their systems, which is mostly why a video may contain the n-word or other inappropriate words in the captions.
They went on to add that they are working on minimising the appearance of offensive words in captions by bettering the training of the models through ‘context biasing’ – which involves machines being told to focus on a specific context away from potentially offensive language.
YouTube also said that the n-word has been reclaimed by some people, and that it is important to them that communities are supported when they choose to reappropriate a word.
On the issue of monetisation, a spokesperson told Liam O’Dell that while automatic captions are used in YouTube’s trust and safety process when analysing content, any errors in the subtitles do not cause high levels of demonetisation on creator videos.
- More in social media: Twitter To Add Automatic Captions On Audio And Media By Early 2021
The comments formed part of a wider discussion with this website, where YouTube said they want to improve the accuracy of automatic captions and the user experience, as well as making them safer for viewers.
Other announcements included their partnership with Google Translate on bettering auto-translations, amending the grammar of automatic captions and expanding them into more languages – their first time doing so since 2008.
The YouTube spokesperson also revealed that the organisation had been experimenting with a new captions button on mobile.
At the moment, viewers have to select the ‘three dot overflow’ on the video player to access captions on mobile, using two extra taps to bring up more options and select the relevant subtitle track.
However, the platform is currently testing having a captions button present on the player itself, for viewers to turn on captions with one touch.
The experiment, along with YouTube’s other plans for automatic captions, comes just weeks before the platform looks to scrap its community contributions tool because of spam and low usage.
At the time of writing, more than 505,000 people have signed a petition calling on YouTube to reverse its decision, before the feature is due to be taken offline on 28 September.
Update – 18.09.20 – 18:45 (GMT): Commenting on the announcement of the new automatic captions setting, Deaf activist Andrew Parsons tweeted: “Oh wow, YouTube enforcing censoring swear words in the automatic captions feels more infantilizing than non-disabled parents pushing back against caption users complaining to Netflix about censorship in PG-13+ TV shows.
“One video I know the person uses F-bomb regularly and automatic captions is”[___]”, “[____] off”, “[___] ton”, and “[___] ing” all over the place. There is a huge difference between censoring yourself, and someone else doing the censoring.”
They concluded by telling YouTube that their new setting “is not a good feature”.
Meanwhile, deaf campaigner and content creator Rikki Poynter added: “Next YouTube title: YouTube thinks us deaf people are a bunch of babies.”
In a series of comments made to this website on background, a YouTube spokesperson did not say whether a certain list of words were being automatically replaced by the feature.
Instead, they said that their automatic speech recognition engine is used to identify a range of words which may be considered potentially inappropriate, and not just swear words.
When presented with Andrew’s tweets about the feature feeling “infantilizing”, the spokesperson responded by stressing that the tool is being implemented to prevent words which are potentially inappropriate from appearing in the site’s automatic captions by mistake.
They concluded by saying that the setting is being gradually rolled out this week, and will be made available to all platform users by the end of the day on Friday.
Update – 18.09.20 – 22:00 (GMT): In a statement to this website about their involvement in the development of the new automatic captions setting, Howard A. Rosenblum, CEO of the National Association of the Deaf (NAD), said: “The [NAD] appreciates the opportunity to collaborate with Google/YouTube as they work to continually improve their automated captioning system, including ensuring the avoidance of inadvertent displays of inappropriate words while doing so in a way that is immediately apparent to deaf and hard of hearing viewers.”