Days after Facebook shut down facial recognition usage on its platform, Instagram has evidently been asking users for video selfies to prove their accounts are authentic, a report by The Verge reveals. To be sure, Instagram said on Twitter that these videos are reviewed by its teams, and that facial recognition is not used for it. The feature came to light after several users took to Twitter to express their surprise at being asked to verify themselves via video selfies. Earlier this month, Facebook had announced it was stopping Facial Recognition Technology (FRT) use, citing societal concerns. The company has previously been fined with relation to FRT-based tag suggestions for not taking permission for it, and not informing users about how long their data would be stored. Instagram has said that it has struggled to gauge spam and fake account prevalence on its platform for quite some time. In Meta's latest transparency report, Instagram said it could not provide numbers on spam and fake account prevalence on its platform. In case of the former, it said that it has been detecting spam through manual methods, which it said does not 'fully capture this type of highly adversarial violation'. When does Instagram ask for videos? Instagram had rolled out the feature last year, which according to reports was rolled back in the interim due to technical troubles, but seems to be back again. According to Instagram, the feature kicks in if an account shows suspicious behaviour leading them to check if…
