Meta admits wrongly suspending Facebook Groups
Meta adds accounts may be disabled after one severe violation, such as posting child sexual exploitation content.
Getty Images
"We take action on accounts that violate our policies, and people can appeal if they think we've made a mistake," a spokesperson added.
The social media giant also told the BBC it uses a combination of technology and people to find and remove accounts that break its rules, and shares data about what action it takes in its Community Standards Enforcement Report.
In its last version, covering January to March this year, Meta said it took action on 4.6m instances of child sexual exploitation - the lowest since the early months of 2021. The next edition of the transparency report is due to be published in a few months.
Meta says its child sexual exploitation policy relates to children and "non-real depictions with a human likeness", such as art, content generated by AI or fictional characters.
Meta also told the BBC it uses technology to identify potentially suspicious behaviours, such as adult accounts being reported by teen accounts, or adults repeatedly searching for "harmful" terms.
This could result in those accounts not being able to contact young people in future, or having their accounts removed completely.