Apple removed Telegram from App Store over “inappropriate content” – involving children

Although the Cupertino giant is still in hot waters for iPhone slowdown issue. But nothing has led Apple to be less strict in its policies, as lately, it was noted that the secure messaging services – Telegram apps, were removed from the App Store, a week back over some “inappropriate content”.

Telegram, that recently planned $1.2 billion ICO for chat cryptocurrency and its more efficient counterpart, Telegram X, both were mysteriously pulled from the App Store. At first, it was unknown that what kind of content led to this course of action. However, it has been recently revealed via an email written by Phil Schiller himself, that the messaging app was, in fact, distributing “child pornography”.

The mail published by 9to5Mac stated that Apple was alerted about the illegal content – sadly, child porn, which made the company take down the apps and inform the Telegram’s developers along with some authorities like NCMEC (National Center for Missing and Exploited Children). It’s no surprise that Apple didn’t disclose that who reported about such immoral content distribution.

Every data sharing platform is responsible for any type of content that is shared among users, by using their services. It is their duty to detect and prevent the spread of such hideous media. Currently, many social networks and sharing services use digital protections including tracking algorithms to make their network pure to its core. Telegram, however, at that time failed to accomplish such prevention.

Anyhow, after being notified by Apple, the Telegram’s developers acted swiftly as they reportedly banned the users who posted the content and updated the apps with new safeties.

As a result, the Telegram apps returned to the iOS App Store after few hours served in jail.

Below is the full email of Phil Schiller, via 9to5Mac:

The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).

The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content. Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store.

We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk – child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral.

I hope you appreciate the importance of our actions to not distribute apps on the App Store while they contain illegal content and to take swift action against anyone and any app involved in content that puts children at risk.

For more on the tech world, keep in touch.

Well, I am Talha. An introverted fellow pursuing a degree in Computer Science from FAST-NUCES. In past, I’ve been a freelance graphic designer which I still am to some extent. My deep interest in technology and reporting has made me do what I do here – cover tech news. I may look like a nerd but I am more about movies, TV shows, anime, art and especially video games. In any case, you can reach me out on Twitter: TalhaSaqib101 and Facebook: talha.saqib.9

Related Posts