Although the Cupertino giant is still in hot waters for iPhone slowdown issue. But nothing has led Apple to be less strict in its policies, as lately, it was noted that the secure messaging services – Telegram apps, were removed from the App Store, a week back over some “inappropriate content”.
Telegram, that recently planned $1.2 billion ICO for chat cryptocurrency and its more efficient counterpart, Telegram X, both were mysteriously pulled from the App Store. At first, it was unknown that what kind of content led to this course of action. However, it has been recently revealed via an email written by Phil Schiller himself, that the messaging app was, in fact, distributing “child pornography”.
The mail published by 9to5Mac stated that Apple was alerted about the illegal content – sadly, child porn, which made the company take down the apps and inform the Telegram’s developers along with some authorities like NCMEC (National Center for Missing and Exploited Children). It’s no surprise that Apple didn’t disclose that who reported about such immoral content distribution.
Every data sharing platform is responsible for any type of content that is shared among users, by using their services. It is their duty to detect and prevent the spread of such hideous media. Currently, many social networks and sharing services use digital protections including tracking algorithms to make their network pure to its core. Telegram, however, at that time failed to accomplish such prevention.
Anyhow, after being notified by Apple, the Telegram’s developers acted swiftly as they reportedly banned the users who posted the content and updated the apps with new safeties.
As a result, the Telegram apps returned to the iOS App Store after few hours served in jail.
Below is the full email of Phil Schiller, via 9to5Mac:
“The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).
The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content. Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store.
We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk – child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral.
I hope you appreciate the importance of our actions to not distribute apps on the App Store while they contain illegal content and to take swift action against anyone and any app involved in content that puts children at risk.”
For more on the tech world, keep in touch.
Reports suggest that Garena Free Fire is set to make a much-anticipated return to India.…
The Albanian government has announced a ban on the social media platform TikTok for a…
The launch of Google’s latest Pixel lineup brings an exciting chance to compare the new…
ISLAMABAD: In February next year, Pakistan is set to launch its first women-focused software technology…
The Law Admission Test (LAT) has been announced by the Higher Education Commission (HEC) of…
Meta's WhatsApp is rolling out a new playback speed feature, allowing users to adjust video…