In a development that has thrust the contentious terrain of platform governance into the spotlight, Google recently announced it will purge low-quality Android apps next month to improve the Play Store’s quality and user experience.
This decision has sparked a range of reactions from users. Some see it as a necessary step to maintain the quality of the Play Store, while others worry about the impact on various apps and potential censorship. Amid this backdrop, Google’s latest transparency report touts a staggering 2.28 million apps expelled from its digital ecosystems over the preceding year. This eye-catching statistic, a surge from the 1.43 million removals cataloged in the prior reporting cycle, projects an aura of decisive action to safeguard user interests.
Yet lurking beneath these top-line figures lies a more nuanced reality begging scrutiny. As digital platforms grow in size and influence, mere numbers alone provide an incomplete picture, disconnected from the substantive policy discussions and fact-based assessments needed to address the challenges facing our online spaces. Are these app removals representative of a success story of an “investment in new and improved security features, policy updates, and advanced machine learning and app review processes,” as Google put it, or are we rather observing another case where Google uses its influence and high market share to dictate the global app environment?
Questioning the Validity of the Numbers
We can already observe a notable trend in how major media outlets have covered Google’s app removal statistics—many reports simply restate the figures provided without in-depth analysis or critical examination, potentially lending them a degree of credibility disconnected from substantive scrutiny.
For instance, one article headlines with “Google Prevented 2.28 Million Malicious Apps from Reaching Play Store in 2023” before proceeding to list the official data breakdowns provided by Google. Similarly, CNBC’s coverage leads with “Google removed more than 2.2 million apps from the Play Store in 2022” without further exploration. While efficiently conveying the top-line figures, this style of straightforward reporting omits deeper context around the nature and implications of these removals.
The lack of critical examination of these enables Google to promote a self-laudatory narrative unchallenged. Amid the 59% spike in removed apps, Google’s transparency report provides scant new information illuminating the drivers behind this increase. The company offers little insight into whether this escalation stems from proliferating threats across the digital ecosystem or a heightened stringency in its own enforcement protocols. Details on the nature of these threats and the specific protective measures taken remain sparse. This raises the concerning possibility that these removal figures may not represent proportional progress in addressing digital vulnerabilities, but rather simply Google’s increase in stringency surrounding its Play Store environment.
Moreover, the absence of detailed data surrounding the nature of the removed apps is another cause for concern. The language in both the transparency report and media reports simply describe ‘malicious’ or ‘privacy-violating’ apps without a full breakdown of which categories the removed apps fall into and for what reasons they were removed. This, along with the lack of independent verification of these figures, allows Google to avoid any real sense of responsibility toward the public regarding its unilateral decisions about which apps are approved and which are not.
Clearly, Google is not the only firm getting away with this: we can see Apple presenting a similar issue by intentionally displaying non-relevant data in its Transparency Reports, obscuring the actual activities within the App Store.
To help curb this and foster a culture of public accountability around app removals, news media should apply the same scrutiny to these tech companies’ reports that they have toward other numbers put forward by the government on issues such as immigration, for instance. Independent analysis of government figures managed to demonstrate that increased immigration has been linked to economic growth and job creation, contradicting anti-immigration arguments that it harms the economy and prevents US workers from finding jobs. The impact of that information only further testifies to the importance of the media utilizing their inquiry and their power over public opinion to transform a long-held narrative. This scrutiny should be replicated in the context of big tech firms’ report figures.
Furthermore, any use of false figures by those in power is swiftly called out by watchdog organizations, allowing civil society to remain active in holding their leaders accountable. In a similar vein, watchdog organizations that collect statistics, such as the AppleCensorship project, are pivotal in reining in the otherwise sparsely accountable big tech, especially with the visible lack of any sort of statistical office for Google, for instance.
Understanding the Mass Removals and Rejections
Google has rejected or remediated around 200,000 app submissions “to ensure proper use of sensitive permissions such as background location or SMS access,” according to a company blog post. While Google’s Developer Policy Center does list its guidelines for accepting new apps, the issue lies in the subjective interpretation of some of these criteria, such as what falls under the umbrella of ‘inappropriate content.’
This ambiguity allows Google to refuse an app at its own discretion, citing a vague reason or guideline. As it states itself, “removal or administrative notices may not indicate each and every policy violation present in your account, app, or broader app catalog.” Further, Google warns against “repeated or serious violations” that would result in “termination” of the associated individual’s Google Play Developer account, without clearly citing how many or what constitute the scale of the violations.
It is clear that this leaves far too much room for interpretation on the side of Google, which holds all the cards vis-a-vis the developer, meaning that transparency and fairness are not an inherent part of the process when it comes to app removal, refusals, and appeals.
The Shadow of Censorship
The ambiguity with which Google exercises app takedowns becomes all the more insidious in the context of political app removals—most recently in the case of the Boycat app, which Google alleged it took down due to misinformation concerns. Another recent example is the case of an app called NoThanks in 2023, which was temporarily suspended for its description’s reference to the Israel-Hamas conflict, showcasing Google’s inclination towards hasty and arbitrary actions without providing explanations or maintaining transparency.
These cases exemplify the thin line between security concerns and outright censorship. With digital app stores like Google’s Play Store wielding considerable power over what apps can reach consumers, their decisions on censorship and app removals have far-reaching implications. This underscores the need for clear, unbiased, and transparent enforcement of security policies that respect user rights and promote a diverse digital environment.
Critical Approach vs. Lack of Transparency
Without a comprehensive account of what is being taken down and detailed reasoning behind it, opaque figures and data points do not allow the press to critically analyze the companies’ curation policies, and therefore, these opaque figures should not be reported at face value.
It is paramount for media outlets to adopt a critical approach toward big tech’s figures, especially in the absence of transparency from these companies. This critical approach is essential to truly shed light on their processes and the nature of the apps removed or rejected from their respective app stores.
This is precisely what drives Greatfire and its App Censorship Project: offering a deeper, more comprehensive portrait of data and information related to big tech’s practices that put their figures into question. With these companies’ firm monopolistic grip on the digital sphere, consistently questioning their practices and demanding greater levels of transparency is the only way for users to be re-integrated into the public scrutiny that is proportional to big tech’s power. To do so, however, the latter should be made to remain open through regulations such as the EU’s Digital Services Act, which could be built upon to ensure compliance amongst Google and Apple and provide a solid start from which media can assess the content and app censorship with which they engage.
Reference
- Google Transparency Report. Retrieved from https://transparencyreport.google.com/?hl=en.
- Google Security Blog. “How We Fought Bad Apps and Bad Actors in 2023.” April 2024. Retrieved from https://security.googleblog.com/2024/04/how-we-fought-bad-apps-and-bad-actors-in-2023.html.
- The Hacker News. “Google Prevented 228 Million Malicious App Installs in 2023.” April 2024. Retrieved from https://thehackernews.com/2024/04/google-prevented-228-million-malicious.html.
- CNBC TV18. “Google Blocked Apps for Play Store Policy Violations in 2023.” Retrieved from https://www.cnbctv18.com/technology/google-blocked-apps-play-store-policy-violation-2023-19404419.htm.
- Android Police. “Google Play Store Malicious Apps 2023 Report.” Retrieved from https://www.androidpolice.com/google-play-store-malicious-apps-2023-report/.
- Google Transparency Report. Retrieved from https://transparencyreport.google.com/?hl=en.
- Anderson, Stuart. “US Job Numbers Show Immigration Opponents Wrong About the Economy.” Forbes, April 8, 2024. Retrieved from q=https://www.forbes.com/sites/stuartanderson/2024/04/08/us-job-numbers-show-immigration-opponents-wrong-about-the-economy/&sa=D&source=docs&ust=1722502659862854&usg=AOvVaw3VOo362u8zWSJmhA2Rj-Dn.
- The Guardian. “Sunak Used Incorrect Asylum Backlog Figures, Statistics Watchdog Says.” March 25, 2023. Retrieved from https://www.theguardian.com/uk-news/2023/mar/25/sunak-used-incorrect-asylum-backlog-figures-statistics-watchdog-says.
- The Guardian. “Boris Johnson Repeatedly Used Inaccurate Child Poverty Figures.” July 30, 2020. Retrieved from https://www.theguardian.com/politics/2020/jul/30/boris-johnson-repeatedly-used-inaccurate-child-poverty-figures.
- Google Security Blog. “How We Fought Bad Apps and Bad Actors in 2023.” April 2024. Retrieved from https://security.googleblog.com/2024/04/how-we-fought-bad-apps-and-bad-actors-in-2023.html.
- European Commission. “Digital Services Act: Transparency Reports – Detailed Rules and Templates.” Retrieved from https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/14027-Digital-Services-Act-transparency-reports-detailed-rules-and-templates-_en.
- Google Play Store. Boycott Pro. Retrieved from https://play.google.com/store/apps/details?id=com.bashsoftware.boycott&hl=fr.
- Google Play Support. “Suspensions.” Retrieved from https://support.google.com/googleplay/android-developer/answer/2477981?hl=en#zippy=%2Csuspensions.
- Google Play Support. “Examples of Violations.” Retrieved from https://support.google.com/googleplay/android-developer/answer/9878810#zippy=%2Cexamples-of-violations.
- Google Play Developer Policy. Retrieved from https://play.google/developer-content-policy/.
- Google Transparency Report. Retrieved from https://transparencyreport.google.com/?hl=en.
- Newsweek. “Google Removes App Boycott Pro-Israel Companies.” Retrieved from https://www.newsweek.com/google-removes-app-boycott-pro-israel-companies-1848904.
- European Commission. “Digital Services Act: Transparency Reports – Detailed Rules and Templates.” Retrieved from https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/14027-Digital-Services-Act-transparency-reports-detailed-rules-and-templates-_en.