Corporate Colonization, Geopolitical Power Struggles, and Hypernudge - How Social Media Engineers Minds

Neuhaus, Till and Curley, Lee J. (2024). Corporate Colonization, Geopolitical Power Struggles, and Hypernudge - How Social Media Engineers Minds. In: Shei, Chris and Schnell, James eds. The Routledge Handbook of Language and Mind Engineering. Abingdon: Routledge, pp. 169–182.




Emerging media technology has often, at times, been suspected to have a corrosive effect on society and especially the youth. While such concerns could, at least historically speaking, be pushed aside as hysteria or unreasonable fears, new technologies and (social media) networks – powered by artificial intelligence (AI) and almost omni‑available via smartphones – may actually be a game changer in that regard. While these technological novelties and the associated platforms have been suspected to be the root cause for a range of undesirable dynamics, one commonly articulated concern is that, through exposure to the minds of specific applications, human behavior may be altered in predictable ways – a process which could be labeled as mind engineering. Such mind engineering is suspected to be conducted by extremist groups in attempts to recruit new followers (cf. Weimann and Masri 2020), by the platforms themselves to prolong usage time or, as the Cambridge Analytica scandal (cf. Berghel 2018) and the suspected election meddling in the 2016 presidential election (cf. Gavra and Slutskiy 2021) has shown, by domestic and foreign political actors. Such a perspective is supported by the fact that India has banned the Chinese-based application TikTok and then-president Trump threatened to do the same for the USA (cf. Kuhn 2020) – a threat which is, as of 2023, negotiated at the federal and state-level judicial realms, however, without a conclusive result yet. Tentatively summarizing, it can be argued that there is something angst-inducing in the way social media and the employed technologies are suspected to change user behavior and thought, such as in overconsumption of certain apps (cf. Fasoli 2021), addictive potential, and the personal and societal side effects thereof (cf. Worsley et al. 2018). Due to the fact that the algorithms employed by Facebook, TikTok, and others can be considered black boxes (cf. Rahwan et al. 2019) and due to the novelty of the phenomena as such, this field is not just underregulated legally but also underinvestigated scientifically.

This chapter wants to address the topic of mind engineering via social media applications by presenting two cases in which applications employed techniques which could qualify as mind engineering (section “The Application of Hypernudges – The Cases of Facebook and TikTok”), yet with alternative motifs. The first one is Facebook and its attempts to prolong screen time for economically motivated reasons (section “The Attentional Merchants at Facebook”), and the other is TikTok which operates with similar mechanisms, yet also with a potentially geopolitical notion (section “TikTok and the Educational Cold War, Revisited”). Taken together, these cases provide a relatively clear picture of what social media applications are able to do regarding minds and behaviors. Yet, before discussing these cases in-depth, this chapter will lay out its theoretical presuppositions (section “From Nudge to Hypernudge”) by illustrating the school of thought known as nudge (Thaler and Sunstein 2017) (section “A Very Short Introduction to Nudging”). Nudges can be considered consciously made changes in decision architectures – all aspects relevant in a decision context – which contribute to a predictable change in behavior (cf. ibid.: 15), a fact which makes nudging potentially compatible with the field of mind engineering. These psychologically informed interventions have already been employed by governments globally (cf. Neuhaus and Curley 2022) to nudge citizens into the “right” direction and, as Brodmerkel (2019) showed, are also regularly employed by the private sector. This chapter aims at systemically connecting insights from the field of nudging with the decision architectures set up by social media companies. Therefore, the proposed nudge lens needs to be expanded by the concept of hypernudge (Yeung 2017), which combines nudging’s psychological insights with big data and AI (section “Big Data, Psychological Insights, and Hypernudging”). The chapter ends with a summary and reflection of key insights.

Viewing alternatives


Public Attention

Altmetrics from Altmetric

Number of Citations

Citations from Dimensions
No digital document available to download for this item

Item Actions