PUBLICATIONS
Digital Colonialism in Taiwan: The Crisis of Subjectivity under Dual Technological Violence from the U.S. and China

Digital Colonialism in Taiwan: The Crisis of Subjectivity under Dual Technological Violence from the U.S. and China

Article by He, Zixuan

Abstract:

In an age of accelerated global digitalization and intensifying geopolitical conflict—a conjuncture of “polycrisis”—colonialism reemerges as a digital spectre, haunting the world and materializing as the infiltration and control of sovereign nations by transnational technological power. Yet current discussions, often preoccupied with the influence of a single hegemonic power, seldom interrogate how particular nations, especially those precariously positioned along geopolitical fault lines, simultaneously endure heterogeneous pressures from multiple external digital powers. This study accordingly turns to the case of Taiwan, which faces a crisis of subjectivity under the digital domination of both the United States and China. Through an analysis of the operational logics and violent effects of this dual digital power regime, this paper reveals how Taiwan’s public sphere and civic subjectivity have been insidiously and profoundly eroded. The core contention involves two distinct forms of technological domination. The United States imposes symbolic violence through market mechanisms that extract data, shape consumer culture, and discipline everyday life. Meanwhile, China deploys epistemic violence as part of a digital united-front strategy oriented toward political unification, utilizing information manipulation and cognitive warfare to undermine Taiwan’s democratic processes and national identity. The study finds that these two heterogeneous forms of technological violence converge in Taiwan to produce a destructive synergistic effect: the disintegration of the public sphere, the disorientation of individual identity, and the erosion of social trust and cohesion, thereby precipitating a severe crisis of subjectivity. The theoretical contribution of this inquiry lies in distinguishing and interrelating these two modes of power—symbolic violence and epistemic violence—thereby deepening our understanding of digital colonialism. Furthermore, by using Taiwan as an exemplar, it unveils novel forms of colonial domination emergent in the digital age and their threats to the Global South and small democracies, while also advancing preliminary strategies for decolonization. Through these analyses, the study seeks to broaden the critical horizon on digital colonial rule and to offer insights for other countries and societies in analogous predicaments.

Keywords: digital colonialism, symbolic violence, epistemic violence, polycrisis, subjectivity, violence and technology

Header image “Digital Colonialism in Taiwan” by the author is an AI-generated image created using ChatGPT o3.


I. Introduction 

We inhabit a moment defined by what might be termed a “polycrisis”—a period where escalating geopolitical conflicts, democratic backsliding, and the pervasive influence of digital technologies converge to produce multiple, mutually reinforcing crises (Tooze, 2021). Within this conjuncture, the colonial power relations of an earlier moment have not simply vanished but have undergone a dialectical transformation, reconstituting themselves within the digital sphere as what we must now recognize as “digital colonialism” (Kwet, 2019). This new form of domination operates not through the direct occupation of physical territory but through the control of data, networks, and the cognitive frameworks that shape social reality.

Taiwan, as an active democratic polity, serves as a compelling case study for observing this emergent phenomenon. On one hand, its highly digitized social structure embeds it deeply within global technological networks; on the other, its unique geopolitical positioning—perpetually situated at the frontlines of Chinese influence projection while simultaneously subjected to the overwhelming dominance of US-led global digital ecosystems—makes it a critical site for observing this emergent form of digital colonial subjugation. Taiwan’s predicament cannot be simplified to merely being “caught between two superpowers.” Instead, it is concurrently influenced by two distinct types of digital power, each functioning under its own rationale and producing unique manifestations of violence. The distinction between these two modes is not merely based on origin but also on essential operational logic and impact. Characterizing the US’s impact as symbolic violence and China’s as epistemic violence illustrates that Taiwan is under attack on both its socio-cultural framework and its cognitive-existential base.

The first is a US-led digital capitalism that utilizes market mechanisms for data extraction and algorithmic governance, thereby inflicting a covert and ostensibly mild form of “symbolic violence” upon Taiwanese society (Bourdieu & Wacquant, 1992), which subtly shapes cultural norms while perpetuating economic dependency. The second is a form of cognitive warfare originating from China, motivated by the stated political purpose of integrating Taiwan under a singular narrative. This operation utilizes information manipulation and AI-generated content to perpetrate “epistemic violence” against Taiwan’s democratic processes and national identity, a strategy aimed at dissolving its capacity to speak as an agentive subject (Spivak, 1994).

This essay seeks to dissect the operational processes of this dual techno-violence within Taiwan and to analyze the multiple crises of “subjectivity” it precipitates. The “subjectivity” at stake here is not the isolated consciousness of atomized individuals, but rather a collective political capacity within the Taiwanese context—the collective capacity of a political community to define itself and pursue autonomy. This collective capacity is continuously generated through what Jürgen Habermas terms “intersubjectivity” —a shared consciousness and mutual understanding forged within communicative contexts (Habermas, 1985; 1990).

II. The Symbolic Violence of the American Digital Empire

In the landscape of the global digital economy, American technology corporations have leveraged their technological innovation, first-mover market advantages, and enormous financial resources to construct a vast and hegemonic digital empire. This imperial order operates not through traditional territorial occupation, but through the control of crucial digital infrastructures that have become the very ground upon which contemporary social life unfolds. While seemingly transnational, this order is profoundly anchored in American corporate and legal power, inflicting a subtle yet pervasive form of symbolic violence upon societies like Taiwan.

This symbolic violence, however, is not an end in itself. It serves as the primary instrument of a deeper logic of domination and extraction, the digital colonialism of our era. While the concept of “digital hegemony” captures the dynamics of influence and ideological leadership, it fails to illuminate the systematic resource appropriation at the core of this relationship. At the same time, digital colonialism distinguishes itself from its historical analogue by shifting the primary object of extraction: from natural resources and physical labor to the raw material of human experience itself (Couldry & Mejias, 2019). Under this regime, domination is no longer exercised primarily through administrative dictate but through infrastructural and algorithmic governance. The power lies in owning and designing the digital rails of search engines, social networks, and app stores upon which communication and culture now travel. This violence is particularly effective because it is cloaked in the ideologically irresistible guises of “convenience,” “progress,” and “free choice,” allowing market logic to secure the internalization of dominant relations (Bourdieu & Wacquant, 1992: 140-174; Bourdieu, 1984: 503-519). 

In Taiwan, the operational logic of this digital colonialism is founded upon three interconnected mechanisms: First, infrastructural control of discovery and distribution is established through near-total market dominance. The search-social adtech stack reveals this order with particular clarity. On the information retrieval layer, Google commands roughly four-fifths of Taiwan’s search, with concrete indicators showing it accounted for 78.55 percent of queries in August 2025. On the device and distribution layer, iOS (at 57.62 percent) and Android (at 41.82 percent) envelop almost the entire market, ensuring App Store and Google Play policies become a private law of circulation. On the attention layer, platforms like YouTube and Instagram intermediate the bulk of cultural consumption, reaching tens of millions in a society with near-universal online participation. These figures do not merely describe popularity; they index a systemic reliance on US-domiciled infrastructures for access to knowledge, culture, and commerce (DataReportal, 2025; StatCounter, 2025a; StatCounter, 2025b). Together, these layers convert technical standards into the conditions of publicity (Zuboff, 2019), creating a vast surveillance network where individuals’ seemingly autonomous actions in fact reinforce the platforms’ power (Stiegler, 2017).

Second, this infrastructural control enables the algorithmic governance of visibility. From this structural dominance, what in a democratic imaginary would be a commons becomes a metered arena where auction dynamics and opaque ranking privatize access to collective attention. As media law and communication research has shown, recommenders now function as core institutions of media governance, deciding exposure with minimal transparency (Helberger, 2019; Leerssen, 2020). In such a system, “public interest” competes with “ad relevance,” forcing local news and civic information to purchase prominence or adapt to engagement heuristics. This structural bias is then amplified by behavioral dynamics. Peer-reviewed evidence shows that when engagement-maximizing recommenders optimize for signals like moral-emotional language, sensational and affectively charged content is preferentially surfaced, crowding out deliberative communication (Vosoughi, Roy, & Aral, 2018; Brady et al., 2017). The result is not overt censorship but a habituation to modes of attention that erode critical discrimination, rewarding content architectures calibrated for advertising yield rather than civic value.

Third, this system institutionalizes the asymmetric extraction of data and advertising rents. Economic flows mirror these informational effects. Taiwanese public agencies, businesses, and cultural producers must buy access to audiences through foreign ad exchanges; app distribution is taxed by commission regimes; and user data feeds distant analytics stacks. This circuit subordinates the domestic media ecology to transnational intermediaries, a high degree of dependency that not only stifles local innovation but also ensures a continuous outflow of economic value (Lin, 2022; Liu et al., 2023). This process is further documented by cultural and communications surveys that register the platformization of distribution channels and the concomitant weakening of local bargaining power and revenue capture (TAICCA, 2024; NCC, 2025).

The operational logic of this US-anchored symbolic violence is thus revealed: it begins with the infrastructural control of discovery and distribution (search, app stores, recommender systems), which in turn enables the algorithmic governance of visibility that sets the operative rules of access, and culminates in the asymmetric extraction of data and advertising rents. Crucially, these mechanisms are not free-floating; they are underwritten by ownership, jurisdiction, and centralized governance in the US Terms of service, moderation standards, API access rules, and ad-review pipelines are authored in US corporate-legal contexts and projected globally as default operating conditions. It is these external standards that ultimately determine findability, circulation, and monetization, compelling Taiwanese users, creators, and institutions to operate within rule-sets that can be unilaterally adjusted to serve external commercial interests.

III. The Epistemic Violence of Chinese Information Operations

If the influence of the American digital sphere is primarily symbolic, China’s digital penetration of Taiwan is driven by direct political motives, constituting a distinct but equally consequential mode of power. Longstanding doctrinal materials in Beijing treat information operations as integral to “Chinese Unification” (NIDS, 2023). In this context, epistemic violence denotes the undermining of a community’s standing as a knowing subject through narrative manipulation and credibility displacement (Spivak, 1994: 76; Dotson, 2011: 237-238). Its central mechanism is the negation of the other as a “knowing subject,” aiming to produce a state of “agnosis” or pernicious ignorance by systematically silencing and distorting the target’s voice. Crucially, in an open media system like Taiwan’s, this does not require removing speech. Its operative mechanism leverages this very openness to produce epistemic flooding: high‑volume, coordinated content saturates attention, raises verification costs, and relocates credible speech to the margins, rendering truth present yet inaudible at scale (Anderau, 2023; Roberts, 2018).

The advent of generative AI has fundamentally altered both the cost dynamics and plausibility of such operations. Microsoft Threat Intelligence reported that the China‑linked cluster “Spamouflage” used suspected AI‑generated audio on Taiwan’s election day, including a clip mimicking Foxconn founder Terry Gou to suggest endorsement of a rival; YouTube removed the content, but the case illustrates low‑cost, rapid fabrication and redeployment across platforms (Microsoft Threat Intelligence, 2024). Taiwan FactCheck Center’s election‑cycle analysis likewise documented AI video/voice fabrications targeting candidates and institutions, requiring reactive verification under time pressure (Li, 2024). These individual instances reflect a broader trend, with Taiwan’s National Security Bureau testimony to parliament that generative AI had intensified disinformation, with over 500,000 contentious messages detected in 2025 to date, often synchronized with salient political moments (Reuters, 2025). 

Operationally, interference with Taiwan’s elections has been primary. China‑linked operators exploit cross‑platform affordances to implement this visibility shift. Research ahead of the 2024 election documented a sustained, coordinated effort to manipulate Taiwanese political conversations across Facebook, YouTube, and TikTok since 2022, using deceptive behaviors, inauthentic accounts, and meme/video packages to steer agenda salience (Graphika, 2023). Independent media coverage and takedown reporting further noted networks of hundreds of likely fake Facebook accounts that recycled Chinese‑language TikTok/YouTube clips within minutes, a posting cadence consistent with orchestration (Bond, 2023).  At the audience level, pre‑ and post‑election surveys indicate correlational patterns between platform reliance and narrative alignment: frequent TikTok users in Taiwan were more likely to agree with pro‑China narratives or express lower confidence in Taiwan’s democratic efficacy and US support, relative to infrequent users (Hsu, 2024). This indicates a measurable correlation between information sources and political alignment.

Specifically, four narrative frames recurred consistently during the campaign: First, portraying Taiwan’s elected leadership as inept or reckless, thereby questioning democratic competence; for example, Chinese state media amplified the unsubstantiated allegation that President Tsai Ing‑wen would “flee in a US plane” if war erupted in the run‑up to the January 2024 election (Lee & Pomfret, 2024). 

Second, amplifying war fears, often coupling military maneuvers with sensational claims. The Thomson Foundation’s analysis of election-period disinformation documents a dominant “peace vs war” framing and concurrent skepticism toward US support, positioning “peace” as acquiescence to Beijing and “war” as continued self-rule (Hung et al., 2024). 

Third, distorting foreign relations to depict Taiwan as isolated or abandoned. The Thomson Foundation report also highlights narrative spikes casting doubt on Washington’s reliability and reframing cross-Strait geopolitics to suggest diplomatic marginalization (Hung et al., 2024). 

Fourth, “Grievability” (Butler, 2016): t lowering the recognitional threshold for Taiwan’s identity and history in global narratives. This is achieved via discursive minimization of the Republic of China (ROC) in international fora—disputing nomenclature and participation rules, pressuring organizations and firms to adopt PRC-preferred labeling, and relegating key strands of ROC historical memory (e.g., the War of Resistance and diplomatic history) to the margins.

These operations seek not merely to contest the discursive authority over specific events but, more fundamentally, to transform the basic frameworks through which Taiwanese citizens understand their situation, constructing what Michel Foucault terms a “regime of truth” (Foucault, 1995). Executed remotely on foreign platforms, they undermine trust and recalibrate the boundaries of political feasibility by continuously injecting putative “facts” and “narratives” that contradict Taiwan’s established cognitive schema, thus constructing an “alternative truth.” In this configuration, cognitive manipulation achieves what physical coercion once required: it re-anchors plausibility, normalizes previously unacceptable options, and constitutes epistemic violence that progressively disqualifies Taiwan’s claims as knowable and authoritative.

IV. The Crisis of Subjectivity in Taiwan

The digital oppression confronting Taiwan is not a simple sum of two distinct forces but a “polycrisis” born of weaponized interdependence under digital-colonial conditions. In this configuration, the symbolic violence of US platform capitalism entrenches structural dependencies that enable China’s cognitive operations to penetrate Taiwan’s information sphere; this interaction ultimately produces a severe crisis of political subjectivity among Taiwanese citizens.

The pathway of this crisis is clear. First, the symbolic violence of American platforms, through their profit-driven algorithmic logic, produces public sphere fragmentation, echo chamber effects, and the dispersion of citizen attention. Taiwanese society’s overall environment resembles an immunocompromised body capacity for rational discourse and collective deliberation has been preemptively weakened. Subsequently, China’s epistemic violence precisely exploits this trust vacuum and cognitive vulnerability manufactured by platform commercial logic, efficiently implanting its political agenda within Taiwan’s information space, disseminating massive amounts of disinformation, and stoking social antagonism. This malignant synergy reveals a fundamental contradiction in the global digital order: the infrastructure of liberal-democratic “free speech” (US platforms) has become the most effective delivery mechanism for authoritarian cognitive warfare. The very features that make these platforms successful in a capitalist sense—scalability, engagement algorithms, global reach—are the same features that make them vulnerable to political manipulation.

The most immediate consequence is the fall of “digital subjectivity”—a decline in the individual’s capacity and will to make autonomous judgments, form a stable identity, and participate in public life (Goriunova, 2019; Skeggs & Yuill, 2019). This predicament represents the individual-level manifestation of the subjectivity crisis: individuals’ information judgment capacities suffer severe disruption, and critical thinking becomes increasingly dulled. On an emotional and identificatory level, some citizens either sink into political apathy or turn toward simplified and exclusionary extremist positions. At the level of active practice, the willingness and capacity to participate in public affairs also decline accordingly. When citizens remain long exposed to high-intensity, polarized information environment amplified by algorithms, their attention becomes highly diluted. As a result, heterogeneous viewpoints are filtered out and behavior trends toward predictability. Consequently, rational judgment further deteriorates and enthusiasm for public participation gradually cools (Han, 2017; Forest, 2021; Miller, 2023). 

What follows is the comprehensive decay of the public sphere. When commercial platforms determine “what people see” and political manipulation determines “what people believe,” a highly polarized and trust-deficient information environment is the necessary result. Research confirms that Taiwanese social media users face increasingly homogenized information sources (IORG, 2025). In such a context, the Habermasian ideal of a public sphere oriented toward rational communication and consensus becomes unsustainable, as the digital platforms of this democratic society are increasingly characterized by uncertainty—what might be termed a “tension-ridden combination” (Patberg, 2025). 

More fundamentally, subjectivity is constituted through the “recognition” of others in a social context (Butler, 2004). When the public sphere is so polluted by information interference that it can no longer provide a stable frame of reference, this process of mutual recognition breaks down. The individual is left unmoored, unable to find their experience reflected or validated, and is thus reduced to the status of a passive object. The intensified political polarization, social fragmentation, and erosion of national trust so evident in contemporary Taiwan are the most direct manifestations of this complex crisis of subjectivity.

A normally functioning public sphere should be a social space freely accessible to all citizens, where individuals form public will through assembly and discussion. Through exchange with others, individuals reference others, reflect upon themselves, and confirm their values and beliefs, thereby shaping a stable political identity and eventually coalescing into a collective will (Habermas, 1991). When this domain collapses, individuals lose the social coordinates for positioning themselves. The collapse of the space in which a collective public will can take form, brought about by the joint effects of US–China dual violence under digital colonialism, constitutes the root of Taiwan’s crisis of subjectivity.

Image “How Does Digital Technology Affect You?” by schopie1 is licensed with CC BY-SA 2.0.

V. Conclusion

This analysis of the pivotal case of Taiwan uncovers the covert operations of contemporary technological violence and the distinct modes of digital colonialism being implemented by the US and China. This dual pressure from a global capitalist empire and a geopolitical force has transformed Taiwan’s digital landscape into a contested battlefield, posing an unprecedented challenge to its democratic society. 

Viewed through the lens of “digital colonialism,” Taiwan’s predicament serves as a precursor to a wider global phenomenon. Its experience indicates that, during polycrisis and US-China rivalry, several smaller democracies and nations of the Global South will face similar struggles to maintain their (digital) sovereignty, resist external interference, and protect the well-being of their citizens. 

To escape the digital colonial matrix, it is essential to reconstruct an autonomous public sphere and enhance civic subjectivity. This project depends both on national policies capable of asserting sovereign control over key digital infrastructures and on a civil society dedicated to enhancing critical media and technological literacy. Only through a dual strategy of state-level policy and grassroots civic engagement can Taiwan find pathways to survive within the fissures of imperial domination in digital modernity and redefine its subjective stance and digital democratic practices within the global network order.

References:

Anderau, G. (2023). Fake news and epistemic flooding. Synthese, 202, Article 106.

Brady, W. J., Wills, J. A., Jost, J. T., Tucker, J. A., & Van Bavel, J. J. (2017). Emotion shapes the diffusion of moralized content in social networks. Proceedings of the National Academy of Sciences, 114(28), 7313–7318.

Bourdieu, P. (1984). Distinction: A social critique of the judgement of taste (R. Nice, Trans.). Harvard University Press.

Bourdieu, P., & Wacquant, L. J. D. (1992). An invitation to reflexive sociology. University of Chicago Press.

Bond, S. (2023, December 13). Fake social media accounts are targeting Taiwan’s presidential election. NPR. Available online: https://www.npr.org/2023/12/13/1219080681/fake-social-media-accounts-are-targeting-taiwans-presidential-election (Access: 09/10/2025).

Butler, J. (2004). Undoing gender. Routledge.

Butler, J. (2016). Frames of war: When is life grievable? (Rev. ed.). Verso.

Couldry, N., & Mejias, U. A. (2019). The costs of connection: How data is colonizing human life and appropriating it for capitalism. Stanford University Press.

Datareportal. (2025, March 3). Digital 2025: Taiwan. Available online: https://datareportal.com/reports/digital-2025-taiwan (Access: 09/10/2025)  

Dotson, K. (2011). Tracking epistemic violence, tracking practices of silencing. Hypatia, 26(2), 236–257.

Forest, J. J. F. (2021). Political warfare and propaganda: An introduction. Journal of Advanced Military Studies, 12(1), 13–33.

Foucault, M. (1995). Discipline and punish: The birth of the prison (A. Sheridan, Trans.). Vintage Books.

Goriunova, O. (2019). Digital subjects: An introduction. Subjectivity, 12(1), 1–11.

Graphika. (2023, December 13). Agitate the debate: Online manipulation of Taiwanese politics. Available online: https://graphika.com/reports/agitate-the-debate (Access: 09/10/2025).

Habermas, J. (1985). The theory of communicative action: Reason and the rationalization of society (Vol. 1, T. McCarthy, Trans.). Beacon Press.

Habermas, J. (1990). The philosophical discourse of modernity: Twelve lectures (F. G. Lawrence, Trans.). MIT Press.

Habermas, J. (1991). The structural transformation of the public sphere: An inquiry into a category of bourgeois society (T. Burger & F. Lawrence, Trans.). MIT Press.

Han, B.-C. (2017). Psychopolitics: Neoliberalism and new technologies of power (E. Butler, Trans.). Verso.

Helberger, N. (2019). On the democratic role of news recommenders. Digital Journalism, 7(8), 993–1012.

Hsu, E. (2024, January 15). 2024 Taiwan election: The increasing polarization of Taiwanese politics—Reinforcement of conspiracy narratives and cognitive biases. Doublethink Lab. Available online: https://medium.com/doublethinklab/2024-taiwan-election-the-increasing-polarization-of-taiwanese-politics-reinforcement-of-2e0e503d2fe2  (Access: 09/10/2025)

Hung, C.-L., Fu, W.-C., Liu, C.-C., & Tsai, H.-J. (2024). AI disinformation attacks and Taiwan’s responses during the 2024 presidential election. Thomson Foundation. Available online: https://www.thomsonfoundation.org/media/268943/ai_disinformation_attacks_taiwan.pdf (Accessed: 2025-09-15)

Information Operations Research Group. (2025, January 21). 2024 Taiwan information environment survey: Annual national opinion poll. Available online: https://iorg.tw/_ua/a/survey-2024  (Access: 09/10/2025)

Kwet, M. (2019). Digital colonialism: US empire and the new imperialism in the Global South. Race & Class, 60(4), 3–26. 

Leerssen, P. (2020). The soap box as a black box: Regulating transparency in social media recommender systems. European Journal of Law and Technology, 11(2).

Lee, Y., & Pomfret, J. (2024, April 1). Chinese state media stoked allegation Taiwan’s president would flee war. Reuters. Available online: https://www.reuters.com/ (Accessed: 2025-09-15)

Lin, C.-C. (2022). Negative impact from platforms on print newspaper: The alert for journalism and democracy in Taiwan. Communication, Culture & Politics, 16, 139–163.

Liu, C.-C., Tsai, H.-J., Hung, C.-L., & Chang, C.-Y. (2023). Revenue destruction and distribution dependency: The economic impact of digital platforms on newspapers and magazines in Taiwan. Chinese Journal of Communication Research, 43, 7–55.

Li, W.-P. (2024, February 19). Seeing is not believing (part II) – AI videos spread during the 2024 presidential election in Taiwan. Taiwan FactCheck Center. Available online: https://en.tfc-taiwan.org.tw/en_tfc_294/ (Access: 09/10/2025).

Miller, S. (2023). Cognitive warfare: An ethical analysis. Ethics and Information Technology, 25, Art. 46. 

Microsoft Threat Intelligence. (2024, April 4). Same targets, new playbooks: East Asia threat actors employ unique methods. Microsoft Security Insider. Available online: https://www.microsoft.com/en-us/security/security-insider/threat-landscape/east-asia-threat-actors-employ-unique-methods (Access: 09/10/2025).

National Communications Commission. (2025). Communications market report 2024. Available online: https://www.ncc.gov.tw/english/news_detail.aspx?sn_f=6039  (Access: 09/10/2025)

National Institute for Defense Studies. (2024). China’s quest for control of the cognitive domain and gray zone situations (China security report 2023). Available online: https://www.nids.mod.go.jp/english/publication/chinareport/index.html  (Access: 09/10/2025)

Patberg, M. (2025). What is social media’s place in democracy? The Review of Politics, 87(2), 236–257. 

Reuters. (2025, April 8). Taiwan says China using generative AI to ramp up disinformation and “divide” the island. Available online: https://www.reuters.com/world/asia-pacific/taiwan-says-china-using-generative-ai-ramp-up-disinformation-divide-island-2025-04-08/  (Access: 09/10/2025)

Roberts, M. E. (2018). Censored: Distraction and diversion inside China’s Great Firewall. Princeton University Press.

Spivak, G. C. (1994). Can the subaltern speak? In P. Williams & L. Chrisman (Eds.), Colonial discourse and post-colonial theory: A reader (pp. 66–111). Routledge.

StatCounter. (2025a, August). Search engine market share in Taiwan. Available online: https://gs.statcounter.com/search-engine-market-share/all/taiwan (Access: 09/10/2025).

StatCounter. (2025b, August). Mobile OS market share in Taiwan. Available online: https://gs.statcounter.com/os-market-share/mobile/taiwan (Access: 09/10/2025).

Stiegler, B. (2017). Automatic society (Vol. 1): The future of work (D. Ross, Trans.). Polity.

Skeggs, B., Yuill, S. (2019). Subjects of value and digital personas: reshaping the bourgeois subject, unhinging property from personhood. Subjectivity 12(1), 82–99.

Taiwan Creative Content Agency. (2024). 2023 Taiwan cultural content industries survey report. Available online: https://taicca.tw/intelligence/industry_research/detail/66  (Access: 09/10/2025)

Taiwan Network Information Center. (2024, October). 2024 Taiwan Internet Report. Available online: https://report.twnic.tw/2024/  (Access: 09/10/2025)

Tooze, A. (2021). Shutdown: How Covid shook the world’s economy. Viking.

Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151. https://doi.org/10.1126/science.aap9559

Watts, C. (2024, April 4). China tests US voter fault lines and ramps AI content to boost its geopolitical interests. Microsoft Threat Analysis Center. Available online: https://blogs.microsoft.com/on-the-issues/2024/04/04/china-ai-influence-elections-mtac-cybersecurity/  (Access: 09/10/2025)

Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.

[PDF]