Algorithmic Audiences: Navigating Identity, Influence, and Power in the Age of Platformized Media

Authors

  • Nodira R. Rustamova ISFT International School of Finance Technology and Science (Private University), Tashkent, 100140, National Institute of Pedagogical Education named after Qori Niyoziy, Tashkent, 100095, Uzbekistan

DOI:

https://doi.org/10.6000/2818-3401.2025.03.09

Keywords:

Algorithmic Audiences, Digital Identity, Media Platforms, Influence Mechanisms, Data Privacy, Datafied Hybrid Entities, Media Ecosystems

Abstract

Purpose: This article investigates the formation and operation of algorithmic audiences within platformized media environments, focusing on how processes of identity, influence, and power intersect to shape audience behaviour. It seeks to theorise the algorithmically produced publics that emerge from data-driven engagement on social media, streaming services, and online gaming platforms.

Methods: The study employs a critical conceptual synthesis of current literature in media studies, platform capitalism, and communication theory, supported by illustrative case studies of user-platform interactions. Through thematic analysis of secondary sources (2017–2023), it maps how algorithmic recommendation systems, identity performances, and influence mechanisms mutually reinforce each other to establish dynamic audience configurations.

Results: Findings reveal that algorithmic audiences are neither passive recipients nor purely autonomous actors, but datafied hybrid entities produced through collaborative interplays of user self-presentation, platform logics, and commercial surveillance. Identity construction increasingly depends on visibility metrics, while influence is redistributed through opaque recommendation architectures producing echo chambers and filter bubbles. Power asymmetries deepen as platforms gain control over information flows, data extraction, and behavioural manipulation, raising serious ethical and regulatory concerns.

Conclusion: Algorithmic audiences represent a paradigm shift in the understanding of contemporary media publics. Their emergence compels scholars and policymakers to move beyond traditional audience theories and to confront new questions surrounding data ownership, platform governance, and audience agency in the age of automated curation. Future research must address how regulatory frameworks and ethical design interventions can protect user autonomy while ensuring transparency and accountability within platformised media ecosystems.

References

Alatawi, F., Cheng, L., Tahir, A., Karami, M., Jiang, B., Black, T., & Liu, H. (2021). A survey on echo chambers on social media: Description, detection and mitigation. arXiv preprint arXiv:2112.05084.

Bayer, J. (2019). Between Anarchy and Censorship. Public discourse and the duties of social media. CEPS Centre for European Policy Studies.

Cohen, J. N. (2018). Exploring echo-systems: how algorithms shape immersive media environments. Journal of Media Literacy Education, 10(2), 139-151. DOI: https://doi.org/10.23860/JMLE-2018-10-2-8

Cultura, T. V. (2018). Social Media Management in Traditional Media Companies.

Duffy, B. E., Pinch, A., Sannon, S., & Sawey, M. (2021). The nested precarities of platformized creative labor. DOI: https://doi.org/10.31235/osf.io/vpmn2

Eckles, D. (2022). Algorithmic transparency and assessing effects of algorithmic ranking. DOI: https://doi.org/10.31235/osf.io/c8za6

Eriksson, M., & Johansson, A. (2017). Tracking gendered streams. Culture unbound: Journal of current cultural research, 9(2), 163-183. DOI: https://doi.org/10.3384/cu.2000.1525.1792163

Etienne, H., & Charton, F. (2024). A mimetic approach to social influence on Instagram. Philosophy & Technology, 37(2), 65. DOI: https://doi.org/10.1007/s13347-024-00736-w

Figà Talamanca, G., & Arfini, S. (2022). Through the newsfeed glass: Rethinking filter bubbles and echo chambers. Philosophy & Technology, 35(1), 20. DOI: https://doi.org/10.1007/s13347-021-00494-z

Gilani, P., Bolat, E., Nordberg, D., & Wilkin, C. (2020). Mirror, mirror on the wall: Shifting leader–follower power dynamics in a social media context. Leadership, 16(3), 343-363. DOI: https://doi.org/10.1177/1742715019889817

Grafanaki, S. (2018). Platforms, the First Amendment and Online Speech Regulating the Filters. Pace L. Rev., 39, 111. DOI: https://doi.org/10.58948/2331-3528.1987

Li, R., Kingsley, S., Fan, C., Sinha, P., Wai, N., Lee, J., ... & Hong, J. (2023, April). Participation and Division of Labor in User-Driven Algorithm Audits: How Do Everyday Users Work together to Surface Algorithmic Harms?. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (pp. 1-19). DOI: https://doi.org/10.1145/3544548.3582074

Lutkenhaus, R., Jansz, J., & Bouman, M. (2019). Tailoring in the digital era. Digital Health, 5. DOI: https://doi.org/10.1177/2055207618821521

Meßmer, A. K., & Degeling, M. (2023). Auditing Recommender Systems--Putting the DSA into practice with a risk-scenario-based approach. arXiv preprint arXiv:2302.04556.

Rathnayake, C., & Suthers, D. D. (2017, July). Twitter issue response hashtags as affordances for momentary connectedness. In Proceedings of the 8th International Conference on Social Media & Society (pp. 1-10). DOI: https://doi.org/10.1145/3097286.3097302

Roth, C. (2019). Algorithmic distortion of informational landscapes. arXiv preprint arXiv:1907.10401. DOI: https://doi.org/10.3406/intel.2019.1895

Sujon, Z. (2019). The rise of platform empires: sociality as mass deception.

Szulc, L. (2019). Profiles, identities, data: Making abundant and anchored selves in a platform society. Communication Theory, 29(3), 257-276. DOI: https://doi.org/10.1093/ct/qty031

Downloads

Published

2025-10-07

How to Cite

Rustamova, N. R. . (2025). Algorithmic Audiences: Navigating Identity, Influence, and Power in the Age of Platformized Media. International Journal of Mass Communication, 3, 135–144. https://doi.org/10.6000/2818-3401.2025.03.09

Issue

Section

Articles