Are You Living Your Life, or the Algorithm’s?
How social feeds and recommendations quietly curate your desires, moods, and daily choices, without you noticing.
You believe you are an autonomous being, clutching your overpriced cold brew while scrolling your meticulously curated feed, convinced you are in control of your intellectual diet. Harvard’s Daniel Gilbert (2006) might remind you that humans are poor predictors of their own happiness, but one suspects he did not anticipate TikTok’s infinite scroll or Spotify’s Discover Weekly. Each time you open your phone “for a moment,” an algorithm executes a predictive model, ensuring you will remain precisely where it wants you, under the illusion of choice (Zuboff, 2019).
Consider your playlists. You did not find that indie band; the algorithm found you and whispered it into your morning run. You did not decide to learn about productivity hacks at two in the morning; YouTube suggested them, assessing your insomnia as a monetisable opportunity (Noble, 2018). Your Amazon cart does not reflect your desires; it shapes them (Wu, 2016).
Here is the comedic tragedy: while you congratulate yourself for your taste in music, your dedication to self-improvement, or your interest in minimalism, the algorithm refines your cage, learning your weaknesses and ensuring your continued engagement (Pariser, 2011). You are not a user; you are a data point in a vast laboratory, where your attention is the control variable.
Of course, the convenience is delightful. We are all far too busy to discover music manually or wander aimlessly in bookstores, and we would rather trust the algorithm to surface the “best” choices for us (Carr, 2020). It is efficiency masquerading as agency, a trade so subtle you barely notice the price.
You may protest, citing your free will and intellectual sovereignty. That is adorable. In the age of algorithmic curation, your desires are optimized by design, your curiosity funneled into neat, monetisable packages, and your boredom, once a birthplace of creativity, eliminated for your own good (Odell, 2019).
The illusion of choice
You wake each day convinced you are in control of your choices. You may even believe that your decisions are the distilled essence of your identity, a testament to your carefully constructed tastes and preferences. It is an attractive narrative, one that fuels the Western obsession with individualism and personal agency. Yet this confidence in your autonomy might require a closer inspection, particularly in the era of algorithmic curation.
You open Instagram, intending to check one message, and find yourself consuming twenty short videos that you did not seek. You open YouTube for a single tutorial, only to watch a recommended documentary you never planned to view. Spotify feeds you artists you have never heard of, but you come to love them, convinced that you have excellent taste. You believe you discovered them, but the algorithm discovered you first (Pariser, 2011).
What you perceive as “choices” are often the outputs of opaque systems designed to predict, influence and monetise your behaviour with precision. These algorithms are not neutral pipelines of information but systems engineered to prioritise content that maximises engagement and monetisation (Zuboff, 2019). Your sense of discovery is, in many cases, an engineered moment of delight, carefully orchestrated to ensure your continued participation in a system that values your attention above your autonomy (Wu, 2016).
The illusion is particularly compelling because it presents itself as personalised empowerment. Your feed feels uniquely yours, curated to reflect your personality and interests. However, the underlying architecture of these platforms is not designed to support your intellectual independence but to anticipate and guide your attention toward profitable outcomes (Noble, 2018). This curation creates a self-reinforcing cycle where the algorithm shows you what it predicts you will like, you engage with it, and the system takes your engagement as confirmation of its accuracy, further narrowing your informational landscape (Carr, 2020).
The intellectual consequence of this illusion of choice is significant. It reduces your exposure to ideas, tastes and perspectives outside your predictable patterns of interest. You may believe you are exploring the world through your devices, but you are often circling within the safe, familiar boundaries the algorithm has observed in your past behaviour. This phenomenon, described by Pariser (2011) as the “filter bubble,” systematically limits your intellectual diversity while convincing you that you are freely exploring.
Perhaps you find comfort in this familiarity, mistaking it for efficiency. In a culture that prioritises speed and productivity, it feels beneficial to have algorithms streamline your choices. Yet this efficiency comes at the cost of randomness, a quality essential for creativity and critical thought. The philosopher Byung-Chul Han (2017) argues that a society dominated by algorithmic efficiency erodes the friction necessary for meaningful experiences. Friction creates pauses that enable reflection, while endless streams of recommended content eliminate the space needed for independent thought.
Consider also the moral dimension of this illusion. You may believe you are immune to influence, but research in behavioural science indicates that subtle changes in your informational environment can shift your decisions and beliefs without your conscious awareness (Thaler and Sunstein, 2008). In a system where algorithms are designed to extract maximum engagement, your preferences are not only predicted but also gently manipulated to align with the interests of the platforms that profit from your attention (Zuboff, 2019).
Satirically speaking, it is almost charming how fiercely we defend our sense of choice while we allow invisible systems to determine what we consume, what we think about and even what we desire. We celebrate our unique tastes while the algorithm quietly smiles, confident in its ability to guide us wherever it wishes, provided we remain entertained and engaged.
This illusion of choice would be a harmless curiosity if it did not shape your worldview, influence your desires and determine the opportunities you see and pursue. In an environment governed by algorithmic curation, your sense of exploration becomes confined, your intellectual development becomes stunted and your ability to question the system itself becomes impaired.
The next time you find yourself scrolling endlessly, discovering new interests and congratulating yourself on your eclectic taste, consider whether you are truly choosing what you consume or if you are simply the recipient of choices made by systems optimised to know you better than you know yourself (Odell, 2019). In recognising the illusion of choice, you may begin the uncomfortable but necessary process of reclaiming your attention from the systems that profit from your passivity.
The subtle shaping of desires
It is endearing how convinced you are that your desires are entirely your own. You insist you have a clear sense of what you want, what you like and what you plan to pursue next. You argue that your tastes reflect your authentic self, carefully honed over years of personal growth and experience. Yet in the glow of your phone screen, while you scroll through a feed designed to optimise your engagement, your desires are gently, consistently and invisibly shaped by algorithms whose sole purpose is to predict what will keep you consuming (Zuboff, 2019).
Spotify introduces you to new artists you never sought but now claim as central to your identity. YouTube suggests topics you did not intend to learn but now consider essential to your self-improvement journey. Instagram quietly places aspirational lifestyles before your eyes, not to broaden your perspective but to cultivate a hunger for the products and experiences advertised between your moments of passive scrolling (Wu, 2016). These desires feel natural, arising spontaneously within you, but they are often the products of algorithmic nudging.
Algorithms excel at transforming curiosity into consumption. They monitor your micro-expressions, your scrolling speed and your pause duration to detect what triggers your interest, then amplify it in your feed (Noble, 2018). You may believe you have discovered a passion for minimalist aesthetics, but perhaps it emerged because the algorithm detected that your engagement increased when minimalist interiors appeared on your timeline. What you interpret as a natural preference may in fact be an engineered desire.
Consider the marketplace that benefits from this dynamic. The attention economy, as described by Wu (2016), does not merely capture your time; it cultivates your wants. In this environment, your desires are monetised long before you realise you have them, transforming your inner landscape into a fertile ground for the consumption of products, services and ideologies that align with the interests of those who control the platforms (Carr, 2020).
This shaping of desires is subtle, which is precisely why it is effective. You rarely notice the moments when a suggestion becomes a preference or when a casual interest becomes a deep-seated aspiration. The philosopher Byung-Chul Han (2017) describes how the digital environment fosters a form of psychopolitical control in which individuals internalise external prompts as personal desires, leading them to believe they are acting freely while they follow preordained scripts.
The moral implications are far from trivial. When your desires are shaped by invisible algorithms, your autonomy is compromised, and your capacity for genuine self-determination is weakened (Odell, 2019). It becomes difficult to distinguish between what you truly value and what you have been guided to value through repeated exposure and targeted content delivery.
You may believe you are immune to these influences, confident in your ability to discern your true desires from those presented to you. This belief is comforting, but it is also a convenient myth. Research in behavioural economics demonstrates how subtle environmental cues can significantly influence decision-making, often without conscious awareness (Thaler and Sunstein, 2008). The platforms you use are designed by experts in behavioural science who understand precisely how to engineer these cues to guide your desires toward outcomes that benefit their business models.
There is a certain tragicomedy in this arrangement. You congratulate yourself on your unique tastes and carefully curated interests, unaware that these preferences often serve as revenue streams for corporations that profit from your engagement. Your desires are moulded to align with the algorithm’s objectives, and you participate in this process willingly, convinced you are making independent choices (Pariser, 2011).
What is lost in this process is the possibility of genuine exploration and authentic desire formation. When your curiosity is constantly intercepted and redirected, you lose the opportunity to cultivate interests organically, to encounter the unfamiliar, and to engage with the world without the filter of algorithmic priorities (Carr, 2020).
If you wish to test the integrity of your desires, consider stepping outside the curated ecosystem. Read a book chosen without a recommendation engine. Explore music without the guidance of auto-generated playlists. Observe what interests emerge when your attention is not commodified and manipulated. Only then can you begin to reclaim your capacity to desire freely in a digital environment engineered to shape you without your awareness (Odell, 2019).
The Emotional Manipulation
You believe your emotions belong to you. It is a comforting belief, one that aligns neatly with your commitment to self-awareness and personal responsibility. You imagine your moods arise organically, a natural response to the environment and your inner reflections. Yet as you scroll through your curated feed, your emotions are carefully and consistently orchestrated by algorithms that understand your triggers better than you do (Zuboff, 2019).
Consider the moment you encounter a video that outrages you. You tell yourself you are simply reacting to an issue that matters, but it is worth questioning why this particular video appeared on your feed at this exact moment. Algorithms prioritise content that provokes emotional responses because emotions drive engagement and engagement drives profit (Wu, 2016). The platform does not care whether you are outraged, amused or afraid. It cares that you are engaged.
Emotions are not peripheral in this architecture. They are central commodities in the attention economy, which is designed to convert your feelings into predictable patterns of consumption (Odell, 2019). Each time you react, comment or share, you provide data points that refine the algorithm’s understanding of your emotional landscape, enabling it to serve you content that will reliably elicit further responses (Noble, 2018).
You may consider yourself too discerning to be manipulated in this manner, but research in behavioural psychology indicates that emotional stimuli are among the most effective tools for influencing decision-making, often bypassing rational deliberation entirely (Thaler and Sunstein, 2008). The algorithm knows that outrage keeps you scrolling, fear keeps you alert and envy keeps you striving for the lifestyles presented in curated images (Pariser, 2011).
Platforms have no interest in your emotional well-being. Their concern is your continued presence, and if your continued presence requires you to remain in a state of low-level anxiety or perpetual outrage, that is the price the system is willing to extract from you (Carr, 2020). This extraction of emotional energy is often framed as participation in important conversations, but the conversations themselves are designed to perpetuate engagement rather than resolution.
There is a particular irony in how you seek connection and community within these platforms, only to find yourself emotionally drained, disoriented and often lonelier than before. The philosopher Byung-Chul Han (2017) argues that the digital environment fosters a form of emotional capitalism, where feelings become products and users become producers of emotional content, sustaining the cycle of attention and profit.
The shaping of your emotional world extends beyond your individual experience. When large groups of people are exposed to the same emotionally charged content, collective moods are engineered, influencing social discourse and even political processes (Zuboff, 2019). The algorithm does not care about the content of your collective outrage or the direction of your fear. It cares only that the engagement metrics continue to rise.
You may believe your emotional responses are aligned with your values, that your outrage is justified, your fear reasonable and your aspirations authentic. However, when these emotions are consistently elicited by algorithmically curated content, your emotional responses become predictable and your sense of agency erodes (Wu, 2016).
The consequences are profound. Your capacity for reflection diminishes as you are drawn into reactive cycles of emotional consumption. Your ability to focus on long-term goals and values weakens as immediate emotional stimuli dominate your attention. Your sense of stability is undermined by the constant fluctuations in your emotional environment, fluctuations engineered to keep you engaged (Carr, 2020).
It is almost comedic how the system persuades you that your emotional reactions are signs of your independence and moral commitment while it harvests these reactions to refine its capacity to keep you scrolling. You congratulate yourself for staying informed, for caring deeply and for remaining vigilant, unaware that these commendable qualities have been converted into mechanisms for your own emotional exploitation (Odell, 2019).
If you wish to test the integrity of your emotional autonomy, consider stepping away from the feed long enough to observe how your emotional state stabilises in the absence of algorithmic triggers. Notice which concerns remain and which dissipate without the constant stimulation provided by the platforms. Only then can you begin to reclaim your emotions from systems that profit from your reactivity, transforming you from a participant in your own emotional life into a commodity within someone else’s business model (Zuboff, 2019).
What is lost when you outsource curiosity
You like to think of yourself as curious. You proudly describe yourself as a lifelong learner, open-minded, eager to explore new ideas and perspectives. It is an attractive identity, one that provides a comforting sense of intellectual vitality. Yet as you open your phone to begin your morning scroll, you are outsourcing your curiosity to systems that monetise your attention while limiting the horizons you believe you are exploring (Carr, 2020).
Curiosity has historically been a disruptive force, propelling individuals to challenge norms, question assumptions and seek knowledge that transforms their worldview. It requires friction, the discomfort of encountering unfamiliar ideas and the willingness to dwell in uncertainty while understanding develops (Han, 2017). The algorithm, however, has no interest in your discomfort or your intellectual development. It is designed to present you with content that aligns with your predictable interests, providing a steady drip of easy, familiar stimulation (Pariser, 2011).
You scroll through recommended videos, carefully curated playlists and trending articles, convinced you are learning, unaware that the scope of your exploration has been defined for you by a system optimised to keep you entertained and engaged (Odell, 2019). You may believe you are expanding your knowledge, but in many cases you are moving in circles within the boundaries of your existing preferences, gently reinforced by content that confirms what you already believe (Noble, 2018).
The cost of outsourcing curiosity is not simply the loss of exposure to new ideas but the erosion of your capacity to seek them actively. The philosopher Byung-Chul Han (2017) argues that the digital environment cultivates a passive mode of engagement in which individuals consume information as a form of entertainment, confusing the appearance of learning with the reality of intellectual growth. The algorithm feeds your desire for stimulation while quietly suppressing the discomfort necessary for genuine curiosity.
Consider the architecture of these platforms, designed to minimise friction and maximise seamless engagement. The next video plays automatically, the next article appears with a swipe, the next post loads instantly. This design erodes the pauses necessary for reflection, the silences in which questions arise, and the slow processes by which ideas are integrated into your understanding (Carr, 2020).
Outsourcing curiosity to algorithmic systems transforms your intellectual life into a commodity. Your attention becomes a product, your preferences become data and your exploration becomes a predictable pattern to be monetised (Wu, 2016). The system rewards you with the illusion of learning, while ensuring that your intellectual activity remains within parameters that benefit the platform’s business model.
There is a certain comedy in how you congratulate yourself for your intellectual openness while your feeds systematically narrow your informational environment. You may believe you are well-informed because you consume large quantities of content, but quantity does not equate to breadth, and engagement does not equate to understanding (Pariser, 2011).
The moral dimension of this outsourcing is significant. When your curiosity is directed by systems that prioritise engagement over truth, your ability to discern credible information is compromised. You are more susceptible to misinformation, ideological manipulation and superficial knowledge that fosters certainty without depth (Zuboff, 2019). You become confident in your opinions while losing the humility that genuine inquiry demands.
You may believe you are too sophisticated to fall into these patterns, but the system is designed precisely to bypass your sophistication, appealing instead to your desire for ease, novelty and stimulation (Thaler and Sunstein, 2008). In the process, your intellectual agency is eroded, replaced by a passive consumption of algorithmically curated content that feels like curiosity while providing none of its transformative potential.
If you wish to reclaim your curiosity, you will need to reintroduce friction into your intellectual life. Read a book that challenges your assumptions. Seek out perspectives that discomfort you. Allow yourself to dwell in the confusion of not knowing, resisting the urge to replace uncertainty with the quick answers the feed provides (Odell, 2019). Curiosity requires effort, discomfort and the willingness to pursue questions without immediate gratification.
Only by reclaiming your curiosity from the systems that have commodified it can you begin to explore the world on your own terms. Only then can you transform your intellectual life from a passive engagement with curated content into an active pursuit of knowledge that challenges, disrupts and ultimately liberates your mind from the limitations imposed by algorithmic curation (Carr, 2020).
Reclaiming your autonomy
It is touching how confidently you declare your autonomy. You assert that you are in control of your choices, that your consumption habits reflect your independent judgement and that your values guide your actions. You may even believe that your relationship with technology is healthy, that you use it intentionally and that you can step away whenever you wish. This belief is comforting, but it deserves scrutiny in an environment designed to erode autonomy while providing the illusion of control (Zuboff, 2019).
Reclaiming autonomy in the age of algorithmic persuasion requires more than minor adjustments to your notification settings. It demands the recognition that your behaviours are systematically shaped by architectures of capture, engineered by platforms optimised to maximise engagement and data extraction (Carr, 2020). Autonomy cannot flourish in environments structured to undermine your capacity for sustained attention, reflective thought and intentional action (Han, 2017).
Consider how your daily rituals have been colonised by systems designed to intercept your attention. You reach for your phone before you reach for your thoughts in the morning. You scroll through feeds curated to keep you reactive before you have even established your priorities for the day. You congratulate yourself for productivity while interrupting your focus to respond to the next digital prompt engineered to capture your attention (Odell, 2019).
To reclaim autonomy, you must first reclaim your time. This may sound quaint, but the deliberate allocation of attention is a radical act in a culture that profits from your distraction (Wu, 2016). Time is the substrate of autonomy, and without the ability to direct your time, your claims to independence remain rhetorical. Creating spaces in your day free from algorithmic interruption is a prerequisite for recovering your agency (Thaler and Sunstein, 2008).
Autonomy also requires the cultivation of intentionality. The philosopher Byung-Chul Han (2017) argues that contemporary digital culture encourages a reactive mode of existence in which individuals lose the capacity to act deliberately. Your autonomy weakens each time you surrender your choices to the invisible guidance of recommendation systems, confusing algorithmic suggestion with genuine desire (Pariser, 2011). Recovering intentionality involves learning to pause before reacting, to interrogate the origins of your impulses and to examine whether your actions align with your values rather than your algorithmically conditioned preferences (Odell, 2019).
A commitment to reclaiming autonomy must extend to your intellectual and emotional life. Algorithms thrive on predictability, and your emotions become tools for manipulation when they are easily triggered and channelled towards consumption or engagement (Noble, 2018). To resist this, you must develop emotional resilience, learning to notice when your emotional state is being manipulated for profit and choosing to disengage from stimuli that undermine your well-being (Carr, 2020).
You may find it challenging to disentangle yourself from systems that have made convenience synonymous with dependence. It is easier to allow the algorithm to choose your entertainment, curate your news and guide your learning than it is to cultivate discernment and actively seek knowledge and experiences aligned with your authentic interests (Odell, 2019). Yet convenience is a poor substitute for autonomy, and the comforts it offers often conceal the costs it exacts from your agency.
This work is not glamorous. It involves discomfort, discipline and the willingness to confront the limitations of your current habits. It requires acknowledging the extent to which your behaviours have been shaped by systems indifferent to your flourishing and recognising that reclaiming autonomy will often mean choosing friction over seamlessness, uncertainty over algorithmic certainty and effort over passivity (Zuboff, 2019).
You may wish to believe there is a simple solution, a productivity hack that will allow you to remain in the system while retaining your autonomy. This belief is understandable, but it is a fantasy. Systems designed to capture and monetise your attention are not neutral tools; they are environments optimised to undermine your autonomy while providing the comforting illusion of control (Wu, 2016).
If you are serious about reclaiming your autonomy, begin by observing your patterns without judgement. Notice when you reach for your device reflexively. Notice how often you seek distraction rather than engage with your priorities. Notice which desires emerge from within and which are seeded by algorithmic prompts. These observations will provide the foundation for deliberate change, allowing you to transition from a reactive participant in a system of capture to an intentional actor capable of directing your attention, desires and actions according to your values rather than the imperatives of the platform (Han, 2017).
Autonomy is not a given in the digital environment. It is a practice, one that must be cultivated with intention, defended with discipline and valued above the comforts of algorithmic ease. It is only by reclaiming your autonomy that you can begin to live your life rather than the algorithm’s (Odell, 2019).
You have arrived at the end of this reflection, which you may believe is a sign of your commitment to your intellectual development. You may even feel a mild sense of accomplishment, a comforting reassurance that you are different from the masses who mindlessly scroll through content without pausing to consider its implications. This sentiment is pleasant, but it deserves examination, for the illusion of awareness is often the final layer of the algorithm’s capture (Zuboff, 2019).
You have learned that your attention is not yours by default, that your curiosity is easily outsourced to platforms designed to limit its scope, and that your emotions are carefully harvested to fuel a system of engagement and profit (Wu, 2016). You now understand that your autonomy, that precious concept you protect in theory, is routinely undermined by systems designed to bypass your rational deliberation while providing you with the comforting illusion of choice (Carr, 2020).
You may wish to congratulate yourself for recognising these dynamics, believing that awareness is sufficient to reclaim your agency. It is not. Awareness is necessary but insufficient. The systems designed to capture your attention are indifferent to your intellectual understanding of their mechanisms. They are designed to exploit the gaps between your knowledge and your behaviour, thriving in the dissonance between what you know and what you do (Odell, 2019).
The question you must confront is not whether you understand the nature of algorithmic capture, but whether you are prepared to change your relationship with these systems in light of this understanding. It is easy to read about the erosion of attention while continuing to structure your life in ways that maximise your distraction. It is easy to express concern about surveillance capitalism while providing a steady stream of data to platforms in exchange for convenience (Han, 2017).
Reclaiming your life from the algorithm requires the cultivation of practices that protect your attention, your curiosity and your autonomy. This is not a one-time decision but a continual process of discernment, discipline and reflection. It requires the willingness to prioritise your long-term well-being over the immediate gratification provided by the feed, the courage to tolerate the discomfort of friction and the patience to reorient your habits away from reactivity towards intentional action (Pariser, 2011).
There is no universal prescription for this reclamation, but certain practices are foundational. You will need to reclaim your time, establishing spaces in your day free from algorithmic intrusion where your thoughts can develop without interruption (Carr, 2020). You will need to cultivate intentional curiosity, seeking knowledge beyond the confines of your feed, pursuing ideas that challenge your assumptions and expanding your intellectual horizons through deliberate exploration (Odell, 2019).
You will need to reclaim your emotional autonomy, learning to recognise when your emotional responses are being manipulated for engagement and choosing to disengage from stimuli that undermine your well-being (Noble, 2018). You will need to develop the capacity for stillness, resisting the compulsion to fill every moment with stimulation, allowing space for reflection and the integration of your experiences into a coherent understanding of your values and priorities (Han, 2017).
These practices will not make you more efficient within the system of algorithmic capture, and they are not intended to do so. They are designed to help you step outside of that system, to create the conditions necessary for a life oriented towards your values rather than the imperatives of platforms optimised for profit (Zuboff, 2019).
There is an irony in how easily the discourse of productivity and optimisation is co-opted by the systems that undermine your autonomy, persuading you that you can hack your way to freedom while remaining embedded in environments designed to erode it. Reclaiming your life from the algorithm is not a matter of becoming more productive within these systems but of recognising the need to create alternative structures that support your capacity for attention, reflection and intentional living (Wu, 2016).
You may find this work inconvenient, uncomfortable and at times discouraging. It will require you to confront the reality of your dependence on systems that undermine your well-being while providing comfort and convenience. It will require you to resist the immediate gratification of seamless digital experiences in favour of the slower, often more challenging, processes necessary for genuine autonomy and intellectual growth (Odell, 2019).
Yet if you wish to live a life that is genuinely yours, a life directed by your values, guided by your curiosity and sustained by your attention, this work is necessary. It is only through the deliberate cultivation of practices that protect your autonomy that you can resist the quiet colonisation of your life by systems that would prefer you remain a predictable source of engagement and data (Carr, 2020).
The question remains, and it is a question only you can answer through your actions rather than your declarations: are you living your life, or the algorithm’s?
Works Cited
Carr, N. (2020) The Shallows: What the Internet Is Doing to Our Brains. New York, W. W. Norton. Available at: https://www.wwnorton.com/books/9780393357820 (Accessed: 7 July 2025).
Han, B. C. (2017) Psychopolitics: Neoliberalism and New Technologies of Power. London, Verso. Available at: https://www.versobooks.com/products/2505-psychopolitics (Accessed: 7 July 2025).
Noble, S. U. (2018) Algorithms of Oppression: How Search Engines Reinforce Racism. New York, NYU Press. Available at: https://nyupress.org/9781479837243/algorithms-of-oppression/ (Accessed: 7 July 2025).
Odell, J. (2019) How to Do Nothing: Resisting the Attention Economy. New York, Melville House. Available at: https://www.mhpbooks.com/books/how-to-do-nothing/ (Accessed: 7 July 2025).
Pariser, E. (2011) The Filter Bubble: What the Internet Is Hiding from You. New York, Penguin Press. Available at: https://www.penguinrandomhouse.com/books/307190/the-filter-bubble-by-eli-pariser/ (Accessed: 7 July 2025).
Thaler, R. H. and Sunstein, C. R. (2008) Nudge: Improving Decisions About Health, Wealth, and Happiness. New Haven, Yale University Press. Available at: https://yalebooks.yale.edu/book/9780300122237/nudge/ (Accessed: 7 July 2025).
Wu, T. (2016) The Attention Merchants: The Epic Scramble to Get Inside Our Heads. New York, Knopf. Available at: https://www.penguinrandomhouse.com/books/317862/the-attention-merchants-by-tim-wu/ (Accessed: 7 July 2025).
Zuboff, S. (2019) The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York, PublicAffairs. Available at: https://www.publicaffairsbooks.com/titles/shoshana-zuboff/the-age-of-surveillance-capitalism/9781610395694/ (Accessed: 7 July 2025).
Comments
Post a Comment