Unit 3: Individual and Social Rights, Choices and Technology

Last update:2 April 2024

Key Topics

  • Understanding the concepts of rights: social rights and individual rights
  • The role of technology in prioritizing social and individual rights, and choices
  • Technology as potential platforms for intercultural dialogue for social rights, social justice, and social movements, as well as for potential surveillance, control and manipulation
  • Media and information Literacy competencies and digital skills for individual and social rights, and choices
  • Technology determinism and agenda setting theories as determinant of social rights and individual choices.
Module 11 MIL

Learning Objectives

After completing this unit, educators and learners should be able to:

  • Understand the role of technology in prioritizing social and individual rights, and choices
  • Understand the importance of technology and its owners for negotiating rights and choices
  • Describe the media and information literacy competencies needed in making choices
  • Understand how technology in the context of its controllers can sensitize or polarize citizens based on religious sentiments, financial gain, cultural affiliations, gender inequalities and political affiliations
  • Understand the role of technology and its controllers in democracy especially as it relates to political institutions, political choices, electoral processes, voting, accountability in governance and transparency in society

AI and Content Sharing

The concept of technological determinism implies that digital means of communication have an influence on who and what content appear prominently on platforms. But this concept reifies technology as it was a thing in and of itself, and ignores that all technology has social roots, and further implies a blind spot that particular business purposes, inter alia, shape the development and deployment of technologies. Control of architecture design and engineering decisions informs the technology design, such as what can be shared within or outside of a walled garden platform, and what data gathering, storage and use cases are operational.

To understand who and what content are shared on platforms, therefore, requires understanding why platforms develop particular curational objectives and content moderation strategies and practices. It is these which underpin the computer algorithms powered by Artificial intelligence (Al), which in turn wield power in the way that content is ranked and subjected to other treatment (e.g. uploading blocked, deletion, labelling, referred to human moderators, sent to fact-checkers, etc). The consequences can affect what people see as items on the agenda, influence their individual choices and consequently their rights, and shape public opinion. Computer algorithms over time have been known to change citizensā€™ online experiences and the factors that inform their decisions, and thereby can change public opinion and perceptions over a period. The impact may also be on individual identities ā€“ their sense and value of who they are, and indeed of what they are or should be becoming. This is especially relevant to young people in the process of consolidating themselves.

The phenomenon known as "filter bubble" has given some insight into online experiences. It highlights how, consequent to algorithmic design, algorithms can work to filter out content that does not reinforce existing preferences, tastes and information habits. The result is a closed universe where individual biases are unchallenged because the system has shut out other narratives. Some research, such as by the Reuters Institute for the Study of Journalism97, suggests that many people are actually exposed to a greater diversity of content than the "filter bubble" model suggests. At the same time, another concept ā€“ of ā€œecho chamberā€ ā€“ suggests that even with diverse content, people can still occupy a narrow interpretative community. In this concept, individuals may indeed be exposed to information that in and of itself contradicts their beliefs and assumptions, but which information is discounted in terms of its significance. This is because it is placed within the context of cumulative repeated social ā€œechoesā€ which precede it and which provide a sense of security that familiar framings and meanings remain intact.

Algorithmic filter bubbles can reinforce "echo chambers", although the latter can exist on their own. In extreme cases, wholly separate and parallel universes of meaning may result, wherein different (and relatively closed) communities operate with different facts and falsehoods on politics, health, climate change, etc, and with different narratives about reality more broadly.

Pedagogical Approaches and Activities

As discussed earlier in this Curriculum (Part 1), various pedagogical approaches are possible. Please review the list in Part 1 and decide which approach to apply to the suggested activities below and others that you may formulate.

  • Watch the popular TED Talk video, | Eli Pariser. The video was prepared close to 10 years ago. Guide the discussion: How relevant is this video today? Ask educators to make a list of the issues raised. Small groups can do research to see more recent discourse on these issues. Does this phenomenon affect how social rights are prioritized, individual choices are made and public opinions are formed? Can AI-driven algorithms alter narratives and thus set the social, development and political agenda?
  • Compare and discuss Eli Pariserā€™s comparison between editorial processes for validation in traditional media (ā€˜broadcast societyā€™) and the algorithmic controlled flow of information of digital communications. Do you agree with his stance? Why or why not? What new arguments and evidence can the educators or learners add? Do the research to underpin your arguments and ideas.
  • The paradox of digital communications is that what appears harmful to one person can turn out to be an advantage to others ā€“ or vice versa. The algorithms that influence content ranking and targeting can cause objective harm to human rights. At the same time, they could - through design, or unintended effect - also bring to light voices that are usually silenced or repressed. For instance, issues such as gender inequalities, the underrepresentation of women, gender-based violence, human trafficking and racial discrimination that since long are underreported and invisible, have now become more prominent in global discourse. Through technology mediated engagement with other cultures and new information, some persons have changed their stance about certain traditionally anchored practices that bring harm to the persons involved, such as female genital mutilation (FGM). Do you think engagement with technology services in this regard has given impetus to social rights and shift in social opinion and beliefs? What other factors are at work other than technologies themselves? How can you put your MIL competencies to work in this discussion?
  • Put educators or learners in groups to research existing cases studies about positive and negative use of technology to advance or hinder the right to association, the right to freedom of religion, the right of opinion, and the right to freedom of expression. Organize a series of presentations and discussions. Focus the research analysis and discussion on individual choices in questioning historical fundamental assumptions, rejection of questionable belief systems, affirmation of natural identity, negotiated cultural values and realignment of value systems, which has become increasingly feasible as a result of technology.
  • MIL advocates have argued for media and information literacy education that emphasizes intercultural dialogue as necessary skills in balancing the two sides of a coin of AI driven algorithmic platforms. Explore the following questions in the context of individual and social rights in the light of citizensā€™ choice and enabling technology:
    • What is your understanding of oneā€™s rights? Carry out interviews, discuss, and collate divergent views of people's understanding of different rights (individual and social) and how these rights influence their perception of public opinion. Capture, curate and share the best short video clips on the topic.
    • What are the different social rights people are confronted with in the 21st century? What are the content providers that make them aware of such social rights? Consider the contrasting social rights in todayā€™s digital environment. 
    • In light of how institutions can act to shape society, do you think the result has enabled social rights and individual choices? Make a deliberate effort to interview different people to know their perception about technology-enabled social rights and the implications on individual choices.
  • Consider the media and information literacy competency skills needed for a peaceful coexistence and tolerance necessary to balance private views and individual choices with contemporary social rights. You may employ focus group discussions with different groups of people in order to come up with innovative MlL practices.
  • Discuss the need for measures that will enable digital communications companies to provide equitable access to online information and narratives, with a view to enhancing citizensā€™ online experience and address the challenges associated with Al driven algorithmic platforms such as "filter bubble" and amplification of potentially harmful content.
  • Consider how some Internet companies are undermining individual rights and choices based on the engineering design and architecture of their services. Discuss and proffer practicable solutions on how you think these problem can be solved.
  • Discuss how MlL education can influence the quality of information that citizens access on online search engines and the impact of MIL on citizensā€™ rights and choices.

Suggested Assessment & Recommendations

  • Develop a skills matrix and competency evaluation guide to determine the MIL skills needed in balancing individual rights and choices in the digital environment
  • Draft a legislative bill that your political representatives could consider on human rights in digital contexts
  • Design and launch a small survey on peopleā€™s understanding of the concept of different types of rights in a democratic setting and the influence of current lnternet communications companies on these rights
  • Carry out a focus group discussion and interview on citizensā€™ rights in todayā€™s digital environment

Topics for Further Considerations

  • Principles of individual choices
  • Individual choices vs collective choices
  • Ethical use of technology
  • Future of digital elections