Skip to main content

Ethics in Technology: Chapter 6. Technology, Justice, and Social Equity

Ethics in Technology
Chapter 6. Technology, Justice, and Social Equity
  • Show the following:

    Annotations
    Resources
  • Adjust appearance:

    Font
    Font style
    Color Scheme
    Light
    Dark
    Annotation contrast
    Low
    High
    Margins
  • Search within:
    • Notifications
    • Privacy
  • Project HomeEthics in Technology
  • Projects
  • Learn more about Manifold

Notes

table of contents
  1. Text Cover Page
  2. Chapter 1. Preface
  3. Chapter 2. Introduction, Ethical Frameworks and Personal Lenses
  4. Chapter 3. Defining Ethics and Related Terminology
  5. Chapter 4. Ethics for Tech Developers and Tech Consumers
  6. Chapter 5. Cybersecurity, Hacking, and Digital Identity
  7. Chapter 6. Technology, Justice, and Social Equity
  8. Chapter 7. Technology in Personal and Social Life
  9. Chapter 8. Privacy, Surveillance, and Data Ethics
  10. Chapter 9. Digital Communication, Social Media, Misinformation and Democracy
  11. Chapter 10. Intellectual Property, Digital Art, and Emerging Economies
  12. Chapter 11. Artificial Intelligence (AI), Automation and Robotics, and Algorithmic Ethics
  13. Chapter 12. Bioethics and Human Enhancement
  14. Chapter 13. Technological Disruption and the Paradox of Progress

6. Technology, Justice, and Social Equity

Tech in Education; Healthcare Access and Tech; Tech for Accessibility and Inclusion; Maslow’s Hierarchy of Needs and Tech; Digital Divide;

Tech in Education

The integration of technology in education has transformed how students learn, educators teach, and institutions deliver knowledge. From digital textbooks and online learning platforms to adaptive learning software and virtual classrooms, technology can help bridge gaps in access to quality education, particularly for students in remote or underserved areas. However, the ethical considerations are complex: unequal access to devices and reliable internet can reinforce existing educational disparities, and the use of student data for algorithmic personalization raises questions about privacy, informed consent, and potential bias.

Educators and policymakers must grapple with the responsibility to ensure that technology enhances learning equitably, rather than exacerbating divides. This involves not only providing hardware and connectivity but also supporting digital literacy, offering accessible content for students with disabilities, and critically evaluating the impact of educational technologies on student well-being and autonomy. Ultimately, the ethical deployment of technology in education requires ongoing reflection on who benefits, who may be left behind, and how to foster a more just and inclusive learning environment.

Healthcare Access and Tech

Technology has revolutionized healthcare delivery through telemedicine, electronic medical records (EMR), wearable health monitors, and AI-driven diagnostics. These advances can increase access to care for rural populations, streamline patient management, and enable earlier detection of disease. Yet, ethical challenges persist: not all patients have equal access to the internet or smart devices, and the digital skills needed to navigate modern healthcare tools are unevenly distributed.

There is also the risk that algorithmic decision-making in healthcare may reflect or amplify existing biases, leading to disparities in diagnosis or treatment. Protecting patient privacy and ensuring informed consent are paramount as more sensitive health data is collected and shared across digital platforms. Ethically, healthcare providers and technologists must work to ensure that technological innovation does not widen the gap between those who can and cannot access high-quality care, but instead promotes justice by making healthcare more inclusive, affordable, and responsive to the needs of all communities.

Tech for Accessibility and Inclusion

One of the most promising aspects of technology is its potential to empower individuals with disabilities and promote broader social inclusion. Assistive technologies – such as screen readers, voice recognition software, and adaptive hardware – can enable people with visual, auditory, motor, or cognitive impairments to participate more fully in education, employment, and civic life. The ethical imperative is to design technology that is accessible by default, not as an afterthought, and to involve people with disabilities in the design and evaluation process.

At the same time, barriers remain: not all digital content is accessible, and some emerging technologies (like AI-powered interfaces) may introduce new obstacles if not thoughtfully implemented. Promoting digital inclusion means addressing affordability, usability, and cultural relevance, while also challenging stereotypes and assumptions about disability. Ethically, the goal is to create a digital world where everyone can participate with dignity and autonomy, regardless of ability.

Maslow’s Hierarchy of Needs and Tech

Maslow’s Hierarchy of Needs provides a useful framework for considering the ethical implications of technology’s role in fulfilling human needs.

Figure 10: Maslow's Hierarchy of Needs

At the base level, technology can support physiological needs – such as food, water, and shelter – through innovations in agriculture, clean water delivery and clean air maintenance, and smart housing. Safety needs are addressed through security systems, health monitoring and health care, and emergency communication tools. As we move up the hierarchy, technology supports belonging and esteem through social media, online communities, and platforms for self-expression.

It has often been said that we currently live in a time where literally all of the basic needs at the lowest levels of Maslow’s hierarchy could be assured for all people of the world! We already have the technology for production, distribution and re-distribution, monitoring for need, and all associated communication needs, to completely eradicate food insecurity (hunger, thirst and starvation), clothing needs and housing needs (shelter and safety), assure clean air, land and water (pollution removal), health care (diagnostics and treatment) and provide the peace-of-mind and mental wellness that comes from having all of these other needs addressed. The technology isn’t what is getting in the way of achieving all of this!

Rather, it is the insistence of holding on to the status-quo of man-made economic systems which favor one economic group to the detriment of all others. This is the only thing that is preventing all of these achievements. It isn’t the tech that is lagging. Rather, it is the socioeconomic constructs that are the root cause preventing the ethical implementation of the tech.

Even worse, additional ethical tensions arise when technology is used in ways that actually exacerbate rather than work to address Maslow’s hierarchy needs: for example, when farmers are paid to not produce food to keep prices at an artificial level – all while people go hungry. Or when drug companies are allowed to charge obscene prices for life-saving treatments resulting in millionaire-class executives, while simultaneously standing idly by while poor people die without the treatments. Or when surveillance systems undermines privacy (a component of safety), or when social media algorithms foster isolation or harm self-esteem. The challenge for technologists and society is to ensure that digital tools are designed and deployed in ways that genuinely enhance human flourishing at every level of Maslow’s hierarchy, being mindful of unintended consequences and the needs of the most vulnerable.

Digital Divide

The digital divide refers to the gap between those who have ready access to computers, the internet, and possess digital literacy, and those who do not. This digital divide is most readily recognized along lines of income, geography, age, and/or ability. This divide often limits opportunities for education, employment, healthcare, and civic participation, reinforcing cycles of disadvantage. Ethically, bridging the digital divide is not just a matter of providing hardware or connectivity, but also of addressing affordability, digital skills, and culturally relevant content.

Consider this hypothetical case:

A 24-hour convenience store in New York has been robbed multiple times in the last several years. Fortunately, no employees were ever physically harmed although most have experienced some amount of psychological trauma and many of them have subsequently quit. In fact, it is very hard to find any employees to work because of the continuing threat of being robbed at knife or gun-point.

So, the business owner decides that an ideal solution is to simply stop accepting cash! Instead, the owner places signs that says, “No CASH accepted and No CASH on premises.” and “Credit or Debit Cards only accepted.” The owner tells the local news team that they decided to go this route because, ‘… nobody ever robbed a store and asked them to hand over all of their credit card receipts! The cash is the problem.’ The owner said, the technology will allow them to have a safe store once again, and this will allow the store to stay in business.

But there is only one problem with this plan… in 2020, New York passed a law that said it was illegal for a business to “not accept cash”.

The law stated that its aim was to protect the rights of the ‘unbanked’ and ‘underbanked’ population. The state used, as part of its assertion, the picture of a $1 bill noting the inscription found on each bill of US currency:

Figure 11: US $1 bill highlighting 'legal tender' phrase

The state argues that it is illegal for anyone to not accept cash. They say that the currency says “all debts, public and private”… not just the debts that anyone wants to choose.

Note, that there actually have been multiple legal cases about this very topic – and not just in New York! But, for the moment, let’s not focus on the legal aspects of the case, but rather, let’s think about the ethical aspects of the case.

If we look at the situation through the bodega owner’s perspective and lenses, we can potentially see the following:

  • The owner is trying to make a living and provide for their family while providing goods for their community at reasonable prices. (Maslow’s hierarchy level 1)
  • The owner needs workers to supplement their own work and keep the store open 24 hours as there are many customers who work all 3 shifts in the neighborhood. (Maslow’s hierarchy level 1)
  • The workers need to feel safe and the risk of being robbed for cash prevents this. (Maslow’s hierarchy level 2)
  • A tech solutions (cashless-payments) exists that address the need previously mentioned.
  • The owner wants everyone to have a cashless payment option and says it is the banks responsibility to give a card to anyone who has cash and let the bank take on the exclusive risk of being robbed for their cash.

But now, let’s look at it from the bank’s perspective and lenses:

  • The bank will only allow for a new account to be opened if there is a minimum original deposit, and a minimum maintained balance, and/or a repeating direct deposit.
  • The bank claims that this is necessary because the cost of maintaining an account requires these balances.
  • Meanwhile, the bank has reported that once again this year, the bank has earned record profits and its executives are making multi-million-dollar bonuses.
  • The bank points to the statement emblazoned on the currency itself and declares – the cash is already here so it’s not our problem.

Finally, let’s look at the situation from one particular customer’s perspective and lenses:

  • I was laid off from my previous job and I am currently freelancing odd jobs just to get by.
  • The jobs I can get all pay me in cash.
  • I had to close my bank account because I couldn’t maintain a minimum balance and I no longer have direct deposit. Since I had to file for bankruptcy, I can’t get a credit card anymore.
  • I am just barely getting by – literally – and pretty much all that I have to my name is right here in my pocket.
  • I just need to get some food before I drive over to the park to crash for a few hours in my car.

As we look at this kind of situation through the various lenses of the different individuals involved, it can become pretty obvious pretty quickly that the issues that are causing concern are not the tech! But rather, the issues surround the facts that the socioeconomic systems, and the legal systems, have not kept pace with the changes that have been brought about by tech advancements. And, rather than focusing on the ethical considerations of the situation, our current society tends to put greater focus on the legal considerations that have too often tended to foster adherence to the status-quo.

And it is this perpetuation of the status-quo that continues to exacerbate the divides (socioeconomic and technological) which become an ever-widening and unsustainable downward spiral.

Efforts to close the digital divide must be intentional and sustained, involving collaboration among governments, private sector, educators, and community organizations. There is also an ethical obligation to consider the environmental and social impacts of technology deployment, ensuring that solutions are sustainable and respect the needs and voices of marginalized communities. In a world increasingly shaped by digital technology, promoting justice and social equity means ensuring that everyone has the opportunity to participate fully and fairly in the digital society.

Textbook Definitions – Tech, Justice and Social Equity

  • Algorithmic personalization – The use of computer algorithms to tailor digital content, services, or experiences to individual users based on their data and behaviors, raising ethical questions about fairness, autonomy, and the reinforcement of social inequalities.
  • Privacy – The right of individuals to control access to their personal information and data, particularly regarding how it is collected, used, and shared by technology platforms.
  • Informed consent – The process by which individuals are provided with clear, understandable information about how their data will be used by technology systems, enabling them to make voluntary and knowledgeable decisions about their participation.
  • Digital Divide – Refers to the gap between those who have ready access to computers, the internet, and possess digital literacy, and those who do not. The digital divide is often obviously recognized along socioeconomic lines.
  • Bias – Systematic and unfair discrimination that can be embedded in technological systems, such as algorithms, which may perpetuate or amplify existing social inequalities and injustices.
  • Digital literacy – The ability to critically understand, evaluate, and effectively use digital technologies, which is essential for individuals to navigate, question, and challenge the impacts of technology on justice and social equity.
  • Autonomy – The capacity for individuals to make self-directed, informed choices in digital environments, which can be threatened by technologies that manipulate or constrain decision-making without transparency or consent.
  • Telemedicine – The remote delivery of healthcare services and clinical information using telecommunications technology, which expands access to care but raises ethical concerns about patient privacy, confidentiality, and the quality of the patient-provider relationship.
  • Electronic medical records (EMR) – Digital versions of patients’ medical histories maintained by healthcare providers, designed to improve care coordination and efficiency while presenting challenges related to data security, privacy, and equitable access.
  • Wearable health monitors – Technology-enabled devices worn on or in the body that continuously collect and transmit health data, offering opportunities for proactive health management but also raising issues of data privacy, consent, and potential disparities in access.
  • AI-driven diagnostics – The use of AI systems to analyze medical data and assist in diagnosing health conditions, which can enhance diagnostic accuracy and efficiency but may simultaneously introduce algorithmic bias, and lack of transparency, and accountability.
  • Patient privacy – The ethical and legal obligation to protect individuals’ health information from unauthorized access or disclosure, a critical concern heightened by the use of digital health technologies such as telemedicine, wearable health monitors and EMRs.
  • Screen readers – Software applications that convert digital text into synthesized speech or braille, enabling people with visual impairments to access and interact with digital content and promoting greater accessibility and inclusion.
  • Voice recognition – Technology that interprets and processes spoken language, allowing users – especially those with mobility or dexterity challenges – to control devices and input information hands-free, thereby enhancing digital accessibility.
  • Adaptive hardware – Specialized physical devices designed to accommodate the needs of individuals with disabilities, such as modified keyboards or alternative input devices, which help remove barriers and foster inclusive participation in technology use.

Annotate

Next Chapter
Chapter 7. Technology in Personal and Social Life
PreviousNext
CC BY NC SA
Powered by Manifold Scholarship. Learn more at
Opens in new tab or windowmanifoldapp.org