May 12, 2025

You’re all becoming powerful creators of technology, learning to build apps and even use Artificial Intelligence to solve problems in our communities. This ability to create is amazing, but it also comes with a big responsibility.

Today, we need to pause and think critically about the Ethics of the technology you are building. It’s not just about making an app that works, but making sure your app is good, fair, and truly helpful without causing unintended harm.

Lesson Topic: Ethics of Technology and AI

Section 1: What Do We Mean by “Ethics” in Technology?

Ethics are the principles of right and wrong that guide our choices and actions. When we talk about ethics in technology, we mean carefully considering the impact our creations have on people, communities, society, and the environment.

It involves asking questions like:

  • Is this technology truly beneficial?
  • Could it be used in harmful ways?
  • Is it fair to everyone it affects?
  • Are we respecting people’s rights and privacy?
  • What are the long-term consequences?

Think of it like inventing a new tool, maybe a machine to help farm faster. That sounds good! But ethical thinking means also asking: Will this machine take away jobs from farm workers? Does it require expensive fuel that farmers can’t afford? Does it pollute the environment? We need to look at the bigger picture.

Section 2: Why is Ethics Especially Important with AI?

Artificial Intelligence adds another layer of complexity because AI systems can learn and make decisions or predictions that have significant real-world consequences. Here are some key ethical areas for AI:

  • Bias Amplification: AI learns from the data we give it. If that data contains existing biases present in society (e.g., biases based on tribe, gender, location, income level, disability), the AI will learn those biases and might even make them worse!
    • Ugandan Example: Imagine an AI app designed to identify qualified candidates for a job. If it was trained mostly on data from past hires who were predominantly male and from urban areas, it might unfairly score down equally qualified female candidates or those from rural areas like ours in Jinja, simply because their profiles look different from the biased training data.
  • Fairness: Related to bias, we need to ensure the AI’s outcomes are fair and equitable across different groups.
  • Data Privacy: AI models, especially complex ones, often need vast amounts of data. This raises serious questions:
    • What data are you collecting from your users?
    • Is it sensitive information?
    • How are you getting informed consent (clear permission) to collect and use it?
    • How are you storing it securely to prevent breaches?
    • Are you complying with laws like Uganda’s Data Protection and Privacy Act, 2019?
  • Transparency (The “Black Box” Problem): Sometimes, it can be difficult to understand exactly why an AI made a specific prediction or decision. This lack of transparency makes it hard to identify errors, correct biases, or allow users to appeal unfair outcomes.
  • Accountability: Who is responsible if an AI makes a harmful mistake? The developers? The users? The organization deploying it? Clear lines of responsibility are needed.
  • Impact on Society: Consider potential impacts like job displacement if an AI automates tasks currently done by people, or the potential for AI to be used for surveillance or spreading misinformation.

Section 3: Who Are Our Stakeholders? (Thinking Beyond the User)

When considering ethics, we need to think about everyone who might be affected by our project. These people and groups are called Stakeholders. They include:

  • Direct Users: The primary people who will use your app.
  • Indirect Users: People affected by the direct users’ actions (e.g., family members, customers of a business using your app).
  • The Community: How does your app impact local culture, the environment in Jinja/Uganda, the local economy, social connections?
  • Your Team: Ethical behaviour matters within the team too – respect, fair workload distribution, honest communication.
  • Partners: Any organizations you might collaborate with.
  • Non-Users: People who cannot use your app. Does your app create disadvantages for those without smartphones, internet access, electricity, digital literacy, or people with certain disabilities? This is the Digital Divide.
  • The Environment: Does your app encourage sustainable practices or could it indirectly lead to environmental harm (e.g., increased travel, resource consumption)?

Think broadly about all the potential ripples your project could create.

Section 4: Asking the Tough Questions (Ethical Checklist for Your App)

Now, let’s apply this thinking to your specific Technovation project. Discuss these questions honestly as a team:

1. Purpose & Impact: * What is the core positive change you hope to achieve? * Who benefits most? * What are potential negative consequences or ways your app could be misused? (Don’t ignore these!)

2. Fairness & Bias: * Could your app unintentionally disadvantage or discriminate against any group (based on tribe, gender, age, location, disability, income, etc.)? * If using AI: Is your training data diverse and representative of all the people you intend to serve in Uganda? How can you actively check for and reduce bias in your data and model? * Is your app accessible? Can people with visual impairments use it (e.g., with screen readers)? Is the language simple enough for those with lower literacy? Does it work reasonably well on less powerful phones or slower internet connections common here?

3. Data Privacy & Security: * What specific user data does your app really need to function? Collect only what is necessary. * How are you explaining data usage and getting clear, informed consent from users? Is it easy to understand? * How will you store user data securely? Who on your team has access? * Will you comply with Uganda’s Data Protection and Privacy Act?

4. Transparency & Accountability: * If using AI: Can you provide a basic explanation of how it works? Is it clear to users when AI is involved? * How can users report problems, give feedback, or challenge decisions made by your app (especially if AI is involved)? Who is responsible for addressing these issues?

5. Overall Impact: * Considering everything, do you believe your app will have a net positive impact on your stakeholders and community?

Section 5: Making Your App a Force for Good (Action Steps)

Based on your answers to the questions above, identify concrete steps your team can take during development to address potential ethical issues:

  • Mitigate Negative Impacts: Can you change a feature to prevent misuse? Add warnings or guidelines?
  • Reduce Bias: Actively seek out more diverse training data. Test your app with users from different backgrounds. Consider alternative approaches if bias seems unavoidable with AI.
  • Enhance Privacy: Collect less data. Implement stronger security. Make privacy settings clear and easy to use. Write a simple privacy policy.
  • Improve Accessibility: Use clear fonts and contrast. Test with accessibility tools if possible. Simplify language.
  • Increase Transparency: Explain how key features work within the app. Provide contact information for support or feedback.

Making ethical choices now is much easier than trying to fix problems after launch!

Section 6: Documenting Your Ethical Considerations

Thinking about ethics is crucial, and documenting your thoughts shows responsibility. In your Technovation submission (perhaps in your business plan or reflections), briefly discuss:

  • Potential ethical challenges you identified for your project.
  • Steps you took during design and development to address these challenges (e.g., how you tried to ensure fairness, protect privacy, gather diverse data).
  • How you plan to ensure your app has a positive impact.

This demonstrates maturity and a deep understanding of responsible technology development.

Section 7: Quick Review (Key Concepts)

  • Ethics (in Tech/AI): Thinking about the right/wrong impacts of technology on people and the world.
  • Stakeholders: Anyone affected by your project (users, community, etc.).
  • Bias (in AI): Unfair outcomes due to unrepresentative data or flawed algorithms.
  • Data Privacy: Protecting users’ personal information.
  • Informed Consent: Getting clear permission for data use.
  • Transparency: How easily the technology’s workings can be understood.
  • Accountability: Responsibility for the technology’s outcomes.
  • Digital Divide: Gap in access to technology/internet.

Conclusion

Mwebaale kwetegereza! (Thank you for paying close attention!) Considering the ethics of your project is not just an extra task; it’s fundamental to creating technology that truly serves humanity well. By thinking critically about the potential impacts, addressing biases, protecting users, and being transparent, you can build apps that you are not only proud of technically, but also ethically. Strive to make your technology a positive force for your community here in Jinja and beyond! Musigale Butebenkevu! (Stay Peaceful/Responsible!)

ETHICS AND ACTIONS

What is Ethics?

  • Moral principles about what is right and wrong
  • Important in the world of technology and AI
  • You do not want to cause harm, even if by accident
  • Your project must help people and society

What is Algorithmic Bias?

  • Where programmer causes their program to treat some users better than others
  • or the AI model is trained without using diverse samples
  • Creating unfair outcomes for some users
  • It may not be intentional but can still affect users negatively

Check out this video to learn more about algorithmic bias.

TECHNOLOGY’S NEGATIVE IMPACT

We interact with technology daily, often without fully understanding how it affects us, how we respond to it or the impact it has on our mental well-being.

In October 2021, it came to light that Facebook (now Meta) was aware through its own internal research of the negative effects it was having on some users but chose not to take action to address the issue.

“Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram (owned by Facebook) made them feel worse,” the researchers reportedly wrote. Facebook also reportedly found that 14% of boys in the U.S. said Instagram made them feel worse about themselves.

Wall Street Journal – the Facebook Files

DO NO HARM

Following the Wall Street Journal article, Facebook whistleblowers have provided further information that:

  • Facebook promoted disinformation
  • ignored hate speech and illegal activity from certain users. 

While you are just starting out and do not have the reach of Facebook or Instagram, you still need to think carefully about who will use your product and how they might be affected by it.

STAKEHOLDERS

As you develop your project, you need to consider your stakeholders. 

Stakeholders are people or entities that are affected by decisions or actions taken by your project. 

Direct Stakeholders

  • your users

Indirect Stakeholders

  • people or organizations that may be affected by your technology, but not necessarily directly. 

For example, TikTok’s users are its direct stakeholders.

But it has some indirect stakeholders, such as:

  • music artists whose music is played on TikTok
  • Influencers who benefit from being on TikTok
  • Schools, because students are influenced by what they view on TikTok

This video explains stakeholders in more detail, as well as other key points for making ethical technology.

Here are some things to think about as you develop your app. 

  • Does your app collect user data? 
  • Do your users know it? 
  • Do you have permission to collect the data? 
  • What steps are you taking to keep their data private? 
  • Is the advice or information you provide in your app accurate? 
  • How do you know?

developing a mobile app for good infographic
APP EXAMPLE

Imagine a mobile app called the Weed Detector that predicts if a plant is a weed or not. Let’s step through what you would consider regarding ethics with the Weed Detector app.


Removing bias

Make sure the dataset is representative of the objects being classified.

Desert weed


Water weed

The data should represent the actual population where the app will be used. If the dataset is biased, the AI model might produce inaccurate results.

For example, if the dataset only includes images of weeds from desert environments, the app may not perform well when identifying plants near water. This is because it wasn’t trained on those types of plants.

Making decisions

Consider every decision your app is programmed to make.

Are there any outcomes it might produce that you didn’t plan for?

For instance, what if the Weed Detector mistakenly labels a tomato plant as a weed? That could harm people relying on it for food and disrupt beneficial insects in the garden.

Can you tell which of these plants is a weed? What would you do if you weren’t sure?

Sharing sensitive data

The app might gather data about users’ home locations when used in their gardens, which could unintentionally reveal private information – and that’s a serious concern!

It’s important to think carefully about the data your app shares. Users need to trust that their information is handled respectfully and not misused. Imagine how you’d feel if someone you didn’t know had access to your personal secrets.

As developers, we bear responsibility for how our technology interacts with people and we should always be mindful of the potential impact our inventions can have.

ETHICAL JOURNAL

Ensure your app has only positive impact

Follow the instructions in the worksheet to:

  1. Identify who your stakeholders are, both direct and indirect.
  2. Follow the flowchart of questions, and check that you are doing everything you can to make your project have only positive impact.

REVIEW OF KEY TERMS

  • Ethics –  set of moral principles that affect how people decide what’s right or wrong
  • Bias – preconceived ideas somebody has that are often unfair to some people or groups
  • Stakeholders – people or entities that will affect or be affected by decision or actions take by an organization or business

ADDITIONAL RESOURCES

Leave a Reply

Your email address will not be published. Required fields are marked *