Creating ethical software, Part 2
Software engineers are often Frankenstein when they should be Goodall
Have you read the previous blog yet?: Creating ethical software, Part 1
Alex waits patiently while the team looks at each other anxiously. People are eyeing the Product Owner especially. After convincing the Public API team to commit to implementing many drastic changes in both their code base as well as their processes, Alex has been visiting the other software teams in the Etrain company and asking them to consider making changes as well. A renewed focus on security, privacy and compliancy in a bottom-up manner will not only help the quality of the software, but also help foster a more professional and even inspiring work environment. At least, that’s what Alex has been repeating, over and over.
Finally, after another hour filled with both awkward silences and harsh words, the team agrees with Alex’s suggestions as well. Alex leaves but doesn’t feel very satisfied. In fact, it is anxiety that dominates.
“To be fair, working in multiple teams for the past months, and now taking on this new level of responsibility, convincing an entire company of changing for the better… it’s been a lot. Maybe I’ve overstepped a bit,” Alex admits to Kim, the Etrain confidant. “Maybe I am taking this whole thing more personally and seriously than I should. Maybe I am crusading when I should focus on being an engineer.”
Kim nods. “That’s possible, I guess. But maybe you’re doing the right thing, you’re just trying to do too much of it. My advice would be to delay your crusade and take a leave for a week or two. When you return, you’ll know how you feel about all this.”
“But if I leave, all the teams I am a part of won’t be able to function!” Alex objects defensively.
Kim shakes her head in response. “If that’s really true, then it is even more important that you take a leave. I urge you to take my advice. If needed, I’ll discuss it with your various managers.”
Hard and soft ethics
Last time, we discussed the… let’s call it the hard aspects of ethical software engineering. Basically, the aspects. Secure, legal software that takes privacy and standards into account is preferred over software that fails to do all that. If you word your arguments carefully, even the customer that you are developing your software for will agree that those make for better software. But now, I am approaching topics where agreement is less common. The soft aspects. The parts where your management – even reasonable management – may disagree with my take. We are going to discuss my personal take on what is necessary for software development to be ethical.
Now, it is virtually impossible to discuss ethical software engineering without drawing in aspects of philosophy, politics, and morality. But I will try to accomplish the impossible in the next few paragraphs. But still, get ready for some hot takes.
We will explore several ethical considerations that I feel all software engineers should embrace when developing applications. As a software engineer, you have responsibilities to your users, to fellow programmers, and even to mankind. We will discuss addictiveness, inclusivity and bias, transparency, transferability, cultural impact, and finally, environmental impact. That’s a lot of discussion points, so let’s get started.
Your responsibility to your users
I aim to create applications that users love. Usability is the hallmark of great UX design and we’ve discussed it before. But here's the flip side: sometimes, this drive is taken to the extreme, and we’re making apps that people become obsessed with. That's where ethics come into play, especially in the world of digital platforms like social media. I refer to this as an addictive design. Addiction is a net negative, even if it’s making your company money.
I think it’s not controversial of me to point out the risks that exist when companies profit from addictive design. Apps that try to trigger user outrage, confusion, anxiety, and even depression to increase engagement are bad. So, as developers, we need to find out: are we creating a lovable product, or are we building software that destroys users' lives? Software engineers should often be asking themselves some essential questions:
- Who benefits from my software?
- How do they benefit? Am I giving them superpowers or just a little boost?
- To what extent do they benefit? Am I transforming lives or just adding a touch of magic?
- Do I have safeguards to protect user well-being and sanity?
- How transparent are our monetization and data collection practices, including the use of AI and machine learning?
Companies like Netflix and Nintendo are putting effort into designing their products with (some of) these considerations in mind. Netflix has the famous “Are you still watching?” check, and Nintendo has a similar “Take a break!” pop-up. But that’s not enough, I think. We need more than that.
So, let's talk about usage limits—the measures to prevent addiction. On one side, we have the consistency champion, pushing users to meet daily goals and nudging them toward success. Think of it as a personal coach who knows when to let you go. On the other side, there's the irresistible temptress, offering a never-ending stream of captivating content. It's like an endless buffet you can't resist!
Transparency is crucial. Users need to understand what's happening behind the scenes. So, here's a concrete action plan when you discover addictive design in your app:
- Evaluate the benefits: Assess who benefits from your app and how. Strive for a healthy balance of empowerment without addiction.
- Set sensible limits: Implement usage boundaries that promote well-being. Encourage healthy engagement rather than dependency.
- Be transparent: Clearly communicate your monetization methods and data collection practices. Let users see what's happening.
- Keep improving: Continuously evaluate and enhance your app's design. Aim for an experience users love while prioritizing their well-being.
Another crucial ethical issue we face regarding our users is bias. It's like an invisible virus that can sneak into our systems if we're not vigilant. Computers themselves have no moral compass; they can only learn from the data and training we provide. Therefore, as developers and data scientists, it's our duty to scrub bias from both the training data and the algorithms we build.
Bias in software systems can perpetuate systemic racism and create disparities against specific populations. It's a lose-lose situation: lost opportunities, compromised medical care, or even alienation: a lack of user-experienced inclusiveness causing users to start disliking or even abandoning your application. It’s like silent quitting. Or maybe even like regular quitting, sometimes. In a way, it’s the opposite of addictive design.
Let’s take AI for example. It’s not for nothing that both Microsoft and Google explicitly state that bias and inclusiveness are imperative design goals in their Responsible AI Guidelines and Responsible AI Practices respectively. Bias exists, and it is like async programming: it spreads around like an oil spill. To combat bias, you must be intentional in your work. Ask critical questions about requirements that were determined, how data was collected, and which assumptions have been made. If you're unaware of any of the conditions surrounding it, your software could inadvertently perpetuate biases.
Eliminating bias and centralizing inclusivity in software isn't just a noble pursuit; it's an ethical imperative. I argue that a quality software engineer embraces this challenge and ensures that their software serves all users equitably, regardless of their background. You owe it to your users.
Your responsibility to other programmers
What better way to help your fellow programmers than by letting them help you? Open-source software refers to software that is released with a license that allows users to access, modify, and distribute the source code freely. It is characterized by its transparency and collaborative nature, as the source code is openly available for anyone to view, modify, and contribute to. This stands in contrast to proprietary software, where the source code is kept private and controlled by the software's creators.
Open-source software encourages community involvement, enabling developers from around the world to collaborate, improve, and customize the software according to their specific needs. This collective effort often leads to rapid innovation, enhanced security, and increased stability of the software. And potentially, that goes not just for your software, but for all software. It’s like a hive mind.
However, the open nature of the code also presents challenges that software engineers must navigate. One challenge is ensuring the security and integrity of open-source components. While the collaborative nature of open-source development can lead to rapid bug fixes and improvements, it also exposes the code to potential exploitation and security risks. Software engineers need to carefully vet and select open-source components, regularly update them, and actively contribute back to the community to maintain a healthy ecosystem.
For the second challenge, let’s again look at enabling other developers. That hive mind I was talking about sounds great, but in practice, it’s not how human society works. Society is competitive by design. Open-source software can give your competitor a way to look at your code and copy it. Take, for example, Twitter’s open-sourcing of part of their code. The fact that competitors sprang out of the ground not long after, cannot be a coincidence.
Open-source can also introduce licensing complexities. Software engineers need to be aware of the different open-source licenses, their requirements, and how they may impact the distribution and use of their applications. By adhering to open-source best practices, such as providing proper attributions and complying with license obligations, engineers can ensure legal and ethical compliance.
Another very important responsibility you have for your fellow software developers is transparency. Transparency is a fundamental principle that should guide software engineering practices. If we’re talking about transparency regarding our users, we’re often talking about compliance and privacy. But regarding other programmers, transparency is a very different beast. Transparency now refers to software that makes it possible for other (new) programmers to learn the choices made during the design and implementation of your software. Without those choices, it becomes impossible for programmers who were not there during those decisions to ever take actual, full ownership of your software. And accountability, as should be obvious by now, is key to ethical software design.
Similar to transparency is transferability. It refers to the ability of software engineers other than you to be able to continue building and maintaining your software in your absence. I’ll admit that, as a consultant, this concern tends to come up in my line of work more often than not, but it’s very important for anyone who plans to eventually change jobs, retire, and/or remain mortal. In practice, you can increase transferability by… well, by following all my previous recommendations, actually! Write testable code. Write code with clear intent. Write tests. Write documentation. By making your code easily understandable and maintainable, you enable other software engineers to seamlessly step in and continue working on the project.
Furthermore, consider using well-established frameworks, libraries, and open-source technologies. By leveraging widely adopted and supported tools, you reduce the risk of reliance on proprietary or niche solutions that may become difficult to transfer or maintain in the future. Open-source software promotes transferability by allowing the community to contribute and ensure its longevity.
Lastly, foster a collaborative and inclusive development culture within your team and/or company. Encourage code reviews, pair programming, and knowledge sharing among team members. By fostering a culture of collective ownership and shared understanding, you create an environment where knowledge is distributed, and software engineering skills are transferable.
Your responsibility to mankind
Finally, we reach the most Sesame Street-ish part of this blog. Heal the World, do it for the children, a better world starts with you, all that good stuff. Before I dive into it, let me paint you a picture of a very, very idealistic company and how they treat software development.
Often, the success of a development team is solely measured by their rate of feature development, the ethical implications of a particular implementation may not be at the forefront of their minds. Butterfly Inc., therefore, sees it as imperative that they set the tone for ethical standards in their software. Ethical priorities are integrated throughout the software lifecycle, from design to operation.
Training staff on ethical choices is essential. This includes educating developers, architects, testers, and other team members about data management practices that comply with regulations and user expectations. Developers may not be aware of the latest legislative actions in the jurisdictions where users make use of their software. It's the responsibility of the business to ensure that developers are well-informed.
To avoid ethical shortcomings, collaboration between engineering leadership and legal teams is treated as vital. For example, businesses should prioritize users' personal data access and retention. Implementing data access controls and logging mechanisms during the software development phase is crucial. It's common for developers, who focus on creating a functional and user-friendly product, to view data access restrictions as the responsibility of another team. However, it's essential to embed data protection as a fundamental feature in the software design, inherently safeguarding against unauthorized access.
Basically, the company is aware that its software has a profound cultural impact. It shapes the way people communicate, access information, and interact with the world. They know they must ensure their software respects different cultures, languages, and perspectives. They know they may intentionally or inadvertently upend entire businesses with overzealous automation. The company is fully accountable for all its software decisions, even if it will cost them all their profits. Therefore, they ask their software engineers to be extremely mindful of their impact, and each new feature request starts with a moment of reflection.
Does that sound overly idealistic? Of course. A company would never think like that, right? Unfortunately, if they’re not going to do it, I reckon we need to.
Software engineers are among the few people who have even a slight chance of predicting the cultural impact on their software. I am not saying that you need to be the one who’ll be sued or vilified when things go wrong, but I am saying that you are in a unique position to escalate potential ethical problems before they leave the design phase. Good luck.
Finally, let’s consider another kind of impact: environmental impact. Software engineers have a responsibility to consider the environmental impact of the apps they build. From energy consumption to electronic waste, the carbon footprint of digital products can be significant.
Implementing LEAN principles in software development can significantly reduce waste and lessen environmental impact. The concept of LEAN originated in the manufacturing industry and was developed by the Toyota Motor Corporation in Japan. By focusing on delivering value to users and eliminating non-essential activities, LEAN practices streamline the development process. This results in shorter development cycles, reduced resource consumption, and minimized waste. With LEAN, software engineers can optimize workflows, improve efficiency, and ultimately contribute to a more sustainable software development ecosystem. As Toyota might say: LEAN is like driving an electric car instead of a diesel.
Your code choices also have an environmental footprint. Optimizing code for performance and energy efficiency can significantly reduce the environmental impact of applications. This includes practices such as minimizing unnecessary computations, optimizing data storage and retrieval, and adopting energy-efficient coding patterns. Furthermore, engineers can leverage cloud-based infrastructure to scale resources dynamically, enabling better utilization of computing power and reducing energy waste. It’s like taking the car only when taking a bike is not feasible.
And finally, cloud computing also plays a pivotal role in reducing the environmental footprint of software. Cloud providers can leverage economies of scale, pooling resources to achieve higher energy efficiency and lower carbon emissions compared to traditional on-premises infrastructure. By migrating applications to the cloud, software engineers can take advantage of the provider's data centers, which are designed to be energy-efficient and environmentally friendly. The elasticity and scalability of cloud services allow for better resource utilization, reducing energy waste that would occur with underutilized on-premises infrastructure.
Additionally, cloud computing enables virtualization and server consolidation, enabling more efficient use of hardware resources. By leveraging cloud-based platforms, software engineers can optimize resource allocation, scale resources based on demand, and implement energy-saving measures such as automatic resource provisioning and load balancing. This dynamic resource management not only improves performance but also contributes to overall energy efficiency and reduces the environmental impact of software systems.
To continue my awesome car analogy, cloud computing is like car sharing. No sense in having your car parked around all the time if you’re only using it part-time.
The great thing about all of this is that within the context of software development, all these environmentally friendly choices tend to be cheaper as well, at least in the long run. So whether you want to save the planet or just save some money, a software engineer should always take environmental impact into account.
Battle of the scientists
I’ve mentioned before that software can be seen as a science of sorts. Unfortunately, some software engineers embody the characteristics of Dr. Frankenstein rather than Dr. Goodall. Like Frankenstein (the scientist, not the creation), they may be or become solely fixated on pushing the boundaries of technology, prioritizing innovation and technical prowess over ethical considerations. They may lose sight of the potential impact their creations can have on society, neglecting the responsibility to ensure that their software is developed with ethical considerations in mind.
In contrast, we should strive to be more like Dr. Jane Goodall, who – to me, at least – is the epitome of an ethical and engaged scientist. You probably know Goodall from her work studying chimpanzees in Tanzania. Goodall's approach to research and conservation demonstrates a deep concern for the well-being of the natural world and a commitment to understanding and mitigating ethics and impact. As software engineers, we too have the power to shape the digital landscape and influence the lives of countless individuals.
So, I urge you to reflect upon your own approach as a software engineer. Are you more aligned with the cold, scientific pursuit of Dr. Frankenstein, or the ethical, engaged perspective of Jane Goodall?
Returning from leave
After two-and-a-half weeks of much-needed time off, Alex walks back into the bustling Etrain office, feeling refreshed and rejuvenated. The break has given Alex a chance to unwind and gain a fresh perspective on the challenges faced in the company. But something else has changed within Alex as well – a surge of new ideas and insights that have been brewing during the time away.
As Alex reenters the office, colleagues and team members approach with curiosity and excitement. They have not only been able to continue their work in Alex's absence, but some have even picked up the ethical initiatives that Alex was so passionately defending before leaving. Alex is surprised by their enthusiasm. Kim was on to something.
Alex takes this opportunity to reconnect with the teams, addressing their inquiries and discussing the new ideas that have sparked during the time off. Through open conversations and shared experiences, Alex reaffirms the importance of not only utilizing technical skills but also connecting with people on a personal level. Software engineering is about measuring, science, and math – but it is also fundamentally about working with people.
Reflecting on the impact each software engineer can have, Alex decides to have a second conversation with Kim, the trusted confidant. Alex relays the exciting news of how the teams continued their work seamlessly and how their dedication and enthusiasm have inspired Alex further. With the teams managing so well on their own, Alex has an idea.
“I am sure I want to keep working with the Booking System team. I have some great new ideas for increasing the quality of that system. But I will take a step back from the other teams. The Booking System will receive my main focus. I’ve learned I need that.”
Kim nods. “But what about the responsibilities you mentioned last time? Are you going to abandon your crusade?”
“No,” Alex says. “Not abandon it. But instead of joining the crusade across all the different teams, I’ll help facilitate it. That should be a lot less… stressful. And probably more effective as well.”
“Facilitate? What do you mean by that?”
Alex looks at the clock and gets up. “Oh, I have some great ideas. But let’s save those for next time.”
An overview
An overview of what has already been posted and what is still to come, here is a full overview:
Contact me
As always, I ask you to contact me. Send me your corrections, provide suggestions, ask your questions, deliver some hot takes of your own, or share a personal story. I promise to read it, and if possible, will include it in the series.
Creating ethical software, Part 2