Creating ethical software, Part 1
Why YOU are responsible for ethical software, not your management
Have you read the previous blog yet?: Building Quality Software, Part 2
The project manager of the Public API team rubs his temples. Alex can taste his anxiety as he prepares to address the team. Seated next to him is the Departmental Manager of the E-train software development and operations department. The fact that he’s here as well makes Alex feel uncomfortable. It probably means this audit is serious business.
“Everyone, I’ve asked you to join me here to discuss a plan of attack. Three auditors will be going through not only our entire application but also our communication channels and our processes. If we are found to be lacking, we may lose departmental funding or even be shut down. We must make sure that that doesn’t happen. The auditors cannot find any major flaws. What can you tell me about how we stand on security, privacy, and compliance?”
The team looks sheepishly at each other. “Not good, probably,” one of the developers finally says. “Management has never given us time to investigate those aspects of the APIs. New features and revenue were more important.”
“I mean, we’ve done the basics,” another developer chimes in. “But you need to understand that security, privacy, and ALL those different compliancy standards… that’s a whole separate expertise. You can’t expect us software developers to know all that.”
“We’re being audited by our own company,” the project manager says. “So clearly, they DO expect us to know all that.”
“Even though they don’t understand it either,” Alex says with a frown.
“The auditors are coming in tomorrow. Alex, I’d like you to provide them with anything they ask for.”
“And when they inevitably find something?” Alex asks.
The Departmental Manager coughs and then finally speaks up. “Don’t worry, everyone. I’ve got your backs. Just let the auditors do their thing and publish their findings. I’ll handle upper management. Whatever report they churn out, I’m sure I can convince whoever that it is mostly meaningless nitpicking.”
Expanding the scope
In parts 2 and 3 of this series, we focused on the topic of writing code. Then, we increased our scope in parts 4 and 5 and included the quality of software. Now, we increase the scope again. Testable code and quality software are great, but just because something is easy to develop and has high quality does not mean it is good. And here, I mean good as in the philosophical Good. High quality products may be illegal, immoral, or even useless. In other words, it is not necessarily ethical. It is ethical software that we should be creating. And my argument is that if the software engineer is not responsible for ethics, nobody can be.
Why not someone else, you may ask? You may feel the ethicality of software is not the specialization of the product owner or middle or upper management. In this blog, I’ll be discussing three topics that belong in the realm of ethics: security, privacy, and compliancy. Those are complex topics. Would it not be preferable if someone else was responsible for them? Are they not way too time-consuming? Too different from “actual” software development for a software engineer to master them in any meaningful way?
As a software engineer, you have two options. Either you are responsible for security, privacy, and compliancy, or you delegate those responsibilities to other people. The second option sounds great, but in practice, it means that during every step of software development, you and your team are dependent on the input of those other people. In my opinion, it is better to include security, privacy, and compliance in your skillset.
Software engineers are usually some of the best-informed people on what the software they’re working on actually does. Moreover, they’re also some of the most knowledgeable people regarding software principles. This means that you, the software engineer, are the most logical person to turn to if a more complex software question comes up, like: ‘How could a hacker access our data?’, or: ‘What kind of personal data do we store, and where, and how long?’, or: ‘’How much work would it be to follow the ISO 9001?’. The way-too-common answer (‘I don’t know, and I don’t know who would know’) undermines your expertise.
So yes, it is a lot of work and adds a lot of complexity. Which leads us to the final question: why care about ethics at all? Why not just ignore it and let the chips fall where they may? Besides the moral argument and the legal argument, there is also the financial argument. Creating unethical software is a risk. If your malpractice is detected, the software’s owners will lose money, popularity, and sympathy. So, when I describe security, privacy, and compliance below, take it as a starting point. If my descriptions leave you wanting, get more information. It is – quite literally – your responsibility to do so.
Keep it secret, keep it safe
Security is a crucial component of ethical software development. Software security involves protecting the software and its users from malicious attacks. So, how do you make software secure? Security by design is the key phrase here. It is essential to design software with security in mind from the beginning rather than as an afterthought.
But even if you start taking security into account during the design phase, if your knowledge of security is limited, it won’t be enough. One widely used resource in the software security field is the Open Web Application Security Project (OWASP), a non-profit organization that provides tools, knowledge, and best practices for web application security. OWASP provides tons of resources, including top tens of security risks, common mitigation tactics, and up-to-date information. I’ll link to their site (and a few more) in the last paragraph of this blog.
More specifically, when estimating your stories or writing your planning proposal, take the following effort into account. Use of encryption to secure sensitive data such as passwords, credit card information, and personal information. Validating input data to prevent injection attacks, where attackers exploit vulnerabilities to inject malicious code into the software. Limiting access privileges to restrict unauthorized access to sensitive areas of the software. Establishing and configuring (virtual) networks, firewalls, and private endpoints. These should all be considered, built, tested, and monitored.
Moreover, hack yourself. Or better yet, authorize others to hack you. Whether through a pentest or initiatives like hackerone.com, you must take care of your blind spots through rigorous attempts to break your application. And finally, stay up to date. I mean this both in the sense of updating packages and other dependencies of your software, but more generally, staying up to date with security news and potential security breaches. You can’t expect to be informed if you spent no time on it.
What you are in the dark
Privacy is another critical component of ethical software development. Privacy is the ability of an individual or group to keep their personal information, activities, and communications confidential and free from unauthorized access or surveillance. It encompasses the right to control what information is collected about oneself, how it is used, and who has access to it.
Privacy can be viewed as a fundamental human right, as it allows individuals to maintain autonomy, protect their personal lives, and preserve their dignity. It enables people to make choices without fear of judgment or interference, and it fosters a sense of security and trust in personal relationships, institutions, and society.
Software engineers must ensure that their software respects user privacy and complies with relevant laws and regulations. One significant law related to user privacy is the General Data Protection Regulation (GDPR), which protects the privacy of individuals within the European Union. Software engineers can design software that considers privacy by limiting the amount of data collected, obtaining user consent before collecting personal data, and ensuring that data is stored securely.
To protect user privacy, you should consider the following four practices. First, like with security, adopt a privacy-by-design approach, integrating privacy considerations into every stage of the software development process. For instance, personal data should be removed from your databases sometime after it has served its purpose. How will you manage this?
Second, provide transparent and easily understandable privacy policies, informing users about the types of data collected, how it is used, and the measures in place to protect it. And if this data ever changes, version it so you know what version applies.
Third, minimize the collection and retention of personally identifiable information to the necessary extent, ensuring that data is used only for its intended purpose. I’d advise you to tag your data (a database column often already suffices) with the agreed-upon use of the data and the version number of the privacy statement that applies to it. This way, you can automate update requests or deletion of data in a personalized yet private way.
Finally, implement strong data protection measures, such as encryption and anonymization, to prevent unauthorized access or unintended disclosure of sensitive information. This is where security and privacy meet. An application cannot guarantee privacy if it is not secure.
It’s all in the manual
Compliancy in this context means two things: adhering to legal and regulatory requirements, and following industry standards. In addition to technical considerations, software engineers must navigate the legal landscape surrounding software development. Compliance with applicable laws and regulations is vital to uphold ethical standards. As with security and privacy failures, a failure to meet compliance obligations can result in legal consequences, fines, and reputational harm.
Granted, a lot of compliancy standards show a strong overlap with security and privacy. However, being compliant is very industry specific. There are specific health standards, financial standards, governmental standards, school standards, military standards, and so on. Besides all these, there are software-specific standards as well. The first step in becoming and staying compliant is knowing which standards apply. Familiarize yourself with relevant legislation and regulations specific to your software’s domain, such as data protection laws (e.g., GDPR) or industry-specific regulations (e.g., HIPAA for healthcare software).
I’d say it is imperative to collaborate with legal professionals or consultants who specialize in technology law to understand and interpret the legal requirements that apply to your software. And this is not a one-time thing. It is needed to regularly review and update your software to address any changes in the legal landscape, ensuring ongoing compliance with evolving standards.
This is often the part where I hear software engineers ask: “How is this my job?”. To which I’ll repeat what I said a few paragraphs ago: if you are not responsible, who can be? However, being responsible does not mean having to know all the things and doing all the work. You’ll need help. Firstly, it is important to inform management where the software currently stands, what change is needed, and why. Secondly, it is your job to make sure that the issue of software compliancy is being safeguarded by someone, and that this someone knows how you can help. This collaboration will help create the resources and awareness needed to become (and stay) compliant.
Saving you some time
Just as a little motivator slash time saver, here are some links that may get you started.
- https://owasp.org: Largest nonprofit organization dedicated to software security.
- https://www.wired.com: One of the most central security hubs on the web.
- https://thehackernews.com: Updates on the world of Infosec.
- https://www.troyhunt.com: Microsoft’s MVP security expert.
- https://iapp.org: The International Association of Privacy Professionals. Like OWASP, but for privacy instead of security.
Next, I’ll leave you with some abbreviations I think might be relevant. If you don’t know one or more of these standards, you can look them up. You should also ask around to find out whether any more specific standards apply to your software.
- ISO/IEC 27001: This international standard focuses on information security management systems and provides guidelines for protecting sensitive information.
- Payment Card Industry Data Security Standard (PCI DSS): PCI DSS is a set of security standards designed to ensure that companies that process, store, or transmit credit card information maintain a secure environment.
- General Data Protection Regulation (GDPR): GDPR is a regulation in the European Union that addresses the protection of personal data and privacy for EU citizens. It has implications for any organization that handles personal data of EU residents.
- Health Insurance Portability and Accountability Act (HIPAA): HIPAA sets standards for the protection of sensitive patient health information, requiring healthcare organizations to implement security measures to ensure confidentiality, integrity, and availability of patient data.
- ISO/IEC 20000: This standard focuses on IT service management and provides guidelines for establishing and maintaining an effective service management system.
- System and Organization Controls (SOC) 2: SOC 2 is an auditing standard that focuses on the security, availability, processing integrity, confidentiality, and privacy of service providers' systems. It is often used to assess cloud service providers and other organizations that handle sensitive data.
- International Organization for Standardization (ISO) 9001: ISO 9001 is a quality management system standard that sets criteria for implementing and maintaining effective quality management practices.
- Web Content Accessibility Guidelines (WCAG): WCAG provides guidelines for creating accessible web content, ensuring that websites and digital services are usable by people with disabilities.
And finally, if you do learn something new, don’t keep it to yourself. Make others join you in your quest for knowledge and ethics!
A culture of learning
One of the auditors sits down next to Alex. "So, we'll be doing a NIST checklist next, followed by the standard ISO 9001, a pentest, and some checks regarding PCI DDS. Do you have some API documentation available, or perhaps even a Postman collection for us?"
Alex hesitates, feeling a little overwhelmed. But, after a long, deep breath, Alex decides to be honest. "Look, I want to help, and I have some knowledge, but I'm afraid I may not understand half of what you're checking. I did not want to tell you that. I’m scared you may think I have no idea what I am doing as a software engineer if I can’t even understand what you’re saying. But I think it’s important that I do understand, and so I ask you to go a bit slower."
The second auditor looks up from his forms. "Don't worry. We're here to support you. Let's start by reviewing the basics and then dive into the specifics. We'll guide you along the way. My name is Riley, by the way."
Motivated by the encouragement, Alex starts asking more and more questions. Not surprisingly, Alex has some major knowledge gaps regarding security, privacy, and especially compliancy for the public APIs. But very surprisingly, the Public API team had intuitively done a lot of things right, and most things that were not up to standard could be solved quickly in the near future. Each day, Alex gets more enthusiastic about the material and feels lucky to be able to learn from the auditors. Riley especially has a knack for explaining concepts clearly.
As the audits progressed, Alex realizes that they had an opportunity to make a lasting impact on E-train's culture. "Before we started, the Public API team seemed to regard you as the enemy. I am sure they are not unique in that regard. I think it would be beneficial for the entire company to understand the importance of security, privacy, and compliance. Could we organize a workshop to educate everyone?"
Riley thinks about it, before grinning. "Absolutely! Let's design a workshop that covers the essentials. Your enthusiasm and expertise will inspire others to prioritize these aspects in their work."
After a few weeks, the report is published. There are quite a few findings and a relatively small timeframe in which to fix them before funding is cut. Despite this, Alex asks the Departmental Manager not to downplay the importance of the findings to upper management, and hands in a companion piece, stating the intended ways to fix the findings, including rough estimations. “It’s clear what needs to be done,” Alex says. “And more importantly, it is our responsibility to do it. We should have been doing all this since the start.”
The Departmental Manager sighs. “Are you sure? How does the rest of the team feel about this?”
“I’m not sure yet. We’ll discuss it in the upcoming Sprint Retrospective.”
The Departmental Manager looks at Alex for a while, shakes his head, and places the report in their briefcase. Then, he hands back the companion piece to Alex. “All I can say is good luck. I don’t think this act will make you very popular within the team.”
Alex nods. “Probably not, no.”
An overview
An overview of what has already been posted and what is still to come, here is a full overview:
Contact me
As always, I ask you to contact me. Send me your corrections, provide suggestions, ask your questions, deliver some hot takes of your own, or share a personal story. I promise to read it, and if possible, will include it in the series.
Writing Quality Code, Part 1