Imagine a vending machine blocking you from buying a snack because your health insurer has identified you and knows it will increase risk on a heart attack. A simple example, though with far-reaching consequences. Driven by The Fourth Industrial Revolution, technology and Artificial Intelligence (AI) bring us endless opportunities and are a key player in today’s success of businesses to maintain a footprint in our future living. However, boundaries between business, technology and the role of IT blur. This brings forward a discussion on ethics which is relevant for politicians, companies and our society as a whole. This article will shed a light on the importance of digital ethics, technological impact on human beings and society, and translates them into why, how and what your organisation needs to take a leadership role in the transition towards a connected society.
Importance of digital ethics
“the systems of values and moral principles for the conduct of electronic interactions among people, business and things”.
Decades ago, Aristotle introduced Eudaimonia, a concept that helps to define how we wish to live. This can be captured in the Oxford definition on ethics, meaning “a set of moral principles, especially ones relating to or affirming a specified group, field, or form of conduct.” The concept of ethics is obviously not new, where this definition serves as a ground to discuss multiple dilemmas in our society for years already. Importantly, this ground has moved to another place, namely a digital world driven by the Fourth Industrial Revolution. Taking in mind the impact of technology on people, their everyday lives and enjoyment of human rights, digital ethics have become very important. Gartner defines digital ethics as “the systems of values and moral principles for the conduct of electronic interactions among people, business and things”. This framework, for example, ensures that technology driven by AI will act in an ethical manner towards living things. Due to the pace at which these technological and societal changes take place different initiatives have come up to boost the implementation of digital ethics. Amongst them are:
- Partnership on AI, backed by Apple, Facebook, Amazon and many more
- AI4People, launched by the European Parliament
- OpenAI, funded by, among others, Elon Musk and Peter Thiel
- DigiTransCope, an EU Declaration of Cooperation on Artificial Intelligence
- IEEE, a Global Initiative on Ethics of Autonomous and Intelligent Systems.
Technological impact on human beings
A good example of the impact of technology on human beings is about deep fake news created by the University of Washington letting Barack Obama hold a speech that is fake, but very realistic.
This example is the start of a discussion that delves into our democracy and the role of media. A discussion on moral standards within an ethical framework that currently not holds in these situations. With every company digitalizing and implementing technology into their business models nowadays, endless opportunities can be seized. Look at clickworkers for instance, where people accept mind-numbing tasks on digital platforms as Amazon Mechanical Turk. To earn a fair income, these people need to click for hours and hours, on a freelance basis, surrounded by a lot of competition leading to mental & physical problems. Despite the fact that these tasks are mind-numbing, they also bring a lot of pressure on the everyday lives of the click workers. Companies as Amazon can easily say it is a worker’s own responsibility to watch their mental & physical state, but the system is built in such a way that it mostly brings benefit to Amazon, and not the people trying to earn a living out of it.
A couple of years ago we could not have imagined the big impact of something simple as clickworking. Thousands of such scenarios on the impact of future technology are already or will be written. The main difficulty is that companies face situations they have never seen before; ethical dilemmas and moral discussions on topics one cannot or does not want to answer right away. These challenges also include, for example, robotics taking over the workforce with advances in artificial intelligence, limitations on privacy and freedom of expression, how we deal with genetic technologies, and an increasing digital divide between our world population.
Now we have highlighted the importance of digital ethics and technological impact on human beings, it is good to take a look at how our society feeds this discussion. Which barriers do we need to overcome to create a shared agreement on how we wish to live together?
“the citizens will divide between those who prefer convenience and those who prefer privacy.” 
The above statement highlights a very important aspect in the development of digital ethics; privacy. The question rises if we can live our digital lives without giving up our privacy since we are heading for a world where we trust our lives, economies and societies to technological innovations. Although trust is a difficult concept, its value is significantly high, perhaps the most important business value. Belief by consumers, the public, investors, regulators, suppliers and stakeholders is fundamental for a healthy organisation. Moreover, a lack of consumer confidence forms a barrier in the acceptance of digital innovations and services. This has created a group of consumers that can be classified as ‘reluctant sharers’. According to the Trust in Digitial Life Institute these are “people who are concerned about or would prefer not to be, sharing their personal details, but feel compelled to do so in order to gain access to a service or product”. With the introduction of the GDPR this group is growing and growing, since they are now empowered with rights over ‘their’ personal data. As a consequence, there’s a re-balance of the rights and powers between an individual and the supplier. Therefore, any development of digital ethics needs to take the societal concerns regarding privacy and consequent trust in mind to create a shared agreement on how we wish to live together.
Taking a leadership role
Why a leadership role brings great business value
Currently, consumer trust and data-security issues are mainly handled by the marketing, sales or IT-department. But when did you last see customer trust as part of a marketing or IT person’s job description? You have probably never seen this and therefore this topic never received the attention it should have gotten. It is namely a company-wide topic and needs to be addressed top-down with the right leadership.
Simply said: if you as an organisation operate in a sustainable and responsible manner, you are in the position to succeed in the long term. We all know that new legislations, such as the GDPR, have a major impact on every business holding customer data. The ultimate objective of these legislations is to increase trust in digital services, subsequently accelerating the adoption of technology bringing efficiency and economic prosperity. By getting your digital ethics right both you and the customer win. Giving the consumer control and offering transparency over the data you hold can be rewarded with higher trust and thus loyalty. This leads to sharing more data, improving your products and services and opening up chances to intensify the customer relation and cross-selling. In other words: a continuous loyalty-loop. Therefore, taking a leadership position in getting your digital ethics right will eventually bring the greatest benefit your company, sooner than competition at least.
How to form an ethical framework: getting things right
With a growing group of reluctant sharers, trust issues and new moral issues coming from technology, the foundation of an ethical framework can be written down by questioning why you collect certain data, what good it brings and how it helps the individual or group. Holding a data-audit can help you in answering these questions[10,13]:
- Identify all the personal data you are responsible for
- Monitor for unintended consequences by building feedback loops
- Question yourself on transparency: if your mindset is “we don’t want our customers to know that we track this about them”, it might be an indication that your processes rise an ethical discussion
- Validate if you are allowed to process data the way you do (in line with GDPR)
- Confirm with third parties if they hold the same compliance
- Apply the golden rule: how you would like to be treated as a human being, citizen or customer?
In case the answers to these questions do not bring any ethical addition to a business or consumer, one could argue why you hold this data in the first place. This is fully in line with an article written by us earlier; because the capabilities are there isn’t a good reason to be storing it.
After writing down what you stand for, it might be time to adapt your products or services to it. In putting up the ethical framework, there are two interconnected concepts that should come on top of this list; Privacy by Design and Design Ethics. The concept of Privacy by Design was developed by Dr. Ann Cavoukian back in the 90’s, outlining seven principles that every product or service should compromise. According to her, privacy must be incorporated by default meaning that it should be an integral part of organisational priorities, project objectives, design processes and planning operations. This is strongly connected to the concept of Design Ethics are advocated by Mike Monteiro of Mule Design in his book ‘A designer’s code for ethics’. He highlights “as the products and services we use every day become more clever and deeply infused in our lives, there has never been a more important time to consider what we are designing and how.”
The above-mentioned questions and principles are just a small beginning to put you on the right track. But, before you start, there’s one last thing.
What to take in mind?
Forced by legislation and technological developments rapidly changing our society, you cannot get around digital ethics. Sooner or later you have to take responsibility and question yourself on ethical issues. What will happen if you do not take a leadership role? You either see competitors sailing away with your customers who value their basic rights, or you might even face bankruptcy due to high fines and brand damage coming along with data-breaches. I can imagine both of these scenarios are not wished for, leaving you one option: taking the leadership role.
- The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems. Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Autonomous and Intelligent Systems, Version 2. IEEE, 2017. http://standards. ieee.org/develop/indconn/ec/autonomous_ systems.html
- Oxford Dictionaries. (n.d.). Definition of Ethics in English by Oxford Dictionaries. Retrieved December 6, 2018, from https://en.oxforddictionaries.com/definition/ethic
- Sutcliffe, H., & Allgrove, A. (2018, November 7). How do we build an ethical framework for the Fourth Industrial Revolution? Retrieved December 6, 2018, from https://www.weforum.org/agenda/2018/11/ethical-framework-fourth-industrial-revolution/
- Buytendijk, F. (2017). A Short Introduction to Digital Ethics – Gartner [Presentation]. Retrieved December 6, 2018, from https://vm.fi/documents/10623/8012612/A+Short+Introduction+to+Digital+Ethics+260418.pdf/4fc0910b-0c73-47c1-b93b-5303b814b8ef/A+Short+Introduction+to+Digital+Ethics+260418.pdf.pdf
- Maldonado, J. (2018, October 23). Legal Ethics: The Ethical Dilemma of Artificial Intelligence. Retrieved December 6, 2018, from https://www.natlawreview.com/article/legal-ethics-ethical-dilemma-artificial-intelligence
- Reijner, T., & De Bruijn, W. (2018, September 28). Future Shock #11: In de krochten van de klikeconomie [Podcast]. Retrieved December 6, 2018, from https://www.vpro.nl/programmas/tegenlicht/kijk/Podcast/Future-Shock-11.html
- Anderson, J., & Rainie, L. (2014). The Future of Privacy. Retrieved December 6, 2018 from PewResearchCenter: http://www.pewinternet.org/2014/12/18/future-of-privacy/
- Cowling, J., Lacy, P., Piotroski, S., & Teo, L. (2011). Trust: Managing the Scarcest Commodity of All for High Performance. Boston: Accenture.
- European Union Agency for Fundamental Rights. (2014). Handbook on Europan data protection law. Brussels: European Union.
- Revill, G., & Goodman, D. (May 2017). Privacy: the competitive advantage (A TDL Working Group Publication- Version 1.0). Retrieved from https://trustindigitallife.eu/wp-content/uploads/Privacy-The-Competitive-Advantage.pdf
- World Economic Forum. (September 2016). Values and the Fourth Industrial Revolution Connecting the Dots Between Value, Values, Profit and Purpose (Global Agenda Council on Values 2014-2016). Retrieved from http://www3.weforum.org/docs/WEF_Values_and_the_Fourth_Industrial_Revolution_WHITEPAPER.pdf
- Daley, S. (2018, April 26). The importance of digital ethics. Retrieved December 6, 2018, from https://www.techuk.org/insights/opinions/item/12941-the-importance-of-digital-ethics
- Buytendijk, F., & Shetty, S. (2017, June 6). Getting Digital Ethics Right. Retrieved December 6, 2018, from https://www.gartner.com/smarterwithgartner/getting-digital-ethics-right/
- Cavoukian, A. (2010). Privacy by Design: The 7 Foundational Principles (Implementation and Mapping of Fair Information Practices). Retrieved from http://www.ontla.on.ca/library/repository/mon/24005/301946.pdf
- Monteiro, M. (2018, June 20). A Designer’s Code of Ethics. Retrieved December 6, 2018, from https://deardesignstudent.com/a-designers-code-of-ethics-f4a88aca9e95?gi=ef76d5c007f3