Sustainability

Rethinking Product Development and Usage to Make a Tech Business Ethical

By Professor Michael Huth, Chief Technology Officer and Co-Founder of Xayn

It’s hard to believe that the term “Corporate Social Responsibility” was first coined in the mid-1800s, because it seems like that it’s become pervasive throughout the financial world only in the past decade or two. Every day seems to bring another company announcement of a report on its Corporate Social Responsibility.

The much-needed shift from “shareholder capitalism” to “stakeholder capitalism” in the early-21st century has meant that companies now track performance in many non-financial areas, such as customer experience, employee experience, and environmental sustainability. Businesses understand that a good product or a useful service is no longer sufficient to win customers and keep them coming back.

Today, corporate culture calls for a business to take a stand on issues such as its carbon footprint, the ethical integrity of its supply chain, and a whole host of other ethical or societal values. And investors have seen that sustainable and ethical index funds perform as well as -- or even better than -- traditional funds.

The importance of ethical considerations may be obvious for manufacturers of consumer products such as cars. For example, factories with a low environmental footprint can save operation costs as well; using sustainable raw material increases the availability of production material, making production capabilities more robust. Therefore, what is good for the planet is also good for those businesses -- that makes ethics and sustainability drivers for strategy and R&D.

Companies that create digital technologies as services appear to be different. There is no raw sewage that could be poured into rivers, no toxic smoke coming out of laptops, and services are always up and running without driving up the end users’ utility bill. Yet, this forgets that computations use matter and consume energy. The cryptocurrency Bitcoin, for example, is very innovative as a store of value and alternative to gold, but its carbon footprint is huge compared to mining gold and reusing mined gold.

Tech businesses can address such ethical and environmental issues during development, as well as in deployment.

Incorporating ethics in the development stage

Firstly, tech businesses must ensure -- right from the development stage -- that technologies offer genuine value to consumers and users. A technologically flawless product is no longer sufficient. Users want to be reassured that ethics, sustainability, or other positive values informed the R&D of a product or service. Since digital products often use algorithms to support decision-making, users also want such algorithms to be fair and transparent.

Realizing this and communicating it effectively to users is often challenging. For example, more foundational research and development of best practices is needed to make AI models free of bias when applied for decision support.

Tech companies should also think of the control users have over their own personal data as an ethical value worth embedding in system designs and product development. Engineers, researchers, and product owners should ask important questions, such as “How much data do we really need from our users?” and “How will the user data we do get be used, where will it be stored, and when will it be disposed of securely after use?”

Naturally, this will not happen if a tech company builds its entire business model around gathering as much user data as possible for an open-ended function and mission creep of data usage.

It will happen, though, when companies design products that give users a lot of control over their personal data and over how algorithms interact with such data. The tension between the desire to have new products or functions and the demand to give users more control is there, but accepting it has intrinsic value in moving the tech industry forward.

Facebook’s founders likely never imagined the extent of activities their platform would be used for back when it was originally developed to be an online student directory with photos and some biographical information. The asymmetry of control between users and the platform has been preserved during the entire mind-blowing evolution of that platform.

Ethical issues in the usage stage

A tech company should also be accountable, in a moral sense, for how its digital products are being used and what effect that has on the world we all live in. Such effects may only become apparent over time, but when they do occur, companies must act decisively. A very recent example of this is that Facebook, Twitter, and other social media platforms were being called upon to bring a stop to the rapid spread of fake news on their own networks.

Taking corrective action may be harder than first imagined. For tech companies, this often means going back to the R&D lab to look for new technical approaches and solutions to the issue at hand -- for example, AI approaches to the automatic detection of hate speech. Businesses that want to be ethical can never be complacent about such issues -- a technology cannot be developed, set loose upon the world to generate profits, and then forgotten.

Returning to Facebook as an example, the company has faced many issues over its lifetime – and still is. The tech giant has had to redevelop its platform, algorithms, data security architectures, and many other system aspects as the user base scaled more and more. What such continued refinements did not appear to bring about is giving users more control over their personal data when participating on this platform. The recent WhatsApp controversy is significant, since it made everyone realize that such platforms see users as the product and not as the customer.

The role of governments

Businesses do not function in a vacuum. Therefore, policymakers and regulators must also act to ensure that tech companies operate ethically and help make our world sustainable. Laws and regulations are products for public service. Just like commercial products, they therefore need to evolve to reflect major changes in technologies and societal values.

For example, on February 8, 1996, U.S. President Bill Clinton signed the Telecommunications Act, which replaced a law from the 1930s “New Deal” years to accommodate changes in technology and policy. Two years later, Google was founded and the World Wide Web experienced its first major growth phase. This law also removed the limit on how many media outlets an owner could have in a single market and, therefore, led to massive consolidation -- and a reduction in the diversity of viewpoints being expressed in local news media. Because the law deregulated the sector, the media became concentrated in the hands of a handful of very wealthy corporations, and this laid the foundation for the significant impact of unbalanced reporting, such as that practiced by Fox News.

People like President Trump have understood very well how these new technologies and media, and the deregulation of them, provide a means of directly and polemically manipulating a mass audience for political and other purposes. Digital technology is rapidly evolving, offering new tools such as deepfake videos for weaponizing effective disinformation campaigns.

Governments and regulatory agencies must remain ever-vigilant to ensure that laws and regulations keep up with the fast pace of technological change -- not only on social media platforms, but also in cases of cloud technologies, AI, algorithms, cybersecurity, and the many other issues swirling within the digital sphere. How does one regulate well? Clinton’s bill above and the negative effects it had suggests that the answer is far from trivial.

Values must be reflected in both technologies and regulations

That digital communication tools and social platforms are currently being turned into weapons of disinformation should be a wake-up call for regulators and legislators around the world. The planned EU Digital Services Act and EU Digital Markets Act can play a role in creating a “New Deal” that reflects our values in digital technology. Following the violence at the U.S. Capitol building in January 2021, regulators are taking a close look at Section 230 of the Communications Decency Act that has thus far protected internet companies from liability for content posted online in the U.S.

Technological innovation should not be stifled in the R&D laboratories simply because of what “might” happen with those technologies years -- if not decades -- later. Despite best intentions, ethical businesses will never be able to foresee every potential use of the digital technologies being developed. This is why legislators around the globe must follow the progress and adapt regulations accordingly. While it is imperative that technological innovation is promoted, innovation cannot be a golden calf that makes us forget that we live in a world we need to preserve for our children. Digital tech must first and foremost be focused on providing real value to humanity and our biosphere and should contribute to a balanced and discursive information culture.

About Michael

Professor Michael Huth (Ph.D.) is Co-Founder and CTO of the technology company Xayn AG and teaches at Imperial College London. His research focuses on Cybersecurity, Cryptography, Mathematical Modeling, as well as security and privacy in Machine Learning. He served as the technical lead of the Harnessing Economic Value theme at PETRAS IoT Cybersecurity Research Hub in the UK. In 2017, he founded Xayn AG together with Leif-Nissen Lundbæk and Felix Hahmann. The Berlin-based company aims to solve the challenge of combining AI with privacy with an emphasis on Federated Learning. Xayn won the first Porsche Innovation Contest and has already worked successfully with Porsche AG, Daimler AG, Deutsche Bahn, and Siemens.

Professor Huth studied Mathematics at TU Darmstadt and obtained his Ph.D. at Tulane University, New Orleans. He worked at TU Darmstadt, Kansas State University and spent a research sabbatical at The University of Oxford. Huth has authored several scientific publications and is an experienced speaker on international stages.

The views and opinions expressed herein are the views and opinions of the author and do not necessarily reflect those of Nasdaq, Inc.