Türk Addresses Digital Tech & Human Rights at World Standards Meeting

OHCHR

Warm thanks for the opportunity to speak here today. I am delighted to open a new chapter in the joint work of our organisations to uphold human rights in the world of digital technology.

Just one generation ago, we could never have imagined that our lives would become as enriched and empowered as they are today by our highly digital and interconnected daily reality.

Technical standards play a crucial role in this reality.

They hold up our digital infrastructure and keep it functional.

They determine how information flows, ensure our calls reach the person we want to talk to, or that online banking works safely.

As experts in this room, you know this well. You know the significant impact that standards have on the daily lives of billions of people.

But I wasn't asked here to talk about what you already know.

Rather, I would like to suggest that human rights and technical standards have much more in common than we may think.

Human rights are, in effect, a set of standards that bind humanity together. They constitute shared values, they are an inspiring force for change, and they aim to ensure human dignity for everyone, everywhere.

Human rights also represent a common language, one that pushes back against injustice, repression, greed, and abuse of power.

It is a language that keeps us safe, and requires others, including those in power, to protect us where necessary.

The unprecedented scale of progress in the digital world puts this safety at grave risk.

We see technological breakthroughs every day - progress in artificial intelligence (AI) is evolving at breath-taking speed; technology capable of reading and manipulating the mind is no longer far-fetched science fiction; and augmented and virtual reality technologies are improving by the day.

With these trends, the human rights challenges will only grow more acute.

It is time to incorporate the common language of human rights into the way we regulate, manage, design, and use new and emerging technologies. Safeguards to protect human rights must be firmly in place at the conception phases of technology and throughout its entire life cycle. These guardrails are a sine qua non for technology that serves humanity and advances the common good.

Colleagues,

We know that when digital infrastructure is designed without legal and technical standards that protect and promote human rights, it can [intentionally or not] be used or misused to facilitate serious human rights violations and abuses.

The right to privacy is being violated in ways and at rates we have never seen before.

Data is collected on our personal lives and stored and exploited in a variety of ways, many of which we are - frighteningly - not even aware of.

Lives are being turned upside down by criminals accessing this sensitive information.

Using sophisticated tools and spyware, State and non-State actors are subjecting individuals to arbitrary surveillance.

Too often new and emerging technologies are designed by men, for men, excluding women's lived experiences and priorities or failing to take into account the harm to women that such technologies can enable.

The explosion of AI technologies, without sufficient guardrails, has already led to chilling setbacks for human rights. Algorithmic bias has denied people social services or employment. Innocent people have been accused of crimes because of inaccurate facial recognition systems. And people of African descent have been denied medical care because of data that reflect deep-seated racist assumptions.

And while AI can also act as a force for good, many services and tools are designed in ways that make it hard or even impossible for people living with disabilities to use them, a clear violation of the human right to access information. We know that standards to render webpages readable and usable for people with disabilities exist, so let's ensure we are using them.

Colleagues, friends,

I deeply hope that we will not look back on 2023 as the watershed year when AI and the Robotocene stamped out human thought, imagination and inquiry.

The organisations who shape the digital infrastructure we all use have an enormous responsibility to act in the interests of their user-bases, and to protect their well-being and dignity.

Standard-developing organizations are at the heart of this. Your standards define the limits on surveillance, the protection of sensitive data, the ease with censorship can be carried out.

But we have seen time and again that it is not enough to ensure that standards are technologically sound and economically viable. That lens is narrow, simplistic and ultimately ineffective in delivering a normative framework that fully respects and enhances fundamental human rights.

They must also be designed with an end goal of improving lives, communities, and societies, through the rights and freedoms binding them together.

We have recognised the need to work together is urgent.

In 2021, the Human Rights Council asked my Office to analyse technical standards for new and emerging digital technologies and their relationship to human rights. We conducted wide-ranging consultations, speaking with your organisations, academic experts, activists, standard-developing organizations, and civil society.

That alone was an important step, breaking silos and building bridges to connect human rights experts and technical standard experts.

We have since had further opportunity, including a large public consultation nine days ago, where representatives from my Office, ITU, the International Organization for Standardization, the International Electrotechnical Commission, and other experts [business, civil society, experts working with the Internet Engineering Taskforce, the Internet Research Task Force, and the World Wide Web Consortium] gathered to discuss how we can best integrate human rights into standard setting.

I am very pleased with this progress. Our two worlds - the world of technological expertise, long the domain of standard-developing organisations, and the world of human rights - are moving closer.

It is a trend we see elsewhere, too. Much is being undertaken to strengthen the ethics of AI, including the 2021 Recommendation by UNESCO, urging Member States to place human rights at the centre of regulatory frameworks and legislation on the development and use of AI. I also want to mention the Institute of Electrical and Electronics Engineers' work on their Ethically Aligned Design principles, which provides strong human rights guidance to AI design. In addition, my Office is closely following the European Union's Artificial Intelligence Act, and how it will incorporate the human rights imperative.

These are just first steps, and they are important ones. But what we must ask ourselves today is how can we harness this collective willingness to do better? How can we work together so we can ensure technical standards support, protect and even promote human rights?

There is no magic potion that will provide a solution to this complex challenge. Developing standards is hard work, as is understanding human rights and translating them into practice.

But it is abundantly clear that overcoming these challenges will require consistent and joined up efforts by all our different expert communities.

At the foundation of our efforts to incorporate human rights into standard setting, there must be transparency. Your organizations have already done extraordinary work in improving access to information.

But as our expert consultations brought to light, much more can be done.

Through proactive outreach to communities that will likely be affected by a new standard.

By making documentation, including during the drafting process, easily accessible to the public - without prohibitive fees.

By establishing or strengthening public consultation processes.

By including a greater diversity of voices, especially women, young people and those from the Global South, not leaving out the most vulnerable and sidelined in our societies.

The involvement of women and girls, in particular, is critical. Renowned author and women's rights activist Caroline Criado Pérez said "The fact is that worth is a matter of opinion, and opinion is informed by culture. And if that culture is as male-biased as ours is, it can't help but be biased against women. By default." This is no less true in the world of tech and digital technologies.

And we must fight actively against this culture. While participation of women has grown, not the least thanks to support programs initiated by some standard-developing bodies, we need to turbo-charge the efforts to achieve gender equality.

Human rights due diligence and human rights impact assessments will also need to be an integral part of standards development processes, informed by the UN Guiding Principles on Business and Human Rights and other human rights standards.

By using these tools - together - we can develop innovative approaches to standard development, with human rights at the core. For example, mechanisms such as the Standardization Programme Coordination Group could integrate human rights screenings and impact assessments in a systematic way.

Colleagues, friends,

I firmly believe that merging our expertise is one of the smartest steps we can take to address all of the challenges that your discussion here today will address. If we work hand in hand to strengthen and uphold human rights in new and emerging technologies - starting now - I know we can mitigate the mountain of risks that we face today, and will face in the future.

Our collective responsibility to current and future generations is to limit the harms that digital technologies can bring and to harness their enormous potential for good - with dignity, safety and the firm protection of human rights as our guiding light.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.