Burton Rast, Google - on ethics of AI, the human right to privacy and gender-neutral voice assistants

Burton Rastis a writer, teacher, mentor, former art director of the IDEO design agency, and today a presenter

UX Designerat Google.He bringsFocus on a human-centered approach to the development and design of products provided within the framework of Google Assistant.Holds the view that developers of machine learning systems are responsible forresponsibility, both individually and as an industry, to ensure that human needs are met equitably, and that privacy is respected as a fundamental human right.

“In the US, not everyone knows about the GDPR”

- Can you talk about the projects you are working on now?

"The projects I'm working on right now won't start until 2020, so I can't talk about them.But he recently joined the privacy and data protection team.We are working on different things, for example, to make our products GDPR compliant in Europe – this is the "General RegulationData Protection".

- Yes, I know what GDPR is.

“Well, not everyone in the States knows about him!” In addition, there are things that we announced at IO, for example, federated machine learning. This is secure and distributed machine learning. With it, they improve models without downloading data from user devices. And then the improved model is loaded into the same devices.

Photo: Festival Off

- Recently, the commission on ethics in AI was closed at Google ...

"I'm not connected to her.

- But can you talk generally about ethics in AI?

- On my own, yes, but not on behalf of Google.

- The ethics commission was closed after a weekafter its creation, as Google was criticized for including certain people in it. In principle, do you consider it normal that a small group of people should address ethical issues in AI for everyone?

- Personally - no, I do not think that a small grouppeople can articulate this ethic. Last year we published a list of principles in AI. They define work in this area. We will no longer work for the army, on something for weapons, on something that may contradict human rights laws. I think this can and should inspire other companies. This document should evolve over time, because the AI ​​is also evolving. We need to start somewhere, but in general it should be much more serious than trying one company, or, as you said, one group of people.

"In all our data is biased people"

- Can we even get rid of bias in AI?

"It's an evolving and quite complex problem.Designing systems that understand how this happens and remove new kinds of bias and discrimination when they emerge is very difficult.These systems need to be managed by people, we need to build into them the possibility of human intervention.Can anyone design the perfect system that cleansProbably not.All of our data has human bias embedded in it.Essentially, we need to create a system that is better than people.

- Is it possible?

- You can improve the system in this direction. Today we see a lot of smart people who think not only about how to develop the systems themselves, but also about the general rules of design and development that others can use. I think this is a good start, we will be able to learn from the mistakes of the past and gradually create systems in which there is no place for bias in the data.

Photo: Off Festival

- UNESCO recently published a report on genderdiscrimination in voice assistants - all of them have female voices. One of the necessary actions, the authors of the report called the development of a machine, gender-neutral voice. What do you think about it?

- It is quite difficult: Should people be able to choose the voice of their assistant? Or should a general decision be made that will affect billions of people? Can three companies make this decision? The problem is not only in gender - there are almost no voices representing people of different cultures, ethnic groups. There is still a lot of work in this area, and I'm not sure that there is a simple answer.

The UNESCO report is called "I'd blush if I could" – "I'd blush if I could" – this is the answerSiri responded to the numerous insults of users against it.The problem is that all voice assistants use the voices of young women, and they, among other things, do not respondUNESCO calls for greater gender balance in teams that developAI: Right now, only 12% of AI researchers are women.

“I want more privacy”

- Returning to the issue of privacy - do not you think that lately people have been panicking too much?

- No, I have long been an activist in data protection,This is partly the reason why you joined the team. I'm not the kind of person who thinks people panic too much. I am the kind of person that family and friends consider to be crazy, at the level of a “foil cap”. It seems to me that people should definitely be concerned about their privacy. If you are not concerned, then at least ask questions. And they should expect that the company that uses their data will tell them about it.

“Now companies are already trying to explain more specifically what is happening with the data. What are the next steps in this direction?

- I do not think that people are obliged to understand in allthe complexity of how their data is used on the Internet. People have a common understanding, this is confirmed by our research. For example, most services are free because data is used for advertising. But when it comes to questionable people and companies that want to access data, they use digital fingerprints of the device — I don't think people should figure this out. It is the business of companies to protect users.

- You can not exactly tell about new projects?

"In general terms, users should be able to say, 'I want more privacy' in a very simple way.We are working to make this a reality in all of our core products.

- How? Press the "make everything private" button?

- Of course, we will not put a giantswitch with the words "Privacy" in Google search. It's pretty hard to get teams working on products with an audience of over a billion users — search, maps, Android, YouTube — to make changes that will greatly affect their ideas. It does not happen in one day. But we are working on it.