Amazon Web Services is adamant it won’t pick and choose what machine learning applications run on its infrastructure, despite growing concerns the emerging technology can create ethical problems and amplify biases.
The world’s largest public cloud provider will, however, remove any users who violate peoples civil liberties or constitutional rights, as per its service agreements.
During a media Q&A session at the AWS Re:Invent conference last week, AWS CEO Andy Jassy fielded several questions from journalists on the topic of machine learning ethics. In each of his responses he stressed AWS was a service provider and the apps customers developed were their responsibility. He said it was incumbent on governments to set ethical standards for machine learning and when they do AWS will abide by them.
“We provide services that people can build on top of. We have a set of terms and the way people are allowed to use them. If people violate those terms they’re not allowed to use those services anymore… We give very strong guidance and recommendations on how people should use that technology,” Jassy said.
“But at the end of the day we don’t tell our customers these are the laws or this is how you can use the technology or not.”
AWS will only remove users if their actions encroach people’s civil liberties or constitutional rights, Jassy said. That arguably leaves several grey areas and even a void as few standards have been established by governments and AWS, by its own claims, innovates and offers emerging technology at a rapid pace.
“We will work with governments. We don’t control the laws of different lands. I think some governments are more interested in having that collaboration with us than others. We are very interested and enthusiastic about participating.
“But at the end of the day if countries are going to set rules around how this technology is going to be used they will end up setting them,” Jassy said.
Machine Learning Barriers To Entry Falling
Last year AWS launched SageMaker, a service designed to open up machine learning applications to its customers without the expertise to develop models or apps themselves. This year AWS announced SageMaker has been improved to further lower the barrier to machine learning entry skills with new levels of service management.
SageMaker provides “guardrails” for developing machine learning applications, according to Jassy.
“There’s a bunch of capabilities in there that allow you to know the confidence levels. We share what they should be [for certain applications] and give guidance.”
Jassy said it is possible AWS will allow customers to set standards and rules to restrict the use of machine learning models if they do not meet accuracy thresholds for certain applications.
“At the end of the day we build services. We ensure that they are secure, they work right and that they’ll be accurate and it will give you confidence levels.”
“And then our customers are building applications on top of these services and they get to make the decisions for how they want to use our platform. And we give a lot of guidance and if we think people are using them in a way that violates our terms of service we will suspend and disable people from using them.”
“But I think that if societies as a whole or countries as a whole want to actually make rules around how things must be used then they should make those rules. And we will participate and abide by them.”