Scientists Taught A Robot Language. It Immediately Turned Racist.

Monsitj / Getty Images

One day a few years ago, while talking to a journalist in her office, Harvard computer science professor Latanya Sweeney typed her name into Google’s search bar to pull up a study. As results filled in, the page also brought up an alarming advertisement: for a service that would search for arrests against her name.

The journalist perked up. Forget the paper, he told her, tell me about that arrest. That’s impossible, Sweeney replied — she had never been arrested.

Sweeney decided to get to the bottom of the phenomenon. Armed with the names of 2,000 real people, she searched on Google.com and Reuters.com and noted how the ads delivered varied depending on the person’s race. In a 2013 study, she observed that searches for typically African American names suggested an arrest 25% of the time.

Today, these sorts of Google searches no longer result in arrest ads. But this algorithmic discrimination is likely to show up in all sorts of online services, according to a study published in Science on Thursday.

The authors raise the possibility that language algorithms — such as those being developed to power chat bots, search engines, and online translators — could inadvertently teach themselves to be sexist and racist simply by studying the way people use words.

The results suggest that the bots were absorbing hints of human feelings — and failings, the researchers say.

“At the fundamental level these models are carrying the bias within them,” study author Aylin Caliskan, a postdoctoral researcher at Princeton, told BuzzFeed News.

Widely used algorithms that screen resumes, for example, could be ranking a woman programmer’s application lower than a man’s. In 2015, a group from Carnegie Mellon University observed that Google was more likely to display ads for high-paying jobs if the algorithm believed the person visiting the site was a man.

Caliskan and her colleagues focused on two kinds of popular “word-embedders” — algorithms that translate words into numbers for computers to understand. The researchers trained each bot on a different dataset of the English language: the “common crawl,” a database of language scraped from the web, containing about 840 billion words; and a Google News database containing some 100 billion words.

The study found that these simple word associations could give the bots knowledge about how people judge objects: Flowers and musical instruments, for example, were deemed more pleasant than guns and bugs.

The researchers also made their own version of a psychology test for people that seeks to reveal hidden biases.

The algorithms more often linked European American names, such as Adam, Paul, Amanda, and Megan, with feel-good words like “cheer,” “pleasure,” and “miracle” than it did for African American names like Jerome, Lavon, Latisha, and Shaniqua. And conversely, the algorithm matched words like “abuse” and “murder” more strongly with the African American names than the European American ones.

Female names like Amy, Lisa, and Ann tended to be linked to domestic words like “home” “children” and “marriage,” whereas male names like John, Paul, and Mike were associated with job terms like “salary”, “management”, and “professional.”

The software also linked male descriptors (brother, father, son) with scientific words (physics, chemistry, NASA), and female descriptors (mother, sister, aunt) to art terms (poetry, art, drama).

“We’ve used machine learning to show that this stuff is in our language,” study author Joanna Bryson, professor of artificial intelligence at the University of Bath in the UK, told BuzzFeed News.

That algorithms are deeply biased is not a new idea. Researchers who study ethics in AI have been arguing for a decade to program “fairness” into algorithms. As AI gets smarter, they say, software could make for a society that is less fair and less just.

Such signs are already here: Last year, when Amazon launched its same-day delivery service in major markets, predominantly black neighborhoods were excluded “to varying degrees” in six cities, Bloomberg found. According to analysis by ProPublica, the test-prep seller Princeton Review was twice as likely to charge Asians a higher price than non-Asians. ProPublica also showed that software used by courts to predict future criminals were biased against black Americans. On social media, people have been regularly calling out racially biased search results.

“It&;s vital that we have a better understanding of where bias creeps in before these systems are applied in areas like criminal justice, employment and health,” Kate Crawford, principal researcher at Microsoft Research, told BuzzFeed News in an email. Last year, Crawford worked with the Obama administration to run workshops on the social impact of AI, and this year co-founded the AI Now Initiative to examine how to ethically deploy such technology.

While the Science paper is expected in some ways, the results show how systemic the problem of bias is, Sorelle Friedler, associate professor of computer science at Haverford College, told BuzzFeed News.

“I see it as an important to scientifically validate them so that we can build on this — so that we can say, now that we know this is happening, what do we do about this?” she said.

Because these kinds of language-learning bots will soon be common, it’s likely that most of us will routinely encounter such bias, Hal Daumé III, a professor of computer science at the University of Maryland, told BuzzFeed News.

Some researchers at Google are finding ways to make decision-making in AI more transparent, according to company spokesperson Charina Choi. “We’re quite sensitive to the effect of algorithms, and spend a lot of time rigorously testing our products with users’ feedback,” Choi wrote in an email to BuzzFeed News.

Facebook, which is developing chat programs powered by AI, declined to comment. But the Science study showed how these biases crop up in at least one popular service: Google Translate.

When translating from Turkish, which does not have gendered pronouns, to English, which does, the service associates male pronouns with the word “doctor” and female pronouns with the word “nurse.”

When translating into Spanish, English, Portuguese, Russian, German, and French, the tool brought up similar results, the study found.

“I think almost any company that does natural language processing-like tasks is going to be using word-embedding in some form or another,” Daumé said. “I don’t see any reason to believe that [the study’s word-embedder] is any more sexist or racist than any of the others.”

LINK: How The Internet Turned Microsoft’s AI Chatbot Into A Neo-Nazi

LINK: Facebook Is Using Artificial Intelligence To Help Prevent Suicide

LINK: This Is Why Some People Think Google’s Results Are “Racist”

Quelle: <a href="Scientists Taught A Robot Language. It Immediately Turned Racist.“>BuzzFeed

KubeCon Europe 2017: Bigger and Broader

End of March I attended CloudNativeCon + KubeCon Europe in Berlin and compared with the event last year in London, I think two words describe it best: bigger and broader. With over 1200 attendees the event was impressive but still felt like a place where you can have meaningful discussions with peers.
Quelle: OpenShift

Slack Is Adding Status Messages That Tell People When You're On A Phone Call Or Vacation

What&;s old is new again.

Slack, the workplace chat app, is adding status messages — think AIM away messages, but for the office.

Slack

Slack&039;s statuses come with emoji. You can choose from a menu of existing emoji or customize your own.

The status emoji will appear next to your name, explaining what you&039;re up to. Hovering over the emoji reveals your full status message.

Slack

Some third party apps that integrate with Slack can also deliver status updates for the people that use them. If you log vacation time in a Zenefits HR system, for example, your status can tell people when you&039;re out of the office.

Slack

Slack is facing challenges from Google and Microsoft, which both recently released competitive workplace messaging products. Slack CEO Stewart Butterfield has noted the competition on Twitter.

Slack

Asked for his top five custom statuses, Butterfield provided eight.

Asked for his top five custom statuses, Butterfield provided eight.

Quelle: <a href="Slack Is Adding Status Messages That Tell People When You&039;re On A Phone Call Or Vacation“>BuzzFeed

Containers are Linux

Containers are Linux. The operating system that revolutionized the data center over the past two decades is now aiming to revolutionize how we package, deploy and manage applications in the cloud. Of course you’d expect a Red Hatter to say that, but the facts speak for themselves.
Quelle: OpenShift

Automating project creation with Google Cloud Deployment Manager

By Chris Crall, Product Manager

Do you need to create a lot of Google Cloud Platform (GCP) projects for your company? Maybe the sheer volume or the need to standardize project creation is making you look for a way to automate project creation. We now have a tool to simplify this process for you.

Google Cloud Deployment Manager is the native GCP tool you can use to create and manage GCP resources, including Compute Engine (i.e., virtual machines), Container Engine, Cloud SQL, BigQuery and Cloud Storage. Now, you can use Deployment Manager to create and manage projects as well.

Whether you have ten or ten thousand projects, automating the creation and configuration of your projects with Deployment Manager allows you to manage projects consistently. We have a set of templates that handle:

Project Creation – create the new project with the name you provide
Billing – set the billing account for the new project
Permissions – set the IAM policy on the project
Service Accounts – optionally create service accounts for the applications or services to run in this project
APIs – turn on compatible Google APIs that the services or applications in a project may need

Getting started
Managing project creation with Deployment Manager is simple. Here are few steps to get you started:
Download the templates from our github samples.

The project creation samples are available in the Deployment Manager github repo under the project_creation directory. Or clone the whole DM github repo:

git clone

https://github.com/GoogleCloudPlatform/deploymentmanager-samples.git

Then copy the templates under the examples/v2/project_creation directory.

Follow the steps in the Readme in the project_creation directory. The readme includes detailed instructions, but there is one point to emphasize. You should create a new project using the Cloud Console that will be used as your “Project Creation” project. The service account under which Deployment Manager runs needs powerful IAM permissions to create projects and manage billing accounts, hence the recommendation to create this special project and use it only for creation of other projects.

Customize your deployments.

At a minimum, you’ll need to change the config.yaml file to add the name of the project you want to create, your billing account, the APIs you want, the IAM permissions you choose to use and the APIs to enable.
Advanced customization — you can do as little or as much as you want here. Let’s assume that your company typically has three types of projects: production service projects, test service projects and developer sandbox projects. These projects require vastly different IAM permissions, different types of service accounts and may also need different APIs. You could add a new top level template with a parameter for “project-type”. That parameter takes a string as input (such as “prodservice”, “testservice” or “developer”) and uses that value to customize the project for your needs. Alternatively, you can make three copies of the .yaml file — one for each project type with the correct settings for your three project types.

Create your project.
From the directory where you stored your templates, use the command line interface to run Deployment Manager:
gcloud deployment-manager deployments create
<newproject_deployment> –config config.yaml –project <Project
Creation project>

Where <newproject_deployment> is the name you want to give the deployment. This is not the new project name, that comes from the value in the config.yaml file. But you may want to use the same name for the deployment, or something similar so you know how they match up once you’ve stamped out a few hundred projects.

Now you know how to use Deployment Manager to automatically create and manage projects, not just GCP resources. Watch this space to learn more about how to use Deployment Manager, and let us know what you think of the feature. You can also send mail to dep-mgr-feedback@google.com.
Quelle: Google Cloud Platform

March 2017 Leaderboard of Database Systems contributors on MSDN

Many congratulations to the March 2017 Top-10 contributors!

Hilary Cotter and Alberto Morillo top the Overall and Cloud database lists this month as well. 7 of the Overall Top-10 featured in last month’s Overall Top-10 too.

This Leaderboard initiative was started in October 2016 to recognize the top Database Systems contributors on MSDN forums. The following continues to be the points hierarchy, in decreasing order of points:

For questions related to this leaderboard, please write to leaderboard-sql@microsoft.com
Quelle: Azure