Twitter Troll Arrested For Allegedly Tweeting Seizure-Inducing Strobe GIF At Journalist

Kurt Eichenwald

Twitter

A Twitter troll who allegedly tried to cause Newsweek writer Kurt Eichenwald to have a seizure by tweeting a strobing graphic at him has been arrested by the FBI on federal charges, the journalist announced Friday.

The Dallas FBI office confirmed to the Verge that a suspect had been arrested. Eichenwald’s lawyer, Steven Lieberman, told Newsweek that agents arrested the suspect at his home in Salisbury, Maryland, on Friday morning.

The suspect, who was not immediately identified, is expected to appear in Baltimore federal court on Friday afternoon.

BuzzFeed News reached out to the FBI, which is expected to release more information on the case later in the day.

On Dec. 15, Twitter user @jew_goldstein tweeted at Eichenwald, who has previously written about being epileptic, a strobing GIF with text reading: “You deserve a seizure for your posts.”

The tweet has since been deleted and the @jew_goldstein account was suspended.

Good Morning America

Eichenwald’s wife responded on Twitter from his account that the strobing image had caused a seizure and they had alerted police.

In an interview with Good Morning America, Eichenwald said other Twitter users followed suit, sending him strobing GIFs.

“I can&;t look at my Twitter feed anymore,” Eichenwald said on GMA. “Apparently, a lot of people find this very funny. A lot of people who identify themselves as Trump supporters are loading up my feed with more strobes.”

On Friday, Eichenwald tweeted that more than 40 people tweeted strobing images at him after they learned they could induce seizures.

Quelle: <a href="Twitter Troll Arrested For Allegedly Tweeting Seizure-Inducing Strobe GIF At Journalist“>BuzzFeed

Bluemix, Watson and bot mania: The cognitive era plays hard at SXSW 

The IBM activation this past week in downtown Austin earned it the number three slot in AdWeek’s compilation of the top eight topics that had attendees buzzing at South by Southwest.
No wonder. IBM at SXSW 2017 enticed developers to the golden age of cognitive by amping up its Bluemix services offerings, specifically around the APIs used to help Watson engage more sentiently with humans. IBM gave SXSW attendees access to Watson to create a bot, remix a song, design a t-shirt, or get a beer recommendation.
With no required badge, a full open bar and DJs on its mega roof deck, the IBM activation was fueled by a regular flow of deep dives at the Maker’s Garage and live talks with IBM heavyweights including CEO Ginni Rometty and Bob Sutor, IBM vice president of cognitive, blockchain and quantum solutions.
Conversation elevated from the cloud infrastructure layer to services throughout the entire activation. With Bluemix getting more recognition thanks to the Watson platform, the event spoke heavily to developers looking for a platform to build on and ways to pull together advanced applications.
Demo areas struck the right tone with non-developers, showing not only how Watson is making the world healthier, more secure, personal, creative and engaged, but also how Watson can now respond to human emotions, preferences and taste palates.
With SXSW interactive dovetailing into the mainstay SXSW music event, the creative aspects of Watson got lots of attention, giving musicians and enthusiasts an opportunity to collaborate and, even better, play with one of the world’s most advanced APIs.
Watson Beat, a research project born out of Austin’s Research Lab uses cognitive technology to compose and transform music by remixing any piece of music using a mood-driven palette to create a personal piece that suits the user’s emotional space.
Meanwhile, TJBot, an open source project, is designed to enable users to play with Watson Services such as Speech to Text, which teaches it to understand human speech. TJBot also demonstrates how Watson can hold a conversation and even respond to different moods using Personality Insights, which can analyze the emotive content of speech in real time.

Capitalizing on the year of the bot
IBM may indeed have had the edge on SXSW’s fever pitch around bots, thanks to Watson and Bluemix.
In one SXSW featured session, the IBM events team got together with Vancouver’s Eventbase, creators of SXSW’s Go Mobile App, to share perspectives on how mobile apps and, more broadly, human experience can be enhanced with augmented intelligence.
This year, both SXSW and IBM’s Events mobile app (debuting the week of 19 March) feature intelligent, conversational user interfaces that act as personal concierge services.
“IBM sits in a unique position to provide a platform for bots and other customer experiences,” said Ted Conroy, IBM Events lead. “The appetite for bespoke, personalized experiences is voracious, and IBM&;s cognitive services definitely can feed it.”
As Conroy pointed out, bots today use a simple, cognitive service to respond to sets of questions. When a service can’t answer, it defaults to scripted answers. Soon, bots will be proactive and able to choose the optimal cognitive service to best answer a broad set of questions without the current context limitations.
Check out how to build a bot in 15 minutes with Watson conversation in this demo.
Learn how to build a TJ bot of your own here.
The post Bluemix, Watson and bot mania: The cognitive era plays hard at SXSW  appeared first on news.
Quelle: Thoughts on Cloud

Deep Learning AMI release v1.2 for Ubuntu and Updated AWS CloudFormation Template Now Available

You can now use upgraded versions of Apache MXNet, TensorFlow, CNTK, and Caffe, on the AWS Deep Learning AMI v1.2 for Ubuntu, including Keras, available in the AWS Marketplace. The Deep Learning AMI v1.2 for Ubuntu is designed to continue to provide a stable, secure, and high performance execution environment for deep learning applications running on Amazon EC2. The latest MXNet release (v0.9.3), included with this AMI v1.2, adds several enhancements including a faster new image processing API that enables parallel processing, improved multi GPU performance, and support for new operators. This AMI includes the following framework versions: Apache MXNet: v0.9.3; Tensorflow v1.0.0; Caffe release: rc5; Theano: rel 0.8.2; Keras: 1.2.2; CNTK: v2.0 beta12.0; and Torch: master branch. The Deep Learning AMI includes Jupyter notebooks with Python 2.7 and Python 3.4 kernels, Matplotlib, Scikit-image, CppLint, Pylint, pandas, Graphviz, Bokeh Python packages, Boto and Boto 3 and the AWS CLI. The DL AMI also comes packaged with Anaconda 2 and Anaconda 3 Data Science platform. You can start using the Deep Learning AMI release v1.2 in the AWS Marketplace, today. 
Quelle: aws.amazon.com

Digging deep on PHP 7.1 for Google App Engine

By Brent Shaffer, Developer Programs Engineer, Google Cloud Platform

Developers love to build web applications and APIs with , and we were delighted to announce last week at Google Cloud Next ‘17 that PHP 7.1 is available on Google App Engine. App Engine is our easy-to-use platform for building, deploying, managing and automatically scaling services on Google’s infrastructure. The PHP 7.1 runtime is available on the App Engine flexible environment, and is currently in beta.

Getting started

To help you get started with PHP on App Engine, we’ve built a collection of getting started guides, samples, codelabs and interactive tutorials that walk through creating your code, using our APIs and services, and deploying to production.

When running PHP on App Engine, you can use the tools and databases you already know and love, including Laravel, Symfony, WordPress, or any other web framework. You can also use MongoDB, MySQL, or Cloud Datastore to store your data. And while the runtime is flexible enough to manage most applications and services, if you want more control over the underlying infrastructure, you can easily migrate to Google Container Engine or Google Compute Engine.

Deploying to App Engine on PHP 7.1

To deploy a simple application to App Engine on PHP 7.1, download and install the Google Cloud SDK. Once you’ve done this, run the following commands:

$ echo ” index.php
$ gcloud app deploy

This generates an app.yaml with the following values:

env: flex
runtime: php
runtime_config:
document_root: .

Once the application is deployed, you can view it in the browser, or go to the Cloud Console to view the running instances.

Installing dependencies

For dependency management, we recommend using Composer. With it, dependencies declared in composer.json are automatically installed when deployed to App Engine Flexible Environment. In addition, it uses the PHP version specified in composer.json in your deployment.

$ composer require “php:7.1.*” –ignore-platform-reqs

Using Google’s APIs and services

Using the Google Cloud client library, you can take advantage of our advanced APIs and services such as our scalable NoSQL database Google Cloud Datastore, Google Cloud Pub/Sub, and Google BigQuery. To use the Google Cloud client library, install the code using Composer (this example assumes composer is installed globally):

composer require google/cloud

This creates a file composer.json with the most recent version of Google Cloud PHP (currently 0.24.0).

{
“require”: {
“google/cloud”: “^0.24.0″
}
}

App Engine detects the project ID of the instance and authenticates using the App Engine service account. That means you can run, say, a BigQuery query with a few lines of code, with no additional authentication! For example, add the following code to index.php to call BigQuery:

<?php
require_once __DIR__ . ‘/vendor/autoload.php';
$client = new GoogleCloudBigQueryBigQueryClient();
$query = ‘SELECT TOP(corpus, 10) as title, COUNT(*) as unique_words ‘ .
‘FROM [publicdata:samples.shakespeare]';
$queryResults = $client->runQuery($query);
foreach ($queryResults->rows() as $result) {
print($result[‘title’] . ‘: ‘ . $result[‘unique_words’] . PHP_EOL);
}

Add this to a directory with the above composer.json file, and deploy it to App Engine flexible environment:

gcloud app deploy
gcloud app browse

The second command will open your browser window to your deployed project, and you will see a printed list of BigQuery results!

Use your favorite framework

The PHP community uses a myriad of frameworks. We have code samples for setting up applications in Laravel, Symfony, Drupal, WordPress, and Silex, as well as a WordPress plugin that integrates with Google Cloud Storage. Keep an eye on the tutorials page as we add more frameworks and libraries, and be sure to create an issue for any tutorials you’d like to see.

Commitment to PHP and open source

At Google, we’re committed to open source. As such, the new core PHP Docker runtime, google-cloud composer package and Google API client are all open source:

https://github.com/GoogleCloudPlatform/php-docker

https://github.com/GoogleCloudPlatform/google-cloud-php

https://github.com/google/google-api-php-client

https://github.com/GoogleCloudPlatform/wordpress-plugins

We’re thrilled to welcome PHP developers to Google Cloud Platform, and we’re committed to making further investments to help make you as productive as possible. This is just the start — stay tuned to the blog and our GitHub repositories to catch the next wave of PHP support on GCP.

We can’t wait to hear from you. Feel free to reach out to us on Twitter @googlecloud, or request an invite to the Google Cloud Slack community and join the PHP channel.
Quelle: Google Cloud Platform