Amazon Simple Queue Service (SQS) Server-Side Encryption is Now Available in the US East (N. Virginia) Region

You can now use Amazon Simple Queue Service (SQS) server-side encryption (SSE) integrated with the AWS Key Management Service (KMS) in the US East (N. Virginia) region. Amazon SQS is a fully managed message queuing service for reliably communicating between distributed software components and microservices – at any scale. You can use Amazon SQS to take advantage of the scale, cost, and operational benefits of a managed messaging service. The addition of server-side encryption allows you to transmit sensitive data with the increased security of using encrypted queues. 
Quelle: aws.amazon.com

Amazon Inspector expands security assessment support for RHEL and CentOS

Amazon Inspector is pleased to announce expanded support for security assessments on Red Hat Enterprise Linux 6, Red Hat Enterprise Linux 7, CentOS 6, and CentOS 7 within Amazon EC2. Amazon Inspector now supports security assessments for these operating systems on versions 6.2 through 6.9, 7.2, and 7.3. You can now run assessments for Common Vulnerabilities & Exposures (CVE), Amazon Security Best Practices, and Runtime Behavior Analysis on these supported versions. To run security assessments, simply install the Amazon Inspector AWS Agent on the desired EC2 instance, configure your assessment in the Amazon Inspector console, and run your assessment.
Quelle: aws.amazon.com

AWS Storage Gateway now provides retrieval of archived virtual tapes in as little as 3-5 hours, adds additional virtual tape information, and adds cached volume cloning

For tape gateway users, you can now retrieve archived virtual tapes into your Virtual Tape Library (VTL) in as little as 3-5 hours. Prior to this launch, tape retrieval could take up to 24 hours. This faster access can help you speed up recovery of archived data. Additionally, to help you manage your virtual tapes, we have enhanced the tape gateway console and API to show the date and time a tape was created, and the amount of data written to each virtual tape.
Quelle: aws.amazon.com

Deploy Cognitive Toolkit model to Azure Web Apps

Azure offers several ways of deploying a deep-learning model including Windows Web App, Linux Web App, and Azure Container Services. For those less experienced with a Linux environment/containers, Windows Web Apps offers familiar territory. In this post we will deploy a ResNet-18 model to Azure Web Apps and then submit some test pictures to it using a sample HTML interface, and also via python.

Demo results

HTML interface

Python

The above screenshot is taken from this notebook. If you wish to run some speed-tests, this notebook on GitHub shows how to submit asyncrochonous requests to the created API to get an idea of how long it takes to classify images in bulk. In this example we get 0.86 seconds per image.

Replicate demo

1. Download the contents of the repo and open a Command Prompt in the folder.

2. Run the following commands to check you have git and azure-cli installed:

az –version # time-of-writing: 2.0.1
pip install azure-cli # otherwise install azure-cli
git –version # time of writing: 2.9.2.windows.1

3. Set your username and password for local git deployment. Please note, you only need to do this once. For example:

set uname=<username_for_local_git_deployment>
set pass=<password_for_local_git_deployment>
# Create a user-name and password for git deployment of all your apps
az appservice web deployment user set –user-name %uname% –password %pass%

4. Create your web-app by running the below commands:

# Name for your web-app
set appn=<app_name>
# Name for resource-group containing web-app
set rgname=<name_for_resource_group_that_contains_app>
# Login to azure
az login
# Create a resource-group
az group create –location westeurope –name %rgname%
# Create a paid 'S2' plan to support your app
# The standard paid plans are: S1, S2, S3
az appservice plan create –name %appn% –resource-group %rgname% –sku S2
# Create the web-app
az appservice web create –name %appn% –resource-group %rgname% –plan %appn%
# Configure for local git deployment (SAVE URL)
az appservice web source-control config-local-git –name %appn% –resource-group %rgname% –query url –output tsv
# Initialise your git repo
git init
# Add the azure endpoint
git remote add azure <PASTE_URL_FROM_ABOVE>
# e.g. git remote add azure https://ilia2ukdemo@wincntkdemo.scm.azurewebsites.net/wincntkdemo.git

5. We will now install Python. Navigate to your web-app on Azure Portal, scroll down to the "Extensions" blade and click select:

6. Then, click on "Add", locate "Python 3.5.3 x64", and add it. Please note, you must use this extension.

Make sure you get a notification that this has installed successfully.

7. (Optional) Under the "Application settings" blade set "Always On" to "On" to reduce the response time since your model will be kept loaded.

8. Deploy this demo by running:

git add -A
git commit -m "init"
git push azure master

If everything has gone successfully you should see the following line in the script output:

remote: Successfully installed cntk-2.0rc1
remote: ..
remote: 2.0rc1

You should now be able to navigate to your web-app address and upload a photo that will be classified according to the CNN: ResNet-18.

Advanced modifications (run your own)

You can include references to other modules (e.g. pandas or opencv) in your model.py file, however you must add the module to the "requirements.txt" file so that python installs the module. If the module needs to be built, you can download the pre-built wheel file to the wheels folder. Don't forget to add the wheel path to the "requirements.txt" file at the root of the directory. Note: Numpy, Scipy, and CNTK wheels are automatically installed inside the "deploy.cmd" script. To change this you can edit the deploy.cmd file to point to whichever numpy wheel you require.

Editing deploy.cmd – The install script automatically adds the binaries for CNTK v2.0 rc1. However, if you want to use Python 3.6 or CNTK v2.0 rc1+, alter the below in the "deploy.cmd" script:

:: VARIABLES
echo "ATTENTION"
echo "USER MUST CHECK/SET THESE VARIABLES:"
SET PYTHON_EXE=%SYSTEMDRIVE%homepython353x64python.exe
SET NUMPY_WHEEL=https://azurewebappcntk.blob.core.windows.net/wheels/numpy-1.12.1+mkl-cp35-cp35m-win_amd64.whl
SET SCIPY_WHEEL=https://azurewebappcntk.blob.core.windows.net/wheels/scipy-0.19.0-cp35-cp35m-win_amd64.whl
SET CNTK_WHEEL=https://azurewebappcntk.blob.core.windows.net/cntkrc/cntk-2.0rc1-cp35-cp35m-win_amd64.whl
SET CNTK_BIN=https://azurewebappcntk.blob.core.windows.net/cntkrc/cntk.zip

To create the 'cntk.zip' file you just need to extract the cntk/cntk folder (i.e. the folder that contains 'CNTK.exe' and DLLs; you can remove the python sub-folder which contains the wheels, if it exists) and then reference it with the %CTNK_BIN% environmental variable above.

You can also install a different python extension if you wish, however make sure to reference it properly (and also to get the Numpy, Scipy and CNTK Wheels for it). For example, the "Python 3.5.3 x64" extension is installed in the directory "D:homepython353x64", and thus the script references: SET PYTHON_EXE=%SYSTEMDRIVE%homepython353x64python.exe

Finally, alter the "model.py" script as desired in the folder "WebApp", along with the HTMl template, "index.html" in "templates" and then push your changes to the repo:

git add -A
git commit -m "modified some script"
git push azure master

Quelle: Azure

Amazon Elastic Block Store (Amazon EBS) Now Supports Cost Allocation for Snapshots

Today we are announcing that Amazon EBS now supports cost allocation tags for EBS snapshots within the AWS Billing and Cost Management Dashboard. Using cost allocation tags, you can now classify snapshot costs to match the needs of your business for better transparency and visibility. You can use this feature to allocate snapshot costs to internal teams or to provide accurate reports for customer billing. With this additional information, you can better manage your snapshot costs.
Quelle: aws.amazon.com