Azure Backup Server now supports SQL 2017 with new enhancements

V3 is the latest upgrade for Microsoft Azure Backup Server (MABS). Azure Backup Server can now be installed on Windows Server 2019 with SQL 2017 as its database. MABS V3 brings key enhancements in the areas of storage and security.

Security

Preventing critical volumes’ data loss

While selecting volumes for storage that should be used for backups by MABS, user may accidently select the wrong volume. Selecting volumes containing critical data may result in unexpected data loss. With MABS V3 you can prevent this by disabling these volumes to be available for backup storage, thus keeping your critical data secure from unexpected deletion.

TLS 1.2

Transport Layer Security (TLS) is the cryptographic protocol which ensures communication security over the network. With TLS 1.2 support, MABS V3 ensures more secured communication for backups. MABS now offers TLS 1.2 communication between Azure Backup Server and the protected servers, for certificate based authentication, and for cloud backups.

Storage

Volume migration

MABS V3 provides the flexibility to move your on-premises backups datasources to other storage for efficient resource utilization. For example, during storage upgrade, you can move datasources such as frequently backed up SQL databases to higher performant storage to achieve better results. You can also migrate your backups and configure them to be stored to a different target volume when a volume is getting exhausted and cannot be extended.

Optimized CC for RCT VMs

With resilient change tracking (RCT) mechanism in Hyper-V VMs, MABS optimizes the the network and storage consumption by transferring only the changed data during consistency check jobs. This reduces the overall need of time consuming consistency checks, thus making incremental backups faster and easier.

Related links and additional content:

If you are new to Azure Backup Server, refer Azure Backup Server documentation.
Want more details? Check out what’s new is MABS.
Need help? Reach out to the Azure Backup forum for support.

Quelle: Azure

Azure Functions now supported as a step in Azure Data Factory pipelines

Azure Functions is a serverless compute service that enables you to run code on-demand without having to explicitly provision or manage infrastructure. Using Azure Functions, you can run a script or piece of code in response to a variety of events. Azure Data Factory (ADF) is a managed data integration service in Azure that allows you to iteratively build, orchestrate, and monitor your Extract Transform Load (ETL) workflows. Azure Functions is now integrated with ADF, allowing you to run an Azure function as a step in your data factory pipelines.

Simply drag an “Azure Function activity” to the General section of your activity toolbox to get started.

You need to set up an Azure Function linked service in ADF to create a connection to your Azure Function app.

Provide the Azure Function name, method, headers, and body in the Azure Function activity inside your data factory pipeline.

You can also parameterize your function name using rich expression support in ADF. Get more information and detailed steps on using Azure Functions in Azure Data Factory pipelines.

Our goal is to continue adding features and improve the usability of Data Factory tools. Get started building pipelines easily and quickly using Azure Data Factory. If you have any feature requests or want to provide feedback, please visit the Azure Data Factory forum.
Quelle: Azure

Automate Always On availability group deployments with SQL Virtual Machine resource provider

We are excited to share that a new, automated way to configure high availability solutions for SQL Server on Azure Virtual Machines (VMs) is now available using our SQL VM resource provider.

To get started today, follow the instructions in the table below.

High availability architectures are designed to continue to function even when there are database, hardware, or network failures. Azure Virtual Machine instances using Premium Storage for all operating system disks and data disks offers 99.9 percent availability. This SLA is impacted by three scenarios – unplanned hardware maintenance, unexpected downtime, and planned maintenance.

To provide redundancy for your application, we recommend grouping two or more virtual machines in an Availability Set so that during either a planned or unplanned maintenance event, at least one virtual machine is available. Alternatively, to protect from data center failures, two or more VM instances can be deployed across two or more Availability Zones in the same Azure region, this will guarantee to have Virtual Machine Connectivity to at least one instance at least 99.99 percent of the time. For more information, see the “SLA for Virtual Machines.”

These mechanisms ensure high availability of the virtual machine instance. To get the same SLA for SQL Server on Azure VM, you need to configure high availability solutions for SQL Server on Azure VM. Today, we are introducing a new, automated method to configure Always On availability groups (AG) for SQL Server on Azure VMs with SQL VM resource provider (RP) as a simple and reliable alternative to manual configuration.

SQL VM resource provider automates Always On AG setup by orchestrating the provisioning of various Azure resources and connecting them to work together. With SQL VM RP, Always On AG can be configured in three steps as described below.

Steps
SQL VM RP resource type
Method to deploy
Prerequisites

Step 1 – Windows Failover Cluster
SqlVirtualMachineGroup
Automated – ARM template
VMs should be created from SQL Server 2016 or 2017 Marketplace images, should be in the same subnet, and should join to an AD domain.

Step 2 – Availability group
N/A
Manual
Step 1

Step 3 – Availability group Listener
SqlVirtualMachineGroup/AvailabilityGroupListener

3.1 Manual – Create Internal Azure Load Balancer resource

3.2 Automated – ARM Template Create and Configure AG Listener

3.1 Manual – None

3.2 Automated – Step 2

Prerequisites

You should start with deploying SQL VM instances that will host Always On AG replicas from Azure Marketplace SQL Server VM images. Today, SQL VM resource provider supports automated Always On AG only for SQL Server 2016 and SQL Server 2017 Enterprise edition.

Each SQL VM instance should be joined to an Active Directory domain either hosted on an Azure VM or extended from on-premises to Azure via network pairing. VM instances can be joined to the Active Directory domain manually or by running the Azure quick start domain join template.

All SQL VM instances that will host Always On AG replicas should be in the same VNet and the same subnet.

1. Configure a Windows Failover Cluster

Microsoft.SqlVirtualMachine/SqlVirtualMachineGroup resource defines the metadata about the Windows Failover Cluster, including the version and edition, fully qualified domain name, AD accounts to manage the cluster, and the storage account as the cloud witness. Joining the first SQL VM to the SqlVirtualMachineGroup will bootstrap the Windows Failover Cluster Service; and join the VM to the cluster. This step can be automated with an ARM template available in Azure Quick Starts as 101-sql-vm-ag-setup.

2. Configure an Always On AG

As Windows Failover Cluster service will be configured at the first step, an Always On AG can simply be created via SSMS on the primary Always On AG replica. This step needs to be manually performed.

3. Create an Always On AG listener

Always On AG listener requires an Azure Load Balancer (LB). Load Balancer provides a “floating” IP address for the AG listener that allows quicker failover and reconnection. If the SQL VMs a part of the availability group are in the same availability set, then you can use a Basic Load Balancer. Otherwise, you need to use a Standard Load Balancer. The Load Balancer should be in the same VNet as the SQL VM instances. SQL VM RP supports Internal Load Balancer for AG Listener. You should manually create the ILB before provisioning the AG Listener.

Provisioning a Microsoft.SqlVirtualMachine/Sql Virtual Machine Groups/AvailabilityGroupListener resource by giving the ILB name, availability group name, cluster name, SQL VM resource ID, and the AG Listener IP address and name creates and configures the AG listener. SQL VM RP handles the network settings, configures the ILB back end pool and health probe, and finally creates the AG Listener with the given IP address and name. As the result of this step, any VM within the same VNet can connect to the Always On AG via the AG Listener name. This step can be automated with an ARM template available on the Azure quick starts as 101-sql-vm-aglistener-setup.

Automated Always On AG with SQL VM RP simplifies configuring Always On availability groups by handling infrastructure and network configuration details. It offers a reliable deployment method with right resource dependency settings and internal retry policies. Try deploying automated Always On availability groups with SQL VM RP today to improve high availability for SQL Server on Azure Virtual Machines.

Start taking advantage of these expanded SQL Server Azure Virtual Machine capabilities enabled by our resource provider today. If you have a question or would like to make a suggestion, you can contact us through UserVoice. We look forward to hearing from you!
Quelle: Azure

Streamlined IoT device certification with Azure IoT certification service

For over three years, we have helped customers find devices that work with Azure IoT technology through the Azure Certified for IoT program and the Azure IoT device catalog. In that time, our ecosystem has grown to one of the largest in the industry with more than 1,000 devices and starter kits from over 250 partners.

Today, we are taking steps to further to grow our device partner ecosystem with the release of Azure IoT certification service (AICS), a new web-based test automation workflow, which is now generally available. AICS will significantly reduce the operational processes and engineering costs for hardware manufacturers to get their devices certified for Azure Certified for IoT program and be showcased on the Azure IoT device catalog.

Over the past year, we’ve made significant improvements to the program such as improving the discovery of certified devices in the Azure Certified for IoT device catalog and expanding the program to support Azure IoT Edge devices. The goal of our certification program is simple – to showcase the right set of IoT devices for our customers’ industry specific vertical solutions and simplify IoT device development.

AICS is designed and engineered to help achieve these goals, delivering on four key areas listed below:

Consistency

AICS is a web-based test automation workflow that can work on any operating systems and web browser. AICS communicates with its own set of Azure IoT Hub instances to automatically validate against devices to Azure IoT Hub bi-directional connectivity and other IoT Hub primitives.

Previously, hardware manufacturers had to instantiate their own IoT Hub using their Azure subscription in order to get certified. AICS not only eliminates Azure subscription costs for our hardware manufacturers, but also streamlines the certification processes through automation. These changes accrue to driving more quality and consistency compared to the manual processes that were in place before.

Additional tests

The certification program for IoT devices has always validated against bi-directional connectivity from device to IoT Hub cloud service (namely device-to-cloud and cloud-to-device). As IoT devices become more intelligent to support more capabilities, we have now expanded our program to support validation of device twins and direct methods IoT Hub primitives. AICS validate these capabilities and Azure IoT device catalog will correspondingly showcased them as well that make it easy for device seekers to build IoT solutions on these rich capabilities.

The screenshot below shows customizable test cases. By default, device-to-cloud is the required test and all others are optional. This new requirement allows constrained devices such as microcontrollers to be certified.

The screenshot below shows how tested capabilities are shown on the device description page in the device catalog.

Flexibility

Previously, hardware manufacturers were required to use the Azure IoT device SDK to build an app to establish connectivity from device(s) to cloud managed by Azure IoT Hub services. Based on partners’ feedback, we now support devices that do not use Azure IoT device SDK to establish connectivity to Azure IoT Hub, for example, devices that use the IoT Hub resource provider REST API to create and manage Azure Hub programmatically or hardware manufacturers opt to use other device SDK equivalent to establish connectivity.

In addition, AICS allows hardware manufacturers to configure the necessary parameters for customized test cases such as number of messages of telemetry data sent from the devices.

The screenshot below illustrates an example page that shows the ability to configure each test case.

Simplicity

Finally, we have made investments to design a user experience that is simple and intuitive to hardware manufacturers. For example, in the device catalog, we have streamlined the process from device registration to running the validations using AICS through a simple wizard driven flow. Hardware developers can easily troubleshoot failed tests through detailed logs that improves diagnose-ability.

Because it’s a web-based workflow, serviceability of AICS is so simple that hardware manufacturers are not required to deploy any standalone test kits (no .exe, .msi, etc.) locally on their devices, which tend to become outdated over time.

The screenshot below shows each test case run. Log files show the test pass/fail along with raw data sent from device to cloud. The submit button only shows up when all the test cases selected pass. Once the tests are complete, we will review the results and notify the submitter of additional steps to complete the entire certification process.

Next steps

Go to Partner Dashboard to start your submission.

Effective immediately, all new incoming submissions for certification must be validated via AICS. We also highly recommend that existing certified IoT devices re-certify using AICS because doing so allows us to showcase your additional hardware capabilities.

You can learn more about AICS in this demo video.

If you have any questions, please contact Azure Certified for IoT iotcert@microsoft.com.
Quelle: Azure

Creating a smart grid with technology and people

This blog post was authored by Peter Cooper, Senior Product Manager, Microsoft IoT.

It’s 1882. Thomas Edison has just surpassed his breakthrough invention—the first incandescent lightbulb—by collaborating with J.P. Morgan to open the first industrial-scale power station in the United States.

Flash forward to today: Power generation, distribution, transmission, and consumption now drive business and modern life around the globe. The industry operates on a vast scale, with a complex web of relationships and technology that enables instant, reliable delivery throughout much of the developed world.

And that grid that got its start back in the 19th century? It’s sorely in need of a massive update. Utilities and their partners are searching for new solutions that can meet 21st-century energy challenges: surging demand for electricity, two-way energy flow, increased use of clean energy sources, and stairstep approaches to creating a smart grid to tackle the thorniest challenges first. Here’s a look at the digital transformation of the power and utilities industry that is picking up steam.

The need to make electrical power more sustainable

The current model of power production and delivery won’t sustain fast-paced business and human population growth. Power systems are already coping with spikes, surges, and even blackouts. Who can forget the Northeast blackout of 2003?

Moreover, the existing grid is wasteful, with 285 percent more power loss today than in 1984. Such inefficiency has highly negative consequences for consumers’ need for reliability, climate change, and businesses’ bottom lines.

A 150-year-old industry goes high tech

Fortunately, new technologies are emerging to manage demand, reduce waste, and harness new energy sources and producers. Here are just a few:

Smart meters that communicate their condition via wireless networks, providing consumers with real-time data and aiding in faster resolution of power disruption issues.
Connected home systems that use sensor-tagged equipment and AI-powered smart assistants to fine-tune energy use throughout the house, even achieving “zero net” energy use.
State-of-the-art batteries that store energy, for future use or sale back to the grid.
Microgrids that combine solar panels, fuel cells, and battery energy storage to power neighborhoods.
Connected cars that reduce energy use, can be charged systematically, and serve as movable energy storage devices.
Next-generation distribution and transmission infrastructures that will enable two-way power flow.
The smart grid, which combines multiple innovations to enable systematic load balancing, peak leveling programs, and full leverage of sustainable energy sources.

All of these developments—and more—are making it possible to deliver electricity to the right customer at the right time and in the right manner. Power companies now also can accommodate the two-way flow of energy, as grassroots producers, both businesses and individuals, deploy their own small-scale energy production. These capabilities are being amplified by a new IoT platform, Azure Digital Twins, that uses spatial intelligence to model complex relationships between people, places, and devices in the energy value chain. Let’s take a closer look.

A grid made smarter by digital technology

Without question, the legacy grid needs to be modernized with state-of-the-art infrastructure to improve effectiveness and ensure a reliable flow of continuous power. Creating a smart grid is slated to cost between $476 and $880 billion, and it will take years to achieve. But new Internet of Things (IoT) digital technology can “smarten” today’s grid faster and at a lower cost. It can also connect all the players—electricity producers, customers, and transmission and distribution companies—providing continuous feedback to help them make more sustainable choices now.

Agder Energi, a hydropower company in Norway, already has used sensor-linked equipment and Microsoft technology, including Microsoft Azure, Power BI, and Azure IoT Hub, to improve energy forecasting, adapt energy production to changing needs, and empower consumers with insights to manage their energy usage. Now, with Azure Digital Twins, Agder Energi is taking those capabilities to a new level. The technology enables Agder Energi to model grid assets and distributed energy resources and optimize them where needed.

Why is this important? Azure IoT enables power companies like Agder Energi to rapidly identify and address sources of waste, right-size production, prioritize investments, and incorporate new energy producers and sources. For example, if a power company finds that a substation is a major source of energy leakage, the company could fast-track upgrades. Or if demand unexpectedly surges, the power company may elect to add more resources, such as wind or solar energy, to ensure continuous electrical delivery.

The rise of new energy “prosumers”

Where will generation companies harness new energy sources? Meet the new prosumers: businesses and individuals who are both consumers and producers of energy.

Businesses may elect to lease land to a wind farm, use solar panels across company buildings, or run fleets of electric vehicles that both use and store energy. Similarly, consumers are increasingly buying solar panels and electric cars to be more sustainable, as well as using smart meters and analytics to monitor and reduce consumption. Both of these groups are likely to store and sell excess energy back to the grid. While in its infancy, the prosumer market is expected to take off in the near future. Mass adoption of autonomous cars could really galvanize this movement.

Allego is a European provider of smart charging solutions and electric vehicle cloud solutions in Europe. The company uses Azure IoT to model all key participants in the charging network, such as regions, charging stations, vehicles, and others to simplify the business complexity of planning and executing charging. The solution enables charging stations to more precisely plan energy delivery, prioritize public vehicles such as buses over others, and charge consumer vehicles according to driver preference, among other benefits.

Smart grid technology means new choices

In the very near future, power generation companies will have greater options in how they run their businesses, using IoT-enabled insights to strategically stairstep their way to creating a smart grid and ensure business continuity. Meanwhile, prosumers will be able to align their values and behavior and benefit financially from sustainable choices, encouraging others to do the same.

Learn about Microsoft’s work on sustainable energy management.
Quelle: Azure

Azure Marketplace new offers – Volume 27

We continue to expand the Azure Marketplace ecosystem. From November 1 to November 16, 2018, 61 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

Virtual machines

CIS Ubuntu Linux 18.04 LTS Benchmark L1: This image of Ubuntu Linux 18.04 is preconfigured by CIS to the recommendations in the associated CIS Benchmark. CIS Benchmarks are vendor-agnostic, consensus-based security configuration guides.

Couchbase Enterprise: Modernize your technology environment with Couchbase's full-featured engagement database. It's adaptive, responsive, scalable, intelligent, highly available, and easy to manage.

CyberPosture Intelligence 1-Month – Trial: Cavirin CyberPosture Intelligence combines automated discovery, monitoring, infrastructure risk scoring, and auto-remediation to help organizations of all sizes leverage the cost savings and agility of the cloud.

CyberPosture Intelligence Annual BYOL: Cavirin CyberPosture Intelligence combines automated discovery, monitoring, infrastructure risk scoring, and auto-remediation to help organizations of all sizes leverage the cost savings and agility of the cloud.

FlashGrid SkyCluster for Oracle RAC: FlashGrid SkyCluster is an engineered cloud system for database high availability. SkyCluster comes as a fully integrated Infrastructure-as-Code template that you can customize and deploy to your IaaS cloud account with a few clicks.

Flexify.IO – Azure Blob/Amazon S3 Data Migration: Migrate data between Azure Blob storage, Amazon S3, and Google Cloud Storage accounts. Flexify.IO validates checksums at every step of migration and retries errors, ensuring your data arrives the same as it was at the source.

Flexify.IO – Multi-Cloud Storage: Flexify.IO enables cloud-agnostic and multi-cloud storage deployments by combining Azure Blob storage, Amazon S3, Google Cloud Storage, and others into a single virtual storage repository and making it available via the S3 API.

Oncore!DOCS, Powered by Nextcloud: Oncore!DOCS is an open-source file sync and share solution designed to be easy to use. It's powered by Nextcloud and optimized for Microsoft Azure.

ont_dev_platform: This is a decentralized application development platform. Users and developers can develop, compile, deploy, and invoke intelligent contracts using a mirror platform composed of Ontology, SmartX (IDE), and Explorer.

Panzura Freedom CloudFS 7.2.2.0 (EARLY ACCESS): New features in 7.2.2.0 include support for virtual NIC and multibyte character support, enabling long path and file names for Japanese and Chinese characters.

Phishing Frenzy on Ubuntu Server: Phishing Frenzy is a software tool for penetration testers that’s built with the Ruby on Rails web-application framework.

RSA NetWitness Platform 11.2: RSA NetWitness Platform v11.2 adds a new User and Entity Behavior Analytics (UEBA) offering, with increased capabilities for detection, response, and forensics activities, as well as added contextual information from RSA Archer.

Scai Analytics & Database Management Web Platform: Scai is a business intelligence and database management web platform for SQL databases and Azure SQL Database. Scai was built for companies that need a powerful, affordable, simple analytics and management tool.

scalearc: The ScaleArc database load balancing software enables an agile data tier that eliminates planned and unplanned downtime, enables failover that avoids application disruption, and delivers instant scalability with no changes to the app or database.

Serverless360: Serverless360 offers powerful message processing for real-time business requirements, integration with major external notification channels, and extensive serverless monitoring for a complete integration solution.

SFTP Secure Server Windows 2016 OpenSSH: This solution is an FTP/FTPS/SFTP server that enables users to access remote files over TCP/IP networks, such as the internet. Unlike FTP, the FTPS and SFTP protocols provide security and strong data encryption.

Topicus KeyHub: Topicus KeyHub provides a complete access management solution for team-based single sign-on, real-time provisioning, and password management.

Ubuntu Server 16.04 LTS + Azure IoT Edge runtime: Azure IoT Edge is a fully managed service that delivers cloud intelligence locally by deploying and running artificial intelligence, Azure services, and custom logic directly on cross-platform IoT devices.

VisualBase: VisualBase is a dynamic business application platform with powerful development tools at runtime. Organizations turn to VisualBase for its capability, flexibility, and reliability.

VisualRM: VisualRM is a risk management tool that empowers organizations to identify, score, categorize, classify, and rank risks, then put in place short-term and long-term mitigation actions and plans.

Web applications

Couchbase Server Enterprise Container: Modernize your technology environment with Couchbase Server Enterprise Container. Built on powerful NoSQL technology, Couchbase Server gives you the flexibility to constantly reinvent the customer experience.

DigiCert Code-Signing Certificates Listing: Code-signing certificates are used by software developers to digitally sign apps, drivers, and software programs as a way for end users to verify that the code they receive has not been altered or compromised by a third party.

Exact Lightweight Integration Server: Exact Lightweight Integration Server offers a central management console to install, update, manage, and monitor all data integrations within your Exact environment.

QuotaGuard Static IPs: QuotaGuard Static IPs are enterprise-ready inbound/outbound proxied static IP services complete with dual-static IP provisions, health monitoring, load balancing, and automated failover for each account.

Container solutions

Git Container Image: Git is an open-source distributed version control system that can handle both small and large projects with speed and efficiency.

Consulting services

1 Week CIO Assessment: CrucialLogics will perform an assessment to determine the current IT landscape and generate an executive deliverable with action plans and next steps for the IT road map on your Microsoft platform (Office 365, Azure, Dynamics).

Accelerating Digital Transformation Assessment: In this one-day workshop, sopra steria will assess your technology and application transformational journey in line with your business strategy, then provide a report advising either rebuilding, rehosting, or refactoring your apps.

Azure & Microsoft 365 Security:2-Wk Implementation: Protect your business data, operational infrastructure, and business identity against cyber threats and data breaches with Steeves and Associates’ Enhanced Security + Data Protection Service for Microsoft 365 and Azure.

Azure Backup and Disaster Recovery:1-Hour Briefing: Disasters can happen to any business. In this briefing by Communication Square, you'll learn how you can leverage Azure Backup and Disaster Recovery options to ensure your business keeps progressing.

Azure Backup: 2-Week POC: Get a portion of your workload backed up on Azure and see all the tools and features in action in this proof of concept from Communication Square. Azure Backup enables you to secure your data in the cloud without the extra costs of infrastructure.

Azure Backup: 4-Week Implementation: In this four-week implementation, Communication Square will back up your workload with Azure Backup. Experience lower operational costs, added security, and pay-as-you-go storage.

Azure Data Center Migration: 4 Week Implementation: Spend less and achieve more by moving your datacenter to the cloud. Migrate your data to Azure with the help of Communication Square's Microsoft-certified experts.

Azure Datapath 10-Weeks Implementation: A challenge for many organizations on their cloud journey is how to efficiently migrate large amounts of data to Azure. Azure Datapath is a service provided by Servent to help accelerate and expedite your migration.

Azure Disaster Recovery: 2-Week POC: With Azure Disaster Recovery, you can create a customized plan to keep your business running in any situation. Get a portion of your workload replicated on Azure in this proof of concept from Communication Square.

Azure Disaster Recovery: 4-Week Implementation: In this four-week implementation by Communication Square, you can get your workload replicated on Azure and receive a comprehensive disaster recovery plan for the continuity of your business.

Azure Done Right Migration: Developed by Netsurit, Azure Done Right is a predefined set of planning and migration procedures and tools to assist you in the successful migration of workloads to Microsoft Azure.

Azure Pathway 10-Weeks Implementation: Azure Pathway by Servent helps simplify your journey to Microsoft Azure. Accelerate your migration with step-by-step guidance and technical resources customized for your workloads.

Backup and Recovery: 5-Day POC: In this engagement from Catapult Systems, customers will learn cloud backup best practices and receive a proof of concept that sends two workloads into Azure Backup.

Cloud Coaching Service 8 Week Workshop: This offer from Compositional IT will deliver one-hour coaching sessions with your developers to help keep your cloud applications on the right track.

Cloud Tech Accelerator 5 Day Proof of Concept: Compositional IT's Tech Accelerator is designed to let us solve those specific or edge-case technology problems that would be prohibitively expensive or time-consuming for you to solve yourself.

Database DevOps Jumpstart: 2-Wk Proof of Concept: DevOps unifies software development and software operation, enabling shorter development cycles and increased deployment frequency. This proof of concept from Coeo will reveal the benefits of Azure DevOps Services.

DB Shield-DB Compliance. 4 week Implementation: The DB Shield Azure service protects databases with a set of preconfigured defenses and helps build a custom security policy for your environment.

DevOps Pipeline Assessment: 3 Day Assessment: This assessment by DevOpsGroup provides an in-depth analysis of your automation technology to create best-practice continuous integration/continuous deployment (CI/CD) pipelines.

DevOps Transformation Discovery: 3 Day Assessment: This assessment by DevOpsGroup will help you understand operational and technical challenges, explore technological or policy-related constraints, and establish the needs of your organization.

DevOps with Azure Automation Jumpstart: 2-Wk PoC: Implement DevOps practices to improve agility, collaboration, and productivity in this two-week proof of concept from Coeo.

Empowering Remote Workers: 1 Hour Briefing: In this free briefing, you’ll learn how Communication Square can help you secure devices and protect sensitive data using Microsoft Azure. Empower your remote workforce with secure access to your organization’s data.

Empowering Remote Workers: 4-Week Implementation: Communication Square will meet with you to determine your deployment goals and go over use-case scenarios, then implement Microsoft Intune for mobile device management, with testing and validation.

Enterprise Blockchain Discovery: 1-Day Assessment: In this free assessment from Envision Blockchain Solutions, participants will learn the benefits and values of conducting their business with blockchain solutions.

Envision Workshop SAP on Azure: Cooperation between Microsoft and SAP facilitates the efficient and secure migration of central IT processes to the cloud for business customers. This workshop by SYCOR will detail the benefits of SAP on Azure.

GDPR Compliant Cloud Solutions: 1-Hour Briefing: In this one-hour briefing by Communication Square, learn about GDPR compliance and how to quickly and cost-effectively achieve it. Microsoft Compliance Manager and other Microsoft services will be discussed.

GDPR Compliant Cloud Solutions: 4-Week Imp.: Communication Square will help your organization achieve GDPR compliance, and you'll receive Microsoft Compliance Manager training and insights on data protection capabilities.

Modern Workplace Enablement: 4-Week Implementation: Increase productivity and empower your workforce with the latest collaboration and productivity tools. Communication Square will take care of all aspects of the modern workplace enablement program.

Modern Workplace for Firstline Workers: 4-Wk Imp.: This implementation from Communication Square will help you digitize work, modernize teamwork, and improve security.

Modern Workplace: 1-Hour briefing: Do you want to retire paper and pencil? Do you want a better way to communicate with your team members? Communication Square's one-hour briefing will help you understand what you can achieve and optimize with Microsoft Azure.

SAP Azure 10-Week Implementation: SAP Azure from Servent is our framework to help you migrate SAP to Azure. With this offering, we simplify the deployment and migration process and enable your journey to use SAP on Azure.

Secure Azure 10-Weeks Implementation: The Secure Azure implementation by Servent is a comprehensive set of security solutions. These solutions ensure that when you migrate workloads to Azure, they will be secured following Microsoft and industry standards.

Secure your Data implementation: At Netsurit, our Secure Your Data offer mitigates any threats by implementing effective and proven measures to ensure your data is protected at all times.

Secure your Devices implementation: Netsurit’s team of experts will work with you to create a customized plan to implement tools and processes so you can effectively manage and secure your company devices.

Secure your Identity Implementation: Netsurit’s Secure Your Identity solution enables automated user lifecycle management across HR systems both on-premises and in the cloud.

Setup SentryOne Monitoring Software monitor 5 Svrs: In this implementation, Denny Cherry & Associates Consulting will set up SentryOne monitoring software on a repository server and configure monitoring on up to five servers.

TFS to Azure DevOps Migration: 2 weeks: In this two-week implementation, DevOpsGroup will migrate your current work items, history, and code repository from Team Foundation Version Control (TFVC) to Git for source control.

Quelle: Azure

Microsoft previews neural network text-to-speech

Applying the latest in deep learning innovation, Speech Service, part of Azure Cognitive Services now offers a neural network-powered text-to-speech capability. Access the preview available today.

Neural Text-to-Speech makes the voices of your apps nearly indistinguishable from the voices of people. Use it to make conversations with chatbots and virtual assistants more natural and engaging, to convert digital texts such as e-books into audiobooks and to upgrade in-car navigation systems with natural voice experiences and more.

This release includes significant enhancements since we first revealed Neural Text-to-Speech at Ignite earlier this year.

Enhanced voice quality

The voices sound more robust and natural across a wider variety of user scenarios, achieved by harnessing the following:

A large supervised training with transfer learning across diverse speakers
More features from unsupervised pretraining
Added robust neural model design 

Accelerated runtime performance

Runtime performance of the Neural Text-to-Speech engine is near-instantaneous through extensive code optimization with hardware accelerators, applying parallel inference models and model simplifications considering the balance of sound quality and performance. The real-time factor has been improved from the previous version to less than 0.05X, meaning 1 second of audio can be generated in less than 50 milliseconds. Producing the first byte of audio now runs 6 times faster than before.

Greater service availability

Neural Text-to-Speech has since expanded to three datacenters across the US, Europe, and Asia. Wherever you are in the world, you can integrate neural voices with reduced latency overhead.

 

With these updates, Speech Services Neural Text-to-Speech capability offers the most natural-sounding voice experience for your users in comparison to the traditional and hybrid system approaches.

You can use this capability starting today with two pre-built neural voices in English – meet Jessa and Guy. Hear what they sound like.

Discounts are available during the preview. Visit the Speech Services pricing page for more details.

If you would like to access this capability in Chinese or German, please submit your request.
Quelle: Azure

Static websites on Azure Storage now generally available

Today we are excited to announce the general availability of static websites on Azure Storage, which is now available in all public cloud regions.

What is a static website?

Static websites refer to websites that can be loaded and served statically from a pre-defined set of files. You can now build a static website using HTML, CSS, and JavaScript files that are hosted on Azure Storage. In contrast, if you want to host a dynamic website with the ASP.NET, Java, or Node runtime, use Azure Web Apps and rely on the runtime to generate and serve your web content dynamically.

Static websites can be powerful with the use of client-side JavaScript. You can build a web app using popular frameworks like React.js and Angular and host it on Azure Blob storage. If there is a need to manipulate or process data on the server side, simply call the relevant managed Azure service like Azure Cognitive Services or leverage a web server of your own hosted on Azure Functions.

Get started now

Azure Storage makes hosting of websites easy and cost-efficient. When you enable the static website hosting on your Azure Storage account, a container named ‘$web’ is automatically created for you. You can then upload your static content to this container for hosting. Your content will be available through a web endpoint (i.e., myaccount.z20.web.core.windows.net) and will get a default page and a 404 page of your choice.

You can enable static website hosting using the Azure portal, Azure CLI, or Azure PowerShell. If you prefer a guided experience, follow the tutorial series on hosting your website on Azure Storage and configuring a custom domain with an SSL certificate.

A sample website – your own file browser for Azure Storage

One scenario where you might use static website hosting is to build a website to interact with your data in Azure Storage. You can secure your data by protecting your files via RBAC roles and Azure Active Directory authentication, and manipulate the data using the Azure JavaScript SDKs. This example uses the new Azure Storage SDK for JS to list files in Blob storage and render in a file browser in a statically hosted website. The example uses anonymous authentication to interact with Azure Blob storage, but you can also use Azure AD authentication to restrict access to your data.

Get the sample on GitHub, and try the demo.

There are many other use cases for static websites with today’s distributed architectures. A common use is building a serverless application in the cloud and creating a front end for it using a static website. Watch our photo gallery demo, “Serverless compute architectures with Azure Blob Storage,” from Ignite 2018, or follow the tutorial, “Build a serverless web app in Azure,” to learn more about building serverless architectures fronted with a static website.

Pricing

There are no additional charges for enabling static websites on Azure Storage. The pricing for storing and accessing your data will apply and can be viewed on the pricing page. In addition to the storage costs, data egress charges will apply and are described within the Bandwidth Pricing Details.

Additionally, you might want to enable Azure CDN to use a custom domain with an SSL certificate, as well as making use of features like custom rewrite rules. If you do so, Azure CDN charges will apply, and may lower your costs depending on your usage pattern.

Feedback

Thank you to everyone who participated in the preview of the static website feature. In the upcoming months we plan to make many enhancements to the feature based on your feedback. Continue providing feedback by posting on Azure Feedback.
Quelle: Azure

Know exactly how much it will cost for enabling DR to your Azure VMs

Azure offers built-in disaster recovery (DR) solution for Azure Virtual Machines through Azure Site Recovery (ASR). In addition to the broadest global coverage, Azure has the most comprehensive resiliency strategy in the industry from mitigating rack level failures with Availability Sets and data center failures with Availability Zones to protecting against large-scale events with failover to separate regions with ASR. A common question we get is about costs associated with configuring DR for Azure virtual machines. We have listened and prioritized.

Configuring disaster recovery for Azure VMs using ASR will incur the following charges.

ASR licensing cost per VM.
Network egress costs to replicate data changes from the source VM disks to another Azure region. ASR uses built-in compression to reduce the data transfer requirements by approximately 60 percent.
Storage costs on the recovery site. This is typically the same as the source region storage plus any additional storage needed to maintain the recovery points as snapshots for recovery.

You can look at this sample cost calculator for estimating DR costs for a three-tier application using six virtual machines. All of the services are pre-configured in the cost calculator. The six virtual machines have 12 Standard SSD disks and 6 Premium HDD disks. Therefore, there will be 18 disks created in the DR region. Each standard disk is expected to have data change at a rate of 10 GB per day and each premium disk is expected to have data change at a rate of 20 GB per day. The daily data change rate will be the storage snapshot charges considering the recovery point retention to be 24 hours, which is the default value.

ASR uses compression to reduce the data to be transferred from source region to target region and the compression ratio usually is around 50 percent. So, 40 percent of the total data changes will be transferred between regions, which comes out to be about 3 TB per month across all the 6 VMs. The sample cost calculator has all these charges listed.

To see how the pricing would change for your particular use case, change the appropriate variables to estimate the cost. You can key in the number of VMs for the ASR license cost. You can use the number of managed disks, along with type, and the total data change rate expected across all the VMs to get the estimated storage costs in DR region. Additionally, you can use the total data change rate in a month after applying the compression factor of 0.4 to get the bandwidth costs incurred for transferring data between regions.

In addition to the above costs, you will incur additional Compute costs for the VMs created as part of a disaster recovery drill using Test failover operation or the actual disaster recovery using failover operation. You also pay for any resources such as load balancers and public IPs created beforehand which are required for the application along with the VMs to work properly when failed over.

Disaster Recovery between Azure regions is available in all Azure regions where ASR is available. Get started with ASR today. Follow us on twitter to get latest updates and share your feedback.

Related links and additional content

Get started by configuring disaster recovery for Azure VMs.
Learn more about the supported configurations for replicating Azure VMs.
Need help? Reach out to ASR forum for support.
Tell us how we can improve ASR by contributing new ideas and voting up existing ones.

Quelle: Azure

Taking a closer look at Python support for Azure Functions

Azure Functions provides a powerful programming model for accelerated development and serverless hosting of event-driven applications. Ever since we announced the general availability of the Azure Functions 2.0 runtime, support for Python has been one of our top requests. At Microsoft Connect() last week, we announced the public preview of Python support in Azure Functions. This post gives an overview of the newly introduced experiences and capabilities made available through this feature.

What's in this release?

With this release, you can now develop your Functions using Python 3.6, based on the open-source Functions 2.0 runtime and publish them to a Consumption plan (pay-per-execution model) in Azure. Python is a great fit for data manipulation, machine learning, scripting, and automation scenarios. Building these solutions using serverless Azure Functions can take away the burden of managing the underlying infrastructure, so you can move fast and actually focus on the differentiating business logic of your applications. Keep reading to find more details about the newly announced features and dev experiences for Python Functions.

Powerful programming model

The programming model is designed to provide a seamless and familiar experience for Python developers, so you can import existing .py scripts and modules, and quickly start writing functions using code constructs that you're already familiar with. For example, you can implement your functions as asynchronous co-routines using the async def qualifier or send monitoring traces to the host using the standard logging module. Additional dependencies to pip install can be configured using the requirements.txt format.

With the event-driven programming model in Functions, based on triggers and bindings, you can easily configure the event that'll trigger the function execution and any data sources that your function needs to orchestrate with. Common scenarios such as ML inferencing and automation scripting workloads benefit from this model as it helps streamline the diverse data sources involved, while reducing the amount of code, SDKs, and dependencies that a developer needs to configure and work with at the same time. The preview release supports binding to HTTP requests, timer events, Azure Storage, Cosmos DB, Service Bus, Event Hubs, and Event Grid. Once configured, you can quickly retrieve data from these bindings or write back using the method attributes of your entry point function.

Easier development

As a Python developer, you don't need to learn any new tools to develop your functions. In fact, you can quickly create, debug and test them locally using a Mac, Linux, or Windows machine. The Azure Functions Core Tools (CLI) will enable you to get started using trigger templates and publish directly to Azure, while automatically handling the build and configuration for you.

What's even more exciting is that you can use the Azure Functions extension for Visual Studio Code for a tightly integrated GUI experience to help you create a new app, add functions and deploy, all within a matter of minutes. The one-click debugging experience will let you test your functions locally against real-time Azure events, set breakpoints, and evaluate the call stack, simply on the press of F5. Combine this with the Python extension for VS Code, and you have a best-in-class auto-complete, IntelliSense, linting, and debugging experience for Python development, on any platform!

Linux based hosting

Functions written in Python can be published to Azure in two different modes, Consumption plan and the App Service plan. The Consumption plan automatically allocates compute power based on the number of incoming events. Your app will be scaled out when needed to handle a load, and scaled back down when the events become sparse. Billing is based on the number of executions, execution time and memory used, so you don't have to pay for idle VMs or reserved capacity in advance.

In an App Service plan, dedicated instances are allocated to your function which means that you can take advantage of features such as long-running functions, premium hardware, Isolated SKUs, and VNET/VPN connectivity while still being able to leverage the unique Functions programming model. Since using dedicated resources decouples the cost from the number of executions, execution time, and memory used, the cost is capped to the number of instances you've allocated to the plan.

Underneath the covers, both hosting plans run your functions in a docker container based on the open source azure-function/python base image. The platform abstracts away the container, so you're only responsible for providing your Python files and don't need to worry about managing the underlying Azure Functions and Python runtime.

Next steps – get started and give feedback

To get started, follow the links below:

Build your first serverless function using the Python in Functions Quickstart.
Find the complete Azure Functions Python develop reference.
Follow upcoming features and design discussion on our GitHub repository.
Learn about all the great things you can do with Python on Azure.
See the Python development experience with Azure Functions in action, applied to Machine Learning workloads in the webinar, “Streamline Machine Learning with Python in Azure Functions.”

This release lays the groundwork for various other exciting features and scenarios. With so much being released now and coming soon, we’d sincerely love to hear your feedback. You can reach the team on Twitter and on GitHub. We actively monitor StackOverflow and UserVoice, so feel free to ask questions or leave your suggestions. We look forward to hearing from you!
Quelle: Azure