Cloud innovations empowering IT for business transformation

Leading the Azure engineering organization over the past several years has been an incredible experience, and one that has taught me a great deal. My discussions with enterprise customers has given me an intimate understanding of the challenges IT teams (including our own) are facing today – How do I innovate with greater agility and faster time to market? How do I modernize our app portfolio? How do I maintain and optimize what I have? How do I manage and secure it all? Amidst all these challenges, however, lies a unique opportunity to align IT with business strategy. And the cloud is the enabling technology that makes this more possible than ever.

Azure is the cloud platform designed for enterprises with the fundamental tenets of global, trusted and hybrid. With Azure infrastructure spanning 34 global regions, we provide twice the choice of regions than AWS to run your applications, as well as offering unique data sovereignty capabilities. With 47 compliance certifications and attestations, Azure is the most compliant hyper scale cloud on the planet. Our hybrid IT depth means ensuring your on-premises investments work consistently with Azure, because hybrid IT means true consistency across your entire environment, not just connectivity between your datacenter and the cloud. But what’s exciting and humbling is to see the tremendous value customers are getting by betting on Azure. From Fortune 500 organizations like Wal-Mart, BMW and EcoLab to startups like Soluto and Linukury, our customers save on costs, are more agile and can transform their businesses.

This week at Ignite, we’re unveiling many new Azure capabilities and you’ll see a common meme across these – enabling IT with cloud infrastructure, security capabilities, holistic management, and world-class support for open source.

Infrastructure for IT innovation

In Azure’s global datacenters lies incredible compute capacity that you can tap into. We want to ensure you can run every workload – meaning all the performance and scale you need, regardless of what you are running. To that end, we are making several announcements today.

New compute offerings – storage optimized, fastest CPU, SAP HANA

We are introducing new Virtual Machine and compute offerings to address your unique application needs – the L-series, H-series and general availability of special-purpose large instances for SAP HANA. This expands upon the recently released N-series, available in preview and offering best-in-class GPU compute VMs.

L-Series – L-series are storage-optimized VMs specially designed for applications requiring low latency, high throughput, large local disk storage such as NoSQL databases (e.g. Cassandra, MongoDB, Cloudera and Redis), and data warehousing. Built on Intel Haswell processors (Intel® Xeon® processor E5 v3), L-series supports up to 6 TB of local SSD and offers unmatched storage performance. We will be rolling out L-series in the coming weeks.
H-series – Aligned with our commitment to deliver the best performing technology to market, H-series sports the fastest CPUs in public cloud as well as RDMA with InfiniBand, so you can run high performance computing (HPC) applications like computational fluid dynamics, automotive crash testing, genome and molecular research. H-Series VMs provides customers the ability to easily get on-demand HPC infrastructure and use it for faster insight.
Large Instances for SAP HANA – Continuing our commitment to be able to run the largest enterprise applications, I am delighted to announce the general availability of large instances specifically designed for SAP HANA workloads. These purpose-built hardware configurations can run the largest SAP HANA workloads in the public cloud. They accommodate SAP HANA OLTP scenarios for up to 3 TB, and for large scale-out OLAP deployments for up to 32 TB of RAM.
N-series – Earlier in August, we announced the preview release of our new GPU-powered N-series VM sizes. With both a visualization SKU and a compute-focused SKU, these VM sizes offer unparalleled performance for desktop graphical modeling/rendering and deep learning computational models.

More openness options

Azure is an open platform, deeply committed to leading open source support. Today, nearly one in three VMs deployed on Azure run Linux. The strong momentum for Linux and open source on Azure is driven by our ongoing innovation and demonstrated commitment. Customers like Johnson Controls and KPMG are using Azure for open source workloads and building modern application architectures, including containers and big data solutions. Two weeks ago, we released a preview of our microservices platform, Service Fabric on Linux, for creating highly scalable, cloud-native applications. Customers can now also provision Service Fabric clusters in Azure using Linux as the host OS and deploy Java applications in these clusters.

This week, we are expanding our open source support with further regional availability of Linux and open source solutions, including on-demand Red Hat Enterprise Linux (RHEL) in Azure Government. We are also adding RHEL support for SAP applications (NetWeaver and HANA). We are committed to great experiences for developers and system admins and to meet customers where they are, from platform stacks to management tools.

New networking capabilities

Azure provides a rich set of networking features so you have the most performant network and diverse set of options for your applications. To that end, we are introducing several new capabilities today.

IPv6 support – With the explosive growth of devices powered by Internet of Things (IoT), compliance regulations and the need for future proofing applications, the need for IPv6 has become more real than ever before. Today, we’re introducing support for IPv6 for applications within virtual machines on Azure.
Azure DNS – We introduced Azure DNS to provide you the speed, reliability, and convenience of having your DNS services hosted close to your applications and cloud infrastructure. We are excited to make this networking service generally available today.
Accelerated Networking – As applications demand faster performance than ever before, we are previewing the ability for VMs to tap into incredible network performance (up to 25 Gbps). Powered by FPGAs, the network throughput and low latency offered by this capability is unparalleled in the public cloud today.
Web Application Firewall (WAF) – With the addition of WAF capabilities to the Application Gateway service, we’re significantly enhancing your ability to manage application security.
New Virtual Network capabilities like peering, active-active VPN gateways and a new ultra-performance gateway for ExpressRoute, all significantly improve the way in which you define network topologies and connect different network environments in powerful ways.

Hybrid cloud enablement

Over 80% of enterprises today have a hybrid cloud strategy – this is the real world for organizations. With decades of experience partnering closely with enterprises, we understand the importance of true hybrid that provides a consistent and comprehensive approach for your entire IT estate. Today, we announced the next step in delivering the power of Azure in your datacenter with the release of Azure Stack Technical Preview 2, bringing more proven cloud innovation such as the Azure Marketplace and security capabilities like Key Vault directly to your datacenter.

We are also bringing cloud-first innovation to your on-premises datacenters and other clouds with General Availability of Service Fabric for Windows Server. With this release, we enable microservice-based Service Fabric applications to have portability and flexibility. This standalone offering provides you a runtime that can be installed on Windows Server on-premises or even in other clouds.

Many of our customers turn to Azure for backup and disaster recovery across the enterprise. In addition to Azure integration with offerings from our rich storage partner ecosystem, we now have the Azure StorSimple Virtual Array for remote and branch offices and the existing Azure StorSimple 8000 series hybrid cloud storage offerings for the datacenter. We are also providing data transformation services that make data backed up using StorSimple available in native Azure formats like blobs and disks. This makes it easy for customers to address business needs with services such as Media Services, Search, Analytics and even custom applications.

New Azure SQL database enhancement

We are announcing general availability of Temporal Tables feature of Azure SQL Database. Temporal Tables are designed to improve your productivity when you develop applications. It lets you focus data analysis on a specific point in time and use a declarative cleanup policy to control retention of historical data. It also enables you to track the full history of data changes in Azure SQL DB, without custom coding.

Securing infrastructure

Security is a primary adoption concern for customers embracing the cloud. We know that staying ahead of sophisticated and ever-evolving cyber threats is a challenging and an ongoing process. We can firmly say that trust and security are cornerstones of the Azure platform. We have the largest compliance portfolio and with it, organizations in even highly regulated industries like Financial Services can use Azure; nearly 85 percent of the world’s largest banks are Azure customers. We are investing heavily in building a cloud that you can trust and today we are announcing key enhancements that further bolster the security of our platform.

Compliance portfolio expansion

We are further strengthening our robust compliance portfolio with these new additions.

ISO 22301 Certification – Azure is the only hyper scale cloud service provider to receive a formal certification for business continuity management, demonstrating comprehensive internal guidelines for the prevention, response, and recovery from disruptive incidents.
EU-US Privacy Shield Framework – Microsoft is also the first cloud vendor to get certified under the new EU-US Privacy Shield Framework for the protection of personal data of EU citizens and is the latest example of the company’s commitment to privacy.
IT-Grundschutz Workbook – Azure has also made available a new security and compliance workbook, IT-Grundschutz, for Azure for who are subject to the German Federal Office for Information Security (BSI) information protection standards.

Azure Security Center enhancements

We continue to enhance Azure Security Center, offering unmatched security monitoring and management for your cloud resources, since its general availability earlier this year.

Using Security Center, customers benefit from ongoing security research resulting in new analytics released today that are designed to detect insider threats, attempts to persist within a compromised system, and use of compromised systems to mount additional attacks, such as DDoS and Brute Force.
Security Incidents, currently available in preview, have been enriched to correlate alerts from different sources, including alerts from connected partner solutions.
Threat attribute reports are now built-in to provide valuable information about attackers, which can be used to remediate threats more quickly.
Security Center also released support for integrated vulnerability assessment from partners like Qualys, along with security assessment of Web Apps and Storage accounts.

Azure Key Vault support for certificates

To better secure your cloud resources and data, Azure Key Vault now extends support for certificates helping simplify tasks associated with SSL/TLS certificates. This service helps customers enroll and automatically renew certificates from supported 3rd party Certificate Authorities while providing auditing trails within the same environment. Aligning with our approach to work with industry partners, the following Certificate Authorities are supported at GA: Digicert, Globalsign and WoSign.

General Availability of Encryption Services

With the general availability of following encryption services, we help customers protect and safeguard their data and meet their organizational security and compliance requirements.

With the availability of Azure Disk Encryption for both Windows and Linux Standard VMs, customers can protect and safeguard their OS disk and data disks at REST using industry standard encryption technology.
Two weeks ago, we announced the general availability of Storage service encryption for Azure Blob Storage. For accounts that have encryption enabled, data will be encrypted using Microsoft managed keys using the industry leading Encryption algorithm, 256-bit Advanced Encryption Standard (AES-256).

Management from the cloud

Cloud intelligence and cloud scale, gives you new options when it comes to management. As you take advantage of the agility of Azure, you need to closely monitor and analyze the utilization and performance of your Azure resources. Today, we announced the preview of Azure Monitor, which provides better insights by enabling you to collect performance and utilization data, activity and diagnostics logs, and notifications from your Azure resources. With workloads on-premises, in Azure or spanning both, you need a unified view and the tools to drill down deep when required. Delivered from Azure, Operations Management Suite gives you a comprehensive hybrid cloud management platform. With Azure Monitor and Operations Management Suite Insight & Analytics, all data from your workloads and applications in Azure, on-premises, in AWS, and on VMware is now at your fingertips.

Empowering the IT cloud journey

While we’re committed to continuous innovation to deliver the world leading cloud platform, staying abreast of rapidly changing technologies can be difficult. Microsoft is therefore providing free resources to help IT Professionals through your cloud career journey, from planning your career path to getting started with Azure to hands-on practice with the latest cloud technology.

Microsoft IT Pro Cloud Essentials program will help you get started with $300 Azure credits, a free support incident, free Pluralsight courses and certification discounts. Today we are expanding availability of the Microsoft IT Pro Cloud Essentials and IT Pro Career Center programs to 25 languages.
Microsoft IT Pro Career Center can help you navigate the skills needed to transition to a cloud role. 
Microsoft Tech Community provides a modern digital community where you can ask questions, exchange ideas, and build connections with Microsoft Valued Professionals (MVPs), Microsoft engineers and peers.  Finally, to stay current with the latest Microsoft cloud technologies subscribe to the Microsoft Mechanics YouTube channel for weekly IT focused videos.

I strongly believe that these investments and innovations on Azure are contributing in a significant way to business transformation. I thank you for being a part of Azure’s incredible growth and am very interested in hearing your feedback on the new releases we have on Azure!
Quelle: Azure

Azure Service Fabric for Windows Server now GA

Enterprises today need to walk a fine line between innovation and delivering reliable services. Firms need to be able to rapidly create and run mission critical enterprise applications that have the potential to capture new areas of growth for the organization, creating the opportunity to increase their exposure in the market and meeting changing customer needs. At the same time, system reliability is equally important since application downtime has a real cost to a business’s reputation, finances, and customer loyalty. For example, customers expect an online banking or e-commerce site to be up and running any time of day across any browser, device, or app. A company that doesn’t meet these 24/7 availability expectations and needs is at risk of losing customers to their competitors. Increasingly, businesses are turning to the cloud to develop and manage their applications at scale and with high availability.

Microsoft’s Azure Service Fabric, our microservices application platform for developing and managing cloud-scale applications, was released last year to help developers build and manage cloud-scale applications.

I am excited to announce today that Azure Service Fabric for Windows Server will be generally available for download at no cost. With today’s announcement, customers can now provision Service Fabric clusters in their own data centers or other cloud providers and run production workloads with the option to purchase support for ultimate confidence. One such customer is Owners.com, an online platform that gives consumers a convenient and cost-effective way to buy or sell a home.

"Our on-premise installation of Azure Service Fabric is a robust and highly scalable platform on which we&;ve been able to build very complex software as a collection of easily manageable modules.  This new paradigm of service development allows us to rapidly develop, test, and deploy (with zero downtime), all while meeting tight SLAs for our production environment." Marion Denny, Director of Engineering at Owners.com

We unveiled Service Fabric preview on Linux earlier this month, furthering our vision to enable developers to build Service Fabric applications on the OS of their choice and run them wherever they want. Battle-hardened internally at Microsoft for almost a decade, Service Fabric has been powering highly scalable services like Cortana, Intune, Azure SQL Database, Azure DocumentDB, and Azure’s infrastructure. We’ve seen tremendous response from our customers and great momentum since our recent GA at Build 2016.

Azure Service Fabric allows the creation of clusters on any machine running Windows Server or Linux which means that you can deploy and run Service Fabric applications in any environment that contains a set of interconnected computers, be it on-premises or with any cloud provider. Azure Service Fabric for Windows Server enables you to create clusters on Windows Server machines, particularly focused on running Service Fabric in your data centers. This means you get benefits such as:

Using data center resources you already own and developing microservice architectures on premise before moving to the cloud.
You can choose to create clusters on other cloud providers.
Service Fabric applications, can be deployed to any cluster with minimal to no changes. This can provider an added layer of reliability because you can move your applications to another deployment environment.
Developer knowledge of building Service Fabric applications and the operational experience of running and managing Service Fabric clusters carries over from one hosting environment to another.

We’re excited that with our continuous updates to Service Fabric, more businesses can take advantage of our innovations to develop and power their applications. Learn more about how to get started with Service Fabric.
Quelle: Azure

Announcing the public preview of Azure Monitor

Today we are excited to announce the public preview of Azure Monitor, a new service making inbuilt monitoring available to all Azure users. This preview release builds on some of the monitoring capabilities that already exist for Azure resources. With Azure Monitor, you can consume metrics and logs within the portal and via APIs to gain more visibility into the state and performance of your resources. Azure Monitor provides you the ability to configure alert rules to get notified or to take automated actions on issues impacting your resources. Azure Monitor enables analytics, troubleshooting, and a unified dashboarding experience within the portal, in addition to enabling a wide range of product integrations via APIs and data export options. In this blog post, we will take a quick tour of Azure Monitor and discuss some of the product integrations.

Quick access to all monitoring tasks

With Azure Monitor, you can explore and manage all your common monitoring tasks from a single place in the portal. To access Azure Monitor, click on the Monitor tab in the Azure portal. You can find Activity logs, metrics, diagnostics logs, and alert rules as well as quick links to the advanced monitoring and analytics tools. Azure Monitor provides these three types of data – Activity Log, Metrics, and Diagnostics Logs.

Activity Log

Operational issues are often caused by a change in the underlying resource. Activity Log keeps track of all the operations performed on your Azure resources. You can use the Activity Log section in the portal to quickly search and identify operations that may impact your application. Another valuable feature of the portal is the ability to pin Activity log queries on your dashboard to keep a tab on the operations you are interested in. For example, you can pin a query that filters Error level events and keep track of their count in the dashboard. You can also perform instant analytics on Activity Log via Log Analytics, part of Microsoft Operations Management Suite (OMS).

Metrics

With the new Metrics tab, you can browse all the available metrics for any resource and plot them on charts. When you find a metric that you are interested in, creating an alert rule is just a single click away. Most Azure services now provide out-of-the-box, platform-level metrics at 1-minute granularity and 30-day data retention, without the need for any diagnostics setup. The list of supported resources and metrics is available here. These metrics can be accessed via the new REST API for direct integration with 3rd party monitoring tools.

Diagnostics logs

Many Azure services provide diagnostics logs, which contain rich information about operations and errors that are important for auditing as well as troubleshooting purposes. In the new Diagnostic logs tab, you can manage diagnostics configuration for your resources and select your preferred method of consuming this data.

Alerts & automated actions

Azure Monitor provides you the data to quickly troubleshoot issues. But you want to be proactive and fix issues before it impacts your customers. With Alert rules, you can get notified whenever a metric crosses a threshold. You can receive email notifications or kick off an Automation-runbook script or webhook to fix the issue automatically. You can also configure your own metrics using custom metrics and events APIs to send data to Azure Monitor pipeline and create alert rules on them. With the ability to create alerts rules on platform, custom and app-level metrics, you now have more control on your resources. You can learn more about alert rules here.

Single monitoring dashboard

Azure provides you a unique single dashboard experience to visualize all your platform telemetry, application telemetry, analytics charts and security monitoring. You can share these dashboards with others on your team or clone a dashboard to build new ones.

Extensibility

The portal is a convenient way to get started with Azure Monitor. However, if you have a lot of Azure resources and want to automate the Azure Monitor setup you may want to use a Resource Manager template, PowerShell, CLI, or REST API. Also, if you want to manage access permissions to your monitoring settings and data look at the monitoring roles.

Product integrations

You may have the need to consume Azure Monitor data but want to analyze it in in your favorite monitoring tool. This is where the product integrations come into play – you can route the Azure Monitor data to the tool of your choice in near real-time. Azure Monitor enables you to easily stream metrics and diagnostic logs to OMS Log Analytics to perform custom log search and advanced alerting on the data across resources and subscriptions. Azure Monitor metrics and logs for Web Sites and VMs can be easily routed to Visual Studio Application Insights, unlocking deep application performance management within the Azure portal.

The product integrations go beyond what you see in the portal. Our partners bring additional monitoring experiences, which you may wish to take advantage of. We are excited to share that there is a growing list of partner services available on Azure to best serve your needs. Please visit the supported product integrations list and give us feedback.

To wrap up, Azure Monitor helps you bring together the monitoring data from all your Azure resources and combine it with the monitoring tool of your choice to get a holistic view of your application. Here is a snapshot of a sample dashboard that we use to monitor one of our applications running on Azure. We are excited to launch Azure Monitor and looking forward to the dashboards that you build. Review the Azure Monitor documentation to get started and please keep the feedback coming.

Quelle: Azure

Azure Networking announcements for Ignite 2016

This week we are announcing several new Azure networking services and features to provide customers greater performance, higher availability, better security and more operational insights. We will continue to innovate to make it even easier and more seamless for customers to run their services in the public cloud. For an overview of all our exciting Azure Ignite announcements please see Jason Zander’s blog post.

Higher Performance

Azure has been developing Microsoft’s cloud scale data center infrastructure for over nine years.  Early on we realized that building network infrastructure for mega scale data centers and ever increasing data transfer rates required fundamental shifts in networking technology.  We have been working across the industry to promote and develop cutting edge networking solutions including Microsoft developed hardware solutions.

Today we are announcing break-through advancements to our entire global server fleet that will improve networking bandwidth performance 33% to 50%. This is achieved by utilizing hardware technologies such as NVGRE offload which harnesses the network processing capabilities of the hardware. Windows and Linux VMs will experience these performance improvements while returning valuable CPU cycles to the application. Our world-wide deployment will complete in 2016 and once completed we will update our VM Sizes table to reflect these new performance benefits.

To provide even more performance, we are very excited to announce the Public Preview of Accelerated Networking. Accelerated Networking provides up to 25Gbps of throughput and drastically reduces network latency up to10x! Applications will benefit from a new generation of hardware technologies including SR-IOV, allowing VMs to communicate directly to the hardware NIC completely bypassing the Hypervisor’s virtual switch. Along with higher bandwidths and lower latencies, applications will experience reduced jitter and improved Packets Per Second (PPS) performance. With Accelerated Networking, Azure SQL DB In-Memory OLTP transaction performance improved 1.5X.  Also with this preview, DS15v2 and D15v2 VM sizes provide up to 25Gbps of network throughput. More details on regional availability and a link to sign up for the preview are available at Accelerated Networking for a virtual machine.

In addition, Azure Storage users will benefit from substantially increased IOPS performance based on these advancements, combined with newly developed storage specific offloads. Hardware now efficiently performs data transfers up to the line rate of the NIC. The roll out for Storage will also complete in 2016.

We are announcing the general availability of Virtual Network Peering (VNet Peering). VNet Peering connects Virtual Networks (VNets) in the same region, enabling direct full mesh connectivity. VMs in the peered VNets communicate with each other as if they are part of the same VNet, thus benefiting from high bandwidth and low latency. Hub & Spoke topologies are supported with Transit Routing through gateways. The VNet without a gateway still has cross-premises connectivity via the gateway in the peered VNet. VNet Peering works across subscriptions allowing for simplified service management.

This allows consolidation of VPN gateways and network virtual appliances in the same region, simplifying management and reducing costs. User-Defined Routes (UDR) and Network Security Groups (NSGs) can manage fine grain control between the peered VNets. Vnet peering enables co-existence of “Classic” VNets and Azure Resource Manager VNets. This allows for incremental adoption to the Azure Resource Manager model.

Many enterprise customers use ExpressRoute to connect their private networks to Microsoft. ExpressRoute is supported by a large ecosystem of global telecom providers, cloud exchanges and service providers in over 35 locations. Today, we are introducing the UltraPerformance Gateway SKU for ExpressRoute that supports up to 10 Gbps throughput. This is a 5x improvement over the existing ExpressRoute HighPerformance gateway with a 99.95% availability SLA. With the UltraPerformance Gateway, customers can deploy even more networking intensive services and workloads into their virtual networks.

Cloud applications with demanding networking and massive real time data access requirements will greatly benefit from these new performance enhancements. We are ready for your workload.

IPv6

Azure now supports Native IPv6 network connectivity for applications and services hosted on Azure Virtual Machines. The demand for IPv6 has never been greater with the explosive growth in mobile devices, billions of Internet of Things (IOT) devices entering the market, along with new compliance regulations. IPv6 has been used by internal Microsoft services such as Office 365 for over three years. We are now offering this feature to all Azure customers. Native IPv6 connectivity to the virtual machine is available for both Windows and Linux VMs.

Higher Availability

Our new Active-Active Virtual Private Network (VPN) Gateway for the High-Performance VPN gateway SKU is recommended for production workloads. Availability requires a complete end to end perspective that includes the customer’s on-premises VPN devices and using different service providers to connect to the Active-Active VPN gateway. Each VPN gateway has two active instances. Customers can now implement dual redundancy for cross-premises VPN connections, increasing the availability of their VPN connections to their Azure VNets. All customers should consider adopting the new Active-Active VPN Gateway.
 

Our customers need more degrees of freedom for their Azure Load Balancer configurations. Today, we are making several announcements to increase design flexibility, enable new scenarios, and allow efficient resource consolidation.

We are announcing general availability of multiple VIPs on internal load balancers and new port reuse options across public and internal load balancers. In the following week, we will be previewing two additional abilities in specific regions: Multiple IP addresses on a Network Interface Card (NIC) and enabling all NICs on a VM to have a Public IP address on the NIC or through the load balancer. Check the service update page on the availability of these abilities.

Network Virtual Appliances (NVA) can now offer more flexible configurations. A firewall appliance can expose an Internet facing service on NIC 1 and an internal management service on NIC 2 using the same backend machines. In addition, an NVA can use a single NIC to host multiple services by securing individual private IP addresses per customer/service. Security can be further enhanced using NSG rules targeted at individual IP addresses.

Another use case is SQL AlwaysOn with Multiple Listeners which is now available in Preview. You can also host multiple availability groups on the same cluster and optimize the number of active replicas.  

Azure DNS

We are announcing the GA release of Azure DNS. Customers can now host domains in Azure DNS and manage DNS records using the same credentials, APIs, tools, billing and support as other Azure services.  Azure DNS also benefits from Azure Resource Manager’s enterprise-grade security features, enabling role-based access control and detailed audit logs. Azure DNS supports multiple record types including, A, AAAA, CNAME, MX, NS, PTR, SOA, SRV and TXT and comes with a 99.99% availability SLA.

Azure DNS uses a global network of name servers, providing exceptionally high availability, even in the event of a multi-region failure or network partitioning.  DNS queries are answered by the closest available DNS server for the fastest possible query performance.

With Azure DNS, IT Pros can manage DNS zones and records using either the Azure Portal, or through scripting using Azure PowerShell or the cross-platform Azure CLI. Developers can use the Azure DNS REST API or SDK to automate DNS record provisioning as part of their application workflows. In both cases, fast DNS record provisioning avoids the need to wait for new DNS records to propagate to the name servers. Customers can use the SDK to automate DNS record provisioning as part of their application workflows.

More Secure

Last year we introduced Application Gateway, an Application Delivery Controller (ADC) offering a Layer 7 load balancing as a service. This complements Azure Traffic Manager (DNS load balancer) for load balancing across geographical regions and Azure Load Balancer for layer 4 load balancing within a region (availability set). Over the past year we have enhanced Application Gateway to better address web application requirements. These capabilities include SSL termination, round robin load distribution, cookie based session affinity, URL path based routing, ability to host multiple web applications on the same load balancer, rich diagnostics with access and performance logs, WebSocket support, VM scale set support and the ability to define user configurable health probes.

In our continued effort to provide enhanced application security, Application Gateway now supports end to end SSL encryption and user configurable SSL policies. Customers can secure end to end communication from user requests to the backend using SSL/TLS, while taking advantage of routing rules set on the Application Gateway. The user’s SSL request is terminated at the gateway, which applies user configured routing rules and then re-encrypts the request before sending it to the backend. User configurable SSL policies allows the customer to selectively disable older SSL/TLS protocol versions thus further strengthening the security profile of the applications behind the Application Gateway.

To provide even more advanced security to protect web applications from common vulnerabilities like SQL injection or cross-site scripting attacks, we are announcing the public preview of Web Application Firewall (WAF) as part of the Application Gateway service.

Application Gateway WAF offers simplified manageability of application security and comes preconfigured with protection from the most prevalent web vulnerabilities as identified by Open Web Application Security Project (OWASP) top 10 common vulnerabilities. Customers can run Application Gateway WAF in either protection or detection only mode. Application Gateway WAF also provides real time metrics and alert reporting to continuously monitor web application against exploits. Security rules customization and integration with the Azure Security Center will be available soon.

Network Monitoring and Diagnostics

As the cloud begins to mature it is important to offer not only the same networking performance but also the same level visibility and insights as on-premises solutions.  We are committed in continually enhancing our capabilities in monitoring and diagnostics, empowering you to more easily manage your networks.

In continuation of our earlier announced capabilities in monitoring and diagnostics –

Network Security Group events and counters
SLB resource exhaustion event and Probe health status
Application Gateway performance and access logs
Audit logs for all Networking resources

We are releasing a series of new capabilities.

Customers can view performance metrics for an Application Gateway on the Azure Portal.  The metrics requires no additional configuration. The current release supports continuous aggregated throughput statistics and additional metrics support will be available soon.

Customers can configure threshold based alerts on metrics to proactively monitor the network. An alert can send an email notification or invoke a web hook that can integrate with 3rd party messaging services.

ExpressRoute users can get operational insights into routing configurations and network peering statistics.

Improved diagnostics for Network Security Groups (NSGs) and Routes enable you to better diagnose complex network connectivity problems.  Effective Routes provide an aggregated view of user-defined routes (UDRs), system and BGP routes that impact a VM’s network traffic flow

The ability to easily verify the correctness of network security settings is critical. The effective security rules view offers a comprehensive yet a simplified and intuitive way to understand the security rules as configured on a VM/NIC.

You can expect a lot more capabilities in the coming months.

Looking forward

The cloud is ever evolving and customers are deploying more demanding and complex workloads. This evolution means our mission and commitment is not an end-point but a journey.  We hope you spend time exploring the new services, features and capabilities and provide your valuable feedback as we continue to create, enhance, and deploy new networking technologies to meet your needs.
Quelle: Azure

Announcing Azure DNS General Availability

Today, we are excited to announce the General Availability of Azure DNS. As a global service, it is available for use in all public Azure regions.

We announced the Public Preview of Azure DNS at the Ignite conference in May of last year. Since then the service has been used by thousands of customers, whose valuable feedback has helped drive engineering improvements and to mature the service.

With this announcement, Azure DNS can now be used for production workloads. It is supported via Azure Support, and is backed by a 99.99% availability SLA.

As with other Azure services, Azure DNS offers usage-based billing with no up-front or termination fees. Azure DNS pricing is based on the number of hosted DNS zones and the number of DNS queries received (in millions). General availability pricing applies from 1 Nov 2016, until that time the Preview pricing discount of 50% continues to apply.

Key Features and Benefits

Azure DNS enables you to host your DNS domains and manage your DNS records in Azure.

Hosting and managing your DNS in Azure provides the following benefits:

Reliability – Azure DNS has the scale and redundancy built-in to ensure high availability for your domains—and is backed by our 99.99% uptime SLA. As a global service, Azure DNS is resilient to multiple Azure region failures and network partitioning for both its control plane and DNS serving plane.
Performance – Our global network of name servers uses ‘anycast’ networking to ensure your DNS queries are always routed to the closest server for the fastest possible response.
Ease of use – Your DNS zones and records in Azure DNS can easily be managed via the Azure Portal, Azure PowerShell, or cross-platform Azure CLI. Application integration is supported via our SDK or REST API.
Security – Azure DNS benefits from the same authentication and authorization features as other Azure services, including the ability to configure multi-factor authentication and role-based access controls.
Convenience – Hosting your DNS in Azure enables you to manage your Azure applications and their DNS records in one place, using a single set of credentials, with a single bill and with end-to-end support.

Key features of Azure DNS include:

All common DNS record types—Azure DNS supports all of the DNS record types most commonly used in customer domains: A, AAAA, CNAME, MX, NS, PTR, SOA, SRV and TXT.  (SPF records are supported via the TXT record type, as per the DNS RFCs.)
Easy migration – Migrating your existing domain hosting to Azure DNS is quick and easy using our zone file import feature, which enables existing DNS zones to be imported into Azure DNS in a single command. This feature is available via Azure CLI on Windows, Mac and Linux.
Fast record propagation – When you create a new DNS record in Azure DNS, that record is available at our name servers in just a few seconds. You can verify name resolution and move on to your next task without having to wait.
Record-level access control – As you would expect, Azure Resource Manager’s role-based access controls can be applied to restrict which users and groups are able to manage each DNS domain. In addition, these permissions can be applied on individual DNS record sets. This is particularly useful for large enterprises, in which shared zones are common, enabling different teams to self-manage their own DNS records without having access to records owned by other teams.

Third-Party Support

Men & Mice is a leading supplier of DNS, DHCP and IPAM (DDI) software solutions for enterprise networks operating on diverse and distributed platforms. Deployed on top of existing network infrastructure, the Men & Mice Suite consolidates hybrid environments and offers comprehensive tools for managing large portfolios of DNS domains and IP address blocks, via a powerful user interface or via the robust Men & Mice APIs.

The Men & Mice team has been working with the Azure DNS team during the Azure DNS Preview to implement full support for Azure DNS in the Men & Mice Suite. This combination offers customers with large domain portfolios the flexibility and power of the Men & Mice Suite for large-scale DNS management together with the low cost, convenience, availability and performance of hosting their domains in Azure DNS.

Magnus E. Bjornsson, CEO of Men & Mice, says: “We are proud partners of Microsoft and embrace the opportunity to join forces with this leader in the field of IT. Our mutual collaboration enhances the value of the open and adaptable Men & Mice Suite to our customers. We look forward to continuing the development of third-party support products in close cooperation with Microsoft.”
Quelle: Azure

I Went To My Own Digital Funeral

BuzzFeed News

A few weeks ago, I went to my own funeral. Or at least a simulation of my own funeral. I was sitting in an auditorium, alone except for a trim young man in a black suit, who walked up to a lectern and began speaking. “Good evening,” he said. “We are here to honor the memory of Doree Shafrir. Doree was a beloved friend, daughter, and wife. Our thoughts go out to her loved ones on this day.”

It was more than a little jarring, sitting there listening to this guy talk about me. Doree, he said, was “committed to her work, to social justice and to literature. She showed support to women she’d never even met, and gave platforms to voices of color.” He went on like this for another minute or so, talking about how I’d passed away and “left an empty place” in the hearts of my loved ones. Next, there was a video — all my tweets, scrolling on a huge screen in front of me — and it was only then that I truly started recoiling. My legacy was going to be my tweets about Justin Bieber’s fling with Bronte Blampied, my neighbors&; love of Project Runway, my excitement about wearing a dress with pockets to a wedding.

I was at LACMA, the LA County Museum of Art, for an interactive exhibit put on by an organization called the Hereafter Institute, which was started by the 34-year-old artist Gabriel Barcia-Colombo. The pitch was vague: The Hereafter Institute, I was told, “evaluates a person&039;s digital afterlife using new technologies.” The “funeral” was the culmination of a half-hour personal tour through a series of exhibits meant to inspire reflection and conversation on our digital afterlives.

What would someone who doesn&039;t know me infer about who I was based solely on my online presence?

For centuries, people have been trying to figure out how to achieve immortality — or at least extend their lifespans. Today, billionaires like Larry Ellison, Peter Thiel, and Sergey Brin are spending part of their fortunes on research that they hope will allow them to extend their lifespans. Perhaps the most radical ideas are coming out of Dmitry Itskov&039;s 2045 Initiative, an organization that hopes to eventually be able to meld human heads with robot bodies. For the non-billionaires among us, digital immortality will have to do.

I&039;ve long been fascinated by the posthumous digital lives of others, but I&039;d never really thought about what would happen to my own self-created online presence after I&039;m dead — and more important, how it could be manipulated, even by people with the best of intentions. As someone who likes to maintain a modicum of control over her online presence (don&039;t we all?), this notion started to feel more than a little bit scary. What would someone who doesn&039;t know me infer about who I was based solely on my online presence? At least when I&039;m alive, my social media is a constantly updated, organically changing thing; once I&039;m dead, it&039;s all frozen in amber. Would that same online presence serve as a comfort to people who knew me, a kind of poignant memorial? Or, most terrifyingly of all, would no one care?

A “funeral” at the Hereafter Institute, an installation at LACMA.

Courtesy Gabriel Barcia-Colombo

I&039;m not proud of the fact that when I hear about a celebrity dying, I check to see what their last tweet was. I obsessively read the Last Message Received Tumblr, which posts the last communication (usually texts) that people got from exes, or family and friends who died; the ones that are the most painful to read are the mundane ones from friends who were then killed by drunk drivers.

In 2016, the human condition is marked by existential despair in thinking about being remembered for a few lackluster, dashed-off tweets and silly photos.

These transmissions can appear cruelly unremarkable, but after death, even the most ordinary dribs and drabs of communication feel poignant to their loved ones. Like the Hereafter Institute&039;s project, the Last Message Received is saying: You matter. You matter, and the world you lived in matters, and the people you loved — they matter too.

Still, I can’t help but think I&039;ll want to keep everything away from the prying eyes of people like me when someone I’m close to dies.

Aren&039;t we really just expressing anxieties about our own mortality when we voraciously consume the digital afterlives of others? When I think about it in this light, I&039;m more forgiving of my morbid, voyeuristic habit. If there is an upside to my obsession with these inadvertent social media memorials, it&039;s that they have made me more aware of the permanence of my online presence, which, in the moment, can seem deceptively ephemeral. In 2016, the human condition is marked by existential despair in thinking about being remembered for a few lackluster, dashed-off tweets and silly photos. What if the last thing I ever tweet is a complaint about how much Time Warner Cable sucks? And so, whether we like it or not, life now requires no small degree of constant self-examination about our own legacies, online and off.

Courtesy Gabriel Barcia-Colombo

When I arrived at the entrance of the Hereafter Institute&039;s exhibit, I was greeted by a young blonde woman (an actor, I later learned) in a lab coat, who began by asking me a series of questions about my online presence, including which social networks I had accounts on and which dating apps I’d used. I was left, by that simple exercise, with the uncomfortable knowledge that my digital legacy goes far beyond a bunch of photos on Instagram. It’s a LinkedIn profile where I’ll always be working at BuzzFeed, a Clue profile where my next period is always just a few weeks away, my Discover Weekly playlist on Spotify updating until the end of time. I sat there wondering if my Apple ID would exist forever and if new episodes of Who? Weekly would keep downloading well after I was gone.

Then I stood on a platform while another Hereafter Institute guide took a 3D scan of my body — a scan I would later see animated at my “funeral” — and led me to another building at the museum, where there exhibit continued. There, I saw a record player on a stand where tweets by a man named Fernando Rafael Heria Jr. scrolled on a black screen. (I later found out he had been hit by a car and killed in 2010 while riding his bike in Miami; he was 25.) “Ever wanted to kick someone in the throat?” said one tweet, from March 20, 2010. “Fernando Rafael Heria Jr. shared a link: Brian Piccolo: Thursday Night Criterium Series,” said another from March 25 of that year.

Next, I was led over to a different part of the same room, where I put on a virtual reality headset and found myself engulfed in the separate worlds of three people who had died. It was like a video game, with voiceovers by friends and family (and in one case, a reading by one of the deceased). Barcia-Colombo explained that his intention was to create a memorial to the dead that would allow people a small window into their lived experiences.

A few days after I went through the exhibit, I spoke with Barcia-Colombo by phone. “I was really interested in this sort of bizarre thing that’s happening now, where people pass away on the internet and there’s no real virtual practice put in place for what we do with this data,” he said. “I&039;ve had friends that have passed away, and yet people don&039;t really know, and they still wish them happy birthday. Or people tweet after they&039;ve died because they&039;ve set up auto-tweeting. I thought it was a really sort of interesting time in our culture, and our conversation about death is really changing.”

“At some point there&039;s going to be more people who&039;ve passed away on Facebook than there are alive people on Facebook.”

Last year, Facebook instituted a policy that allows you to designate a person to maintain your Facebook page after you die; your page lives on, but is changed to a “memorial” page. But what happens when that person dies? And so on? “At some point there&039;s going to be more people who&039;ve passed away on Facebook than there are alive people on Facebook,” Barcia-Colombo said. “What is that going to mean?”

We don’t know the answer to that question yet. But what does it mean when even the most off-the-cuff content that we produced when we were alive has the potential to become a posthumous representation of ourselves? It’s exhausting enough to maintain a digital presence while we’re alive. Now are we expected to also be mindful of how our digital selves will be perceived after death?

Today&039;s teenagers are enamored with pointedly ephemeral social media like Snapchat, where posts disappear quickly and (seemingly) forever, and maybe they&039;re onto something. Maybe the next generation is so conscious of digital legacies that they&039;ve decided not to create one at all. But I&039;m too far gone, I think, to make my social media presence disappear; I am a self-archivist by nature, and erasing everything is scarier to me than the idea that someone might piece together a contextless version of me after I die.

All of this awareness adds another complicated layer to the notion of the digital self — one that a quick perusal of my Twitter feed tells me I am definitely not ready for. We may not be sentient beings in death, but whether we like it or not, we will continue to exist long after our bodies are dead and gone.

BuzzFeed News

Quelle: <a href="I Went To My Own Digital Funeral“>BuzzFeed

Microsoft server hosting on IBM Cloud

Did you know that tens of thousands of Microsoft workloads are running on IBM Cloud? Here are some of the reasons why organizations of all sizes are choosing cloud to support their Microsoft servers.
Why choose cloud
Businesses are looking for new ways to engage customers, drive digital transformation and make operations faster and more flexible. With cloud, it’s easier to design and implement these ideas to create competitive advantage. Choose from multiple models &; public, private, and hybrid cloud – that deliver choice and flexibility as the competitive landscape changes and your business needs evolve.

Across public, private and hybrid cloud, IBM Cloud can provide seamless integration and support for the latest versions of applications such as Microsoft SQL Server 2016. The infrastructure is secure, scalable, and flexible, providing the solid foundation that has made IBM Cloud the hybrid cloud market leader.
Success factors

Configure the cloud your way – Can you trust the cloud with critical Microsoft workloads? One secure and widely used approach is to implement bare metal servers, creating a custom, dedicated cloud. With bare metal, the server is designed to your specifications. You select and approve what goes on it.
Stay in control – When your Microsoft workloads are running in the cloud, you want to manage them like an extension of your data center. Look for ways to use APIs and a single management system across workloads.
Go global – No matter the size of your business, you need to consider the flexibility of global data access and storage when choosing a cloud provider to support your growth plans.
Manage costs – Check to understand cost visibility across the software and server resources. Evaluate each element, from how Microsoft workloads are hosted to infrastructure. IBM Cloud offers clear, competitive pricing on hourly or monthly terms for cloud services and Microsoft software so you can easily meet all of your Microsoft Windows workload requirements.

Also, each time the software inside your core applications is headed for end of life, see if cloud can help you move to the newest version. For example, moving from an older version of SQL Server onto SQL Server 2016 may be faster using cloud hosting.
Get started
Stop by the IBM Booth at Microsoft Ignite 2016 in Atlanta, Georgia, September 26-30, to speak to advisors about Microsoft on IBM Cloud.
Learn more about IBM Cloud and Microsoft workload hosting.
The post Microsoft server hosting on IBM Cloud appeared first on news.
Quelle: Thoughts on Cloud