Windows Server 2019 support now available for Windows Containers on Azure App Service

The Azure App Service engineering team is always striving to improve the efficiency and overall performance of applications on our platform. Today, we are happy to announce Windows Server 2019 Container support in public preview.

To our customers, this expanded support translates into clear efficiencies:

Reduced container size enables you to be more cost effective by running more applications/slots within your App Service Plan. For example, the Windows Server Core 2019 LTSC base image is 4.28 GB compared to the Windows Server Core 2016 LTSC image is 11GB, which equates to a decrease of 61 percent!
You will benefit from faster startup time for your application because the container images will be smaller.

The container hosts have been updated to support Windows Server 2019, which means we can now support Windows Containers based on:

Windows Server Core 2019 LTSC
Windows Server Nano 1809
Windows Server Core 2016 1803
Windows Server Core 2016 1709
Windows Server Core 2016 LTSC

Windows Container support is available in our West US, East US, West Europe, North Europe, East Asia, and East Australia regions. Windows Containers are not supported in App Service Environments at present.

Faster app startup times with new, cache-based images

App Service caches several base images and we advise customers to use those images as the base of their containers to enable faster application startup times. Customers are free to use their own base images, though using non-cached base images will lead to longer application startup times.

Customers deploying .NET Framework Applications must choose a base image based on the Windows Server Core 2019 Long Term Servicing Channel release or older, and customers deploying .NET Core Applications must choose a base image based on Windows Server Nano 1809.

Cached base images:

mcr.microsoft.com/dotnet/framework/aspnet:4.7.2-windowsservercore-ltsc2019
mcr.microsoft.com/windows/nanoserver:1809

Resources

Run a custom Windows container in Azure (Preview)
Migrate an ASP.NET app to Azure App Service using a Windows container (Preview)
Windows Containers on Azure App Service Wiki, which contains example DockerFiles for various application scenarios

We want to hear from you!

Windows Container support for Azure App Service provides you with even more ways to build, migrate, deploy, and scale enterprise grade web and API applications running on the Windows platform. We are planning to add even more capabilities during the public preview and are very interested in your feedback as we move towards general availability.
Quelle: Azure

Unlock dedicated resources and enterprise features by migrating to Service Bus Premium

Azure Service Bus has been the Messaging as a Service (MaaS) option of choice for our enterprise customers. We’ve seen tremendous growth to our customer base and usage of the existing namespaces, which inspires us to bring more features to the service.

We recently expanded Azure Service Bus to support all Azure regions with Availability Zones to help our customers build more resilient solutions. We also expanded the Azure Service Bus Premium tier to more regions to enable our customers to leverage many enterprise ready features on their Azure Service Bus namespaces while also being closer to their customers.

The Azure Service Bus Premium tier is a relatively newer offering, made generally available in September 2015, that allows our customers to provision dedicated resources for their Azure Service Bus namespaces. This in turn provides reliable throughput and predictable latency, along with production and enterprise ready features at a fixed price per Messaging Unit. This is a major improvement from the Azure Service Bus Standard tier that is a multi-tenant system optimized for lower throughput systems using a pay-as-you-go model.

Our Azure Service Bus Premium tier offering has resonated well with the customers, who have been excited to get onboard to enjoy the value add that it provides. However, until now, we haven’t had a way to upgrade the existing Azure Service Bus Standard namespaces to the Premium tier. That is now about to change.

Today, we’re happy to announce tools, both on the Azure portal and via the Command Line Interface/PowerShell that enables our customers to upgrade their existing Azure Service Bus Standard namespaces to the Premium tier. This tooling will ensure that no configuration changes are required on the sender and receiver applications, while enabling our customers to adopt the best offering for their use case, with minimal downtime.

To know more about this feature and the finer details on what is happening under the hood, please read the documentation.

You can access the portal tool by clicking on the “Migrate to Premium” menu option on the left navigation pane under the Service Bus Standard namespace that you are looking to migrate.
Quelle: Azure

Installing OpenShift 4 from Start to Finish

You have probably heard about all the great engineering work going on to get the next release of OpenShift 4 ready for prime time. OpenShift 4 marks an incredible advancement for enterprise Kubernetes as it includes some great new features such as over the air updates and integration with the operator hub.   One of […]
The post Installing OpenShift 4 from Start to Finish appeared first on Red Hat OpenShift Blog.
Quelle: OpenShift

Top 10 tremendous app dev sessions to attend at Next ‘19

Whether you develop frontends or backends; on App Engine, Firebase or Kubernetes; for mobile, the web or an embedded device, there’s no lack of options for application development on Google Cloud Platform (GCP)—and we’ll be talking about all of them at Google Cloud Next 19. To help get you situated, we crowdsourced a list from our Google Cloud peers on top app dev sessions you can’t afford to miss. Here are our picks.1.Google Cloud Platform 101Maybe you need an introduction, maybe you just need a refresher. Whatever your reason, check out this session to figure out exactly what all of GCP’s tools are, and get guidance on how best to solve your application development problems.2.Super-Charge Your GKE Developer Workflow in Visual Studio Code and IntelliJKubernetes is portable, extensible, and powerful, but getting started, configuration management, deployments and debugging can be painful. In this session, we’ll explore Visual Studio Code and IntelliJ IDE extensions to simplify these Kubernetes workflows.3.Large-Scale Multiplayer Gaming on KubernetesBuilding the next fast-paced, online, multiplayer game? You’re going to want to use Kubernetes for that, which, when combined with open-source projects like Open Match and Agones, can make the hard work of building a matchmaking platform and coordinating game server orchestration that much easier. Game on!4.Build Mobile Apps with Flutter and Google MapsJust like the real-estate business, mobile app development is all about location, location, location. Come to this session and learn how to build a location-aware mobile app using a powerful combination of Flutter, Firestore and Google Maps Platform.5.Dead Easy Kubernetes Workflows With VS CodeIf you’ve ever worked with a Kuberetes application, you know that there are multiple configuration files to edit, a lot of moving parts to deploy, and that debugging is a pain. Luckily, there is a new Visual Studio Code extension to simplify the Kubernetes development workflow—come check it out.6.The “Why” and the “How” of Testing Games in the CloudTrue statement: testing games is a difficult, imperfect, manual process. In Google’s Firebase Test Lab, we’ve been working ways to perform more sophisticated automated games testing, so you can find problems in your game—before your users do.7.What’s New in Firebase for Development TeamsFirebase is a popular application development platform on GCP, and recent changes to the Firebase back-end as a service tooling makes it an even better fit for building large-scale applications. Come to this session to learn more about testing, continuous integration, global roll-out and more.8.Building Secure Mobile Apps With FirebaseSpeaking of Firebase, it’s important to understand potential attack vectors that arise using Firebase’s direct-from-mobile access, and how to thwart them. We’ll introduce you to the tools Firebase and Google Cloud provide to help you build secure apps, and also tell you about all the things we automatically do for you.9.Serverless Payment Processing with FirebaseCollecting payments on your mobile application is within easy reach, thanks to Firebase and GCP. In this talk, we’ll walk you through how to manage payments in your app using the Stripe API and Cloud Functions for Firebase, and discuss common solutions for payment management, including how to handle refunds and ensure security.10.How Retailers Prepare for Black Friday on Google Cloud PlatformMajor industry events, like Black Friday, can test every facet of a system. In this session, we’ll discuss how cloud-based retailers successfully navigate the holiday peak season—including monitoring tactics, infrastructure designs, and application architectures—to help you prepare for your next peak event!To learn more about these sessions and others, and to register, visit the Next ‘19 website. Until then, happy coding!
Quelle: Google Cloud Platform

Announcing the Azure Functions Premium plan for enterprise serverless workloads

We are very excited to announce the Azure Functions Premium plan in preview, our newest Functions hosting model! This plan enables a suite of long requested scaling and connectivity options without compromising on event-based scale. With the Premium plan you can use pre-warmed instances to run your app with no delay after being idle, you can run on more powerful instances, and you can connect to VNETs, all while automatically scaling in response to load.

Huge thanks to everyone that participated in our private preview! Symantec Corporation and Volpara Solutions are just a few of the companies that will benefit from the new features of the Premium plan.

See below of a comparison of how the Premium plan improves on our existing dynamically scaling plan, the Consumption Plan.

Advanced scale controls enable customized deployments

Instance size can now be specified with the Premium plan. You can select up to four D-series cores and 14 GB of memory. These instances are substantially more powerful than the A-series instances available to functions using the Consumption plan, allowing you to run much more CPU or memory intensive workloads in individual invocations.

Available Instance sizes

Maximum Instances can now also be specified with the Premium plan. This is one of the most highly requested features and allows you to limit the maximum scale out of your Premium plan. Restricting max scale out can protect downstream resources from being overwhelmed by your functions and allows you to predict your maximum possible bill each month.

Minimum Instances can be specified in the Premium plan to allow you to pre-scale your application ahead of predicted demand. If you suspect an email campaign, sale, or any time gated event will cause your app to scale faster than it can replenish pre-warmed instances. You can increase your minimum instances to pre-load capacity.

We’ve built a sample Durable Function that will move any function between the Consumption and Premium plan with pre-warmed instances on a schedule, allowing you to optimize for the best cost.

Connect Functions to VNET

The Premium plan allows dynamic scaling functions to connect to a VNET and securely access resources in a private network. This feature was previously only available by running Functions in an App Service Plan or App Service Environment, and is now available in a dynamically scaling model by using the Premium plan. Read more about VNET integration.

Pre-warmed Instances let you avoid cold start

With the Functions Premium plan we are offering a solution to the delay when calling a serverless application for the first time: pre-warmed instances. This delay is commonly referred as cold start, and it’s one of the most common problems amongst serverless developers. For more details on what cold start is and why it happens please refer to the blog post, “Understanding serverless cold start.”

In the Premium plan, we offer you the ability to specify a number of pre-warmed instances that are kept warm with your code ready to execute. When your application needs to scale, it first uses a pre-warmed instance with no cold start. Your app immediately pre-warms another instance in the background to replenish the buffer of pre-warmed instances. This model allows you to avoid any delay on the execution for the first request to an idle app, and also at each scaling point.

Today we only allow one pre-warmed instance per site, but we expect to open that up to higher numbers in the following weeks.

Keeping a pool of pre-warmed instances to scale into is one of the core advantages beyond existing workarounds. Today in the Consumption plan many developers work around cold start by implementing a “pinger” to constantly ping their application to keep it warm. While this does work for the first request, apps with pingers will still experience cold start as they scale out, since the new instances pulled to run the application won’t be ready to execute the code immediately. We always keep the number of pre-warmed instances you’ve requested ready as a buffer, so you’ll never see cold-start delays so long as you’re scaling slower than we can warm up instances.

Try it out and learn more!

The Azure Functions Premium plan is available in preview today to try out! Here’s what you can do to learn more about it:

Check out how to get started with the Premium plan.
Learn how to switch functions between Consumption and Premium plans.
Sign up for an Azure free account if you don’t have one yet, and try out the Azure Functions Premium plan.
Troubleshoot with the community and file any issues you run into on our GitHub repo.
Learn more about the Premium plan and other enterprise serverless features in the Mechanics Show below:

Quelle: Azure