New Lower Prices on Azure Virtual Machines and Blob Storage

We strive hard at Azure to offer you the best value in one of the most cost effective ways in the public cloud. We believe in providing a comprehensive cloud platform that not only enables customers to innovate rapidly, but to also do so at the best possible prices. To that end, today we are happy to announce significant price reductions on several Azure Virtual Machine families and Storage types. We hope this will further lower the barrier to entry for our customers and accelerate cloud transformation.

Azure Virtual Machines:

We have reduced prices on Compute optimized instances – F Series, General purpose instances – A1 Basic by up to 24% and 61% respectively.

The table below shows an example of the VM price reductions in UK South. Please refer to the VM pricing page for all the regions and details.

 

Azure VMs
Price reductions (Linux VM)
Price reductions (Windows VM)

F1 to F16
-23%
-18%

A1 Basic
-42%
-51%

We will also be announcing price reductions specifically for our D-series General-purpose instances in the near future.

Azure Blob Storage:

We have reduced prices on Azure Storage offerings – Hot Block Blob Storage, Cool Block Blob Storage by up to 31% and 38% respectively. These new prices are only available to customers using Azure Blob Storage Accounts. Customers who are on the General Purpose Blob Storage can take advantage of these prices by moving data from General Purpose Blob to Azure Blob Storage account using tools such as AZ copy.

The table below shows an example of the Storage price reductions in UK South. Please refer to the Storage pricing page for all the regions and details.

 

Azure Storage
Price reductions

Hot Block Blob
-26%

Cool Block Blob
-38%

All of the new reduced prices take effect today. We are excited about these new lower prices and how it helps our customers accomplish a lot more. For more details, please visit Linux Virtual Machines Pricing.
Quelle: Azure

Event Hubs .NET Standard client is now generally available

After several months of testing, both internally and by our users (thank you), we are releasing our newest Event Hubs clients to general availability. This means that these new libraries are production ready and fully supported by Microsoft.

What new libraries are available?

Consistent with our past design decisions, we are releasing two new NuGet packages:

Microsoft.Azure.EventHubs – This library comprises the Event Hubs specific functionality that is currently found in the WindowsAzure.ServiceBus library. In here you will be able to do things like send and receive events from an Event Hub.
Microsoft.Azure.EventHubs.Processor – Replaces functionality of the Microsoft.Azure.ServiceBus.EventProcessorHost library. This is the easiest way to receive events from an Event Hub, and keeps you from having to remember things such as offsets and partition information between receivers.

What does this mean for you?

Releasing these new libraries provides three major benefits:

Runtime portability – Using .NET Standard, we now have the ability to write a single code base that is portable across different .NET runtimes, including .NET Core, .NET framework, and the Universal Windows Platform. You can take this library and run it on Windows Server with .NET Framework 4.6, or on a Mac/Linux machine using .NET Core.
Open source – Yes! We are very excited that these new libraries are open source and available on GitHub. We love the interactions that we have with our customers, whether it be an issue or pull request.
Event Hubs now has its own library – while Event Hubs and Service Bus have been seemingly joined in the past, the use cases between these two products are often times different. Previously, you needed to download a Service Bus library in order to use Event Hubs. These new libraries are specific to Event Hubs, so we hope that they will make things more clear for our new users.

What&;s next?

For those of you currently using the WindowsAzure.ServiceBus library, we will continue to support Event Hubs workloads on this library for the foreseeable future. With that said, we currently have a .NET Standard Service Bus library in preview!

For more information on getting started with these new libraries , check out our updated getting started documentation.

So take the new libraries for a spin, and let us know what you think!
Quelle: Azure

Enhanced Automated Backup for SQL Server 2016 in Azure Virtual Machines

We are excited to announce some great enhancements to our Automated Backup feature, which greatly extends your control over backups when running SQL Server 2016 in Azure Virtual Machines. You can now control the schedule of your backups and backup system databases. You can easily enable this feature through the Azure Portal or PowerShell on Azure Virtual Machines running SQL Server 2016 Enterprise, Standard, or Developer.

For those of you not familiar with Automated Backup, this feature allows you to automatically backup all the databases in a SQL Server VM running in Azure to one of your storage accounts. Automated Backup ensures a consistent backup chain at all times, so you can always recover your databases to any point in time. Even better, it manages the desired retention for the backups, keeping them only for the time you specify. If you’re curious, Automated Backup is implemented on top of the SQL Server IaaS Agent Extension and the SQL Server Managed Backup feature.

New capabilities

Backup system databases

Automated Backup now gives you the option to schedule backups for System databases in addition to User databases. If you choose to enable this option, your System databases, and all their important instance-level objects, will be backed up on the same schedule as your User databases.

Scheduling backups

Automated Backup now allows you to schedule a time window and frequency, daily or weekly, for full backups so that these don’t impact performance during business hours. In addition, you can specify how often to take log backups.

Remember that Azure Storage keeps 3 copies of every VM disk to guarantees no data loss, thus the purpose of these backups is to recover from human errors (e.g. deleting a table).

For disaster recovery purposes or compliance reasons, consider storing these backups in a geo-replicated storage account, preferably readable. This will make the backups available also in a remote Azure region.

How to find Automated Backup v2

For new SQL Server VMs

When creating a new virtual machine running SQL Server 2016 in the Azure portal, you will be presented with several SQL Server configuration options. Here you can enable and configure Automated Backup according to your requirements.

For existing SQL Server VMs

If you have an existing virtual machine running SQL Server 2016, you can enable and configure Automated backup v2 from the Azure Portal or PowerShell. If you find your SQL Server virtual machine in the Azure Portal, you can find Automated Backup under SQL Server configuration.

To learn more, check out the documentation for this feature.

Try out this feature today in the Azure Portal!

If you do not have an Azure subscription, you can easily sign up for a free trial!
Quelle: Azure

Azure Search and Dynamics 365: Multitenancy at scale

In late 2016, Dynamics 365 launched their new Relevance Search functionality as a result of a partnership between the Dynamics and Azure Search teams. Relevance Search is generally available worldwide to every customer on the December 2016 Update for Dynamics 365 (online) release.

​Azure Search at scale

Dynamics 365 is one of the largest deployments of Azure Search. About 2,000 organizations have opted into CRM relevance search and that number is growing by dozens each day. By enabling members of these organizations to search through a variety of records spanning a number of entity types, including accounts, contacts, emails (including the content of email attachments), faxes, invoices, contracts, among many others.

As of the date of this posting, thousands of organizations have indexed more than 160 million records using Dynamics 365 Relevance Search. These organizations have issued almost 8 million search queries to Azure Search with an average latency of under 90 milliseconds.

How Dynamics 365 configured Azure Search

Dynamics 365 has provisioned separate search services in each Azure geography where Dynamics 365 is offered. Within each region’s service, each Dynamics customer is assigned an individual Azure Search index. This index-per-tenant model is described in the whitepaper Design patterns for multitenant SaaS applications and Azure Search.

Using Azure Search’s comprehensive APIs, the Dynamics team built a robust infrastructure to handle index lifecycle management and index allocation. Additionally, Dynamics 365 implemented their complex row level security trimming using Azure Search filter expressions to ensure that document security was not compromised across each organization that uses Relevance Search.

S3 High Density

Dynamics 365’s Relevance Search is successfully deployed using Azure Search’s S2, S3, and S3 High Density (HD) pricing tiers. S3 HD was designed specifically for multitenant scenarios built on the index-per-tenant model, supporting up to 3000 indexes in a single service. With this configuration, Dynamics 365 is able to effectively and cost-efficiently serve search traffic for not only their largest and most active customers, but also their long tail of small and medium-sized tenants – all at a global scale.

Read more

You can learn more about Azure Search and its capabilities and find our documentation. Please visit our pricing page to learn about the various tiers of service to fit your needs. You can also read more about modeling multitenancy in this whitepaper.
Quelle: Azure

Expanded subscription offers for StorSimple Virtual Array

Today StorSimple Virtual Array is available exclusively for Microsoft Enterprise Agreement (EA) customers. To leverage the StorSimple Virtual Array hybrid cloud offering, we are now expanding this solution to customers who are on MSDN, Pay-as-you-go, and other subscriptions. For more details, please visit StorSimple Solution Pricing.

With the flexible pay-as-you-go subscription, StorSimple Virtual Array can be used by Small and Medium Business (SMB) owners. MSDN subscribers can now run POCs or development and testing workloads. In all these cases, you can configure StorSimple Virtual Array as a file server (NAS) or as an iSCSI server (SAN) in the new Azure portal.

Everything about the StorSimple Virtual Array experience in the Azure portal is designed to be easy. Create a StorSimple Device Manager service to manage all your StorSimple Virtual Arrays. Remember to select the StorSimple Virtual Device Series while creating the service.

For more information, go to the StorSimple product documentation. Visit the StorSimple MSDN forum to find answers, ask questions, and connect with the StorSimple community.

Your feedback is important to us, please send any feedback or feature requests using the StorSimple User Voice. Should you need any assistance, Microsoft Support is there to help you along the way!

 
Quelle: Azure

Announcing the new Azure Marketplace experience

This post was co-authored by Vybava Ramadoss, Senior Program Manager, Azure Marketplace.

Azure Marketplace provides a rich catalog of thousands of products and solutions from independent software vendors (ISVs) that have been certified and optimized to run on Azure. While customers love the breadth of our offerings, which range from open source to enterprise applications, one piece of feedback we’ve heard consistently was that navigating through such a huge catalog is difficult. Today, we are excited to announce a new interactive experience for the Azure Marketplace that makes it easy to navigate the product catalog and find the right solution for your cloud application without having to login to the Azure portal.

Launch the new Azure Marketplace. Let’s go over a couple of scenarios.

Find and deploy your favorite product

Let’s say you are looking for a specific product. For example, you may be a blogger who wants a WordPress environment. You can start typing “WordPress” in the top search button and pick the WordPress option that best suits you from the list.

Click on “Get It Now” which will prompt you to login to the Azure portal. Follow the instructions, and you can have your WordPress environment up and running in a few minutes.

Discover and deploy a new product

Let’s say you are looking for a solution, but you aren’t sure which product best fits your needs.  For example, you need a storage appliance for your cloud application, but you want to look at the available options and learn about the products before deciding.

Discover new products – The categories in Azure Marketplace are a good place to start. You can click on the Storage category (notice that categories are consistent with the Azure portal) to see the top recommended products or filter to a subcategory such as Backup and Recovery.

But in this case, you are looking for appliances, and it isn’t a subcategory. Don’t worry; you can type Appliances in the search area to filter for appliances within the storage offerings. The search result shows you the brief description and the starting price for each available appliance. 
 

Deep dive to learn more – So, now you know the storage appliances available in Marketplace, but you need more information on the products to make your decision. The new product pages make it easier and more convenient to deep dive into a product. Let’s look at the NetApp and SoftNAS product pages. Click on the product tiles to open the product pages. You will see two sections. The Overview section contains the detailed technical documentation, product features, screenshots, etc.
 

 

The Plans + Pricing contains the different SKU’s, pricing options, and publisher recommendations.

 

You will notice that while the pages have product-specific information, the fields that are important for your decision making are prominent and consistent across product pages. These fields include:

Select a software plan -Shows the available pricing plans. Also, the Download table as CSV option enables you to export the pricing plan.
Publisher recommendations – Tells you the recommended VM’s to use to deploy a product based on the region.

Test drive the products – Now that you have gone through the potential candidates, wouldn’t it be great to try out some of these appliances before making your final decision? Azure Marketplace Test Drives let you do just that. Test Drives are ready to go environments that allow you to experience a product for free without even needing an Azure subscription. You can access a Test Drive from the product page itself or by clicking on Test Drives on the left navigation pane. Both SoftNAS and NetApp products offer Test Drives – so go ahead test drive them to get a hands-on experience before deciding.
 

The new marketplace experience makes it seamless to find and deploy your favorite product. We hope that the consistent and easy-to-use navigation structure along with the hands-on experience of Test Drives will make finding and learning about new products a fun experience. Try out the new Azure Marketplace and let us know your feedback.

If you are a publisher interested in Azure Marketplace, visit the Sell on Azure Marketplace page and get started today.
Quelle: Azure

New in Azure Stream Analytics: Geospatial functions, Custom code and lots more!

Today, we are pleased to announce the roll-out of several compelling capabilities in Azure Stream Analytics. These include native support for geospatial functions, custom code with JavaScript, low latency dashboarding with Power BI and preview of Visual Studio integration and Job diagnostic logs. Additionally, effective today there will be no more ingress data throttling.

Native support for Geospatial functions

Starting today, customers can easily build solutions for scenarios such as connected cars, fleet management, and mobile asset tracking with tremendous ease using Azure Stream Analytics. Developers can now leverage powerful built-in geospatial functions in their stream-processing logic to define geographical areas, and evaluate incoming geospatial data for containment, proximity, overlap, and generate alerts or easily kick-off necessary workflows etc. These geospatial capabilities are in alignment with the GeoJSON specification.

We had more than 100 customers using these Geospatial capabilities in preview, including NASCAR. Established in 1947, NASCAR has grown to become the premier motorsports organization. Currently, NASCAR sanctions more than 1,200 races in more than 30 U.S. states, Canada, Mexico and Europe. NASCAR has been a pioneer in using geospatial capabilities in Azure Stream Analytics.

“We use real-time geospatial analytics with Azure Stream Analytics for analyzing race telemetry during and after the race,” said NASCAR’s Managing Director of Technology Development, Betsy Grider

Custom code with JavaScript user defined functions

With Azure Stream Analytics, customers can now combine the power of JavaScript with the simplicity and pervasiveness of SQL. Historically, Azure Stream Analytics let developers express their real-time query logic using a very simple SQL like language. That said, customers have also been asking us to support more expressive custom code to implement advanced scenarios. Today, in our journey to offer richer custom code support, we are pleased to announce the support for User defined functions using JavaScript in Azure Stream Analytics. With this new feature, customers can now write their custom code in JavaScript, and easily invoke it as part of their real-time stream processing query.

Invoking JavaScript UDF from a Stream Analytics Query

Visual Studio tools for Azure Stream Analytics

To help maximize end-to-end developer productivity across authoring, testing and debugging Stream Analytics jobs, we are rolling out a public preview of Azure Stream Analytics tools for Visual Studio. Local testing on client machines to enable true offline query building and testing experience will be one of the key capabilities that will be available. Additionally, features such as IntelliSense (code-completion), Syntax Highlighting, Error Markers and Source control integrations are designed to offer best in class developer experiences.

Stream Analytics jobs in Visual Studio

Low-latency dashboarding with Power BI

In our quest to continually test the boundaries of performance and latencies to serve our customer needs better, we’ve worked closely with our Power BI engineering team to improve dashboarding experiences for solutions built using Azure Stream Analytics. Azure Stream Analytics jobs can now output to the new Power BI streaming datasets. This will enable rich visual and dynamic dashboards with a lot lower latency than what was possible until now.

Dashboards powered by streaming data from Azure Stream Analytics

Job Diagnostics logs

Building on a series of ongoing investments designed to improve the self-service troubleshooting experience, today we are announcing the preview of Azure Stream Analytics’ integration with Azure Monitoring. This provides customers a systematic way to deal with lost, late or malformed data while enabling efficient mechanisms to investigate errors caused by bad data.

Having immediate access to actual data that causes errors helps customers quickly address problem(s). Users will be able to control how the job acts when errors occur in data, and persist relevant event data and operational metadata (eg. occurrence time and counts) in Azure Storage or Azure Event Hubs. This data can be used for diagnostics and troubleshooting offline. Furthermore, data routed to Azure Storage can be analyzed using the rich visualization and analytics capabilities of Azure Log Analytics.

Key examples of data handling errors include: Data conversion and serialization errors in cases of schema mismatch; Incompatible types and constraints such as allow null, duplicates; Truncation of strings and issues with precision during conversion etc.

Link to Diagnostics logs on Azure portal

Keep the feedback and ideas coming

Azure Stream Analytics team is highly committed to listening to your feedback and let the user voice dictate our future investments. We welcome you to join the conversation and make your voice heard via our UserVoice.

Please visit our pricing page to review the latest pricing.

 

 
Quelle: Azure

Connecting Power BI to an Azure Analysis Services server

Last October we released the preview of Azure Analysis Services, which is built on the proven analytics engine in Microsoft SQL Server Analysis Services. With Azure Analysis Services, you can host semantic data models in the cloud. Users in your organization can then connect to your data models using tools like Excel, Power BI, and many others to create reports and perform ad-hoc data analysis. This blog will focus on everything you need to know to use Power BI Desktop to build reports against an Azure Analysis Services server and deploy that report to PowerBI.com. Before getting started, you’ll need: A data model deployed at an Azure Analysis Services server – Creating your first data model in Azure Analysis Services. Power BI Desktop – Download the latest version for free. Power BI Account – Sign up for Power BI. Connect and create 1. Open Power BI Desktop 2. Click Get Data. 3. Select Databases/SQL Server Analysis Services, and then click connect. 4. Enter your Azure AS server name, and click OK. 5. On the Navigator screen, select your model, and click OK. You’ll now see your model displayed in the field list on the side. You can drag and drop the different fields on to your page to build out interactive visuals.   Publish to Power BI Now that you have created your report, you can publish to Power BI. When published, you can view it online and share it with others. 1. Save the report locally. 2. Click the Publish button on the Home tab. If this is your first time publishing, you’ll be asked to sign in to Power BI. 3. Select the destination for your report. This can either be your personal workspace or a group that you are a member of. 4. Once publishing is complete, click the blue link to view the report in Power BI. The report will open in Power BI and will automatically connect to your Azure Analysis Services server. When connecting from Power BI to Azure Analysis Services, you are connected as your Azure Active Directory identity. This is the same identity as you would have used to sign into Power BI. If you share the report to any other users, you must ensure those users have access to your model. In addition to the report being published, you’ll also see a dataset that has been published. You can use this dataset to create new reports against your Azure Analysis Services model directly in Power BI.   Learn more about Azure Analysis Services.
Quelle: Azure

Connect industrial assets with ProSoft, powered by the Azure IoT Gateway SDK

For businesses around the world, connecting existing assets and devices to the cloud is a first step towards realizing the benefits of the industrial Internet of Things. Yet while the opportunity for operational efficiencies and productivity improvements are straightforward, the logistics of connecting legacy industrial devices and systems are often not as easy to tackle. For starters, industrial equipment is often built to last decades, and older devices tend not to be cloud aware. They also don’t have the capability perform the encryption necessary to securely traverse the internet, and in many cases they are not even TCP/IP enabled.

We’re working with industry leaders to make connecting legacy devices simpler with the Azure IoT Gateway SDK. ProSoft, a leader at integrating disparate devices into a unified system, is one company that is seamlessly connecting existing industrial devices into IoT solutions. The ProSoft PLX gateway communicates directly with Azure IoT Hub through the Azure IoT Gateway SDK in a highly secure manner which scales with the millions of devices a customer may desire to connect.

Using gateways like ProSoft PLX, not only can multiple devices be connected to the cloud in minutes, but they can also become a true IoT solution by leveraging the remote monitoring capabilities of the Azure IoT Suite.  This preconfigured solution allows businesses to easily monitor, analyze, report on, and create alarms based on the data previously un-connectable devices send to the cloud. These devices can also receive updates from the cloud, for example, to change configuration or settings. Working together, ProSoft and Microsoft Azure IoT are helping to lay the groundwork required to unlock the value of data produced in industrial systems.

Figure 1 – Diagram showing how existing business assets can be connected to the Remote Monitoring Preconfigured Solution using a ProSoft PLX running the Azure IoT Gateway SDK.

Connecting existing devices to cloud solutions is the first step towards realizing the potential of the Internet of Your Things. Soon Microsoft and ProSoft will show even more advanced scenarios like edge analytics and responding to device events in real time. For now, you can learn more about what IoT can do for your business at www.internetofyourthings.com and hardware to power your IoT solution at www.prosoft-technology.com.
Quelle: Azure

Understanding Azure troubleshooting and support

Microsoft Azure team is committed to helping you achieve more with the power of the cloud. The Azure support teams are here to help you build, deploy and run your Azure solutions with confidence. We are continuously expanding troubleshooting and support areas to help you accelerate your cloud journey. This post provides an overview of key self-help tools and a quick glance at Azure support. Self-help and troubleshooting Comprehensive documentation and get started resources are a good start for all Azure customers. As you begin to design your solutions and look for help from community experts – Azure Forums is a great option. While you progress building your solution, self-help and optimization tools are built into the Azure Portal. Can’t connect to your VM? Just click on the Diagnose and solve problems to follow troubleshooting guidance for resolution right in your Azure Portal. This troubleshooting solution is included with all Azure services. Once your solutions are established, Azure Advisor preview provides personalized recommendations based on best practices to further optimize your environment for e.g. high availability, or security. Azure Advisor is a recommendation engine that provides proactive, actionable, and personalized best practices to help you improve the high availability, security, performance, and cost effectiveness of your Azure resources. Check the Advisor dashboard to get your list of recommendations. These are some of resources and tools that you can utilize on your own. If you need in-depth help and assistance for your mission critical applications, the right Azure technical support plan can help you. Get the right level of technical support Microsoft Azure offers flexible support options that give you direct access to Microsoft technical engineers. You will receive the best available expertise aligned to the level you need, allowing you to focus on your business outcomes. Microsoft Azure support plans are designed to help everyone – from individual developers to multi-national organizations, covering range of environments from trial to business critical. Premier support, the top tier plan, is ideal for large and global customers who need comprehensive support across multiple Microsoft products, including Azure. If you are looking for non-technical subscription and billing support for Azure, it is included with every Azure subscription and can be accessed from the Help + support blade in your Azure portal. Next steps Please let us know your comments and ideas about Azure support and troubleshooting as it will help us continue to evolve these areas. Submit your feedback on Azure portal or via @AzureSupport on Twitter, where you can also get answers from Microsoft Azure experts. To ensure your Microsoft Azure support plan is aligned with your business requirements, visit the Azure support page. Having access to the right support and resources is an investment that can help you get the most of your cloud assets can save you money, and can make your organization more productive.
Quelle: Azure