Azure Blockchain Workbench 1.7.0 integration with Azure Blockchain Service

We’re excited to share the release of Microsoft Azure Blockchain Workbench 1.7.0, which along with our new Azure Blockchain Service, can further enhance your blockchain development and projects. You can deploy a new instance of Blockchain Workbench through the Azure portal or upgrade your existing deployments to 1.7.0 using the upgrade script. 

This update includes the following improvements:

Integration with Azure Blockchain Service

With the Azure Blockchain Service now in preview, you can develop directly with Blockchain Workbench on Azure Blockchain Service as the underlying blockchain. For those of you who have been on this blockchain journey with Microsoft, there are now templates in Azure which make it faster to configure and deploy a private blockchain network, but it’s still up to you to maintain and run your blockchain nodes, including upgrading to new versions, installing security patches, and more. Azure Blockchain Service simplifies the maintenance of the underlying blockchain network by running a fully managed blockchain node for you.

 

 

Blockchain Workbench helps with building the scaffolding needed on top of a blockchain network to quickly iterate and develop blockchain solutions. Workbench 1.7.0 enables you to easily deploy the Azure Blockchain Service directly with Workbench. To deploy Workbench from the Azure Marketplace, navigate to the Advanced settings blade and select Create new blockchain network under Blockchain settings.

 

Selecting this option will automatically deploy an Azure Blockchain Service node for you. Note that if you rotate the primary API key on the primary transaction node on your Azure Blockchain Service, you need to change the key of the configured RPC endpoint on Blockchain Workbench. Update the Key Vault with the new key and reboot the VMs.

Enhanced compatibility with Quorum

One of the highly requested features from customers is adding compatibility for additional blockchain network protocols. In previous releases of Blockchain Workbench, the default blockchain network that is configured is an Ethereum Proof-of-Authority (PoA) blockchain network. With Blockchain Workbench 1.7.0, we have added compatibility with the Quorum blockchain network.

For customers who are looking to build blockchain applications on top of Quorum, you can now develop and build your Quorum based applications directly with Blockchain Workbench.

You can stay up to date on Azure Blockchain Service by following the team on Twitter @MSFTBlockchain. Please use the Blockchain UserVoice to provide feedback and suggest features and ideas. Your input is helping make this a great service. We look forward to hearing from you.
Quelle: Azure

A solution to manage policy administration from end to end

Legacy systems can be a nightmare for any business to maintain. In the insurance industry, carriers struggle not only to maintain these systems but to modify and extend them to support new business initiatives. The insurance business is complex, every state and nation has its own unique set of rules, regulations, and demographics. Creating new products such as an automobile policy has traditionally required the coordination of many different processes, systems, and people. These monolithic systems traditionally used to create new products are inflexible and creating a new product can be an expensive proposition.

The Azure platform offers a wealth of services for partners to enhance, extend, and build industry solutions. Here we describe how one Microsoft partner, Sunlight Solutions, uses Azure to solve a unique problem.

Monolithic systems and their problems

Insurers have long been restricted by complex digital ecosystems created by single-service solutions. Those tasked with maintaining such legacy, monolithic systems struggle as the system ages and becomes more unwieldy. Upgrades and enhancements often require significant new development, large teams, and long-term planning which are expensive, unrealistic, and a drain on morale. Worse, they restrict businesses from pursuing new and exciting opportunities.

A flexible but dedicated solution

An alternative is a single solution provider that is well versed in the insurance business but able to create a dedicated and flexible solution, one that overcomes the problems of a monolith. Sunlight is such a provider. It allows insurance carriers to leverage the benefits of receiving end-to-end insurance administration functionality from a single vendor. At the same time, their solution provides greater flexibility, speed-to-market, and fewer relationships to manage with lower integration costs.

Sunlight’s solution is a single system which manages end-to-end functionality across policy, billing, claims, forms management, customer/producer CRM, reporting and much more. According to Sunlight:

“We are highly flexible, managed through configuration rather than development. This allows for rapid speed to market for the initial deployment and complete flexibility when you need to make changes or support new business initiatives. Our efficient host and continuous delivery models address many of the industry’s largest challenges with respect to managing the cost and time associated with implementation, upgrades, and product maintenance.”

In order to achieve their goals of being quick but pliable, the architecture of the solution is literally a mixture of static and dynamic components. Static components are fields that do not change. Dynamic components such as lists populate at run time. This is conveyed in the graphic below, the solution uses static elements but lets users configure with dynamic parts as needed. The result is a faster cycle that maintains familiarity but allows a variety of data types.

In the figure above, data appears depending on the product. When products are acquired, for example through mergers, the static data can be mapped. If a tab exists for the product, it appears. For example, “benefits” and “deductibles” are not a part of every product.

Benefits

In brief, here are the key gains made by using Sunlight:

End-to-end functionality: Supports all products/coverages/lines of business
Cloud-based and accessible anywhere
Supports multiple languages and currencies
Globally configurable for international taxes and regional regulatory controls
Highly configurable by non-IT personnel
Reasonable price-point

Azure services

Azure Virtual Machines are used to implement the entire project life cycle quickly.
Azure Security Center provides a complete and dynamic infrastructure that continuously improves on its own.
Azure Site Recovery plans are simple to implement for our production layer.
Azure Functions is utilized in order to quickly replicate environments.
Azure Storage is used to keep the application light with a range of storage options for increased access time based on the storage type.

Next steps

To learn more about other industry solutions, go to the Azure for insurance page. To find more details about this solution, go to Sunlight Enterprise on the Azure Marketplace and select Contact me.
Quelle: Azure

New PCI DSS Azure Blueprint makes compliance simpler

I’m excited to announce our second Azure Blueprint for an important compliance standard with the release of the PCI-DSS v3.2.1 blueprint. The new blueprint maps a core set of policies for Payment Card Industry (PCI) Data Security Standards (DSS) compliance to any Azure deployed architecture, allowing businesses such as retailers to quickly create new environments with compliance built in to the Azure infrastructure.

Azure Blueprints is a free service that enables customers to define a repeatable set of Azure resources that implement and adhere to standards, patterns, and requirements. Azure Blueprints allow customers to set up governed Azure environments that can scale to support production implementations for large-scale migrations.

Azure Blueprints is another reason why Azure is a strong platform for compliance, with the industry’s broadest and deepest portfolio of 91 compliance offerings. Azure is built using some of the most rigorous security and compliance standards in the world, and includes multi-layered security provided by Microsoft across physical datacenters, infrastructure, and operations. Azure is also built for the specific compliance needs of key industries, including over 50 compliance offerings specifically for the retail, health, government, finance, education, manufacturing, and media industries.

Compliance with regulations and standards such as ISO 27001, FedRAMP and SOC is increasingly necessary for all types of organizations, making control mappings to compliance standards a natural application for Azure Blueprints. Azure customers, particularly those in regulated industries, have expressed strong interest in compliance blueprints to help ease their compliance burdens. In March, we announced the ISO 27001 Shared Services blueprint sample which maps a set of foundational Azure infrastructure, such as virtual networks and policies, to specific ISO controls.

The PCI DSS is a global information security standard designed to prevent fraud through increased control of credit card data. Organizations that accept payments from credit cards must follow PCI DSS standards if they accept payment cards from the five major credit card brands. Compliance with PCI DSS is also required for any organization that stores, processes, or transmits payment and cardholder data.

The PCI-DSS v3.2.1 blueprint includes mappings to important PCI DSS controls, including:

Segregation of duties. Manage subscription owner permissions.
Access to networks and network services. Implement role-based access control (RBAC) to manage who has access to Azure resources.
Management of secret authentication information of users. Audit accounts that don't have multi-factor authentication enabled.
Review of user access rights. Audit accounts that should be prioritized for review, including depreciated accounts and external accounts with elevated permissions.
Removal or adjustment of access rights. Audit deprecated accounts with owner permissions on a subscription.
Secure log-on procedures. Audit accounts that don't have multi-factor authentication enabled.
Password management system. Enforce strong passwords.
Policy on the use of cryptographic controls. Enforce specific cryptographic controls and audit use of weak cryptographic settings.
Event and operator logging. Diagnostic logs provide insight into operations that were performed within Azure resources.
Administrator and operator logs. Ensure system events are logged.
Management of technical vulnerabilities. Monitor missing system updates, operating system vulnerabilities, SQL vulnerabilities, and virtual machine vulnerabilities in Azure Security Center.
Network controls. Manage and control networks and monitor network security groups with permissive rules.
Information transfer policies and procedures. Ensure information transfer with Azure services is secure.

We are committed to helping our customers leverage Azure in a secure and compliant manner. Over the next few months we will release new built-in blueprints for HITRUST, UK National Health Service (NHS) Information Governance (IG) Toolkit, FedRAMP, and Center for Internet Security (CIS) Benchmark. If you would like to participate in any early previews please sign up with this form, or if you have a suggestion for a compliance blueprint, please share it via the Azure Governance Feedback Forum.

Learn more about the Azure PCI-DSS v3.2.1 blueprint in our documentation.
Quelle: Azure

Solving the problem of duplicate records in healthcare

As the U.S. healthcare system continues to transition away from paper to more a digitized ecosystem, the ability to link all of an individual’s medical data together correctly becomes increasingly challenging. Patients move, marry, divorce, change names and visit multiple providers throughout their lifetime, with each visit creating new records, and the potential for inconsistent or duplicate information grows. Duplicate medical records often occur as a result of multiple name variations, data entry errors, and lack of interoperability—or communication—between systems. Poor patient identification and duplicate records in turn lead to diagnosis errors, redundant medical tests, skewed reporting and analytics, and billing inaccuracies.

The Azure platform offers a wealth of services for partners to enhance, extend, and build industry solutions. Here we will describe how one Microsoft partner, Nextgate, uses Azure to solve a unique problem.

Patient matching

The process of reconciling electronic health records is called “patient matching,” and it is a major obstacle to improving the quality of care coordination, and patient safety. Further, duplicate records are financially crippling, costing the average hospital $1.5 million and our nation’s healthcare system over $6 billion annually. As data sharing matures and the industry pivots toward value, an enterprise view of patient information is essential for informed clinical-decision making, effective episodic care, and a seamless patient-provider experience during every encounter.

As more data is generated and more applications are introduced into the health IT environment, today’s organizations must engage in more comprehensive patient matching approaches.

The puzzle of disjointed electronic health records

While electronic health records (EHRs) have become commonplace, the disjointed, competitive nature of IT systems contributes to a proliferation of siloed, disconnected information. Many EHR systems make sharing data arduous, even in a single-system electronic medical record environment. Further, master patient indexes (MPI) within EHR systems were designed for a single vendor-based environment and lack the sophisticated algorithms for linking data across various settings of care and disparate systems. When sent downstream, duplicate and disjointed patient demographics trigger further harm including increased waste and inefficiencies, suboptimal outcomes, and lost revenue. Without common technical standards in place, EHR systems continue to collect information in various formats that only serve to exacerbate the issue of duplicate record creation.

Solution

NextGate’s Enterprise Master Patient Index (EMPI) platform is a significant step towards improving a health system’s data management and governance framework. This solution manages patient identities for more than two-thirds of the U.S. population, and one-third of the U.K. population. It empowers clinicians and their organizations to make informed, life-saving decisions by seamlessly linking medical records from any given system and reconciling data discrepancies across multiple sites of care. The automated identity matching platform uses both probabilistic and deterministic matching algorithms to account for minor variations in patient data to generate a single best record that follows the patient throughout the care journey.

Benefits

Enhanced clinical decision-making.
Improved patient safety (or reduced medical errors.)
Decreased number of unnecessary or duplicate testing/procedures.
Improved interoperability and data exchange.
Trusted and reliable data quality.
Reduced number of denied claims and other reimbursement delays.
Improved administrative efficiencies.
Higher patient and provider satisfaction.

Azure services

Azure Security Center reinforces the security posture of the NextGate solution against threats, and provides recommendations to harden the security.
Azure Monitor provides telemetry data about the NextGate application to ensure its health.
Azure Virtual Machines provide compute power; enabling auto-scaling and supporting Linux and open source services
Azure SQL Database and Azure Database for PostgreSQL enable NextGate solutions to easily scale with more compute power (scale-up) or more database units (scale-out.)

Next steps

To find out more about this solution, go to Nextgate EMPI and click Contact me.
To see more about Azure in the healthcare industry see Azure for health.

Quelle: Azure

Event-driven analytics with Azure Data Lake Storage Gen2

Most modern-day businesses employ analytics pipelines for real-time and batch processing. A common characteristic of these pipelines is that data arrives at irregular intervals from diverse sources. This adds complexity in terms of having to orchestrate the pipeline such that data gets processed in a timely fashion.

The answer to these challenges lies in coming up with a decoupled event-driven pipeline using serverless components that responds to changes in data as they occur.

An integral part of any analytics pipeline is the data lake. Azure Data Lake Storage Gen2 provides secure, cost effective, and scalable storage for the structured, semi-structured, and unstructured data arriving from diverse sources. Azure Data Lake Storage Gen2’s performance, global availability, and partner ecosystem make it the platform of choice for analytics customers and partners around the world. Next comes the event processing aspect. With Azure Event Grid, a fully managed event routing service, Azure Functions, a serverless compute engine, and Azure Logic Apps, a serverless workflow orchestration engine, it is easy to perform event-based processing and workflows responding to the events in real-time.

Today, we’re very excited to announce that Azure Data Lake Storage Gen2 integration with Azure Event Grid is in preview! This means that Azure Data Lake Storage Gen2 can now generate events that can be consumed by Event Grid and routed to subscribers with webhooks, Azure Event Hubs, Azure Functions, and Logic Apps as endpoints. With this capability, individual changes to files and directories in Azure Data Lake Storage Gen2 can automatically be captured and made available to data engineers for creating rich big data analytics platforms that use event-driven architectures.

The diagram above shows a reference architecture for the modern data warehouse pipeline built on Azure Data Lake Storage Gen2 and Azure serverless components. Data from various sources lands in Azure Data Lake Storage Gen2 via Azure Data Factory and other data movement tools. Azure Data Lake Storage Gen2 generates events for new file creation, updates, renames, or deletes which are routed via Event Grid and Azure Function to Azure Databricks. A databricks job processes the file and writes the output back to Azure Data Lake Storage Gen2. When this happens, Azure Data Lake Storage Gen2 publishes a notification to Event Grid which invokes an Azure Function to copy data to Azure SQL Data Warehouse. Data is finally served via Azure Analysis Services and PowerBI.

The events that will be made available for Azure Data Lake Storage Gen2 are BlobCreated, BlobDeleted, BlobRenamed, DirectoryCreated, DirectoryDeleted, and DirectoryRenamed. Details on these events can be found in the documentation “Azure Event Grid event schema for Blob storage.”

Some key benefits include:

Seamless integration to automate workflows enables customers to build an event-driven pipeline in minutes.
Enable alerting with rapid reaction to creation, deletion, and renaming of files and directories. A myriad of scenarios would benefit from this – especially those associated with data governance and auditing. For example, alert and notify of all changes to high business impact data, set up email notifications for unexpected file deletions, as well as detect and act upon suspicious activity from an account.
Eliminate the complexity and expense of polling services and integrate events coming from your data lake with third-party applications using webhooks such as billing and ticketing systems.

Next steps

Azure Data Lake Storage Gen2 Integration with Azure Event Grid is now available in West Central US and West US 2. Subscribing to Azure Data Lake Storage Gen2 events works the same as it does for Azure Storage accounts. To learn more, see the documentation “Reacting to Blob storage events.” We would love to hear more about your experiences with the preview and get your feedback at ADLSGen2QA@microsoft.com.
Quelle: Azure

Announcing the general availability of Azure premium files

Highly performant, fully managed file service in the cloud!

Today, we are excited to announce the general availability of Azure premium files for customers optimizing their cloud-based file shares on Azure. Premium files offers a higher level of performance built on solid-state drives (SSD) for fully managed file services in Azure.

Premium tier is optimized to deliver consistent performance for IO-intensive workloads that require high-throughput and low latency. Premium file shares store data on the latest SSDs, making them suitable for a wide variety of workloads like databases, persistent volumes for containers, home directories, content and collaboration repositories, media and analytics, high variable and batch workloads, and enterprise applications that are performance sensitive. Our existing standard tier continues to provide reliable performance at a low cost for workloads less sensitive to performance variability, and is well-suited for general purpose file storage, development/test, backups, and applications that do not require low latency.

Through our initial introduction and preview journey, we’ve heard from hundreds of our customers from different industries about their unique experiences. They’ve shared their learnings and success stories with us and have helped make premium file shares even better.

“Working with clients that have large amounts of data that is under FDA or HIPAA regulations, we always struggled in locating a good cloud storage solution that provided SMB access and high bandwidth… until Azure Files premium tier. When it comes to a secure cloud-based storage that offers high upload and download speeds for cloud and on-premises VM clients, Azure premium files definitely stands out.”

– Christian Manasseh, Chief Executive Officer, Mobius Logic

“The speeds are excellent. The I/O intensive actuarial CloudMaster software tasks ran more than 10 times faster in the Azure Batch solution using Azure Files premium tier. Our application has been run by our clients using 1000’s of cores and the Azure premium files has greatly decreased our run times.”

– Scott Bright, Manager Client Data Services, PolySystems

Below are the key benefits of the premium tier. If you’re looking for more technical details, read the previous blog post “Premium files redefine limits for Azure Files.”

Performant, dynamic, and flexible

With premium tier, performance is what you define. Premium file shares’ performance can instantly scale up and down to fit your workload performance characteristics. Premium file shares can massively scale up to 100 TiB capacity and 100K IOPS with a target total throughput of 10 GiB/s. Not only do premium shares include the ability to dynamically tune performance, but also offer bursting capability to meet highly variable workload requirements with short peak periods of intense IOPS.

"We recently migrated our retail POS microservices to Azure Kubernetes Service with premium files. Our experience has been simply amazing – premium files permitted us to securely deploy our 1.2K performant Firebird databases. No problem with size or performance, just adapt the size of the premium file share to instantly scale. It improved our business agility, much needed to serve our rapidly growing customer base across multiple retail chains in France."

– Arnaud Le Roy, Chief Technology Officer, Menlog

We partnered with our internal Azure SQL and Microsoft Power BI teams to build solutions on premium files. As a result, Azure Database for PostgreSQL and Azure Database for MySQL recently opened a preview of increased scale of 16 TiB databases with 20,000 IOPS powered by premium files. Microsoft Power BI announced a powerful 20 times faster enhanced dataflows compute engine preview built upon Azure Files premium tier.

Global availability with predictable cost

Azure Files premium tier is currently available in 19 Azure regions globally. We are continually expanding regional coverage. You can check the Azure region availability page for the latest information.

Premium tier provides the most cost-effective way to create highly-performant and highly-available file shares in Azure. Pricing is simple and cost is predictable–you only pay a single price per provisioned GiB. Refer to the pricing page for additional details.

Seamless Azure experience

Customers receive all features of Azure Files in this new offering, including snapshot/restore, Azure Kubernetes Service and Azure Backup integration, monitoring, hybrid support via Azure File Sync, Azure portal, PowerShell/CLI/Cloud Shell, AzCopy, Azure Storage Explorer support, and the list goes on. Developers can leverage their existing code and skills to migrate applications using familiar Azure Storage client libraries or Azure Files REST APIs. The opportunities for future integration are limitless. Reach out to us if you would like to see more.

With the availability of premium tier, we’re also enhancing the standard tier. To learn more, visit the onboarding instructions for the standard files 100 TiB preview.

Get started and share your experiences

It is simple and takes two minutes to get started with premium file shares. Please see detailed steps for how to create a premium file share.

Visit Azure Files premium tier documentation to learn more. As always, you can share your feedback and experiences on the Azure Storage forum or email us at azurefiles@microsoft.com. Post your ideas and suggestions about Azure Storage on our feedback forum.
Quelle: Azure

Using natural language processing to manage healthcare records

The next time you see your physician, consider the times you fill in a paper form. It may seem trivial, but the information could be crucial to making a better diagnosis. Now consider the other forms of healthcare data that permeate your life—and that of your doctor, nurses, and the clinicians working to keep patients thriving. Forms and diagnostic reports are just two examples. The volume of such information is staggering, yet fully utilizing this data is key to reducing healthcare costs, improving patient outcomes, and other healthcare priorities. Now, imagine if artificial intelligence (AI) can be used to help the situation.

The Azure platform offers a wealth of services for partners to enhance, extend, and build industry solutions. Here we describe how SyTrue, a Microsoft partner focusing on healthcare uses Azure to empower healthcare organizations to improve efficiency, reduce costs, and improve patient outcomes.

Billions of records

Valuable insights remain locked in unstructured medical records such as scanned documents in PDF format that, while human-readable, present a major obstacle to the automation and analytics required. Over four billion medical notes are created every year. The clinical and financial insights embodied within these records are needed by an average of 20+ roles and processes downstream of the record generation. Currently, healthcare providers and payors require an army of professionals to read, understand, and extract healthcare data from the flood of clinical documents generated every day. But success has been elusive.

It's not for lack of trying. In the last decade, an effort was made to accumulate and upload data into electronic health records (EHR) systems. Meaningful Use is a government-led incentive program that aims to accelerate the movement from hard-copy filing systems to electronic health records. Still, the problem is related to the volume and the lack of time and resources to assimilate masses of data.

Note: the Meaningful Use program has a number of goals. An important one is, “Ensure adequate privacy and security protection for personal health information.” Data security is a prime value for Azure services. Data services such as Azure SQL Database encrypt data at rest and in-transit.

Moving the needle on healthcare

As costly and extensive as this effort was, many believe that we have yet to see evidence of any significant impact from the digitization of healthcare data to the quality or cost of care. One way to radically improve this is using AI for natural language processing (NLP)—specifically to automate reading of the documents. That enables subsequent analytics, yielding the most relevant actionable information in near real-time from mountains of documents to the medical professional. It empowers them to deliver better quality care, more efficiently, at lower cost.

In action

A Microsoft partner, SyTrue is leading the way. In the words of their Founder and CEO, Kyle Silvestro, “At SyTrue, the next big challenge is accessing this vast pool of accumulated patient data in a serviceable way. We’ve created a platform that transforms healthcare documentation into actionable information. The focus is on three main features: speed, context, and adaptability. Our technology consumes thousand-paged medical records in sub-seconds. The innovation is built on informational models that can ingest data from multiple types of clinical and financial health care organizations. This allows diverse healthcare stakeholders to use the system. The main objective for the technology is to present key clinical and financial insights to healthcare stakeholders in order to reduce waste and improve clinical outcomes.”

Informed by natural language processing and machine learning

SyTrue relies on NLP and machine learning (ML) as the underlying technology. Using their own proprietary methods, they perform “context-driven information extraction.” In other words, they connect the dots. The graphic below shows their processes.

Improving healthcare

SyTrue’s offers the NLP OS (Operating System) for healthcare. It aids in several ways.

It unlocks healthcare records and enables healthcare professionals to interact with medical record data and its clinical and financial implications. Specifically, it eliminates the need for professionals to hunt for the same key observations. This enables professionals to spend more time focused on patient care.
NLP OS also bridges the communication between a specialist provider and a primary care physician regarding the care of a shared patient. The system extracts and highlights continuity of care recommendations generated within the patient’s care team.
A large healthcare organization installed SyAudit, powered by SyTrue NLP OS, at the front of their medical chart review process. Before the charts reach a nurse-reviewer, they are processed through this solution. The system interprets the documentation to determine if a nurse review is in fact needed, or if the documentation lacks actionable information. This potentially decreases the time spent by nurse reviewers.
A healthcare provider used SyReview, another SyTrue solution powered by the SyTrue NLP OS, for their quality capturing and reporting process. The particular process is related to an incentive program which directly ties quality to Medicare payment. Automating the quality-capturing process strengthens the feedback loop to providers that needed to show improvement. The organization also eliminated its manual quality-capture process, which was slow, expensive, and often inaccurate.

Next steps

To see more about Azure in the healthcare industry see Azure for health.

To find out more about this solution, go to the Azure Marketplace listing for NLP OS™ for Healthcare and click Contact me.
Quelle: Azure

Azure.Source – Volume 88

News and updates

Announcing native backup for SQL Server 2008 end of support in Azure

With SQL Server 2008 and 2008 R2 approaching end of support, many customers are moving to Azure. They see this milestone as an opportunity to reimagine and transform their infrastructure with the power of cloud computing. Azure’s offer of free extended security updates for three years provides a new lease on life to these servers while giving organizations time to upgrade. Learn how easy it is to protect your SQL databases in Azure.

Microsoft and Truffle partner to bring a world-class experience to blockchain developers

Last month, Microsoft released Azure Blockchain Service making it easy for anyone to quickly setup and manage a blockchain network and providing a foundation for developers to build a new class of multi-party blockchain applications in the cloud. To enable end-to-end development of these new apps, we’ve collaborated with teams from Visual Studio Code to Azure Logic Apps and Microsoft Flow to Azure DevOps, to deliver a high-quality experience that integrates Microsoft tools developers trust and open-source tools they love. Now we have doubled down on our relationship by announcing an official partnership between our organizations to bring Truffle blockchain tools for developer experience and DevOps to Microsoft Azure.

Now available

Azure and Office 365 generally available today, Dynamics 365 and Power Platform available by end of 2019

Microsoft Azure and Microsoft Office 365 are taking a major step together to help support the digital transformation of our customers. Both Azure and Office 365 are now generally available from our first cloud datacenter regions in the Middle East, located in the United Arab Emirates (UAE). Dynamics 365 and Power Platform, offering the next generation of intelligent business applications and tools, are anticipated to be available from the cloud regions in UAE by the end of 2019.

In preview

Introducing next generation reading with Immersive Reader, a new Azure Cognitive Service

We’re unveiling the preview of Immersive Reader, a new Azure Cognitive Service in the Language category. Developers can now use this service to embed inclusive capabilities into their apps for enhancing text reading and comprehension for users regardless of age or ability. No machine learning expertise is required. Based on extensive research on inclusivity and accessibility, Immersive Reader’s features are designed to read the text aloud, translate, focus user attention, and much more. Immersive Reader helps users unlock knowledge from text and achieve gains in the classroom and office.

Announcing the preview of Microsoft Azure Bastion

For many customers around the world, securely connecting from the outside to workloads and virtual machines on private networks can be challenging. Exposing virtual machines to the public Internet to enable connectivity through Remote Desktop Protocol (RDP) and Secure Shell (SSH), increases the perimeter, rendering your critical networks and attached virtual machines more open and harder to manage. To connect to their virtual machines, most customers either expose their virtual machines to the public Internet or deploy a bastion host, such as jump-server or jump-boxes. So we’re excited to announce the preview of Azure Bastion, a new managed PaaS service that provides seamless RDP and SSH connectivity to your virtual machines over the Secure Sockets Layer (SSL).

Virtual machine scale set insights from Azure Monitor

In October 2018 we announced the public preview of Azure Monitor for Virtual Machines (VMs). At that time, we included support for monitoring your virtual machine scale sets from the at scale view under Azure Monitor. Now Today we are announcing the public preview of monitoring your Windows and Linux VM scale sets from within the scale set resource blade. This blog highlights several enhancements.

Technical content

Using Azure Search custom skills to create personalized job recommendations

The Microsoft Worldwide Learning Innovation lab is an idea incubation lab within Microsoft that focuses on developing personalized learning and career experiences. One of the recent experiences that the lab developed focused on offering skills-based personalized job recommendations. Research shows that job search is one of the most stressful times in someone’s life. Everyone remembers at some point looking for their next career move and how stressful it was to find a job that aligns with their various skills. Harnessing Azure Search custom skills together with our library of technical capabilities, we were able to build a feature that offers personalized job recommendations based on identified capabilities from resumes.

Azure Stack IaaS – part ten

One of the best things about running your VMs in Azure or Azure Stack is you can begin to modernize around your virtual machines (VMs) by taking advantage of the services provided by the cloud. Platform as a Service (PaaS) is the term often applied to the capabilities that are available to your application to use without the burden of building and maintaining these capabilities yourself. Actually, cloud-IaaS itself is a PaaS since you do not have to build or maintain the underlying hypervisors, software defined network and storage, or even the self-service API and portal. Furthermore, Azure and Azure Stack gives you PaaS services which you can use to modernize your application. In this article we will explore how you can modernize your application with web apps, serverless functions, blob storage, and Kubernetes as part of your Journey to PaaS.

Getting Started with Azure Machine Learning service with Visual Studio Code | Azure Tips and Tricks

In Azure, you can create complex machine learning models and train them with data in a Machine Learning Service workspace. This is a workspace where you can manage all of your machine learning tools and assets, like experiments, models, scripts and model deployments. And you can use the workspace to share your machine learning work with other data scientists in your team. In the Machine Learning Service workspace, you can. Let’s get started with Azure Machine Learning for VS Code and the Azure Machine Learning Service works

Azure Shows

Azure tips and tricks for Visual Studio 2019 | Azure Friday

Learn Michael Crump’s latest Azure tips and tricks that will help you be more productive working with Azure in Visual Studio 2019.

.NET Core 3.0 with Scott Hunter | On .NET

.NET Core 3 will be a major milestone with tons of new features, performance updates and support for new workloads. In this episode, Richard Lander and Scott Hunter get together to discuss some of the highlights that developers can look forward to in this new release.

Server-side Blazor in .NET Core 3.0 | On .NET

In this episode, Shayne Boyer sits down with Daniel Roth to get an understanding of what Blazor is and what benefits does it bring to the table for building web applications.

Five things you didn’t know Python could do | Five Things

This week, Python (the language, not the snake) aficionado Nina Zakharenko joins us for Five Things that you didn’t know that Python can do. And don’t worry, there are plenty of snake references and even a free potato joke. Also, Burke finds snake facts on the internet and Nina tries her first Goo Goo Cluster.

All about Rust in real life: Linkerd 2.0 | The Open Source Show

Oliver Gould, CTO at Buoyant and one of the creators of Linkerd, joins Lachie Evenson to talk Rust. One of StackOverflow’s most loved programming languages for the fourth year running. Specifically, how and why Linkerd rewrote 2.0 in Rust, what’s changed over the years, and get Oliver’s tips for navigating tooling, package management, release channels, and more.

Azure IoT Edge development with Azure DevOps | Internet of Things Show

The Internet of Things is a technology paradigm that involves the use of internet connected devices to publish data often in conjunction with real-time data processing, machine learning, and/or storage services. We will examine IoT Edge Solutions using Azure DevOps, Application Insights, Azure Container Registries, containerized IoT edge devices and Azure Kubernetes Service to create an end-to-end pipeline which deploys, smoke tests, and allows for scalable integration testing using replica sets in k8s.

Eric Fleming on middle-of-the-day deployments | Azure DevOps podcast

Today’s episode is all about recognizing middle-of-the-day deployments. How teams such as Netflix, Facebook, and even the Azure DevOps Product Team are doing them; and taking a look at how other teams can achieve that for themselves!

Quelle: Azure

Azure Cosmos DB: A competitive advantage for healthcare ISVs

This blog was co-authored by Shweta Mishra, Senior Solutions Architect, CitiusTech and Vinil Menon, Chief Technology Officer, CitiusTech

CitiusTech is a specialist provider of healthcare technology services which helps its customers to accelerate innovation in healthcare. CitiusTech used Azure Cosmos DB to simplify the real-time collection and movement of healthcare data from variety of sources in a secured manner. With the proliferation of patient information from established and current sources, accompanied with scrupulous regulations, healthcare systems today are gradually shifting towards near real-time data integration. To realize such performance, healthcare systems not only need to have low latency and high availability, but should also be highly responsive. Furthermore, they need to scale effectively to manage the inflow of high speed, large volumes of healthcare data.

The situation

The rise of Internet of Things (IoT) has enabled ordinary medical devices, wearables, traditional hospital deployed medical equipment to collect and share data. Within a wide area network (WAN), there are well defined standards and protocols, but with the ever increasing number of devices getting connected to the internet, there is a general lack of standards compliance and consistency of implementation. Moreover, data collation and generation from IoT enabled medical/mobile devices need specialized applications to cope with increasing volumes of data.

This free-form approach provides a great deal of flexibility, since different data can be stored in document oriented stores as business requirements change. Relational databases aren’t efficient in performing CRUD operations on such data but are essential for handling transactional data where consistent data integrity is necessary. Different databases are designed to solve different problems, using a single database engine for multiple purposes usually leads to non-performant solutions. Whereas management of multiple types of databases is an operational overhead.

Developing distributed global scale solutions are challenged by the capability and complexity of scaling databases across multiple regions without compromising performance, and while complying with data sovereignty needs. This often leads to inefficient management of multiple regional databases and/or underperformance.

Solution

Azure Cosmos DB has the ability of polyglot persistence, which allows it to use a mix of data store technologies without compromising on performance. It is a multi-model, highly-available, globally scalable database which supports proven low latency reads and writes. Azure Cosmos DB has enterprise grade security features and keeps all data encrypted at rest.

Azure Cosmos DB is suited for distributed global scale solutions as it not only provides a turnkey global distribution feature but can geo-fence a database to specific regions to manage data sovereignty compliance. Its multi-master feature allows writes to be made and synchronized across regions with guaranteed consistency. In addition, it supports multi-document transactions with ACID guarantees.

Use cases in healthcare

Azure Cosmos DB works very well for the following workloads.

1. Global scale secure solutions

Organizations like CitiusTech that offer a mission-critical, global-scale solution should consider Azure Cosmos DB a critical component of their solution stack. For example, An ISV developing a non-drug treatment for patients through a medical device at a facility can develop web or mobile applications which store the treatment information and medical device metadata in Azure Cosmos DB. Treatment information can be pushed to medical devices at global facilities for the treatment. ISVs can comply to the compliance requirement by using geo-fencing feature.

Azure Cosmos DB can also be used as a multi-tenant database with carefully designed strategy. For instance, if a tenant has different scaling requirements, different Azure Cosmos containers can be created for such tenants. In Azure Cosmos DB, containers serve as logical units of distribution and scalability. Multi-tenancy may be possible at a partition level within an Azure Cosmos container, but needs to be designed carefully to avoid creating hot-spots and compromising the overall performance.

2. Real-time location system, Internet of Things

Azure Cosmos DB is effective for building a solution for real-time tracking and management of medical devices and patients, as it often requires rapid velocity of data, scale, and resilience. Azure Cosmos DB supports low latency writes and reads so that all data is replicated across multiple fault and update domains in each region for high availability and resilience. It supports session consistency as one of its five consistency levels which is suitable for such scenarios. Session consistency guarantees strong consistency within a session.

Using Azure Cosmos DB also allows scaling of processing power, this is useful for burst scenarios and also provides elastic scale petabytes of storage. This enables request units (RU’s) to be programmatically adjusted as per the workload.

CitiusTech worked with a leading provider of medical grade vital signs and physiological monitoring solution to build a medical IoT based platform with the following requirements:

Monitor vitals with medical quality
Provide solutions for partners to integrate custom solutions
Deliver personalized, actionable insights
Messages and/or device generated data don’t have a fixed structure and may change in the future
Data producer(s) to simultaneously upload data for at least 100 subjects in less than two seconds per subject, receiving no more than 40*21=840 data points per subject, per request
Data consumer(s) to read simultaneously, data of at least 100 subjects in less than two seconds, producing no more than 15,000 data points per data consumer
Data for most recent 14 days shall be ready to be queried, and data older than 14 days to be moved to a cold storage

CitiusTech used Azure Cosmos DB as a hot storage to store health data, since it enabled low latency writes and reads of health data that was generated by the wearable sensor continuously. Azure Cosmos DB provided schema agnostic flexible storage to store documents with different shapes and size at scale and allowed enterprise grade security with Azure compliance certification.

The time to live (TTL) feature in Azure Cosmos DB automatically deleted expired items based on the TTL value. It was geo-distributed with its geo-fencing feature to address data sovereignty compliance requirements.

Solution architecture

Architecture of data flow in CitiusTech’s solution using Azure Cosmos DB

Key insights

Azure Cosmos DB unlocks the potential of polyglot persistence for healthcare systems to integrate healthcare data from multiple systems of record. It also ensures the need for flexibility, adaptability, speed, security and scale in healthcare is addressed while maintaining low operational overheads and high performance.

About CitiusTech

CitiusTech is a specialist provider of healthcare technology services and solutions to healthcare technology companies, providers, payers and life sciences organizations. CitiusTech helps customers accelerate innovation in healthcare through specialized solutions, healthcare technology platforms, proficiencies and accelerators. Find out more about CitiusTech.
Quelle: Azure

Azure Security Expert Series: Learn best practices and Customer Lockbox general availability

With more computing environments moving to the cloud, the need for stronger cloud security has never been greater. But what constitutes effective cloud security, and what best practices should you be following?

While Microsoft Azure delivers unmatched built-in security, it is important that you understand the breadth of security controls and take advantage of them to protect your workloads.

We launched the Azure Security Expert Series, which will provide on-going virtual content to help security professionals protect hybrid cloud environments. Ann Johnson, CVP of Cybersecurity Solutions Group at Microsoft, kicked off the series and shared five cloud security best practices:

Strengthen Access Control
Increase your security posture
Secure apps and data
Manage networking
Mitigate threats

Make sure you are up to speed with each of these important best practices as you secure your own organization.

Customer Lockbox for Microsoft Azure

During Ann’s main talk, she announced the general availability of Customer Lockbox for Microsoft Azure. Customer Lockbox for Azure extends our commitment to customer privacy while also giving you help when you need it most. With Customer Lockbox for Microsoft Azure, customers can review and approve or reject requests from Microsoft engineers to access their data during a support case. Access is granted only if approved and the entire process is audited with records stored in the Activity Logs.

Customer Lockbox is now generally available and currently enabled for remote desktop access to virtual machines. To learn more, please go to Customer Lockbox for Microsoft documentation.

What will you learn?

Missed the broadcast or want to dive deeper into SIEM, IOT, Networking or Security Center?

Check out the Azure Security Expert series which includes the best practice session with Ann, and additional drill-down sessions including:

Get started with Azure Sentinel a cloud-native SIEM
What is cloud-native Azure Network Security?
Securing the hybrid cloud with Security Center
What makes IoT Security different?

Until June 26th, 2019, you will have a chance to win a Microsoft Xbox One S. To enter, watch the sessions and complete the knowledge check on the entry form and submit the entry.**

‘Ask Us Anything’ with Azure security experts

Have more questions? The Azure security team will be hosting an ‘Ask Us Anything’ session on Twitter on Monday June 24, 2019 from 10 am – 11:30 am PT (1 pm – 2:30 pm ET). Our product and engineering teams will be available to answer questions about Azure security services.

Post your questions to Twitter by mentioning @AzureSupport and using the hashtag #AzureSecuritySeries.

If there are follow-ups or additional questions that come up after the Twitter session, no problem! We’re happy to continue the dialogue afterward through Twitter or send your questions to Azuresecurityexpert@microsoft.com.

Save the date
 

How do I learn more about Azure security and connect with the tech community?

There are several ways to stay connected and access new executive talks, on-demand sessions, or other types of valuable content covering a range of cloud security topics to help you get started or accelerate your cloud security plan.

Watch for content on Azure Security Expert Series.
Visit Microsoft Azure for product details.
Follow the social channel for Azure security news and updates on @Azure.
Join our security community to connect with the engineering teams and participate in previews, group discussions, give feedback, etc.
Accelerate your knowledge on security capabilities within Azure with hands-on training courses with Microsoft Professional Program for Cybersecurity, or watch out for new Azure security training sessions on Microsoft Learn.
Attend Microsoft Ignite for specialized security learning paths and other exclusive activities to learn from the experts and connect with your peers.

**The Sweepstakes will run exclusively between June 19 – June 26 11:59 PM Pacific Time. No purchase necessary. To enter, you must be a legal resident of the 50 United States (including the District of Columbia), and be 18 years of age or older. You will need to complete all the Knowledge Check questions in the entry form to qualify for the sweepstakes. Please refer to our official rules for more details.
Quelle: Azure