Announcing Cloud Partner Portal: Public preview for single-virtual machine offers

Today, we are excited to announce the public preview of the Cloud Partner Portal for publishing single virtual machine offers. The Cloud Partner Portal enables our publisher partners to create, define, publish and get insights for their single virtual machine offerings on the Azure Marketplace.

With this announcement, new and existing publisher partners who wish to publish virtual machines onto Azure will be able to use the new Cloud Partner Portal to perform any of the above actions. This new portal will soon support all other offer types and will replace the current publishing portal in time.

The new improved Cloud Partner Portal

Today’s release has several new features that make publishing onto the Azure Marketplace a lot faster, simpler and easier.

Features of Today’s Release:

1. Org Id login support – This has been an ask from our publisher partners for a long time and we are adding support for Org Id to the Cloud Partner Portal. Additionally, the new publishing portal would support RBAC so that offers remain secure and publishers don’t have to make all contributors co-admins giving them only the level of access needed.

2. Get it right the first time – Everyone hates do-overs. There is nothing worse than spending time defining an offer and thinking you are done only an issue with the offer downstream. To prevent this, your offer is validated as you type. This reduces unwanted surprises after publishing the offer.

Additionally, we anticipate an overall reduction in time from starting defining an offer to actually publishing an offer.

We have spent a considerable amount of time in writing validations for every field within the offer to ensure when publishers click publish, their offer will publish successfully. Even as we ship, we are adding new validations with every release which make the process a lot more predictable.

3. Simplified publishing workflow – The new publishing portal has a simplified publishing workflow providing one path to offer publishing. There are no separate production and staging slots exposed to publishers. Publishers just need to ‘Publish’ their offers, and we take care of the rest.

Before an offer goes live, publishers are given a chance to review it and ensure that everything is working as expected.

4. Be more informed – The new Cloud Partner Portal lets publishers know even before they publish their offer about the steps their offer would go through along with estimated execution times. Along with the guidance around the workflows, we have notifications built into the portal which keep the publishers informed on your offer’s progress to getting listed on Azure.

5. Insights in the portal – The Cloud Partner Portal provides a direct link into the insights of an offer. These insights provide a quick glance and drilldowns into an offer’s health and performance on the Azure Marketplace. The insights portal also has an onboarding video and rich documentation that helps publishers familiarize themselves with its features.

6. Feedback is just a click away – The send a smile/frown button will be ubiquitous in the new portal. In a matter of clicks publishers can send feedback directly to the engineering team.

I can keep writing about the host of new features and capabilities of the new publishing portal, however the best way to discover these features is to take the portal for a spin.

If you are an existing Azure Publisher with a Virtual Machine offer, your account for the new publishing portal is already created. Please visit the Cloud Partner Portal and login using your current credentials. Please refer to our documentation if you would need any help in getting started.

Existing publishers can also let us know if they would like to get their offers migrated following the steps available to registered publishers. We also have a brand new seller guide that can help you navigate the Azure Marketplace better and get most value out of it.

If you are a new Publisher looking to publish onto the Azure platform, please fill up the nomination form here and we will be in touch with you.

As you try out the new cloud partner portal, please keep the steady stream of feedback coming in. We hope you enjoy using the portal as much as we enjoyed creating it for you.
Quelle: Azure

Cloudera now supports Azure Data Lake Store

With the release of Cloudera Enterprise Data Hub 5.11, you can now run Spark, Hive, and MapReduce workloads in a Cloudera cluster on Azure Data Lake Store (ADLS). Running on ADLS has the following benefits:

Grow or shrink a cluster independent of the size of the data.
Data persists independently as you spin up or tear down a cluster. Other clusters and compute engines, such as Azure Data Lake Analytics or Azure SQL Data Warehouse, can execute workload on the same data.
Enable role-based access controls integrated with Azure Active Directory and authorize users and groups with fine-grained POSIX-based ACLs.
Cloud HDFS with performance optimized for analytics workload, supporting reading and writing hundreds of terabytes of data concurrently.
No limits on account size or individual file size.
Data is encrypted at rest by default using service-managed or customer-managed keys in Azure Key Vault, and is encrypted with SSL while in transit.
High data durability at lower cost as data replication is managed by Data Lake Store and exposed from HDFS compatible interface rather than having to replicate data both in HDFS and at the cloud storage infrastructure level.

To get started, you can use the Cloudera Enterprise Data Hub template or the Cloudera Director template on Azure Marketplace to create a Cloudera cluster. Once the cluster is up, use one or both of the following approaches to enable ADLS.

Add a Data Lake Store for cluster wide access

Step 1: ADLS uses Azure Active Directory for identity management and authentication. To access ADLS from a Cloudera cluster, first create a service principal in Azure AD. You will need the Application ID, Authentication Key, and Tenant ID of the service principal.

Step 2: To access ADLS, assign the permissions for the service principal created in the previous step. To do this, go to the Azure portal, navigate to the Data Lake Store, and select Data Explorer. Then navigate to the target path, select Access and add the service principal with appropriate access rights. Refer to this document for details on access control in ADLS.

Step 3: Go to Cloudera Manager -> HDFS -> Configuration. Add the following configurations to core-site.xml:

Use the service principal property values obtained from Step 1 to set these parameters:

<property>
<name>dfs.adls.oauth2.client.id</name>
<value>Application ID</value>
</property>
<property>
<name>dfs.adls.oauth2.credential</name>
<value>Authentication Key</value>
</property>
<property>
<name>dfs.adls.oauth2.refresh.url</name>
<value>https://login.microsoftonline.com/<Tenant ID>/oauth2/token</value>
</property>
<property>
<name>dfs.adls.oauth2.access.token.provider.type</name>
<value>ClientCredential</value>
</property>

Step 4: Verify you can access ADLS by running a Hadoop command, for example:

hdfs dfs -ls adl://<your adls account>.azuredatalakestore.net/<path to file>/

Specify a Data Lake Store in the Hadoop command line

Instead of, or in addition to, configuring a Data Lake Store for cluster wide access, you could also provide ADLS access information in the command line of a MapReduce or Spark job. With this method, if you use an Azure AD refresh token instead of a service principal, and encrypt the credentials in a .JCEKS file under a user’s home directory, you gain the following benefits:

Each user can use their own credentials instead of having a cluster wide credential
Nobody can see another user’s credential because it’s encrypted in .JCEKS in the user’s home directory
No need to store credentials in clear text in a configuration file
No need to wait for someone who has rights to create service principals in Azure AD

The following steps illustrate an example of how you can set this up by using the refresh token obtained by signing in to the Azure cross platform client tool.

Step 1: Sign in to Azure cli by running the command “azure login”, then get the refreshToken and _clientId from .azure/accessTokens.json under the user’s home directory.

Step 2: Run the following commands to set up credentials to access ADLS:

export HADOOP_CREDSTORE_PASSWORD=<your encryption password>
hadoop credential create dfs.adls.oauth2.client.id -value <_clientId from Step 1> -provider jceks://hdfs/user/<username>/cred.jceks
hadoop credential create dfs.adls.oauth2.refresh.token -value ‘<refreshToken from Step 1>’ -provider jceks://hdfs/user/<username>/cred.jceks

Step 3: Verify you can access ADLS by running a Hadoop command, for example:

hdfs dfs -Ddfs.adls.oauth2.access.token.provider.type=RefreshToken -Dhadoop.security.credential.provider.path=jceks://hdfs/user/<username>/cred.jceks -ls adl://<your adls account>.azuredatalakestore.net/<path to file>
hadoop jar /opt/cloudera/parcels/CDH/lib/hadoop-0.20-mapreduce/hadoop-examples.jar teragen -Dmapred.child.env="HADOOP_CREDSTORE_PASSWORD=$HADOOP_CREDSTORE_PASSWORD" -Dyarn.app.mapreduce.am.env="HADOOP_CREDSTORE_PASSWORD=$HADOOP_CREDSTORE_PASSWORD" -Ddfs.adls.oauth2.access.token.provider.type=RefreshToken -Dhadoop.security.credential.provider.path=jceks://hdfs/user/<username>/cred.jceks 1000 adl://<your adls account>.azuredatalakestore.net/<path to file>

Limitations of ADLS support in EDH 5.11

Only Spark, Hive, and MapReduce workloads are supported on ADLS. Support for ADLS in Impala, HBase, and other services will come in future releases.
ADLS is supported as a secondary storage. To access ADLS, use fully qualified URLs in the form of adl://<your adls account>.azuredatalakestore.net/<path to file> .

Additional resources

Cloudera documentation on ADLS support

Quelle: Azure

Microsoft Azure Platform for India GST Suvidha Providers (GSPs)

Goods and Services Tax (GST) is essentially one new indirect tax system for the whole nation, which will make India one unified common market, right from the manufacturer to the consumer. It is a broad-based, comprehensive, single indirect tax which will be levied concurrently on goods and services across India.

The Central and State indirect taxes that may be subsumed by GST include Value Added Tax (VAT), Excise Duty, Service Tax, Central Sales Tax, Additional Customs Duty and Special Additional Duty of Customs. GST will be levied at every stage of the production and distribution chains by giving the benefit of Input Tax Credit (ITC) of the tax remitted at previous stages; thereby, treating the entire country as one market.

Due to the federal structure of India, there will be two components of GST – Central GST (CGST) and State GST (SGST). Both Centre and States will simultaneously levy GST across the value chain. For interstate transactions, an Integrated GST (IGST) will be applicable which will be settled back between the center and the states.

Goods and Services Tax Network (GSTN) a non-Government, private limited company has been formed to provide the IT infrastructure to State/Central government and taxpayers. It has setup a central solution platform to technically enable the GST system including registration of taxpayers, upload/download of invoices, filing returns, State/Central Government reporting, IGST settlement etc. This platform, the GST Platform, has been setup from day 1 as an open API (GST API) based platform allowing various authorized parties to exchange information at scale.

The core IT Strategy for GST had been identified way back and is now accessible.

GSTN has further identified GSPs who can wrap the GST Platform and offer various value added services to its customers (i.e. taxpayers) and further downstream sub-GSPs or registered application service providers (ASPs).

GSPs have limited time to understand the new set of rules, which are continuously evolving, develop the solution and host and run it on a secure and scalable platform. Govt. has also adopted an ecosystem approach which will allow GSPs to further expose their APIs to downstream ASPs (Application Service Providers) who will cater to taxpayer needs.

GSP and the ASPs need to focus on solution capabilities and select a platform which provides most of the plumbing necessary to build an open yet secure, scalable, maintainable and compliant platform and achieve all of this at a manageable cost.

With three huge data centers in India offering a host of IAAS, PAAS and SAAS services supporting host of open source and commercial platforms, Azure offers the best platform to host GSP and GST related ASP solutions.

The attached document authored by Mandar Samant (Area Architect- Microsoft Services) provides a good overview of how Azure services can get GSPs and ASPs started very quickly and provide cutting edge GST solution to the taxpayer community in India.  
Quelle: Azure

Azure IoT Suite connected factory now available

Getting Started with Industrie 4.0

Many customers tell us that they want to start with the digital transformation of their assets, for example production lines, as well as their business processes. However, many times they just don’t know where to start or what exactly Industrie 4.0 is all about. At Microsoft, we are committed to enabling businesses of all sizes to realize their full potential and today we are  proud to announce our connected factory preconfigured solution and six-step framework to quickly enable you to get started on your Industrie 4.0 journey.

Azure IoT Suite preconfigured solutions are engineered to help businesses get started quickly and move from proof-of-concept to broader deployment. The connected factory preconfigured solution leverages Azure services including Azure IoT Hub and the new Azure Time Series Insights. Furthermore, it leverages the OPC Foundation’s cross-platform OPC UA .Net Standard Library reference stack for OPC UA connectivity, as well as a rich web portal with OPC UA server management capabilities, alarms processing and telemetry visualizations. The web portal and the Azure Time-Series Insights can be used to quickly see trends in OPC UA telemetry data and see Overall Equipment Effectiveness (OEE) and several key performance indicators (KPIs) like number of units produced and energy consumption.

This solution builds on the industry-leading cloud connectivity for OPC UA that we have first announced at Hannover Messe a year ago. Since then, all components of this connectivity have been released cross-platform and open-source on GitHub in collaboration with the OPC Foundation making Microsoft the largest open-source contributor to the OPC Foundation. Furthermore, the entire connected factory preconfigured solution is also published open-source on GitHub.

Azure IoT Suite is the best solution for Industrie 4.0

As we demonstrated at Hannover Messe 2016, we believe that the Azure IoT Suite is the best choice for businesses to cloud-enable industrial equipment — including already deployed machines, without disrupting their operation — to allow for data and device management, insights, machine learning capabilities and even the ability to manage equipment remotely.

To demonstrate this functionality, we have gone to great lengths to build real OPC UA servers into the solution, grouped into assembly lines where each OPC UA server is responsible for a “station” within the assembly line. Each assembly line is producing simulated products. We even built a simple Manufacturing Execution System (MES) with an OPC UA interface, which controls each assembly line. The connected factory preconfigured solution includes 8 such assembly lines and they are running in a Linux Virtual Machine on Azure. Our Azure IoT Gateway SDK is also used in each simulated factory location.

Secure by design, secure by default

As verified by the BSI Study, OPC UA is secure by default. Microsoft is going one step further and is making sure that the OPC UA components used in the connected factory solution are secure by default, to give you a secure base to build your own solution on top. Secure by default means that all security features are turned on and already configured. This means that you don’t need to do this step manually and sees how an end-to-end solution can be secured.

Easy to extend with real factories

We have made it as simple as possible to extend the connected factory preconfigured solution with real factories. For this, we have partnered with several industry leaders in the OPC UA ecosystem who have built turnkey gateway solutions that have the Azure connectivity used by this solution already built in and are close to zero-config. These partners include Softing, Unified Automation, and Hewlett Packard Enterprise. Please visit our device catalog for a complete list of gateways compatible with this solution. With these gateways, you can easily connect your on-premises industrial assets to this solution.

However, we have gone even further and additionally provided open-source Docker containers as well as pre-built Docker container images available on Docker Hub for the Azure connectivity components (OPC Proxy and OPC Publisher), both integrated in the Azure IoT Gateway SDK and available on GitHub to make a PoC with real equipment achievable in hours, enabling you to quickly draw insights from your equipment and to plan commercialization steps based on these PoCs.

The future is now

Get started on the journey to cloud-enable industrial equipment with Azure IoT Suite connected factory preconfigured solution and see the solution in action at Hannover Messe 2017. To learn more about how IoT can help transform your business, visit www.InternetofYourThings.com.

Learn more about Microsoft IoT

Microsoft is simplifying IoT so every business can digitally transform through IoT solutions that are more accessible and easier to implement. Microsoft has the most comprehensive IoT portfolio with a wide range of IoT offerings to meet organizations where they are on their IoT journey, including everything businesses need to get started — ranging from operating systems for their devices, cloud services to control them, advanced analytics to gain insights, and business applications to enable intelligent action. To see how Microsoft IoT can transform your business, visit www.InternetofYourThings.com.​
Quelle: Azure

Announcing Azure Stream Analytics on edge devices (preview)

Today, we are announcing Azure Stream Analytics (ASA) on edge devices, a new feature of Azure Stream Analytics that enables customers to deploy analytical intelligence closer to the IoT devices and unlock the full value of the device-generated data.

Azure Stream Analytics on edge devices extends all the benefits of our unique streaming technology from the cloud down to devices. With ASA on edge devices, we are offering the power of our Complex Event Processing (CEP) solution on edge devices to easily develop and run real-time analytics on multiple streams of data. One of the key benefit of this feature is the seamless integration with the cloud: users can develop, test, and deploy their analytics from the cloud, using the same SQL-like language for both cloud and edge analytics jobs. Like in the cloud, this SQL language notably enables temporal-based joins, windowed aggregates, temporal filters, and other common operations such as aggregates, projections, and filters.  Users can also seamlessly integrate custom code in JavaScript for advanced scenarios.

Enabling new scenarios

Azure IoT Hub, a core Azure service that connects, monitors and updates IoT devices, has enabled customers to connect millions of devices to the cloud, and Azure Stream Analytics has enabled customers to easily deploy and scale analytical intelligence in the cloud for extracting actionable insights from the device-generated data. However, multiple IoT scenarios require real-time response, resiliency to intermittent connectivity, handling of large volumes of raw data, or pre-processing of data to ensure regulatory compliance. All of which could now be achieved by using ASA on edge device to deploy and operate analytical intelligence physically closer to the devices.

Hewlett Packard Enterprise (HPE) is an early preview partner who has demonstrated a working prototype of ASA on edge devices at Microsoft&;s booth at Hannover Messe (April 24 to 28, Hall 7, Stand C40). A result of close collaboration between Microsoft, HPE and the OPC Foundation, the prototype is based on Azure Stream Analytics, the HPE Edgeline EL1000 Converged Edge System, and the OPC Unified Architecture (OPC-UA), delivering real-time analysis, condition monitoring, and control. The HPE Edgeline EL1000 Converged Edge System integrates compute, storage, data capture, control and enterprise-class systems and device management built to thrive in hardened environments and handle shock, vibration and extreme temperatures.

ASA on edge devices is particularly interesting for Industrial IoT (IIoT) scenarios that require reacting to operational data with ultra-low latency. Systems such as manufacturing production lines or remote mining equipment need to analyze and act in real-time to the streams of incoming data, e.g. when anomalies are detected.

In offshore drilling, offshore windfarms, or ship transport scenarios, analytics need to run even when internet connectivity is intermittent. In these cases, ASA on edge devices can run reliably to summarize and monitor events, react to events locally, and leverage connection to the cloud when it becomes available.

In industrial IoT scenarios, the volume of data can be too large to be sent to the cloud directly due to limited bandwidth or bandwidth cost. For example, the data produced by jet engines (a typical number is that 1TB of data is collected during a flight) or manufacturing sensors (each sensor can produce 1MB/s to 10MB/s) may need to be filtered down, aggregated or processed directly on the device before sending it to the cloud. Examples of these processes include sending only events when values change instead of sending every event, averaging data on a time window, or using a user-defined function.

Until now, customers with such requirements had to build custom solutions, and manage them separately from their cloud applications. Now, customers can use Azure Stream Analytics to seamlessly develop and operate their stream analytics jobs both on edge devices and in the cloud.

How to use Azure Stream Analytics on edge devices?

Azure Stream Analytics on edge devices leverages the Azure IoT Gateway SDK to run on Windows and Linux operating systems, and supports a multitude of hardware as small as single-board computers, to full PCs, servers or dedicated field gateways devices. The IoT Gateway SDK provides connectors for different industry standard communication protocols such as OPC-UA, Modbus and MQTT and can be extended to support your own communication needs. Azure IoT Hub is used to provide secured bi-directional communications between gateways and the cloud.

Azure Stream Analytics on edge devices is available now in private preview. To request access to the private preview, click here.

You can also meet with our team at Hannover Messe, the world&039;s biggest industrial fair, which take place from April 24th to April 28th in Hannover, Germany. We are located at the Microsoft booth in the Advanced Analytics pod (Hall 7, Stand C40).
Quelle: Azure

Announcing Azure Time Series Insights

Today we are excited to announce the public preview of Azure Time Series Insights, a fully managed analytics, storage, and visualization service that makes it incredibly simple to interactively and instantly explore and analyze billions of events from sources such as Internet of Things. Time Series Insights gives you a near real time global view of your data across various event sources and lets you quickly validate IoT solutions and avoid costly downtime of mission-critical devices. It helps you discover hidden trends, spot anomalies, conduct root-cause analysis in near real-time, all without writing a single line of code through its simple and intuitive user experience. Additionally, it provides rich API’s to enable you to integrate its powerful capabilities in your own existing workflow or application.

Today more than ever, with increasing connected devices and massive advances in the collection of data, businesses are struggling to quickly derive insights from the sheer volume of data generated from geographically dispersed devices and solutions. In addition to the massive scale, there is also a growing need for deriving insights from the millions of events being generated in near real time. Any delay in insights can cause significant downtime and business impact. Additionally, the need to correlate data from a variety of different sensors is paramount to debug and optimize business processes and workflows. Reducing the time and expertise required for this is essential for businesses to gain a competitive edge and optimize their operations. Azure Time Series Insights solves these and many more challenges for your IoT solutions.

Customers from diverse industry sectors like automotive, windfarms, elevators, smart buildings, manufacturing, etc. have been using Time Series Insights during its private preview. They have validated its capabilities with real production data load, already realized the benefits, and are looking for ways to cut costs and improve operations.

For example, BMW uses Azure Time Series Insights and companion Azure IoT services for predictive maintenance across several of their departments. Time Series Insights and other Azure IoT services have helped companies like BMW improve operational efficiency by reducing SLAs for validating connected device installation, in some cases realizing a reduction in time from several months to as little as thirty minutes.

Near real-time insights in seconds at IoT scale

Azure Time Series Insights enables you to ingest 100’s of millions of sensor events per day, and makes new data available to query for insights within 1 minute. It also enables you to retain this data for months.  Time Series Insights is optimized to enable you to query over this combination of near real-time and historic TB’s of data in seconds. It does not pre-aggregate data, but stores the raw events, and delivers the power of doing all aggregations instantly over this massive scale. Additionally, it also enables you to upload reference data to augment or enrich your incoming sensor data. Time Series Insights enables you to compare data across various sensors of different kinds, event sources, regions and IoT installations in the same query. This is what enables you to get a global view of your data, lets you quickly validate, monitor, discover trends, spot anomalies, and conduct root cause analysis in near real time.

“Azure Time Series Insights has standardized our method of accessing devices’ telemetry in real time without any development effort. Time to detect and diagnose a problem has dropped from days to minutes. With just a few clicks we can visualize the end-to-end device data flow, helping us identify and address customer and market needs,” said Scott Tillman, Software Engineer, ThyssenKrupp Elevator.

Easy to get started

With built-in integration to Azure IoT Hub and Azure Event Hubs, customers can get started with Time Series Insights in minutes. Just enter your IoT Hub or Event Hub configuration information through the Azure Portal, and Time Series Insights connects and starts pulling and storing real-time data from it within a minute. This service is schema adaptive, which means that you do not have to do any data preparation to start deriving insights. This enables you to explore, compare, and correlate a variety of sensors seamlessly. It provides a very intuitive user experience that enables you to view, explore, and drill down into various granularities of data, down to specific events. It also provides SQL-like filters and aggregates, ability to construct, visualize, compare, and overlay various time series patterns, heat maps, and the ability to save and share queries. This is what enables you to get started, and glean insights from your data using Azure Time Series Insights in minutes. You can also unleash the power of Time Series Insights using the REST query APIs to create custom solutions. Additionally, Time Series Insights is used to power the time series analytics experiences in Microsoft IoT Central and Azure IoT Suite connected factory preconfigured solutions. Time Series Insights is powered by Azure Platform and provides enterprise scale, reliability, Azure Active Directory integration, and operational security.

Codit, based in Belgium, is a leading IT services company providing consultancy, technology, and managed services in business integration. They help companies reduce operational costs, improve efficiency and enhance control by enabling people and applications to integrate more efficiently. “Azure Time Series Insights is easy to use, helping us to quickly explore, analyze, and visualize many events in just a few clicks.  It’s a complete cloud service, and it has saved us from writing custom applications to quickly verify changes to IoT initiatives,” said Tom Kerkhove, Codit. “We are excited to use Time Series Insights in the future.”

Azure Time Series Insights extends the broad portfolio of Azure IoT services, such as Azure IoT Hub, Azure Stream Analytics, Azure Machine Learning and various other services to help customers unlock deep insights from their IoT solution. Currently, Time Series Insight is available in US West, US East, EU West, and EU North regions. Learn more about Azure Time Series Insights and sign up for the Azure Time Series Insights preview today.

Learn more about Microsoft IoT

Microsoft is simplifying IoT so every business can digitally transform through IoT solutions that are more accessible and easier to implement. Microsoft has the most comprehensive IoT portfolio with a wide range of IoT offerings to meet organizations where they are on their IoT journey, including everything businesses need to get started — ranging from operating systems for their devices, cloud services to control them, advanced analytics to gain insights, and business applications to enable intelligent action. To see how Microsoft IoT can transform your business, visit www.InternetofYourThings.com.​
Quelle: Azure

Announcing new functionality to automatically provision devices to Azure IoT Hub

We’re announcing a great new service to Azure IoT Hub that allows customers to provision millions of devices in a secure and scalable manner. Azure IoT Hub Device Provisioning enables zero-touch provisioning to the right IoT hub without requiring human intervention, and is currently being used by early adopters to validate various solution deployment scenarios.

Provisioning is an important part of the lifecycle management of an IoT device, which enables seamless integration with an Azure IoT solution. Technically speaking, provisioning pairs devices with an IoT hub based on any number of characteristics such as:

Location of the device (geo-sharding)
Customer who bought the device (multitenancy)
Application in which the device is to be used (solution isolation)

The Azure IoT Hub Device Provisioning service is made even better thanks to some security standardization work called DICE and will support multiple types of hardware security modules such as TPM. In conjunction with this, we announced hardware partnerships with STMicro and Micron.

Without IoT Hub Device Provisioning, setting up and deploying a large number of devices to work with a cloud backend is hard and involves a lot of manual work. This is true today for Azure IoT Hub. While customers can create a lot of device identities within the hub at a time using bulk import, they still must individually place connection credentials on the devices themselves. It&;s hard, and today customers must build their own solution functionality to avoid the painful manual process. Our commitment to strong security best practices is partly to blame. IoT Hub requires each device to have a unique identity registered to the hub in order to enable per-device access revocation in case the device is compromised. This is a security best-practice, but like many security-related best practices, it tends to slow down deployment.
 
Not only that, but registering a device to Azure IoT Hub is really only half the battle. Once a device is registered, physically deployed in the field, and hooked up to the device management dashboard, now customers have to configure the device with the proper desired twin state and firmware version. This extra step is more time that the device is not a fully-functioning member of the IoT solution. We can do better using the IoT Hub Device Provisioning service.

Hardcoding endpoints with credentials in mass production is operationally expensive, and on top of that the device manufacturer might not know how the device will be used or who the eventual device owner will be, or they may not care. In addition, complete provisioning may involve information that was not available when the device was manufactured, such as who purchased the device. The Azure IoT Hub Device Provisioning service contains all the information needed to provision a device.

Devices running Windows 10 IoT Core operating systems will enable an even easier way to connect to Device Provisioning via an in-box client that OEMs can include in the device unit. With Windows 10 IoT Core, customers can get a zero-touch provisioning experience, eliminating any configuration and provisioning hassles when onboarding new IoT devices that connect to Azure services. When combined with Windows 10 IoT Core support for Azure IoT Hub device management, the entire device life cycle management is simplified through features that enable device reprovisioning, ownership transfer, secure device management, and device end-of-life management. You can learn more about Windows IoT Core device provisioning and device management details by visiting Azure IoT Device Management.

Azure IoT is committed to offering our customers services which take the pain out of deploying and managing an IoT solution in a secure, reliable way. The Azure IoT Hub Device Provisioning service is currently in private preview, and we&039;ll make further announcements when it becomes available to the public. In the meantime, you can learn more about Azure IoT Hub&039;s device management capabilities. We would love to get your feedback on secure device registration, so please continue to submit your suggestions through the Azure IoT User Voice forum or join the Azure IoT Advisors Yammer group.

Learn more about Microsoft IoT

Microsoft is simplifying IoT so every business can digitally transform through IoT solutions that are more accessible and easier to implement. Microsoft has the most comprehensive IoT portfolio with a wide range of IoT offerings to meet organizations where they are on their IoT journey, including everything businesses need to get started, ranging from operating systems for their devices, cloud services to control them, advanced analytics to gain insights, and business applications to enable intelligent action. See how Microsoft IoT can transform your business.
Quelle: Azure

Azure IoT supports new security hardware to strengthen IoT security

Microsoft’s commitment to leadership in IoT security continues with Azure IoT’s improving the level of trust and confidence in securing IoT deployments.  Azure IoT now supports Device Identity Composition Engine (DICE) and many different kinds of Hardware Security Modules (HSMs). DICE is an upcoming standard at Trusted Computing Group (TCG) for device identification and attestation which enables manufacturers to use silicon gates to create device identification based in hardware, making security hardware part of the DNA of new devices from the ground up. HSMs are the core security technology used to secure device identities and provide advanced functionality such as hardware-based device attestation and zero touch provisioning.

In addition, Azure IoT team is working with standards organizations and major industry partners to employ latest in security best practices to deploy support for a wide variety of Hardware Secure Modules (HSM).  HSMs offer resistant and resilient hardware root of trust in IoT devices. The Azure IoT platform transparently integrates HSM support with platform services like Azure IoT Hub Device Provisioning and Azure IoT Hub Device Management, thereby enabling customers and developers to focus more on identifying specific risks associated with their applications and less on security deployment tactics. 

 

IoT device deployments can be remote, autonomous, and open to threats like spoofing, tampering, and displacement. In this case HSMs offer a major defense layer to raise trust in authentication, integrity, confidentiality, privacy, and more. The Azure IoT team is working directly with major HSM manufacturers to easily enable access to a wide variety of HSMs to accommodate deployment specific risks for customers and developers. 

 

The Azure IoT team leverages open standards to develop best practices for secure and robust deployments.  One of such upcoming standards is the Device Identity Composition Engine (DICE) from the Trusted Computing Group (TCG) which offers a scalable security framework that requires minimal HSM footprint to anchor trust from which to build various security solutions like authentication, secure boot, and remote attestation.  DICE is a response to the new reality of constraint computing that continually characterizes IoT devices.  Its minimalist approach is an alternate path to more traditional security framework standards like the Trusted Computing Group’s (TCG) and Trusted Platform Module (TPM), which is also supported on the Azure IoT platform.   As of this writing the Azure IoT platform has HSM support for DICE in HSMs from silicon vendors like STMicroelectronics and Micron, as well as support for TPM 1.2.  There is also support for HSMs with vendor specific protocols like Spyrus’ Rosetta.

 

Finally, a high-level guidance on risk assessment is to help solutions architects make the proper security decisions, including choice of HSM. While it is possible to overengineer a security solution that ends up being too expensive to adopt, it is also possible to shortcut the solution security engineering for cost reasons.  There is therefore the need to understand this interplay between security and cost for an optimal solution.  To this end the Azure IoT team offers the Security Program for Azure IoT to assist customers and solution architects access the security of their IoT infrastructure and help find the right security approach for their IoT deployments. 

 

The security journey is one the Azure IoT team is committed to continually help customers and developers navigate to achieve the highest trust and confidence in securing their IoT deployments.  This involves supporting a wide range of hardware base security and security standards to secure hardware root of trust for IoT devices

 

 
Quelle: Azure

Introducing H2O.ai on Azure HDInsight

We are excited to announce that H2O’s AI platform is now available on Azure HDInsight Application Platform. Users can now use H2O.ai’s open source solutions on Azure HDInsight, which allows reliable open source analytics with an industry-leading SLA.

To learn more about H2O integration with HDInsight, register for the webinar held by H2O and Azure HDInsight team.

HDInsight and H2O to make data science on big data easier

Azure HDInsight is the only fully-managed cloud Hadoop offering that provides optimized open source analytical clusters for Spark, Hive, MapReduce, HBase, Storm, Kafka, and R Server backed by a 99.9% SLA. Each of these big data technologies and ISV applications, such as H2O, are easily deployable as managed clusters with enterprise-level security and monitoring.

The ecosystem of data science has grown rapidly in the last a few years, and H2O’s AI platform provides open source machine learning framework that works with Spark sparklyr and PySpark. H2O’s Sparkling Water allows users to combine the fast, scalable machine learning algorithms of H2O with the capabilities of Spark. With Sparkling Water, users can drive computation from Scala/R/Python and utilize the H2O Flow UI, providing an ideal machine learning platform for application developers.

Setting up an environment to perform advanced analytics on top of big data is hard, but with H2O Artificial Intelligence for HDInsight, customers can get started with just a few clicks. This solution will install Sparkling Water on an HDInsight Spark cluster so you can exploit all the benefits from both Spark and H2O. The solution can access data from Azure Blob storage and/or Azure Data Lake Store in addition to all the standard data sources that H2O support. It also provides Jupyter Notebooks with in-built examples for an easy jumpstart, and a user-friendly H2O FLOW UI to monitor and debug the applications.

Getting started

With the industry leading Azure cloud platform, getting started with H2O on HDInsight is super easy with just a few clicks. Customer can install H2O during the creation of a new HDInsight cluster by simply selecting the customer applications when creating a cluster, selecting “H2O Artificial Intelligence for HDInsight”, and agreeing to the license terms.

Customers can also deploy H2O when on an existing HDInsight Spark cluster by clicking the “Application” link:

Sparkling Water integrates H2O&;s fast scalable machine learning engine with Spark. It provides utilities to publish Spark data structures (RDDs, DataFrames) as H2O&039;s frames and vice versa. Python interface enabling use of Sparkling Water directly from pySpark and many others. The architecture for H2O on HDInsight is as below:

After installing H2O on HDInsight, you can simply use Jupyter notebooks, which is built-in to Spark clusters, to write your first H2O on HDInsight applications. You can simply open the Jupyter Notebook, and will see a folder named “H2O-PySparkling-Examples”, which has a few getting started examples.

H2O Flow is an interactive web-based computational user interface where you can combine code execution, text, mathematics, plots, and rich media into a single document. It provides richer visualization experience for the machine learning models, and provides native support for hyper-parameter tuning, ROC Curve, etc.

Together with this combined offering of H2O on HDInsight, customers can easily build data science solutions and run them at enterprise grade and scale. Azure HDInsight provides the tools for a user to create a Data Science environment with underlying big data frameworks like Hadoop and Spark, while H2O’s technology brings a set of sophisticated, fully distributed algorithms to rapidly build and deploy highly accurate models at scale.

H2O.ai is now available on the Microsoft Azure marketplace and in HDInsight application. For more technical details, please refer to H2O documentation and this technical blog post on HDInsight blog.

Resources

H2O press release
Learn more about Azure HDInsight
Learn more about H2O
H2O on Azure marketplace
Getting Started with H2O on HDInsight
Use H2O with Azure HDInsight

Summary

We are pleased to announce the expansion of HDInsight Application Platform to include H2O.ai. By deploying H2O on HDInsight, customers can easily build analytical solutions and run them at enterprise grade and scale.
Quelle: Azure

Announcing Azure Analysis Services general availability

Today at the Data Amp event, we are announcing the general availability of Microsoft Azure Analysis Services, the latest addition to our data platform in the cloud. Based on the proven analytics engine in SQL Server Analysis Services, Azure Analysis Services is an enterprise grade OLAP engine and BI modeling platform, offered as a fully managed platform-as-a-service (PaaS). Azure Analysis Services enables developers and BI professionals to create BI Semantic Models that can power highly interactive and rich analytical experiences in BI tools and custom applications.

Why Azure Analysis Services?

The success of any modern data-driven organization requires that information is available at the fingertips of every business user, not just IT professionals and data scientists, to guide their day-to-day decisions. Self-service BI tools have made huge strides in making data accessible to business users. However, most business users don’t have the expertise or desire to do the heavy lifting that is typically required – finding the right sources of data, importing the raw data, transforming it into the right shape, and adding business logic and metrics – before they can explore the data to derive insights. With Azure Analysis Services, a BI professional can create a semantic model over the raw data and share it with business users so that all they need to do is connect to the model from any BI tool and immediately explore the data and gain insights. Azure Analysis Services uses a highly optimized in-memory engine to provide responses to user queries at the speed of thought.

Integrated with the Azure data platform

Azure Analysis Services is the latest addition to the Azure data platform. It integrates with many Azure data services enabling customers to build sophisticated analytics solutions.

Azure Analysis Services can consume data from Azure SQL Database and Azure SQL Data Warehouse. Customers can build enterprise data warehouse solutions in Azure using a hub-and-spoke model, with the SQL data warehouse at the center and multiple BI models around it targeting different business groups or subject areas.
With more and more customers adopting Azure Data Lake and HDInsight, Azure Analysis Services will soon offer the ability to build BI models on top of these big data platforms, enabling a similar hub-and-spoke model as with Azure SQL Data Warehouse.
In addition to the above, Azure Analysis Services can also consume data from on-premises data stores such as SQL Server, Oracle, and Teradata. We are working on adding support for several more data sources, both cloud and on-premises.
Azure Data Factory is a data integration service that orchestrates the movement and transformation of data, a core capability in any enterprise BI/analytics solution. Azure Analysis Services can be integrated into any Azure Data Factory pipeline by including an activity that loads data into the model. Azure Automation and Azure Functions can also be used for doing lightweight orchestration of models using custom code.
Power BI and Excel are industry leading data exploration and visualization tools for business users. Both can connect to Azure Analysis Services models and offer a rich interactive experience. In addition, third party BI tools such as Tableau are also supported.

How are customers using Azure Analysis Services?

Since we launched the public preview of Azure Analysis Services last October, thousands of developers have been using it to build BI solutions. We want to thank all our preview customers for trying out the product and giving us valuable feedback. Based on this feedback, we have made several quality, reliability, and performance improvements to the service. In addition, we introduced Scale Up & Down and Backup & Restore to allow customers to better manage their BI solutions. We also introduced the B1, B2, and S0 tiers to offer customers more pricing flexibility.

Following are some customers and partners that have built compelling BI solutions using Azure Analysis Services.

Milliman is one of the world&;s largest providers of actuarial and related products and services. They built a revolutionary, industry first, financial modeling product called Integrate, using Azure to run highly complex and mission critical computing tasks in the cloud.

“Once the complex data movement and transformation processing is complete, the resulting data is used to populate a BI semantic model within Azure Analysis Services, that is easy to use and understand. Power BI allows users to quickly create and share data through interactive dashboards and reports, providing a rich immersive experience for users to visualize and analyze data in one place, simply and intuitively. The combination of Power BI and Azure Analysis Services enables users of varying skills and backgrounds to be able to deliver to the ever-growing BI demands needed to run their business and collaborate on mission critical information on any device.”

Paul Maher, Principal and CTO, Milliman Life Technology Solutions

“Another great use case for Azure Analysis Services is leveraging its powerful modeling capabilities to bring together numerous disparate corporate data sources. An initiative at Milliman is currently in design leveraging various Finance data sets in order to create a broader scope and more granular access to critical business information. Providing a cohesive and simple-to-access data source for all levels of users gives the business leaders a new tool – whether they use Excel or Power BI for their business analytics.”

Andreas Braendle, CIO, Milliman

Contidis is a company in Angola that is building the new Candando supermarket chain. They created a comprehensive BI solution using Power BI and Azure Analysis Services to help their employees deliver better customer service, uncover fraud, spot inventory errors, and analyze the effectiveness of store promotions.

“Since we implemented our Power BI solution with Azure Analysis Services and Azure SQL Data Warehouse, we’ve realized a big improvement in business insight and efficiency. Our continued growth is due to many factors, and Power BI with Azure Analysis Services is one of them.”

Renato Correia, Head of IT and Innovation, Contidis

DevScope is a Microsoft worldwide partner who is helping customers build solutions using Azure Analysis Services.

“One of the great advantages of using Azure Analysis Services and Power BI is that it gives us the flexibility to start small and scale up only as fast as we need to, paying only for the services we use. We also have a very dynamic security model with Azure Analysis Services and Azure Active Directory and  in addition to providing row-level security, we use Analysis Services to monitor report usage and send automated alerts if someone accesses a report or data record that they shouldn’t."

Rui Romano, BI Team Manager, DevScope

Azure Analysis Services is now generally available in 14 regions across the globe: Southeast Asia, Australia Southeast, Brazil South, Canada Central, North Europe, West Europe, West India, Japan East, UK South, East US 2, North Central US, South Central US, West US, and West Central US. We will continue to add regions based on customer demand, including government and national clouds.

Please use the following resources to learn more about Azure Analysis Services, get your questions answered, and give us feedback and suggestions about the product.

Overview
Documentation
Pricing
MSDN forum
Ideas & suggestions

Join us at the Data Insights Summit (June 12-13, 2017) or at one of the user group meetings where you can hear directly from our engineers and product managers.
Quelle: Azure