ADF v2: Visual Tools enabled in public preview

ADF v2 public preview was announced at Microsoft Ignite on Sep 25, 2017. With ADF v2, we added flexibility to ADF app model and enabled control flow constructs that now facilitates looping, branching, conditional constructs, on-demand executions and flexible scheduling in various programmatic interfaces like Python, .Net, Powershell, REST APIs, ARM templates. One of the consistent pieces of customer feedback we received, is to enable a rich interactive visual authoring and monitoring experience allowing users to create, configure, test, deploy and monitor data integration pipelines without any friction. We listened to your feedback and are happy to announce the release of visual tools for ADF v2. The main goal of the ADF visual tools is to allow you to be productive with ADF by getting pipelines up & running quickly without requiring to write a single line of code. You can use a simple and intuitive code free interface to drag and drop activities on a pipeline canvas, perform test runs, debug iteratively, deploy & monitor your pipeline runs. With this release, we are also providing guided tours on how to use the enabled visual authoring & monitoring features and also an ability to give us valuable feedback.

Our goal with visual tools for ADF v2 is to increase productivity and efficiency for both new and advanced users with intuitive experiences. You can get started by clicking the Author & Monitor tile in your provisioned v2 data factory blade.

 

Check out some of the exciting features enabled with the new visual tools in ADF v2. You can also watch the short video below.

 

Get Started Quickly

Create your first ADF v2 pipeline

Quickly Copy Data from a bunch of data sources using the copy wizard

Configure SSIS IR to lift and shift SSIS packages to Azure

Set up code repo (VSTS GIT) for source control, collaboration, versioning etc..

Visual Authoring

Author Control Flow Pipelines

Create pipelines, drag and drop activities, connect them on-success, on-failure, on-completion.

Create Azure & Self Hosted Integration runtimes

Create a self hosted integration runtime for hybrid data movement or an Azure-SSIS IR for lifting and shifting SSIS packages to Azure. Create linked service connections to your data stores or compute.

Support for all control flow activities running on Azure computes

Control Flow Activities:

HDInsight Hive, HDInsight Pig, HDInsight Map Reduce, HDI Streaming, HDI Spark, U-SQL, Stored Procedure, Web, For Each, Get Metadata, Look up, Execute Pipeline

Support for Azure Computes:

HDI (on-demand, BYOC), ADLA, Azure Batch

Iterative development and debugging

Do Test Runs before attaching a trigger on the pipeline and running on-demand or on a schedule.

Parameterize pipelines and datasets

Parameterize using expressions, system variables.

Rich Validation Support

You can now validate your pipelines to know about missed property configurations or incorrect configurations. Simply click the Validate button in the pipeline canvas. This will generate the validation output in side drawer. You can then click on each entry to go straight to the location of the missing validation.

Trigger pipelines

Trigger on-demand, run pipelines on schedule.

Use VSTS GIT

VSTS GIT for source control, collaboration, versioning, etc.

Copy Data

Data Stores (65)

Support for 65 data stores. 18 stores with first class support that require users to provide just configuration values. The remaining 47 stores can be used with JSON.

18 stores with first class support:

Azure Blob, Azure CosmosDB, Azure Database for MySQL, Azure Data Lake Store, Amazon Redshift, Amazon S3, Azure SQL DW, Azure SQL, Azure Table, File System, HDFS, MySQL, ODBC, Oracle, Salesforce, SAP HANA, SAP BW, SQL Server

47 Stores with JSON support:

Search Index, Cassandra, HTTP file, Mongo DB, OData, Relational table, Dynamics 365, Dynamics CRM, Web table, AWS Marketplace, PostgreSQL, Concur, Couchbase, Drill, Oracle Eloqua, Google Big Query, Greenplum, HBase, Hive, HubSpot, Apache Impala, Jira, Magento, MariaDB, Marketo, PayPal, Phoenix, Presto, QuickBooks, ServiceNow, Shopify, Spark, Square, Xero, Zoho, DB2, FTP, GE Historian, Informix, Microsoft Access, MongoDB, SAP Cloud for customer

Use copy wizard to quickly copy data from a bunch of data sources

The familiar ADF v1 copy wizard is now available in ADF v2 to do one-time quick import. Copy Wizard generates pipelines with copy activities on authoring canvas. The copy activities can now be extended to run other activities like Spark, USQL, Stored Proc etc. on-success, on-failure etc. and create the entire control flow pipeline.

 

Guided tour

Click on the Information Icon in the lower left. You can then click Guided tour to get step by step instructions on how to visually monitor your pipeline and activity runs.

Feedback

Click on the Feedback icon to give us feedback on various features or any issues that you may be facing.

Select data factory

Hover on the Data Factory icon on the top left. Click on the Arrow icon to see a list of Azure subscriptions and data factories that you can monitor.

Visual Monitoring

List View Monitoring

Monitor pipeline, activity & trigger runs with a simple list view interface. All the runs are displayed in local browser time zone. You can change the time zone and all the date time fields will snap to the selected time zone.

Monitor Pipeline Runs:

List view showcasing each pipeline run for your data factory v2 pipelines.

Monitor Activity Runs:

List view showcasing activity runs corresponding to each pipeline run. Click Activity Runs icon under the Actions column to view activity runs for each pipeline run.

Important note: You need to click the Refresh icon on top to refresh the list of pipeline and activity runs. Auto-refresh is currently not supported.

Monitor Trigger Runs:

Rich ordering and filtering

Order pipeline runs in desc/asc by Run Start and filter pipeline runs pipeline name, run start and run status.

Add/Remove columns to list view

Right click the list view header and choose columns that you want to appear in the list view.

Reorder columns widths in list view

Increase and decrease the column widths in list view by simply hovering over the column header.

Monitor Integration Runtimes

Monitor health of your Self Hosted, Azure, Azure-SSIS Integration runtimes.

Cancel/Re-run your pipeline runs

Cancel a pipeline run or re-run a pipeline run with already defined parameters.

This is the first public release of ADF v2 visual tools We are continuously working to refresh the released bits with new features based on customer feedback. Get more information and detailed steps for using the ADF v2 visual tools.

Get started building pipelines easily and quickly using Azure Data Factory. If you have any feature requests or want to provide feedback, please visit the Azure Data Factory forum.
Quelle: Azure

3 key ideas to help drive compliance in the cloud

Deploying critical data and workloads in a cloud environment can drive numerous benefits such as reduced costs and increased time to market on product and services.
When designing a strategy for regulatory compliance in cloud deployments, however, IT leaders must first make some big decisions.
For example, the choice of public, private or hybrid cloud may depend on whether your business is risk tolerant of sharing at the hypervisor level or if it requires dedicated physical servers. Also, how will your compliance strategy affect recovery and business continuity in the event of a disaster?
The CIA triad
To help navigate these decisions, start with the basics. This simple diagram illustrates the three key components to creating an effective strategy for information security.
I call it the “CIA triad.” CIA stands for:

Confidentiality through preventing access by unauthorized users.
Integrity from validating that your data is trustworthy and accurate.
Availability by ensuring data is available when needed.

Technology, procedures and auditing
I recommend a three-pronged approach to designing a compliance strategy that addresses each area of the triad.
The first prong is technology. An effective cloud infrastructure should include controls that enable you to manage user access to the environment, using software-defined architecture such as virtual or host-based firewalls to isolate, segment and protect data. The infrastructure should also help meet availability targets for critical data with service-level agreements (SLAs) that go up to the application layer.
The second prong consists of procedures and processes for successfully implementing this technology. This includes the use of operational plans and metrics to achieve the strategic and organizational goals set forth by management. These procedures should define the roles of each team member and outline security policies to help ensure the confidentiality of the data.
Once your infrastructure and procedures are in place, it’s a good idea to work with a third party who can audit your environment and policies. This auditing process should help determine what control framework will be used and an approach to validating successful implementation. A qualified auditor can also identify compliance practices that align with the core business. For example, if e-retail is a core business function, then Payment Card Industry (PCI) standards should be considered.
Compliance on IBM Cloud session at Think 2018
At Think 2018, I will host a Think Tank session to dive deeper into these topics, discussing how IBM Cloud can help businesses meet industry and regulatory compliance requirements such as PCI, FEDRAMP and HIPAA.
Along with Barbara Davis, offering manager for managed hosting and application services, I will highlight ways to deploy SAP data and applications more efficiently in a managed cloud environment. To join our conversation, go to the Think 2018 website to register for the event and enroll in the session.
Learn more about Cloud Managed Application Services.
The post 3 key ideas to help drive compliance in the cloud appeared first on Cloud computing news.
Quelle: Thoughts on Cloud

Accelerate your business revolution with IoT in Action

There’s a revolution underway that is positioning companies to take operational efficiency to new levels and inform the next generation of products and services. This revolution of course, is the Internet of Things (IoT).

Here at Microsoft, we’re committed to helping our customers harness the power of IoT through our Azure IoT solutions. We’re also committed to helping customers take the first steps through our IoT in Action series. Our next delivery is coming February 13, 2018 in San Francisco, which I’d encourage you to attend.

But first, I’d like to introduce you to some recent updates to Azure IoT Suite that are making IoT solutions easier and more robust than ever.

Azure IoT powers the business revolution

With our long history of driving business success and digital transformation for our customers, it’s no surprise that we’re also focused on powering the business revolution through our robust Azure IoT suite of products.

So how does Azure IoT benefit businesses?

First off, it’s a quick and scalable solution. Our preconfigured solutions can accelerate your development process, so you can get up and running quickly. You can connect existing devices and add new ones using our device SDKs for platforms including Linux, Windows, and real-time operating systems. Scaling is easy, whether you want to add a few devices or a million.

Azure IoT Suite can easily integrate with your existing systems and applications like Salesforce, SAP, and Oracle. You can also enhance security by setting up individual identities and credentials for each of your connected devices. Plus, Azure comes complete with built-in artificial intelligence and built-in machine learning.

Watch the following interview with Sam George, Director of Azure IoT at Microsoft, to learn how Azure IoT is accelerating the digital transformation for businesses.

So, what’s new with Azure IoT?

Microsoft continues to evolve its suite to offer you the world’s best IoT technology. Here are three notable releases that are smoothing the road to IoT.

Microsoft IoT Central

This highly scalable SaaS solution was recently released for public preview. It delivers a low-code way for companies to build IoT production grade applications in hours without needing to manage backend infrastructure or hire specialized talent. Features include device authentication, secure connectivity, extensive device SDKs with multi-language support, and native support for IoT protocols. Learn more about Microsoft IoT Central.

Azure IoT Hub

Use the Azure IoT Hub to connect, monitor, and manage billions of IoT assets. This hub enables you to securely communicate with all your things, set up identities and credentials for individuals, connected devices, and quickly register devices at scale with our provisioning service. Learn more about Azure IoT Hub Device Provisioning Service.

Azure Stream Analytics on IoT Edge

This on-demand, real-time analytics service is now available for your edge devices. Shifting cloud analytics and custom business logic closer to your devices where the data is produced is a great solution for customers who need low-latency, resiliency, and efficient use of bandwidth. It also enables organizations to focus on more business insights instead of data management. Learn more about Azure Stream Analytics on Iot Edge.

Register for IoT in Action

To learn more about how Azure IoT can help you accelerate your business revolution, attend IoT in Action in San Francisco on February 13.

Get expert insights from IoT industry pioneers like James Whittaker and Sam George. Learn how to unlock the intelligent edge with Azure IoT. Take an in-depth exploration of two Microsoft approaches to building IoT solutions, Azure PaaS and SaaS. Find out how to design and build a cloud-powered AI platform with Microsoft Azure + AI. Plus, connect with partners who can help you take your IoT solution from concept to reality.

Register for this free one-day event today, space is limited.
Quelle: Azure

Compatibility Level 140 is now the default for Azure SQL Database

Database Compatibility Level 140 is now the default for new databases created in Azure SQL Database across almost all regions. At this point in time, there are already 539,903 databases in Azure SQL Database already running in Compatibility Level 140.

Frequently asked questions related to this announcement:

Why move to database Compatibility Level 140?

The biggest change is the enabling of the adaptive query processing feature family, but there are also query processing related fixes and batch mode improvements as well. For details on what Compatibility Level 140 specifically enables, see the blog post Public Preview of Compatibility Level 140 for Azure SQL Database.

What do you mean by "database Compatibility Level 140 is now the default"?

If you create a new database and don’t explicitly designate COMPATIBILITY_LEVEL, the database Compatibility Level 140 will be used.

Does Microsoft automatically update the database compatibility level for existing databases?

No, we do not update database compatibility level for existing databases. This is up to customers to do at their own discretion. With that said, we highly recommend customers plan on moving to the latest compatibility level in order to leverage the latest improvements.

My application isn’t certified for database Compatibility Level 140 yet. For this scenario, what should I do when I create new databases?

For this scenario, we recommend that database configuration scripts explicitly designate the application-supported COMPATIBILITY_LEVEL rather than rely on the default.

I created a logical server before 140 was the default database compatibility level. What impact does this have?

The master database of your logical server will reflect the database compatibility level that was the default at the time of the logical server creation. New databases created on a logical server with an older compatibility level for the master database will still use database Compatibility Level 140 if not explicitly designated. The master database compatibility cannot be changed without recreating the logical server. Having master at an older database compatibility level will not impact user database behavior.

I would like to change to the latest database compatibility level, any best practices for doing so?

For pre-existing databases running at lower compatibility levels, the recommended workflow for upgrading the query processor to a higher compatibility level is detailed in the article Change the Database Compatibility Mode and Use the Query Store. Note that this article refers to compatibility level 130 and SQL Server, but the same methodology applies for moves to 140 for SQL Server and Azure SQL DB.
Quelle: Azure