New undersea cable expands capacity for Google APAC customers and users

Posted by Brian Quigley, Director, Google Networking Infrastructure

Google’s mission is to connect people to the world’s information by providing fast and reliable infrastructure. From data centers to cables under the sea, we’re dedicated to building infrastructure that reaches more people than ever before.

Today, we announced that we will work with Facebook, Pacific Light Data Communication and TE Subcom to build the first direct submarine cable system between Los Angeles and Hong Kong with ultra high-capacity.

The Pacific Light Cable Network (PLCN) will have 12,800 km of fiber and an estimated cable capacity of 120 Tbps, making it the highest-capacity trans-Pacific route, a record currently held by another Google-backed cable system, FASTER. In other words, PLCN will provide enough capacity for Hong Kong to have 80 million concurrent HD video conference calls with Los Angeles — an example of Google Cloud Platform having the largest network backbone of any public cloud provider.

This is the sixth submarine cable in which Google has an ownership stake — joining the ranks of the Unity, SJC, FASTER, MONET and Tannat projects. We anticipate that PLCN will be operational in 2018.

From the get-go, PLCN is designed to accommodate evolving infrastructure technology, allowing us to independently choose network equipment and refresh optical technology as it advances. Most importantly, PLCN will bring lower latency, more security and greater bandwidth to Google users in the APAC region. In addition to our existing investments in APAC cloud regions and the FASTER cable system, PLCN expands our ability to serve people in Asia, including Google Cloud and G Suite customers.

Nei Hou, Hong Kong! We can’t wait to link up with you!
Quelle: Google Cloud Platform

Temporal Tables are generally available in Azure SQL Database

Temporal Tables allow you to track the full history of data changes directly in Azure SQL Database, without the need for custom coding. With Temporal Tables you can see your data as of any point in time in the past and use declarative cleanup policy to control retention for the historical data.

When to use Temporal Tables?

Quite often you may be in the situation to ask yourself fundamental questions: How did important information look yesterday, a month ago, a year ago, etc. What changes have been made since the beginning of the year? What were the dominant trends during a specific period of time?  Without proper support in the database, however, questions like these have never been easy to answer.
Temporal Tables are designed to improve your productivity when you develop applications that work with ever-changing data and when you want to derive important insights from the changes.
Use Temporal Tables to:

Support data auditing in your applications
Analyze trends or detect anomalies over time
Easily implement slowly changing dimension pattern
Perform fine-grained row repairs in case of accidental data errors made by humans or applications

Manage historical data with easy-to-use retention policy

Keeping history of changes tends to increase database size, especially if historical data is retained for a longer period of time. Hence, retention policy for historical data is an important aspect of planning and managing the lifecycle of every temporal table.  Temporal Tables in Azure SQL Database come with an extremely easy-to-use retention mechanism. Applying retention policy is very simple: it requires users to set single parameter during the table creation or table schema change, like shown in the following example.

ALTER TABLE [WebSiteClicks]
SET
(
SYSTEM_VERSIONING = ON
(
HISTORY_TABLE = dbo. WebSiteClicks_History,
HISTORY_RETENTION_PERIOD = 3 MONTHS
)
);

You can alter retention policy at any moment and your change will be effective immediately.

Why you should consider Temporal Tables?

If you have requirements for tracking data changes, using Temporal Tables will give you multiple benefits over any custom solution. Temporal Tables will simplify every phase in the development lifecycle: object creation, schema evolution, data modification, point-in-time analysis and data aging.

Next steps

To learn how to integrate Temporal Tables in your application, read the following article with the step-by-step instructions. To utilize temporal retention, check out  Manage temporal history with retention policy article on Azure.com.
Visit Channel 9 to hear a real customer story and watch a live presentation with the demo. For more information, check out MSDN documentation.
Quelle: Azure

Azure Stream Analytics query testing now available in the new portal

Azure Stream Analytics is a fully managed service allowing you to gain insights and run analytics in near real-time on your big data streaming workloads. The service was first deployed more than 2 years ago, long before the “new” Azure management portal, http://portal.azure.com, even existed.

For the past few months we’ve been hard at work adding exciting new features to the service as well as transitioning the management user interface from the old https://manage.windowsazure.com to the new portal

Today we want to announce that we’ve just added the ability to test queries in the “new” portal without needing to start or stop a job. Here’s a quick look at how this works.

Setup

You can setup a Stream Analytics by following this simple tutorial – How to create a Stream Analytics job. 

Once you have created a new Stream Analytics job you would typically Create Inputs and then Create Outputs. Or you can just skip ahead to building the query and once your query is working then go back and define the Inputs and Outputs to match those used in the query. Both ways work, giving you the flexibility to decide how you wish to work.

For the purposes of this blog post I have defined a job with 1 data stream input, called StreamInput and 1 output, called Output. You can see these in the query editor blade above.

Open the Query editor blade from the job details screen by clicking on the query in the “Query” lens. Or in our case the < > placeholder because there is no query yet.

You will be presented with the rich editor as before where you create your query. This blade has now been enhanced with a new pane on the left. This new pane shows the Inputs and Outputs used by the Query, and those defined for this job.

There is also 1 additional Input and Output shown which I did not define. These come from the new query template that we start off with. These will change, or even disappear all together, as we edit the query. You can safely ignore them for now.

A key requirement and a common ask from our customers while writing a query is being able to test, and test often, to ensure that the output is what it is expected to be, given some input data. Having to save the query after every edit, start the job, wait for incoming data, check the results, and then stop the job again each time you make a small change to the query would be slow and is sometimes not even possible. A way to test changes to a query quickly was needed.

I am happy to announce that with today’s latest release in the portal you can now test the query without going through this stop/start process. Here&;s how …

Sample data and testing queries

To test with sample input data, right click on any of your Inputs and choose to Upload sample data from file.

Once the upload has completes you can then use the Test button to test this query against the sample data you have just provided.

The output of your query is displayed in the browser, with a link to Download results should you wish to save the test output for later use. You can now easily and iteratively modify your query, and test repeatedly to see how the output changes.

In the diagram above you can see how I have changed the query inline to have a 2nd output, called HighAvgTempOutput where I am only writing a subset of the data being received.
With multiple outputs used in a query you can see the results for both outputs separately and easily toggle between them.
Once you are happy with the results in the browser, then you can save your query, start your job, sit back and watch the magic of Stream Analytics happen for you.

Feature Parity and the road ahead

With the long awaited addition of sample data and query testing in the new portal we are happy to announce that we have reached feature parity between the portals. Everything you could do before, and more, is now in the new portal. Going forward all new development efforts will be concentrated on the new portal. The old portal will continue to work and existing functionality will remain until end of the calendar year when we place to completely retire support for Stream Analytics in the old portal.
If you have not tried Stream Analytics in the new portal we encourage you to head over and give it a try.

Next Steps

We’re really excited to bring local testing to the new portal and take this final step to reaching feature parity across the two portals. We hope this makes your life much easier as you go about developing (and testing) your queries.

We invite you to provide feedback on our User Voice page about what you want added next to the service!

If you are new to either Microsoft Azure or Stream Analytics, try it out by signing up for a free Azure trial account and create your first Stream Analytics job.

If you need help or have questions, please reach out to us through the MSDN or Stackoverflow forums, email the product team directly.
Quelle: Azure