State of the Word 2022

It’s almost time for State of the Word 2022! Join us for this live stream event on December 15th at 1pm ET.

State of the Word is the annual keynote address delivered by the WordPress project’s co-founder, Matt Mullenweg. Every year, the event shares reflections on the project’s progress and the future of open source. Expect this and more in this year’s edition.

This year’s event will take place in person in New York City and live-streamed via various WordPress.org social media platforms. 

Join Matt as he provides a retrospective of 2022, the latest WordPress releases, Site Editor advancements, and a return to in-person events around the globe, among other topics.

How to Watch Live

What: State of the Word 2022

When: December 15, 2022, 1–2:30 P.M. EST (18–19:30 UTC)

How: The live stream will be embedded in this post at the time of the event and will also be available through the WordPress YouTube channel. Additionally, there are a number of locally organized watch parties happening around the world if you’d like to watch it in the company of other WordPressers.

Don’t worry, we’ll post the recorded event early next week if you aren’t able to catch it live.
Quelle: RedHat Stack

Minimal Downtime Migrations to Cloud Spanner with HarbourBridge 2.0

Spanner is a fully managed, strongly consistent and highly available database providing up to 99.999% availability. It is also very easy to create your Spanner instance and point your application to it. But what if you want to migrate your schema and data from another database to Cloud Spanner? The common challenges with database migrations are ensuring high throughput of data transfer, and high availability of your application with minimal downtime,  and all this needs to be enabled with a user-friendly migrations solution. Today, we are excited to announce the launch of HarbourBridge 2.0 (Preview) – an easy to use open source migration tool, now with enhanced capabilities for schema and data migrations with minimal downtime.This blog intends to demonstrate migration of schema and data for an application from MySQL to Spanner using HarbourBridge.About HarbourBridgeHarbourBridge is an easy to use open source tool, which gives you highly detailed schema assessments and recommendations and allows you to perform migrations with minimal downtime. It just lets you point, click and trigger your schema and data migrations. It provides a unified interface for the migration wherein it gives users the flexibility to modify the generated spanner schema and run end to end migration from a single interface. It provides the capabilities of editing table details like columns, primary key, foreign key, indexes, etc and provides insights on the schema conversion performance along with highlighting important issues and suggestions.What’s new in HarbourBridge 2.0?With this recent launch, you can now do the following:Perform end to end minimal downtime terabyte scale data migrations Get improved schema assessment and recommendationsExperience ease of access with gCloud Integration We’ll experience the power of some of these cool new add-ons as we walk through the various application migration scenarios in this blog.Types of MigrationData migration with HarbourBridge is of 2 types:Minimal Downtime Migration with downtimeMinimal Downtime is for real time transactions and incremental updates in business critical applications to ensure there is business continuity and very  minimal interruption.Migration with downtime is recommended only for POC’s/ test environment setups or applications which can take a few hours of downtime.Connecting HarbourBridge to sourceThere are three ways to connect HarbourBridge to your source database:Direct connection to Database – for minimal downtime and continuous data migration for a certain time periodData dump –  for a one time migration of the source database dump into Spanner Session file – to load from a previous HarbourBridge sessionMigration components of HarbourBridgeWith HarbourBridge you can choose to migrate:Schema-only Data-only Both Schema and Data The below image shows how at a high level, the various components involved behind the scenes for data migration:To manage a low-downtime migration, HarbourBridge orchestrates the following processes for you. You only have to set up connection profiles from the HarbourBridge UI on the migration page, everything else is handled by Harbour Bridge under the hood:HarbourBridge sets up a Cloud Storage bucket to store incoming change events on the source database while the snapshot migration progressesHarbourBridge sets up a datastream job to bulk load a snapshot of the data and stream incremental writes. HarbourBridge sets up the Dataflow job to migrate the change events into Spanner, which empties the Cloud Storage bucket over timeValidate that most of the data has been copied over to Spanner, and then stop writing to the source database so that the remaining change events can be applied. This results in a short downtime while Spanner catches up to the source database. Afterward, the application can be cut over to use Spanner as the main database.The applicationThe use case we have created to discuss to demonstrate this migration is an application that streams in live (near real-time) T20 cricket match data ball-by-ball and calculates the Duckworth Lewis Target Score (also known as the Par Score) for Team 2, second innings, in case the match is disrupted mid-innings due to rain or other circumstances. This is calculated using the famous Duckworth Lewis Stern (DLS) algorithm and gets updated for every ball in the second innings; that way we will always know what the winning target is, in case the match gets interrupted and is not continued thereafter. There are several scenarios in Cricket that use the DLS algorithm for determining the target or winning score. MySQL DatabaseIn this use case, we are using Cloud SQL for MySQL to house the ball by ball data being streamed-in. The DLS Target client application streams data into MySQL database tables, which will be migrated to Spanner. Application Migration ArchitectureIn this migration, our source data is being sent in bulk and in streaming modes to the MySQL table which is the source of the Migration. Cloud Functions Java function simulates the ball by ball streaming and calculates the Duckworth Lewis Target Score, updates it to the baseline table. HarbourBridge reads from MySQL and writes (Schema and Data) into Cloud Spanner. The below diagram represents the high level architectural overview of the migration process:Note: In our case the streaming process is simulated with the data coming from a CSV into a landing table in MySQL which then streams match data by pushing row by row data to the baseline MySQL table. This is the table used for further updates and DLS Target calculations.Migrating MySQL to Spanner with HarbourBridgeSet up HarbourBridge Run the following 2 gCloud commands on Google Cloud Console Cloud Shell:Install the HarbourBridge component of gCloud by running:gcloud components install HarbourBridgeStart the HarbourBridge UI by running:gcloud alpha spanner migration webYour HarbourBridge application should be up and running:Note: Before proceeding with the migration, remember to enable the DataStream and Dataflow API from Google Cloud ConsoleEnsure you have Cloud SQL for MySQL or your own MySQL server created for the source and Spanner instance created for the targetEnsure all source database instance objects are created. For access to the DB DDLs, DMLs and the data CSV file refer to this git repo folderFor data validation (post-migration step) SELECT queries for both source and Spanner, refer to this git repo folderEnsure Cloud Functions is created and deployed (for Streaming simulation and DLS Target score calculation). For the source code, refer to the git repo folder. You can learn how to deploy a Java function to Cloud Functions hereAlso note that your proxy is set up and running when trying to connect to the source from HarbourBridge. If you are using Cloud SQL for MySQL, you can ensure that proxy is running by executing the following command in Cloud Shell:./cloud_sql_proxy -instances=<<Project-id:Region:instance-name>>=tcp:<<3306>>Connect to the sourceOf the 3 modes of connecting to source, we will use the “Connect to database” method to get the connection established with source:Provide the connection credentials and hit connect:You are now connected to the source and HarbourBridge will land you on the next step of migration.Schema Assessment and ConfigurationAt this point, you get to see both the source (MySQL) version of the schema and the target draft version of the “Configure Schema” page. The Target draft version is the workspace for all edits you can perform on the schema on  your destination database, that is, Cloud Spanner.HarbourBridge provides you with comprehensive assessment results and recommendations for improving the schema structure and performance. As you can see in this image above, the icons to the left of table represent the complexity of table conversion changes as part of the schema migrationIn this case, the STD_DLS_RESOURCE table requires high complexity conversion changes whereas the other ones require minimal complexity changesThe recommendation on the right provides information about the storage requirement of specific columns and there other warnings indicated with the columns list as wellYou have the ability to make changes to the column types at this point Primary Key, Foreign Key, Interleaving tables, indexes and other dependencies related changes and suggestions are also availableOnce changes are made to the schema, HarbourBridge gives you the ability to review the DDL and confirm changesOnce you confirm the schema changes are in effect before triggering the migrationSchema changes are saved successfully.Prepare MigrationClick the “Prepare Migration” button on the top right corner of the HarbourBridge page.1. Select Migration Mode as “Schema and Data”2. Migration Type as “Minimal Downtime Migration”3. Set up Target Cloud Spanner InstanceNOTE: HarbourBridge UI supports only Google SQL dialect as a Spanner destination today. Support for PostgreSQL dialect will be added soon.4. Set up Source Connection profileThis is your connection to the MySQL data source. Ensure, you have the IP Addresses displayed on the screen allow-listed by your source.5. Set up Target Connection profileThis is the connection to your Datastream job destination which is the Cloud Storage. Please select the instance and make sure you have allow-listed the necessary access.Once done, hit Migrate at the bottom of the page and wait for the migration to start. HarbourBridge takes care of everything else, including setting up the Datastream and Dataflow jobs and executing them under the hood. You have the option to set this up on your own. But that is not necessary now with the latest launch of HarbourBridge.Wait until you see the message “Schema migration completed successfully” on the same page. Once you see that, head over to your Spanner database to validate the newly created (migrated) schema.Validate Schema and Initial DataConnect to the Spanner instance, and head over to the database “cricket_db”. You should see the tables and rest of schema migrated over to the Spanner database:Set up Streaming DataAs part of the setup, after the initial data is migrated, trigger the Cloud Functions job to kickstart data streaming into My SQL.Validate Streaming DataCheck if the streaming data is eventually migrating into Spanner as the streaming happens.The Cloud Functions (Java Function) can be triggered by hitting the HTTPS URL in the Trigger section of the function’s detail page. Once the streaming starts, you should see data flowing into MySQL and the Target DLS score for Innings 2 getting updated in the DLS table.In the above image, you can see the record count go from 1705 to 1805 with the streaming. Also, the DLS Target field has a calculated value of 112 for the most recent ball.Now let’s check if the Spanner database table got the updates in migration. Go to the Spanner table and query:As you can see, Spanner has records increasing as part of migration as well. Also note the change in Target score field value ball after ball:Wait until you see all the changes migrated over.For data validation, you can use DVT (Data Validation Tool), which is a  standardized data validation method built by Google, and can be incorporated into existing GCP tools and technologies. In our use case, I validated the migration of the initial set of records from MySQL source to Spanner table using Cloud Spanner queries. End the MigrationWhen you complete all these validation steps, click End Migration. Follow the below steps to update your application to point to Spanner database:Stop writes to the source database – This will initiate a period of downtimeWait for any other incremental writes to Spanner to catch up with the sourceOnce you are sure source and Spanner are in sync, update the application to point to SpannerStart your application with Spanner as the databasePerform smoke tests to ensure all scenarios are workingCutover the traffic to your application with Spanner as the databaseThis marks the end of the downtime periodClean Up Finally hit the “Clean Up” button on the End Migration popup screen. This will remove the migration jobs and dependencies that were created in the process.Watch the migration in actionMinimal Downtime Migrations to Spanner with HarbourBridge 2.0Next StepsAs you walked through this migration with us, you would have noticed how easy it is to point to your database, assess and modify your schema based on recommendations, and migrate your schema, your data, or both to Spanner with minimal downtime.You can learn more about HarbourBridge on the README, and learn to install gCloud here. Get started todaySpanner’s unique architecture allows it to scale horizontally without compromising on the consistency guarantees that developers rely on in modern relational databases. Try out Spanner today for free for 90 days or for as low as $65 USD per month.
Quelle: Google Cloud Platform

How Vodafone Hungary migrated their data platform to Google Cloud

Vodafone is currently the second largest telecommunication company in Hungary, and recently  acquired UPC Hungary to extend its previous mobile services with fix portfolio. Following the acquisition, Vodafone Hungary serves approximately 3.8 million residential and business subscribers. This story is about how Vodafone Hungary benefited from moving its data and analytics platform to Google Cloud. To support this acquisition, Vodafone Hungary went through a large business transformation that required changes in many IT systems to create a future-ready IT architecture. The goal of the transformation was to provide future-proof services for customers in all segments of the Hungarian mobile market. During this transformation, Vodafone’s core IT systems changed, which created the challenge of building a new data and analytics environment in a fast and effective way. During the project data had to be moved from the previous on-premises analytics service to the cloud. This was achieved by  migrating existing data and merging them with data coming from the new systems in a very short timeframe of  around six months.  During the project there were several changes in the source system data structure that needed to be adapted quickly on the analytics side to reach the Go Live date.Data and  analytics in Google CloudTo answer this challenge, Vodafone Hungary decided to partner with Google Cloud. The partnership was based on implementing a full metadata-driven analytics environment in a multi-vendor project using cutting edge Google Cloud solutions such as Data Fusion and BigQuery. The Vodafone Hungary Data Engineering team gained significant knowledge of the new Google Cloud solutions, which meant the team was able to support the company’s long-term initiatives.Based on data loaded by this metadata-driven framework, Vodafone Hungary built up a sophisticated data and analytics service on Google Cloud that helped it become a data-driven company.By analyzing data from throughout the company with the help of Google Cloud, Vodafone was able to gain insights that provided a clearer picture of the business. They now have a holistic view of customers across all segments. Along with these core KPIs, the advanced analytics and Big Data models built on the top of this data and analytics services ensures that customers get more personalized offers than was previously possible.. It used to be the case that a business requestor needed to define a project to send new data to the data warehouse. The new metadata-driven framework allows the internal data engineering team to onboard new systems and new data in a very short time (within days), thus speeding up the BI development and decision-making process.Technical solutionThe solution uses several technical innovations to meet the requirements of the business. The local data extraction solution is built on the top of the CDAP and Hadoop technologies written in CDAP pipelines, PySpark jobs, and Unix shell script. In this layer, the system gets data from several sources in several formats including database extracts and different file types. The system needs to manage around 1,900 loads on a daily basis, and most data arriving in a five-hour time frame. Therefore, the framework needs to be a highly scalable system that can handle the high loading peaks without generating unexpected cost during the low peaks.Once collected, the data from the extraction layer goes to the cloud in an encrypted and anonymized format. In the cloud, the extracted data lands in a Google Cloud Storage bucket. By arriving at the file, it triggers the Data Fusion pipelines in an event-based way by using the Log Sink, Pub/Sub, Cloud Function, and REST API. After triggering the data load, Cloud Composer controls the execution of the metadata-driven, template-based, auto-generated DAGs. Data Fusion ephemeral clusters were chosen as they adapt to the size of each data pipeline while also controlling costs during low peaks. The principle of limited liability is important. Each component has a relatively limited range of responsibilities, which means that Cloud Function, DAGs, and Pipelines contain the minimum responsibilities and logic that is necessary to finish their own tasks.After loading this data into a raw layer, several tasks are triggered in Data Fusion to build up an historical aggregated layer. The Vodafone Hungary data team can use this to create their own reports in a Qlik environment (which also runs on the Google Cloud environment) and build up Big Data and advanced analytical models using the Vodafone standard Big Data framework. The most critical point of the architecture is the custom triggering function, which handles scheduling and execution of processes. The process triggers more than 1,900 DAGs per day, while also moving and processing around 1 TB of anonymized data per day.The way forwardAfter the stabilization, the optimization of the processes started taking into account cost and efficiency levels. The architecture was upgraded to use Airflow 2 and Composer 2 as these systems became available. Moving the architecture to these versions increased performance and manageability. Going forward, Vodafone Hungary will continue searching for even more ways to improve processes with the help of the Google Support team. To support fast and effective processing, Vodafone Hungary recently decided to move the control tables to Google Cloud Spanner and keep only the business data in BigQuery. This delivered a great improvement in  processing.In the analytics area, Vodafone Hungary plans to move to more advanced and cutting-edge technologies, which will allow the Big Data team to improve their performance by using Google Cloud native machine learning tools such as Auto ML and Vertex AI. These will further improve the effectiveness of the targeted campaigns and offer the benefit of advanced data analysis.To get started, we recommend you check out BigQuery’s free trial and BigQuery’s Migration Assessment.
Quelle: Google Cloud Platform

Carbon Health transforms operating outcomes with Connected Sheets for Looker

Everyone wants affordable, quality healthcare but not everyone has it. A 2021 report by the Commonwealth Fund ranked the U.S. in last place among 11 high-income countries in healthcare access.1 Carbon Health is working to change that. We are doing so by combining the best of virtual care, in-person visits, and technology to support patients with their everyday physical and mental health needs.Rethinking how data and analytics are accessed at Carbon Health Delivering premium healthcare for the masses that’s accessible and affordable is an ambitious undertaking. It requires a commitment to operating the business in an efficient and disciplined way. To meet our goals, our teams across the company require detailed, daily insights into operating results.In the last year, we realized our existing BI platform was inaccessible to most of our employees outside of R&D. Creating the analytics, dashboards, and reports needed by our clinic leaders and executives required direct help from our data scientists. However, this has all changed since deploying Looker as our new BI platform. We initially used Looker to build tables, charts, and graphs that improved how people could access and analyze data about our operating efficiency. As we continued to evaluate how our data and analytics should be experienced by our in-clinic staff, we learned about Connected Sheets for Looker, which has unlocked an entirely new way of sharing insights across the company.A new way to deliver performance reporting and drive resultsConnected Sheets for Looker gives Carbon Health employees who work in Google Sheets—practically everyone—a familiar tool for working with Looker data. For instance, one of our first outputs using the Connected Sheets integration has been a daily and weekly performance push-report for the clinic’s operating leaders, including providers. Essentially a scorecard, the report tracks the most important KPIs for measuring clinics’ successes, including appointment volume, patient satisfaction such as net promoter score (NPS), reviews, phone call answer rates, and even metrics about billing and collections. To provide easy access, we built a workflow through Google App Script that takes our daily performance report and automatically emails a PDF to key clinic leaders each morning. Within the first 30 days of the report’s creation, clinic leaders were able to drive noticeable improvements in operating results. For instance, actively tracking clinic volume has enabled us to manage our schedules more effectively, which in turn drives more visits and enables us to better communicate expectations with our patients. Other clinics have dramatically improved their call answer rates by tracking inbound call volume, which has also led to better patient satisfaction. Greater accountability, greater collaborationAs you can imagine, a report that holds people accountable for outcomes in such a visible way can create some anxiety. We’ve eased those concerns by using the information constructively, with the goal to use reporting as a positive feedback mechanism to bolster open collaboration and identify operational processes that need improvement. For example, data about our call answer rates initiated an investigation that led to an operational redesign of how phones are deployed and managed at more than 120 clinics across the U.S.Looker as a scalable solution with endless applicationsWe’re now rolling out Connected Sheets for Looker to deliver performance push-reporting across all teams at Carbon Health. Additionally, we continue to find new ways to leverage Connected Sheets for Looker to meet other needs of the business. For instance, we’ve recently been able to better understand our software costs by analyzing vendor spend from our accounting systems directly in Google Sheets. Going forward, this will allow us to build a basic workflow to monitor subscription spend and employee application usage, which will lead to us saving money on unnecessary licenses and underutilized software. We’ve come a long way in the last year. Between Looker and its integration with Google Sheets, we can meet the data needs of all our stakeholders at Carbon Health. Connected Sheets for Looker has been an impactful solution that’s going to help us drive measurable results in how we deliver premium healthcare to the masses.1. Mirror, Mirror 2021: Reflecting Poorly2.  HEALTHCARE EDITORS’ PICK Meet The Immigrant Entrepreneurs Who Raised $350 Million To Rethink U.S. Primary CareRelated ArticleAnalyze Looker-modeled data through Google SheetsConnected Sheets for Looker brings modeled, trusted data into Google Sheets, enabling users to work in a way that is comfortable and conv…Read Article
Quelle: Google Cloud Platform

Forrester study finds 228 percent ROI when modernizing applications on Azure PaaS

Using modern apps in the cloud to do more with less

There’s no denying the pivotal role developers play in today’s organizations. Whether you’re a high-tech company, a non-profit organization, or a fast-food restaurant, robust digital and online services are key to your customer success. Take the example of one of our customers, Jotun, a multinational chemical supplier—their customer-facing and sales applications are mission critical to their business. But with a small development team managing global applications on premises, the time and effort they spent on routine management and administration was extensive.

The company decided to embrace application modernization—ending investments in on-premises structures—and migrate their apps to Microsoft Azure with Azure App Service. In addition to eliminating routine maintenance tasks and increasing uptime, the new approach enabled them to scale developer expertise, deliver high application performance from anywhere in the world, and begin the transition to a modern development, security, and operations (DevSecOps) approach—all while lowering costs and accelerating time to market.

Platform-as-a-service (PaaS) represents one of the most cost-effective ways to strategically shift resources to application innovation, rather than spending time managing application infrastructure. Azure PaaS services like Azure App Service, Azure Spring Apps, and Azure Integration Services provide developers and IT professionals with a fully managed application platform for building, deploying, and managing applications of all kinds—from the simplest website to the most complex business solution. Developers focus on innovation, and the cloud platform takes care of everything else. A new, commissioned study conducted by Forrester Consulting on behalf of Microsoft, The Total Economic ImpactTM (TEI) of Azure PaaS, details the significant business impact of this approach.

Through a series of customer interviews, Forrester finds that a composite organization—an anonymized aggregate profile of these customers—using Azure PaaS can realize:

A three year 228 percent return on investment (ROI), with a payback period of 15 months.
A 50 percent increase in the speed of application development.
A 40 percent reduction in app-dev related infrastructure costs.

Speaking with Azure customers, Forrester observed several common factors that drove their organizations’ decision to adopt Azure PaaS for modernizing applications. These include being part of a broader strategic initiative, the potential for cost savings, limitations with existing architectures, wanting to take advantage of cloud capabilities, the tight market for tech talent, and prior experience with Azure.

Simply put, this set of fully managed services offers a powerful tool for enterprises to equip their developers in the rapidly changing application landscape.

Retire legacy infrastructure, reduce server costs, and deliver value faster with Azure

Whether your goal is to modernize applications in the cloud, integrate with modern databases and AI, rapidly build apps with low-code platforms, or future-proof existing applications, Azure helps you to provide your developers with the right tools for the right job.

Let me tell you about several ways Azure's fully managed services can transform your organization’s application development process:*

Tackle application development infrastructure costs

Whether you’re operating in an on-premises or hybrid environment, Azure PaaS supports your migration needs efficiently, enabling you to retire legacy infrastructure after applications are fully migrated.

Forrester estimates total application development-related infrastructure cost savings for the composite organization at USD19.1 million over three years during this process.

Rely on trusted cloud infrastructure and security management

While migrating to virtual machines is sometimes the simplest and fastest path for many organizations, application modernization provides the full benefits of the public cloud. PaaS makes this easier because now companies benefit from the cloud provider managing the underlying infrastructure and software of the platform. The cost savings for the composite organization begin almost immediately, with Forrester research showing estimated savings of USD10.3 million on related administrative costs over three years.

Free developers to focus on innovation

Developers are at their best when they’re given time to focus on innovating and developing new applications. Modernizing with PaaS helps increase productivity using dev/test and staging environments, provides the ability to run on the latest versions of the OS, languages, and framework, and enables the use of modern DevOps practices such as continuous integration and continuous delivery. The potential for savings for the composite organization is up to USD7.2 million over three years.

Prioritize application uptime

Offloading management of infrastructure not only reduces the immediate spending, but also ensures that the service provider is responsible for maintaining a 99.95 percent SLA uptime. The resiliency inherent in the fully managed service approach provides peace of mind to the developers in the composite organization and an estimated USD3.8 million in avoided revenue losses over three years.

Reap immediate benefits for your business

With an efficient and reliable platform that works out of the box, developers increase the speed of application development by up to 50 percent. The improved time to market saves the composite organization USD2.8 million over three years and enhances the organization’s ability to serve customer needs better.

Excited? Learn more about how to modernize your enterprise applications today

Azure PaaS helps organizations confidently take the next step to modernizing applications—paving the way for maximizing IT budget and resources, aligning stakeholder priorities, supplementing cloud skillsets, and even unifying the security approach. The methodical analysis of business value in the Forrester TEI study reinforces that this has both tangible economic impact as well as unquantified benefits to help you become a digital leader.

At the recent Microsoft Ignite 2022 conference, we shared our unique approach to helping organizations modernize their .NET, Java, and other workloads. Learn more about Azure App Modernization and get started with free Azure credits.

If you are ready to begin, check out our partner portal, where you can learn about Microsoft partners who have specialized services for your application requirements.

Join us for this free webinar to learn more about the Forrester Total Economic Impact of Microsoft Azure PaaS study.

Bookmark the Apps on Azure blog to keep up with our expert coverage and product announcements.

Follow Microsoft Azure on Twitter for the latest news and updates.

*Disclaimer: In this study, Forrester provides the detailed assumptions and methodology used to arrive at these estimates. We encourage readers to use their own estimates within the framework provided in the study to determine the appropriateness of an investment in Azure PaaS.
Quelle: Azure

Microsoft Azure CLX: A personalized program to learn Azure

The rise of cloud computing has created demand for proven cloud experts. That’s why we’ve launched the Microsoft Azure Connected Learning Experience (CLX) program, designed to help aspiring learners and IT professionals become Microsoft Azure cloud pros. CLX is a personalized and self-paced journey that culminates in a certificate of completion—allowing you to maximize learning while minimizing time invested.

What is the CLX program?

The CLX program is a four-step program that prepares you for the Microsoft Azure certification exams while optimizing your learning experience and minimizing time invested. The program, which is curated to meet every learner’s unique needs, consists of four steps:

A knowledge assessment
A Microsoft Learn study materials review
A practice test
A cram session

At the start of the program, you’ll take a knowledge assessment to test your skills and create a personalized learning path. You’ll then take only the Microsoft Learn courses that are useful to you—saving you time and ensuring that you learn the skills you need to accelerate your career.

What courses will I take?

The courses you take are up to you. The self-paced program is catered to your skillset, and you can embark on six tracks: Microsoft Azure Fundamentals, Microsoft Azure AI Fundamentals, Microsoft Azure Data Fundamentals, Microsoft Azure Administrator, Administering Windows Server Hybrid Core Infrastructure, and Windows Server Hybrid Advanced Series—with more on the way. Learn more about these tracks below.

Course
Learner Personas
Course Content

Microsoft Azure Fundamentals
Administrators, Business Users, Developers, Students, Technology Managers

This course strengthens your knowledge of cloud concepts and Azure services, workloads, security, privacy, pricing, and support. It’s designed for learners with an understanding of general technology concepts, such as networking, computing, and storage.

Microsoft Azure AI Fundamentals
AI Engineers, Developers, Data Scientists

This course, designed for both technical and non-technical professionals, bolsters your understanding of typical machine learning and artificial intelligence workloads and how to implement them for Azure.

Microsoft Azure Data Fundamentals
Database Administrators, Data Analysts, Data Engineers, Developers

The Data Fundamentals course instructs you on Azure core data concepts, Azure SQL, Azure Cosmos DB, and modern data warehouse analytics. It’s designed for learners with a basic knowledge of core data concepts and how they’re implemented in Azure.

Microsoft Azure Administrator
Azure Cloud Administrators, VDI Administrators, IT Operations Analysts

In Azure Administrator, you’ll learn to implement cloud infrastructure, develop applications, and perform networking, security, and database tasks. It’s designed for learners with a robust understanding of operating systems, networking, servers, and virtualization.

Administering Windows Server Hybrid Core Infrastructure
Systems Administrators, Infrastructure Deployment Engineers, Senior System Administrators, Senior Site Reliability Engineers

In this course, you’ll learn to configure on-premises Windows Servers, hybrid, and Infrastructure as a Service (IaaS) platform workloads. It’s geared toward those with the knowledge to configure, maintain, and deploy on-premises Windows Servers, hybrid, and IaaS platform workloads.

Windows Server Hybrid Advanced Series
System Administrators, Infrastructure Deployment Engineers, Associate Database Administrators

This advanced series, which is designed for those with deep administration and deployment knowledge, strengthens your ability to configure and manage Windows Server on-premises, hybrid, and IaaS platform workloads.

How do I get certified?

After you finish your personalized curriculum, you’ll complete a two-hour practice test that mimics the final certification exam. Next, you’ll attend a virtual, instructor-led cram session that dives deeply into the Microsoft Azure Certification Exam content. The four-hour session covers the entire course syllabus to ensure you’re well-prepared to pass with ease.

Once you’ve sharpened your understanding of the Azure platform and its solutions, you’ll receive your certificate of completion. You’ll also walk away with the skills to confidently pass the Microsoft Azure Certification Exams—and the proven expertise to advance your career and exceed your cloud computing goals today and in the future.

To learn more and register, visit the Microsoft Cloud Events Portal or check out our Microsoft Azure CLX introductory video.
Quelle: Azure