Why adopt Lake House for Phenomenal Data Management and Business Intelligence

Data Explosion and Importance of Analytics in Business

                The whole world is experiencing unprecedented growth in quantum of data generation through Business activities and through social media.  Since inception of a business house, terabytes of business data has been accumulated without owning clarity to use it effectively and optimally in the business. But now, such business houses are expecting to use it to increase precision in their decision making at all levels. Even small and medium sized organizations also expecting to inculcate habit of “Data DrivenDecision Making”.

Businesses today have realized the significance of social media to know sentiments of general people and customers to effectively to use in business decision making specifically in the event of growing cut-throat business competition.  Many social media software are being made popular in massesas a tool to gather their opinionsand thoughts on diversified subjects related to society mindset.

Challenges before Industry

                Days are gone for Enterprises to make decisions on gut feelings. They are striving for being Data Driven Enterprises making decisions on facts and figures. But many of them still are perusing their dreams of to be.  There is a need of the paradigm shift in technology spectrum for gaining and streamlining movementand momentums in this arena. Industry is posing lots of challenges and thus to meet them, technology is going through rigorous phases of evolutions.  A holistic view of these challenges before industry is essential to understand the need and the process of evolution of new solutions. A glance at the following list can give a sense of gravity of herculean task to overcome them.

  • Data Formats: Unlike in past when data was mostly in text and figures, now it is also being generated in Images, Videos, Audio, Columnar, Document and so many formats like JSON, Binary, Parquet, Avro, ORC.
  • Data Storages: Unlike in past when data was preserved in flat file systems and RDBMS, variety of storages are needed today to meet the needs of storing big data size (Volume), high speed R/W (Velocity), preserving data of variety of formats (Variety) in addition to basic needs like Cost, Simplicity, Security of data on storages.
  • Data Sources: Unlike in past when data used to be generated mostly out of business transaction, today its generated from Mobile Devices, IoT devices, logs, social media, B2B and B2C transactions and so on. The tools to use to ingest data must have a wide variety of connectors to interact with such data sources.
  • Impedance in Data Formats and Structures: Many data sources to ingest the data from, increases complexity multi-fold to deal with impedances in data formats and structures.  Cross system data formats may have different schemas, views, indexes, biases on same type of data. Simplifying alignment of veracity of data is a great challenge to address.
  • Dirty Data: Unlike in past when data was accumulated from handful of data sources, the task of cleaning and wrangling of the data was much simpler. While accumulating data from variety of data sources in variety of formats with serious impedances, this problem has been aggravated to serious level of complexity.  Data normally comes with noise, outliers, discrepancies, distortions, duplicates, missing values and so on.
  • Integration with other Systems: Todays applications are of the kind of complex heterogeneous systems in the sense that in a single system, there are integrationswith other systems of the kind of B2B, B2C, Legacy, Microservices, Cloud services, Hybrid applications and so on. Such systems can not be designed properly by sticking to old trends, tools, architectures, practices etc.  Altogether new shift in centroids of problems is essential on an architectural canvas to approach to solutions today.
  • Organizational Operations: Every organization was struggling on many fronts like working in silos, rate of attrition, encouraging human resources towards inculcating multi-skill sets, unfriendly environments, complex/clumsy systems etc making difficult for organization to do even moderate shifts in approaches.
  • Data Security and Privacy: Cyber crimes have become a serious pain point for whole IT industry to protect the data and its privacy from being destroyed, stole, and misused. The threat is growing with the number and skills of ill-brain in the world.

Evolution of Data Architecture for Lake House

                Its now pretty clear that evolution on one front cannot be a solution on all these challenges but has to happen on many fronts with introducing new Storages, Computes, Tools, Services, Practices, Architectures, Approaches and so on.  Obviously, industry in the process of evolution must look for versatile system imbibed with following features…

a) Storage’swith humongous volume of data handling diligently and reliably with high R/W speed optimized for performance and cost.
b) Tools to integrate and ingest data from variety of data sources seamlessly and capable of easily and quickly resolving data impedances.
c) Tools to apply high speed data cleaning, munging, wrangling to make big data suitable and ready for analytics processing.
d) Systems to process big data in minimal time to create alerts, predictions, reports, visuals in almost real-time on devices even like Mobile.
e) System compliant with regional norms with globalization/localization features inculcated.
f) Systems with are automated, fault tolerant, resilient, disaster safe, highly available, self-healing, fully consistent still optimized for performance and efficiency.

Since few years, the IT industry is in the phase of evolution on many fronts through many phases.  Just few of them have been discussed here.

I) The Data Warehouse: It was introduced to overcome weaknesses of databases for not able to process complex business processes.  It did have large storage with Massive Parallel Processing units working to execute complex queries concurrently in short span of time. It was also capable of responding to too many queries at the peakhours with minimal latency. Mostly the product was well equipped with phenomenal data management and business intelligence capabilities. It did cater business needs diligently in an era but later started failing consistently to meet the needs like: Not able to represent data in different formats, provide access to outside world other than SQL interface, may not support advanced analytics and lack of in-build support of proper ecosystem around it to meet all new data processing needs.

II) The Data Lake: The demand of the industry changed as time passed and industry started looking for the bulk sized storage which can represent data of different formats, access with high R/W speed and can be integrated with variety of tools and applications.  The Data Lake like storages offered solution and ideally met the needs like capable of representing data of formats including Images, Audios, Videos and so on. It also offered interfaces to connect to other parallelly evolving systems like Big data Processing, Advanced Analytics and Business Intelligence Tools. (Ex. Azure Data Lake Store with Azure Data Lake Analytics, HDInsight and so on.)

Although it catered most of the needs of industry, the system soon started falling short of services like in-build data management platform, in-build support for compute and analytics services and industry needs of working with more complex data and data processing scenarios.And again, with the change in demands and trends triggered the need to search for another system.

By this time, industry tried different alternatives with trade-offs between pain and soothing points.

a) Apache Hive on HDFS/S3: Hive is a Data Warehouse System build at the top of Hadoop.  It facilitates analyzing large data volumes to exclusively handle queries on structured data that is collected in Logs.  Hive runs at the top of highly scalable, fault-tolerant, and distributed cluster of nodes through MapReduce.  The mole on the beautiful face was the ability to process and create results real-time.

b) The Lambda Architecture: It’s a typical approach in which batch and stream pipelines prepare records in co-ordination concurrently.  The results are then blended to offer complete answer.  Industry noticed the architecture as it is meeting the strict latency requirements for processing old and freshly produced events. But the biggest dis-advantage cropped up was Development and Operational Burdon of maintaining two independent systems.

III) The Lake House: The desperation of Industry finally came to an end with the introduction of the Lake House Architecture.  The Lake House is an Open Data Management Architecture which unifies Data warehousing and Advanced Analytics.And different vendors quickly implemented without delay the concept in their products.  Just to name them are Databricks and Azure Databricks, Azure Synapse Analytics and so on.  The Lake House combined the best part of Data Warehouse and Data Lake and addressed almost all concerns listed above.

The solution came up with lots of features under one roofto manage Batch and Real-Time processing data pipelines. The specification addresses need of features like-

  • Phenomenal Data Management and Business Intelligence
  • Features to cater needs of roles like
    • Data Engineer for designing Data Processing Pipelines.
    • Data Scientist for building AI and ML processing pipelines.
    • Data Analyst for building BI and analysis pipelines
    • Data Administrator with Dashboard to manage administration.
  • Connectors to connect to variety of Data Sources and Data Sinks.
  • Support for variety of data formats and humongous storage
  • Real-time processing of both historical data and stream data. 
  • Dramatically low latency to process data in real-time.
  • Processing engines for batch and stream data pipelines.

In addition to all above features, specification mandates following points to it…

  • Offers Data Management and ACID Transactions of Data Warehouse.
  • Enables Business Intelligence, Machine Learning and Deep Learning on all types of data.
  • Exists in 2-tier architecture
  • Support of even 3rd Party libraries like TensorFlow, Keras, PyTorch
  • Support of Audit History and Time Travel.
  • Separate detachable Metadata Layer to associate Schema Enforcement, Table Versioning, Data Validations.

The Lake House has brought a light of hope to easily design combined data pipeline of Batch and Stream Processing otherwise which is difficult to implement as a Lambda or Kappa Architecture. Our next module will discuss implementation of Lake House in Azure Data Bricks and in Azure Synapse Analytics.

Blog by Mr. Chandrashekhar Deshpande (MCT)

Technology expertise: Analytics, Machine Learning and Artificial Intelligence, Hadoop spectrum and Azure HDInsight, Apache Spark-Azure Databricks-Azure ML Services-Azure Cognitive Services, Java-Python-Scala, GoF and JEE and Frameworks.

Delta Lake – Deliver reliability and high performance on your Data Lake

In my earlier blog, I discussed the need and features of Lake House Architecture.  The Delta Lake is an open-source project aimed to implement Lake house architecture with pluggable components for Storages and Computes. 

In modern data architecture, at least 3 types of systems are used in combination: The Data warehouse, Data Lake and Streaming system.  This combination demands large ecosystem from Amazon Kinesis/Apache Kafka/Azure Event-IoT Hub, distributed computing like Hadoop, Spark, Flink, Storm and finally to Data Lake sinks like HDFS, S3, ADLS, GCS or even quarriable systems like Data Warehouse, Databases, NoSQL Storages etc. 

Unfortunately, Data Lake is weak in the Performance or Quality required to support high-end business processes on their own.  As a result, till yesterday, only alternative available that offered Performance, Security, Scalability, Concurrency was Data Warehouse Solutions.  But, the Data Warehouse solutions bring with them, unwanted fats like Cost, inability to deal with different data formats and so on.

The Delta Lake can be looked at as a boon as solution to the problems industry is facing to be ‘Data-Driven Decision Making’ Enterprises.

The Delta Lake:

It’s an open-source project that allows building Lake House Architecture on the top of Data Lakes like HDFS, S3, ADLS, GCS using compute engines like Spark, Flink, PrestoDB, Trino through APIs of languages like Java, Python, Scala, SQL, Rust, Ruby and so on.

At the heart of Delta Lake, data is represented as a Table.  There can be multiple tables representing different kinds of data.  These tables are different than Database tables in the sense that they can accommodate stream data on the fly and operations on them may also change results with fresh data pumped in.  The ultimate point to notice is the real-time performance of these operations as these tables along with their data are managed well diligently within the memory of the computes despite the big data size.

The Delta Lake Standards offer list of key features discussed below…

  • ACID Transactions: A typical data processing lifecycle normally has multiple data pipelines ingesting data from one/multiple data sources, applying different pipelines concurrently and writing processed data to Data Sinks concurrently (Ref Lambda Architecture).  Seems easy but realization is challenging for the reason that implementation, maintenance is tedious and further more tedious is ensuring data integrity due to lack of transaction management.  Delta Lake brings ACID Transactions with strongest isolation level to seamlessly address this concern.  That’s great!
  • Open Formats: The Apache Parquet format has a unique native proposition of efficient compression and encoding schemes. Its an open format supported by most of the Big Data Solutions.  The Delta Lake uses it as a native format to preserve and work with all the data. Its columnar structure adds the performance multi-fold while dealing with Big Data Analytics.
  • Schema Enforcement and Evolution: There are data systems which enforce/validate schema while reading and not while writing leading to entering corrupt data into the system. If it goes un-noticed until irreversible damage happens to the system, makes the system vulnerable for inconsistency.  The Schema enforcement in Delta Lake safeguard the data quality by rejecting the writes of corrupt, disoriented data. The schema represents the column names, types and constraints.  Only data abiding by the schema enters the system and otherwise outrightly rejected.  It also assures that until you make affirmative choice of changing it, it does not change thereby prevents Data Dilution.  The Schema Evolution lets you easily change the current schema of tables to accommodate data that is changing over the time.  It can be made to automatically adopt a schema with new columns while appending/overwriting data.
  • Delta Sharing: The ‘Delta Sharing’ is industry’s first open protocol to secure sharing of the data across tools, applications, organizations irrespective of the Compute and Storage platforms.  It facilitates sharing the live data without using any intermediate staging storage. It also can share tera-byte scale of data reliably and efficiently with easy governance, auditing and tracking to the data sets.
  • Scalable Metadata Handling: In big data, even metadata also is big and needs its management like other data including distributed and replicated storage.  The Delta Lake treats metadata as normal data leveraging distributed processing power of compute and distributed & partitioned storage across the files at ease. It obviously augments the feature of metadata as highly scalable and available.
  • Unified Batch and Stream Sources and Sinks: A table in Delta Lake is a major building block in the Batch as well as a Stream data pipeline. A table gets populated with static data or automatically and dynamically populated with real-time data. Relationship among tables can be easily established to entertain even complex queries. Streaming data ingest, batch historic backfill, and interactive queries all just work out of the box.
  • Delta Everywhere: Use of connectors and plug-ins to connect to a variety of tools, applications, platforms through delta share are one of the important needs of the hour to design a collaborative and accommodative system. Support of API for multiple languages makes it possible for a single system to be developed by different skill-sets.
  • Audit History and Time Travel: Delta Lake transaction log records details about every change made to the data providing a full history of the audit trail of the changes. Data versioning enables rollbacks and reproducible machine learning experiments.
  • Upserts and Deletes: To provide support for use-case like Slowly Changing Dimension (SCD) Operations, Change-Data-Capture, Stream Upserts, the support of Insert, Merge, Update and Delete operations are essential.  Be assured from Delta Lake for extensive support through API.
  • Compatibility with Apache Spark API: Delta Lake offers 100 % Compatibility with Apache Spark API.  Developers can quickly and easily use existing spark data pipelines in the Delta Lake environment doing minimal changes to it.

In nutshell, together, the features of Delta Lake improve both the manageability and performance of working with data in storage objects and enable a “Lake House” paradigm that combines the key features of data warehouses and data lakes augmented with High Performance, Scalability with cost economics.

Delta Lake Data Maturity Layers:

                In a typical Data Processing Pipeline, data changes its shape as it passes through different processing stages.  After ingesting data from the Data Source, different operations like Data Cleaning, de-duplicating, enriching, transforming and converting into a dimensional model are the inevitable steps to carry out before being brought for actual analytical processing.  This leads to have layers of maturity levels across the data pipeline.  The Databricks Delta Lake has suggested and used the following levels of maturity. 

The Bronze Layer: The layer contains the raw data ingested directly from Source Files.  The structure of these tables thus resembles the structure of ingested data from Data Source.  The data exist here is in the pre-state of cleaning and further processing of data.  The Bronze tables represent data in the ‘Delta’ format irrespective of the format of the ingested data (JSON, Parquet, Avro, ORC). The tables in the Bronze Layer are further enriched by cleaning, de-duplicating, transforming, wrangling to get a new version of data which is called as Silver Layer.

The Silver Layer: The layer contains tables having cleaned and enriched data thereby offering a refined view of data.  It may be necessary to do look-up, joins with other Bronze/Silver tables, updates and so on to prepare the Data Model in this layer.  The Silver Layer represents the data up-to-date for analytical processing.

The Gold Layer: The data from the silver layer will be converted into Dimensional Model, aggregation to represent a Gold Layer.  The aggregation done here can be used further for reporting, business processing and Dashboard designing by end-user or other applications.

Delta Lake Implementation platforms:

  • Delta Lake on Spark: The Delta Lake distribution is available for Spark 2.x and 3.x. The Delta Lake package version 0.7.0 works on Spark 3.x with Delta Core Scala 2.11.x.
  • Azure Databricks: A data analytics platform optimized for the Microsoft Azure cloud services platform. It is an enterprise-level platform designed around Spark Engine to offer solutions for designing analytics processing pipeline.  Azure Databricks support Delta Lake development around and upon Spark engine and Databricks File System.  This is one of the platforms offering complete support of Delta Lake within is Storage and Compute Environments.
  • Azure Synapse Analytics: The Azure Synapse Analytics is a managed unified Azure service offering end-to-end solutions for all types of business analytics operations. Its one of the features is Spark Pool which enables data engineers and data scientists to either interactively or in batch mode design data pipeline using programming languages like Scala, PySpark, .Net. The Synapse Analytics also has a service called Server-less SQL Pool which offers a quarriable layer around all non-quarriable data formats. Interestingly, the Server-less SQL Pool can be queried to read Apache Delta Lake files and serve them to reporting tools. The Delta Lake file has a similar format irrespective of whether they are created using Apache Spark, Azure Databricks or any other product which uses Delta Lake Format. The Spark API of Spark Pool has a support to query, modify Delta Lake files using programming language. A full support of Delta Lake is yet to be introduced.

Many organizations are using and contributing to Delta Lake as of today, few names are Databricks, Alibaba Group, Viacom, Informatica etc.  The structure of data changes over time as business concerns and requirements change. However, with the help of Delta Lake, adding new Dimensions as the data changes are simple. Delta Lakes improve the performance, reliability, and manageability of Data Lakes.

Above all, Delta Lake has made it extremely easy for organizations to adapt ‘Data-Driven Decision Making’ and make a value out of the data.

Blog by Mr. Chandrashekhar Deshpande (MCT)

Technology expertise: Analytics, Machine Learning and Artificial Intelligence, Hadoop spectrum and Azure HDInsight, Apache Spark-Azure Databricks-Azure ML Services-Azure Cognitive Services, Java-Python-Scala, GoF and JEE and Frameworks.

Why Cloud Computing could be an Exciting Career Option for Aspiring Professionals

Pic credit: Technology Advice

While cloud computing has been gaining importance from several years, the COVID-19 pandemic has accelerated its growth further 

At the simplest level, cloud computing refers to the on-demand availability of computer system resources and services such as storage, databases, software, and analytics. These are delivered via ‘clouds’ that are housed in data centres across multiple locations spread throughout the globe. Cloud computing offers several advantages such as scalability, flexibility, cost savings, remote access to infrastructure, mobility, etc. 

While cloud computing technologies have been steadily gaining significance for over a decade, the COVID-19 pandemic has considerably accelerated their growth across the globe. A Markets and Markets report estimates that the cloud computing market will grow at a Compound Annual Growth Rate (CAGR) of 17.5% to touch USD 832.1 billion by 2025. Similarly Gartner expects worldwide end-user spending on public cloud services to grow 23.1% in 2021 to total $332.3 billion.

The COVID-19 crisis and accompanying social distancing norms forced professionals across the globe to stay home for extended periods of time. This created a greater demand for remote working capabilities and collaboration technologies in order to maintain business continuity.

 At the same time, there was growing demand for online entertainment through video streaming platforms. OTT platforms like Netflix and Hotstar were seen to disrupt traditional models of television-cable and movie-cinema ecosystems. This brought to the fore the role of cloud in not just making conventional IT networking architecture more efficient, but also giving rise to tech enabled business models that could disrupt status quo and serve customers much better with instant gratification and real time interactivity.

All these factors further helped spur the demand for cloud computing. With enterprises realizing the many advantages of cloud computing and seeing its adoption as critical to survival, this demand is only set to grow in the future.

Demand for Cloud Computing Professionals 

The rapid growth in cloud adoption has also thrown up a greater demand for trained cloud professionals. As per a recent NASSCOM report titled ‘Cloud Skills: Powering India’s Digital DNA’, India will need 20 lakh cloud professionals by 2025. With the current baseline growth, India is expected to produce only an estimated 14-15 lakh cloud professionals by this timeline. This showcases the need for a fairly aggressive skilling roadmap for cloud computing in India. 

The NASSCOM report also stated that job openings for cloud roles in India touched 380,000 in 2020, a 40 per cent growth over 2019. Already, the demand for cloud skills is far greater than the current availability of skilled professionals. Therefore, for aspiring professionals, cloud computing offers an exciting and fruitful career option. 

Roles in Cloud Computing 

Typically, a cloud engineer is responsible for all the technical aspects of cloud computing including planning and design, maintenance, and support. There are several paths that one can explore including: Cloud developer, front-end/back-end developer, solutions architect, cloud architect, data engineer, security engineer, development operations engineer, full-stack developer, SysOps administrator etc.

For aspiring cloud professionals, the first and foremost step towards building a career as a cloud engineer is to gain proficiency in at least one of the major cloud computing platforms such as AWS, Azure, or Google Cloud Platform (GCP). Also, it is useful to learn data-oriented programming languages, such as Python, Java, and Clojure that are used widely in cloud computing. Next, the focus can be on understanding the various specialty areas within cloud such as Storage and Networking, Virtualization and Operating Systems, Security and Disaster Recovery, Web Services and DevOps etc. Based on this understanding, one can dive deeper into the area that they find most interesting and engage in more structured study in that area. 

With rapid digitization and the growing awareness about the advantages of cloud and allied technologies, adoption is set to grow rapidly, especially in the new normal. With the right talent and skill sets, professionals can tap into the massive opportunities at hand. India is well positioned to build its identity as the global hub for cloud solutions. Talented and skilled professionals have a huge part to play in realizing this vision.

Source: https://www.dqindia.com/why-cloud-computing-could-be-an-exciting-career-option-for-aspiring-professionals/

Hybrid work marks a paradigm shift in how organizations think about security: Satyavrat Mishra, Godrej Industries

Pic credit: Microsoft India

“The lockdown in India happened quite suddenly and at that moment nobody knew what the future of work would look like,” says Satyavrat Mishra, assistant vice president – corporate IT, Godrej Industries Limited.

Despite the suddenness of the nation-wide lockdown last year due to the pandemic, the Godrej group, whose business interests span across industries like consumer products, diversified agri business, chemicals, real estate and housing finance sprang to its feet and rolled out a remote work model within days for its 12,000 employees spread across four continents.

Mishra, who oversees all things IT, attributes the success of this migration to early adaptability of technology.

“We implemented complete Enterprise Mobility Security (EMS) Suite solutions, along with Microsoft Defender for Office 365 (erstwhile Office 365 Advance Threat Protection) for our email security in 2018,” he says.

When the pandemic forced governments to extend lockdowns constantly, Mishra’s job of securing the company’s networks and data became drastically different and much more challenging.

“The one thing that worked in our favor was that we’d already adopted Microsoft’s cloud-based solutions for secure connectivity earlier, but we were using it for a smaller user base. After the lockdown, all we had to do was roll it out for all our employees,” he says.

The cybersecurity leader recently spoke to Microsoft Stories India about sustainable hybrid work models. He also shared his thoughts on the changing face of cybersecurity during the pandemic, while insisting that the only way for companies to succeed is to invest in employees.

Edited excerpts from the conversation follow:

What were the biggest challenges for Godrej after pandemic forced employees to work from home?

Godrej Industries has 12,000 employees, most of whom are in India, but some are spread across the United States, Latin America, Africa, and Indonesia. When the lockdown was announced and it was decided that people will start working from home, nobody had imagined that it would last for over a year-and-a-half. We had to provide seamless connectivity for 12,000 people who are spread across four continents, almost overnight.

Our journey towards enabling employees to work remotely had started before the lockdown, but we still did not have a work from home culture. One of the biggest challenges we faced was related to identity and security. In an office, there are physical boundaries, so it was easy to secure the perimeter. Now that wasn’t the case.

How did you overcome these challenges?

We’ve been using Microsoft to provide multifactor authentication and advance email security for some of our employees for several years. So, after the lockdown was announced, it did not take us a long time to roll it out on a large scale and it was business as usual for us.

We also use over 150 business applications across verticals, and we connected each one of these to Azure AD to enable Sigle Sign On (SSO) authentication. This helped us to securely manage all our apps centrally, which saved a lot of time. We made sure that we enabled this solution not just for our employees, but also our vendors and consultants.

We also enabled conditional access to our networks, and only a few people could access sensitive information. We also began using Office 365 Data Loss Prevention (DLP) at Godrej Housing Finance and we plan to deploy it across the group companies next year.

The biggest plus point was that since these are SaaS solutions, we didn’t have to spend too much time or effort in training and deployment. We’ve implemented a Zero Trust framework to enable employees to access the tools and documents they need.

Why did Godrej Industries choose Microsoft?

We began using Microsoft Defender for 0ffice 365 (erstwhile Office 365 Advance Threat Protection) in 2018 after a couple of incidents at the organization. After the breach, when we did our security review meeting it became very clear that we cannot opt for multiple security solutions from different partners. So, we zeroed in on Microsoft, which gave us tools for multifactor authentication, SSO, and mobile security. No other solution could take care of all our needs.

Had we opted for different solutions, it would’ve taken us a couple of years just to roll them out. Then each solution would’ve required separate teams with expertise to deploy, manage, and monitor them.

With Microsoft, we could complete this in just 6 months, and we didn’t have to take care of the integration process because there were no hardware boxes involved. Everything was happening automatically and being stored on the cloud.

Microsoft gave us tools for multifactor authentication, SSO, and mobile security. No other solution could take care of all our needs.

How has the job of IT security professionals changed because of the implementation of hybrid work model?

In the last two years there has been a sea of change because before that at least 90 percent of the workforce used to go to the office or some regional points. But now, except our manufacturing units, which cannot be operated remotely, everyone is working from home. The way we access software has completely changed.

Earlier, there used to be discussions on how to secure the perimeter, how to put more security and network access control solutions. Now, we’re implementing Zero Trust frameworks. That is a major paradigm shift in the way that a security team would define resource access. There’s no hardware ensuring security in a remote work model. All of it is via SaaS-based solutions.

What advice would you give to organizations looking to secure their networks and data?

Our dependence on digital applications has increased because of the pandemic, which has led to an increase in cloud access and cloud usability. Traditional on-premises security solutions that companies used to deploy in their data centers won’t work anymore. Before embarking on a big journey towards cloud adoption, companies need to be aware of emerging technologies like cloud support plan management, cloud security, and posture management.

Apart from this, organizations need to conduct Digital Risk Management. They need to have a complete inventory of their digital resources. All the applications they use create a digital footprint. They need to identify this footprint and ensure they are secure. So, if any identity leak happens on the dark web, there must be a way for them to know and take remedial action.

Same goes for social media, as organizations have relied heavily on these networks for marketing activities. There can be fake social pages, which can hamper the brand’s image. So, we need to monitor social media and take proactive action in case there has been some impersonation.

How can employees contribute to the strengthening of cybersecurity?

There is a lot of awareness needed for employees to understand cyber hygiene. As it became clear that hybrid workplace model will continue, we began to let our employees know if there had been a problem in their workstations. If a person was going to a malicious website, for example, they’d be notified about it. While all these incidents were recorded earlier too, now we have started capturing this to understand the user behavior and launched employee user behavior score card which is similar in design to CIBIL scorecard used for Credit Score. It comprises of Threat Score and Awareness Scores. This has helped us in creating user groups based on overall scores and we use that to run campaigns and simulations with Attack Simulator in Microsoft Defender for Office 365 for awareness. We also run a lot of bite sized micro training programs to keep our employees updated with best practices.

Source: https://news.microsoft.com/en-in/features/hybrid-work-marks-a-paradigm-shift-in-how-organizations-think-about-security-satyavrat-mishra-godrej-industries/

Exam AI-102: An Exhaustive Guide to Microsoft Certification

The current wave of emerging technology has been highly revolutionary. The speed at which technology is changing is super fast and highly agile. Technology has seeped into every sector of the economy, including education, manufacturing, retail, finance, automotive, health care, etc. ML and AI development and implementation within IT solutions have proved highly valuable.

With AI-based solutions likely to rule the roost, it is time that the IT workforce acquires the skillsets essential as quickly as possible. Actually, the move to bridge the gap has to be taken up on a war footing basis as more and more organizations are adopting the cloud wherein AI plays a crucial role.

As per industry reports, the global AI market is likely to gain momentum, and the demand for AI-certified IT professionals is expected to grow manifold. Hence, acquiring an Azure AI certification is the right move to become an AI solution provider in the cloud computing world, and look forward to an evolving and engaging career path.

Table of Contents:

1. Introduction

2. Azure AI certification – Overview

3. How valuable are Azure certifications?

4. How do I prepare for Azure Certification AI 102?

5. Can I enhance my job prospects with Azure Certification?

6. Azure AI Engineer Salary Career Path

7. How can Synergetics assist you in acquiring AI-102 Certification?

Introduction

Cloud computing and Data migration is the new normal for most IT departments in the organization. AI skills and certifications are the hour’s need for those IT professionals who wish to stay employed within the cloud environment. And, acquiring the Microsoft Azure Certification AI-102 is one step forward in the right direction to set a firm footing in the Cloud space.

Azure AI certification – Overview

The two most sought-after Microsoft Azure certifications on AI are:

Exam AI-900: Microsoft Azure AI Fundamentals

The Microsoft AI-900 exam certification tests your understanding of artificial intelligence and popular machine learning workloads. In addition, the AI-900 tests your deployment skills on the Azure. Also, technical as well as non-technical candidates can appear for this certification.

Exam AI-102: Designing and Implementing a Microsoft Azure AI Solution

Candidates keen to appear for AI-102 certification on Designing and Implementing a Microsoft Azure AI Solution should be capable of:  

  • Developing, managing, and implementing AI solutions using Azure Cognitive Search, Azure Cognitive Services, and the Microsoft Bot Framework.
  • Active involvement in all phases of AI solution development, from requirements definition and design to development, deployment, maintenance, performance tuning, and monitoring.
  • Work in collaboration with solution architects to build complete end-to-end AI solutions through clear communication about the final AI solutions while working with IoT specialists, data scientists, data engineers, etc.

How valuable are Azure certifications?

Azure certifications are extremely valuable. The adoption of Microsoft Azure is on the rise, and cloud infrastructure is predicted to grow exponentially. Also, Microsoft is constantly making improvisations in Azure and working towards its adoption by organizations and companies worldwide.

Wide Skills Gap – The number of Microsoft certified professionals is far too less than the current demand, which is likely to increase further in the coming times. Hence, organizations have to make do with limited AI-certified professionals.

A diverse range of certifications – Microsoft has devised many technology-specific certification examinations that help you acquire expertise in Azure which you should take advantage of – 

How can I prepare for Azure AI 102 Certification?

A comprehensive study plan for Azure AI-102 certification should include –

  • Refer to the Microsoft learning path and try to cultivate intrinsic knowledge of the Azure concepts
  • Try to gain as much hands-on experience with the concepts you have learned through lab practice
  • Refer to Microsoft documentation and get as much as possible information on the various use cases

Can I enhance my job prospects with Azure Certification?

Yes, certification proves to be an excellent add-on to you CV. Moreover, your CV is likely to stand out because it carries professional experience and certification. Acquiring a professional certification has its share of challenges and requires a lot of dedication and commitment. You will need to make an effort to enroll in a course, study for its exam, and get certified. However, the main advantages of Azure AI-102 certification are that you can earn higher salaries than your counterparts working in a similar profile and region.

Azure AI Engineer Salary & Career Path

The profile of an Azure AI engineer is full of challenges as they have to interact and coordinate with various technical teams within the organization. A brief overview of their profile includes

They work in the visionary capacity with solution architects, data scientists, data engineers, IoT specialists, and AI developers to devise and build complete end-to-end AI solutions.

As per data received in 2021, an artificial intelligence engineer in the United States can earn approximately more than $160,000. The factors such as experience, location, expertise, and education play a crucial role in determining the salary of an AI professional, which can easily go beyond $200,000

How can Synergetics assist you in acquiring AI-102 Certification?

To begin with, Synergetics is a Microsoft Gold Partner and has been associated with Microsoft Learning services since 2000. Our intuitive study and learning are comprehensive and offer the much necessary knowledge and guidance that help you develop the cloud aptitude and expertise.

Our learning offerings for Microsoft AI-102 certification include:

  • Get access to our Microsoft Official Courseware (MOC) upon your enrollment for the AI-102 certification training course.
  • Refine your knowledge and skills.
  • Interaction with our Microsoft certified and skilled trainers who will help, support and refine your Azure-AI knowledge throughout the training journey.
  • Gain practical knowledge and experience by availing our hands-on lab sessions, which are immensely valuable when appearing for an examination.
  • Attempt our practice and mock tests that boost your confidence when appearing for the exam.

Also, our enrolled students can:

  • Participate in study groups and online forums to acquire relevant guidance.
  • Refer to the whitepaper, eBooks, and relevant Microsoft documentation.
  • Gain access to Microsoft Learn Portal, Microsoft Official Courseware, and other self-learning portals.

All the best for your exam preparation and certification too!

More About Synergetics:

Synergetics is a learning solutions company that offers Microsoft certifications to individuals as well as corporates. Our bespoke learning solutions are well developed and comprehensive for their best results. Moreover, we also provide training, workshops, and webinars on these certifications on request. Our well-trained and expert Microsoft-certified learning partners conduct all our learning solutions.

For more details do get in touch with us on info@synergetics-india.com or you can Call or What’s App us on +91 8291362058 or visit our contact us page to discuss your learning requirement.

You can also read: 7 Reasons on why to seek Microsoft Certifications for accelerating career growth!

Cloud trends show customers increasing investments in hybrid and multicloud

Pic source: Microsoft

Across every industry and geography, companies are working hard to keep pace with evolving business needs and build on their existing digital investments. In my role leading the product team for the core of Azure, I spend a lot of time with customers learning what they need to be successful as they integrate cloud technologies into their business strategy to digitally transform.

As many have experienced firsthand, the pandemic has pushed more businesses to adopt wider use of cloud computing technology. A new survey finds it’s also driving more companies to deploy a hybrid (mix of on-premises and one or more public clouds) or multicloud (multiple public clouds) approach. The survey, conducted by The Harris Poll and sponsored by Microsoft, found 86% of all respondents plan to increase investment in hybrid or multicloud environments, and 95% say those technologies have already been critical to their success.

Results from the survey of business decisions makers, IT professionals and IT decision makers in medium to large U.S. companies demonstrate the opportunity in 2022 for cloud technology to solve complex business needs.

Hybrid and multicloud fuel business outcomes

The survey and our own observations both underscore how companies are emphasizing remote work, increasing demands for efficiency and ever-present competitive pressures to reduce costs. While the pandemic has already accelerated use of cloud technology, respondents shared they will continue to proactively and strategically invest in cloud, specifically hybrid and multicloud, in the next few years.

It comes down to the bottom line: these technologies are critical to business success, enabling new scenarios, improving resource efficiency and increasing business agility. Companies that use these technologies were more likely to report increases in revenue over the past 12 months, with 83% of those who operate in both hybrid cloud and multicloud environments reporting revenue growth compared to just 58% of non-hybrid and non-multicloud users. In fact, nearly all professionals in organizations that use hybrid or multicloud said these solutions drove direct business outcomes.

While some companies may arrive at hybrid or multicloud organically, the survey shows most organizations are choosing these technologies with strategic intent. Companies are deploying additional clouds for specific purposes, such as a cloud just for AI.

Flexibility and choice emerged as clear needs from businesses that aren’t just adopting a single cloud, but rather the right mix of clouds and the tools that go with them. Nearly all survey respondents agreed they need to be able to adopt cloud in some areas of business while retaining other business information on premises, primarily for regulatory reasons. Nearly nine in 10 say they will increase spending on hybrid or multicloud in the next three years.

Why? To address business demands including scalability, risk management and the ability to govern data and digital sovereignty. The majority of Fortune 500 business respondents say they’re likely to make changes to their cloud infrastructure in the next year, including half who plan to increase the number of cloud providers they are using.

Empowering customers with flexibility

We are committed to customers’ need for flexibility, which is why we built Azure to be hybrid by design and continue to invest in enabling the use of cloud-native technologies anywhere. As part of our ongoing investment, in 2019 we delivered Azure Arc – the foundation of our current approach to hybrid and multicloud. Azure Arc provides a centralized way for customers to manage, govern and secure their entire digital landscape across datacenters, edge and multiple clouds. We also encourage customers to choose familiar cloud-native tools or use Azure services to build and deploy their applications to on-premises, edge and multicloud environments. And that flexibility is resonating – with thousands of new customers signing up, including KPMGNokiaRoyal Bank of Canada and Siemens Healthineers. In fact, 78% of the Fortune 500 companies now use Microsoft hybrid cloud offerings.

For example, worldwide telecommunications company Nokia uses Azure to serve customers and keep data secure and compliant in every geography where it operates. We worked with the Nokia team to develop an approach that gives it a standard way to deploy, operate and monitor its cloud-native applications in whichever environment customers choose, including Azure, AWS, Google Cloud Platform, private clouds or on-premises environments.

We also understand readiness and enablement are critical to customer success with hybrid and multicloud, and we continue to invest in this area. Today, we’re announcing the new Landing Zone Accelerator for Azure Arc-enabled servers, making deployments easier and faster with guidance and automated reference implementations based on our experience with thousands of customers.

As The Harris Poll survey underscored, recent years have challenged many enterprises to evolve their approach to cloud, including renewing their focus on hybrid and multicloud. At Microsoft, we expect this trend to continue and are committed to delivering the solutions customers need across the cloud environment continuum. We encourage you to learn more about how customers are using Azure Arc and our approach to hybrid and multicloud.

Source: https://blogs.microsoft.com/blog/2022/01/27/cloud-trends-show-customers-increasing-investments-in-hybrid-and-multicloud/

Microsoft Security delivers new multicloud capabilities

Image source: Microsoft

In times of great change, challenges and opportunities can be found in many directions. This is certainly true in IT and cybersecurity.

Today, while navigating a pandemic, frequent supply chain shocks, and global talent shortages, organizations around the world are forced to confront sophisticated ransomware and nation-state attacks. They’re continually staying ahead of stricter compliance requirements, and they’re doing all of this while focusing on the strategic edge they obtain using technology as a transformational advantage.

Cloud, mobile, and edge platforms have driven unprecedented business innovation, adaptation, and resilience during this time, but this broad mix of technologies also introduces incredible complexity for security and compliance teams. The security operations center (SOC) must keep pace with safeguarding identities, devices, data, apps, infrastructure, and more. Further, they must take stock of evolving cyber risks in this multicloud, multi-platform world, and identify where blind spots may exist across a broad new set of users, devices, and destinations. 

When you combine these business needs and rising concerns, it’s clear that security is the defining opportunity and challenge of our time. At Microsoft, our mission of empowering every person and organization on the planet to achieve more means anticipating these needs, slashing security complexity, and protecting organizations across their entire digital estate. We do this by making multicloud support central to our security strategy.

Today, we’re announcing new advances to help customers strengthen visibility and control across multiple cloud providers, workloads, devices, and digital identities—all from a centralized management view. These new features and offerings are designed to secure the foundations of hybrid work and digital transformation.  

Delivering the future of multicloud security

According to the Flexera 2021 State of the Cloud Report, 92 percent of respondents are using a multicloud model, meaning they rely on apps and infrastructure from multiple cloud providers.1 Another recent survey sponsored by Microsoft shows that 73 percent of respondents say it’s challenging to manage multicloud environments.2 For organizations to fully embrace these multicloud strategies, it’s critical that their security solutions reduce complexity and deliver comprehensive protection.

Today, we’re taking another step in Microsoft’s journey to protect our customers across diverse cloud systems by extending the native capabilities of Microsoft Defender for Cloud to the Google Cloud Platform (GCP).

With GCP support, Microsoft is now the only cloud provider with native multicloud protection for the industry’s top three platforms: Microsoft Azure, Amazon Web Services (AWS) (announced at Ignite last November), and now Google Cloud Platform (GCP). Microsoft Defender for Cloud provides Cloud Security Posture Management and Cloud Workload Protection. It identifies configuration weak spots across these top providers to help strengthen the overall security posture in the cloud and provides threat protection across workloads—all from a single place. Support for GCP comes with out-of-box recommendations that allow you to configure GCP environments in line with key security standards like the Center for Internet Security (CIS) benchmark—protection for critical workloads running on GCP, including servers, containers, and more. Find out more in today’s announcement blog.

Image source: Microsoft

Strengthening Zero Trust with identity security from CloudKnox

Despite all this innovation and change, security and compliance fundamentals begin with conclusively managing identity. Identities are the foundational piece that makes it possible to deliver apps, data, and services where they’re needed.

In a multicloud world, the number of platforms, devices, users, services, and locations multiplies exponentially, so securing those dynamically changing identities and permissions, wherever they are, is another core pillar of multicloud protection. 

A key pain point for many organizations here is the lack of visibility and control over their ever-evolving identities and permissions. To help address this, last year we acquired CloudKnox Security, a leader in Cloud Infrastructure Entitlement Management (CIEM), to accelerate our ability to help customers manage permissions in their multicloud environments and strengthen their Zero Trust security posture. Today, we’re announcing the public preview of CloudKnox Permissions Management. CloudKnox provides complete visibility into user and workload identities across clouds, with automated features that consistently enforce least privilege access and use machine learning-powered continuous monitoring to detect and remediate suspicious activities. Learn more in today’s blog post.

Image source: Microsoft

Reinventing the economics of security data with Microsoft Sentinel

To defend against today’s threats as well as tomorrow’s, security teams must have ready access to all security data. But as the volume of security data continues to grow exponentially, a one-size-fits-all model is no longer sufficient.

We’re working to reinvent the economics of working with security information and event management (SIEM) data and delivering new ways to access and analyze security data by embracing all data types, wherever they live, to provide the most comprehensive threat hunting solution. Today, we’re announcing new capabilities as the first step on this journey. We’re introducing basic logs, a new type of log that allows Microsoft Sentinel to sift through high volumes of data and find high-severity, low-visibility threats, and a new data archiving capability to extend data retention to seven years—beyond our current policy of two years—to enable our customers’ global data compliance needs. We’re also adding a new search experience to empower security analysts to hunt for threats more effectively. They can now search massive volumes of security data quickly and easily from all logs, analytics, and archives. Learn more about Microsoft Sentinel’s vision and new capabilities.

Delivering comprehensive protection

In today’s threat landscape, attacks are coming from anywhere and everywhere, including both inside and outside organizations. That’s why it’s critical to deliver comprehensive solutions that organize security, compliance, identity, endpoint management, and privacy as an interdependent whole while extending protection across platforms and clouds.

To that end, we’re announcing some updates across our portfolio that will help you better protect what’s most important to your business:

  • Secure workload identities with Azure Active Directory (Azure AD): We’re extending Azure AD beyond its core capabilities of protecting user identities to now also safeguarding workload identities for apps and services, as customers move more workloads into the cloud, and develop more cloud-native applications. We announced Conditional Access for workload identities last November, and now, Identity Protection can also be applied to workload identities. Learn more from our blog post.

Secure payment processing in the cloud with Azure Payment HSM: We recently launched a new service, Azure Payment HSM, in public preview, for payment card issuers and network and payment processors to securely process payments in the cloud. It provides the highest levels of protection for cryptographic keys and customer PINs for secure payment transactions. Learn more from the announcement blog.

Source: https://www.microsoft.com/security/blog/2022/02/23/microsoft-security-delivers-new-multicloud-capabilities/

Organizations need to secure innovations emerging from accelerated tech adoption: Vishal Salvi, Infosys

Pic source: Microsoft India

“The pandemic accelerated the rate of adoption of technology by companies, which also led to a lot of innovation in the digital world. But while we talk about innovation, there is also a need to secure those innovations,” said Vishal Salvi, chief information security officer and head of cybersecurity practice at Infosys, during Microsoft Future Ready.  

Infosys, the fourth largest IT services company in the world, successfully enabled its 280,000 employees across 50 countries to work remotely within weeks at the onset of the pandemic. Salvi attributed this agility to the organization’s ability to keep pace with changing trends. 

He emphasized the importance of thinking about security right at the beginning of building digital solutions for the future.  

“It’s imperative to find a balance between security, agility, and quality. Enabling employees to work from anywhere and simultaneously ensuring that the data is secure is every organization’s requirement, ” the cybersecurity leader said while elaborating about the Microsoft solutions the IT services giant utilizes for security, compliance, and identity management. “Remote working has diminished the monolithic perimeter defense and I concur with one of Microsoft’s Zero Trust statement that identity is the new perimeter.”

Edited excerpts follow:  

Have instances of cybersecurity increased during the pandemic? What are the new risks that have emerged? 

The number of cybersecurity incidents and breaches had been on an annual rise even before the pandemic. The intensity and frequency of these attacks had also been increasing. But when the pandemic struck, organizations had to pivot towards working from home and from cloud during a very short period. This created a sense of temporary vulnerability as organization went through rapid and massive changes.  

From a cybersecurity perspective, the attack threat surface increased dramatically during the pandemic because of rapid adoption of cloud, data, and analytics. Therefore, this period provided an opportunity for threat actors to really exploit vulnerabilities. So initially, we saw amplified and accelerated attacks. 

We also saw unrelenting pandemic-related phishing attempts. We saw advanced malware, various versions of ransomware attacks, and advanced persistent threats like organized crime against nation states and financial institutions.  

The pandemic has led companies to realize the importance of digitalization. How can they ensure that everything they do is still safe? 

We are seeing an increased rate of digital adoption. In fact, any enterprise today that does not have a digital strategy is not innovating. What was earlier within the perimeter of your organization can now be anywhere on the internet, which is borderless and has no geographical control. All of these have created a different kind of threat perception for organizations to manage. While we are talking about innovation emerging from accelerated tech adoption, there is also a need to secure those innovations. 

How important is it to think about security while building solutions? 

Let me explain this through an analogy. When we drive a car, we need to have brakes to have things under control. In fact, the stronger the engine, the more powerful the brakes need to be. We have to look at all risk management functions, including security, in a similar way. 

It is very important to ensure that security is not an afterthought for organizations. If security is an afterthought, companies can be left vulnerable. It is very expensive to implement security into your products and solutions after you have already built it.  

Imagine trying to fit in brakes into a vehicle after all the design is complete. That’s almost impossible.  

The same thing applies to technology, but people tend to ignore it because it’s difficult to imagine it. So, democratizing technology and making sure security is included (in your solutions) by design are very important aspects to securing your digital presence.  

What are the top risks that security professionals should keep in mind to foster innovation in this digital world? 

The board of every company is worried right now, and rightly so. It’s not just that the risks are increasing but the ecosystem is making leadership accountable for the management of cybersecurity risks. So, it starts from the top and companies need to really set the tone for what they want to do and how they want to drive their strategy when it comes to setting up cybersecurity measures.  

Broadly, there are four goals that companies need to focus on.  

The first being finding a balance in the tradeoff between convenience and control. Companies need to find an equilibrium.  

The second is to constantly keep upgrading your cybersecurity posture. Companies need to remember that they are really not competing with their peers or the industry. They are competing with their own past versions, and they need to ensure that they are better than they were the day before. 

Thirdly, companies need to build a sense of cyber resiliency within the organization. The attacks and breaches are inevitable, but companies need to ensure that their response to these are composed and calm so that they really manage these instances, and then resume business quickly.  

The fourth would be to build a cybersecurity culture within organizations. Security should not only be the problem for the cybersecurity team or the IT team. It is meant for everyone, and all employees need to incentivize and realize the role that they need to play.  

How is Microsoft enabling Infosys in achieving these four goals? 

It’s imperative to find a balance between security, agility, and quality. Enabling employees to work from anywhere and simultaneously ensuring that the data is secure is every organization’s requirement. Remote working has diminished the monolithic perimeter defense and I concur with one of Microsoft’s Zero Trust statement that identity is the new perimeter. Zero trust takes data as the central pivot, but it revolves around identity. 

Microsoft O365 services like Azure ADAzure MFA, Microsoft Intune and conditional access have enabled us to tag identity as user plane with device and data plane which translates to 360-degree security, striking the optimal balance between user convenience and control.  

With the advancement and sophistication of attack and techniques, organizations need to be a step ahead in terms of technology and must keep upgrading their cybersecurity capabilities. The Microsoft compliance stack helps perform gap assessment of the deployment and determines progress.  

Infosys uses Microsoft Security and Compliance Center, and Secure Score dashboards guide IT and Information Security teams at Infosys to track and monitor latest and historical scoring based on Security controls enabled. It also provides actionable insights to improve the organization’s data protection capabilities and overall compliance posture.  

Also, building a cybersecurity culture within organizations is important and I think it needs to be the primary goal. The fact that security is the responsibility of each individual needs to be inculcated, Infosys has adopted multiple training programs in partnership with Microsoft towards building that culture. We also perform multiple phishing campaigns using Microsoft Attack Simulation to determine and enhance user awareness. Tools like Azure Information Protection (AIP) increased awareness among employees on the significance of classifying and protecting information.

Infosys uses defense in depth approach to build cyber resiliency and Microsoft’s Security stack plays an important role to identify, detect and protect any phase of program. Infosys uses Azure AD premiumMicrosoft ATAO365 DLPExchange Online ProtectionMCAS, and other services to help build that confidence. Microsoft’s worldwide network of datacenters provide highly available network, detect adversaries, and proactively apply required remediation strategies. 

How can businesses prepare to return to work? Will the future be all about hybrid work models? 

There are two important factors when it comes to the future of work–the role of technology as a driver, but also the contribution of employees. Today, technology is not an issue at all because it can support all kinds of models–remote, hybrid or completely back to office.  

What we are seeing in the corporate world right now is a preference for the hybrid work model, where people are returning to work in a calibrated manner. But at the same time, we have the flexibility for employees to keep working from home to increase their productivity. So, I think the future is all about hybrid work models.  

One thing that is clear is that we will not go back to the old legacy architecture that we had. We will have a modern, pivoted technology architecture that will allow employees to work from anywhere, any time and from any device. And this will be possible 24X7 in a trusted and secure manner.

Source: https://news.microsoft.com/en-in/features/organizations-need-to-secure-innovations-emerging-from-accelerated-tech-adoption-vishal-salvi-infosys/

Microsoft announces intent to establish India datacenter region in Hyderabad

Pic source: FirstLight Fiber

HYDERABAD, March 7, 2022 – Microsoft today announced its intent to establish its latest datacenter region in Hyderabad, Telangana. This strategic investment is aligned with Microsoft’s commitment to help customers thrive in a cloud and AI-enabled digital economy and will become part of the world’s largest cloud infrastructure.

Customer demand for cloud as a platform for digital transformation, driving economic growth and societal progress across India, is increasing. According to IDC*, Microsoft datacenter regions in India contributed $9.5B revenue to the economy between 2016 and 2020. Beyond GDP impact, the IDC report estimated 1.5 million jobs were added to the economy, including 169,000 new skilled IT jobs.[1]

Rajeev Chandrasekhar, Minister of State for Skill Development & Entrepreneurship and Electronics & Information Technology of India shared, “Today’s commitment to the people and businesses of India will position the country among the world’s digital leaders. A Microsoft datacenter region provides a competitive advantage to our digital economy and is a long-term investment in our country’s potential. The cloud is transforming every industry and sector. The investment in skilling will empower India’s workforce today and into the future.”

The Hyderabad datacenter region will be an addition to the existing network of three regions in India across Pune, Mumbai, and Chennai. It will offer the entire Microsoft portfolio across the cloud, data solutions, artificial intelligence (AI), productivity tools, and customer relationship management (CRM) with advanced data security, for enterprises, start-ups, developers, education, and government institutions.

To support customer needs for high availability and resiliency, Microsoft launched Azure Availability Zones in December 2021 in its Central India datacenter region. This forms the most extensive network of datacenters in the country with disaster recovery provisions and coverage of seismic zones.

Stephanie Trautman, Chief Growth Officer, Wipro said, “Wipro and Microsoft have been working together for more than two decades to help enterprises drive business growth, enhance customer experience, and provide connected insights. Microsoft’s new datacenter in India will help advance our collaboration and drive ongoing innovation for shared client relationships. As Indian enterprises continue to transition and expand their involvement in cloud computing, this facility will provide the critical infrastructure foundation for developers and organizations of all sizes to create new customer experiences, support their business, and harness innovation at scale.”

Creating opportunities for innovation in a digital economy, through skilling

Telangana is emerging as a ‘challenger’ in the Indian IT sector for its software exports registering an increase by seven percent year-over-year to reach Rs. 5 trillion ($67.4 billion) in FY21.[2]

In the city of Hyderabad and across the state of Telangana, Microsoft will enable opportunities for local businesses to innovate with Microsoft Cloud services. Microsoft is partnering closely with the government of Telangana to accelerate the adoption of cloud, AI, IoT and cybersecurity solutions for governance. This includes efforts to upskill government officials in next generation technology, supporting young girls to build careers in cybersecurity through the CyberShikshaa program, and partnering on skilling programs like DigiSaksham with the Ministry of Labour and Employment to equip jobseekers from rural areas with technical skills.

Shri. KT Rama Rao, Minister Municipal Administration & Urban Development, Industries & Commerce, and Information Technology, Government of Telangana, said, “I am extremely delighted that Microsoft chose Hyderabad as the destination for its largest datacenter investment in India. This will also be one of the largest foreign direct investments (FDIs) the state has attracted. Microsoft and Telangana have a long history, with Hyderabad hosting one of the largest Microsoft offices in the world, and I am happy to see the relationship grow.”

Anant Maheshwari, President, Microsoft India said, “Cloud services are poised to play a critical role in reimagining the future of business and governance and enabling overall inclusion in the country. The new datacenter will augment Microsoft’s cloud capabilities and capacity to support those working across the country. It will also support new entrepreneurial opportunities while meeting critical security and compliance needs. The new datacenter region is a testament to our mission to empower the people and organizations of India to achieve more. We are pleased to be collaborating with the Government of Telangana on this major milestone and we deeply appreciate their support.”

Cloud for sustainable growth

With the expansion of its cloud datacenter footprint in India, Microsoft is empowering and co-innovating with customers and partners. Microsoft’s customers in India, including Jio, Inmobi, Infosys, TCS, ICICI, Bajaj Finserv, Apollo Hospitals, Mahindra, Dr. Reddy’s Labs, Piramal, State Bank of India, Flipkart, Pidilite, and Amity, are already benefitting from Microsoft’s global scale cloud services and the new datacenter region in Hyderabad will play an important role in meeting India’s burgeoning cloud requirements.

Microsoft is committed to responsible and sustainable datacenter operations. This includes a commitment to have a 100 percent renewable energy supply equivalent to the electricity consumed by Microsoft datacenters by 2025. The new datacenter region will be built with sustainable design and operations in mind, enabling Microsoft to responsibly deliver reliable and highly available cloud services at scale.

Rajiv Memani, Chairman, Ernst & Young India said, “Congratulations to Microsoft and the Telangana government on the upcoming datacenter that will bring comprehensive, intelligent, sustainable, and trusted cloud services to the region. It is heartening to see Microsoft’s commitment towards sustainability and reducing its impact on the environment by using green technology solutions” Rajiv Kumar, Managing Director, Microsoft IDC and Corporate Vice President, E+D India “The new datacenter is a testament to the high-quality engineering work we are doing here in India Development Center; a microcosm of Microsoft engineering, it drives some of the most impactful and innovative work across Cloud & AI, Gaming, Experiences & Devices and Security.”

Source: https://news.microsoft.com/en-in/microsoft-india-hyderabad-data-center-region-intent/