How will artificial intelligence change the way you lead?

Artificial-Intelligence (1)

There have been countless books written about the tenets and principles of effective leadership. The lives of everyone from Ghandi to a coyote have been mined for insights into how to manage people for success. But what of the new world where leaders will be required to manage both people and machines to thrive?

There’s no doubt that the breadth and depth of artificial intelligence, machine learning and robotic-process automation capabilities are growing fast. And, while there is some talk about the possibility of working for a robo-boss in the future, the reality is more likely to be a significant shift in the skills that leaders will require to succeed in this new digital workplace.

So, what does that mean?

A recent Avanade survey showed that 85 percent of executives agree that company leadership needs to be able to manage both humans and machines if they plan to successfully integrate artificial intelligence into their organizations. Indeed, more than half of the C-level executives surveyed believe that an understanding of new and emerging technologies will be more important for leaders than a deep specialization in strategy, sales and marketing. Accenture affirms the need for a balance of skills as it identifies three elements to an executive’s AIQ—technology, data and people.

Just like today, leaders will need to have a balance of intellectual (IQ) and emotional (EQ) intelligence to manage in the AI infused workplace. On the IQ side, leaders will need to have a vision for the AI-first world in their organizations and know where it can be used to free employees to spend more time on complex tasks and enhance productivity. But, even more important, EQ and people-centric skills will be critical to evangelize the positive impacts and keep people engaged, address anxiety around the changing workforce, and help them reskill to focus on new ways of working and thinking.

In fact, with advanced analytics producing insights far greater and faster than the human brain is capable of, the “softer” management skills will be more important than deep subject expertise or raw intelligence. Topics like digital ethics and trust will come to the forefront. According to Harvard Business Review, “Certain qualities, such as deep domain expertise, decisiveness, authority, and short-term task focus, are losing their cachet, while others, such as humility, adaptability, vision, and constant engagement, are likely to play a key role in more-agile types of leadership.”

Of course, this is not the first time that leaders have taken on new skills in response to technology advances. Executives of a certain age will recall when dictation machines and typing pools were replaced by personal computers and the gap between those who learned to type in school and those who didn’t became quickly apparent. We adjusted, and we adjusted again when typewritten memos gave way to emails, then blogs, then tweets.

What’s next?

Executives in all industries need to be open to expanding and pivoting their skillset along with the rest of their workforces. Understanding the capabilities of AI is important, but so is attending to the needs of people who are affected by changes to the workforce. Along with the wisdom of [insert-your-favorite-business-guru-here], learn about the technology that is changing the game within your organizations and on the larger competitive landscape.

With a solid understanding of the capabilities of humans and machines, leaders will be prepared to draw upon the strengths of each to grow and sustain a digital workplace.

Source: https://www.avanade.com/en/blogs/avanade-insights/artificial-intelligence/ai-and-leadership

How AI improves customer experience and drives business outcomes

ai

So what is AI—artificial intelligence? I think it’s useful—at the risk of oversimplification—to state it this way: AI is machines that think and act like people. What that really means is that they need to understand and interpret data. To improve the customer experience, AI needs to be capable of processing data of various types—both structured and unstructured. For example, if AI sees a picture of an apple—unstructured data—it needs to know that is an apple.

We have to add intelligence to AI to make that happen. With real intelligence in AI, it can make reasonable conclusions about data. But even if AI can make educated decisions based on available data, that will be of little use if it cannot communicate with people. Therefore, to improve the customer experience, AI must be capable of seamless interaction with people, for example, through chatbots.

What is the future of AI? How will we use AI to improve the customer experience?

We’ve only scratched the surface of AI. That leads some long-range thinkers to ask, “What is the future of AI?” But the more immediate question for in-the-field practitioners is how will we use AI to improve the customer experience? The answer is AI will be used to drive many company processes. In addition to the customer experience, for marketing, there are three other primary processes for which AI will become increasingly instrumental:

  • Driving business outcomes
  • Differentiating customer segments
  • Enhancing the employee experience (as a subset of customer experience)

And as a subset of customer experience, employee experiences offer companies a way to increase engagement as well as defray costs. For example, a chatbot could speak to an employee about IT support, using natural language processing within this narrow domain to diagnose IT issues and either resolve them or reroute them to appropriate departments.

AI has increasingly become part of everyday life. For example, if you use a Kindle and buy ebooks from Amazon, you’re getting recommended books for your ongoing reading. But how do you get so many recommendations so seamlessly? As AI becomes more accessible, it is less expensive to process information—increasing its ability to scale more data. For example, the algorithms that make these recommendations could be processing more than 100 million book-recommendation combinations, which combined with your previous reading and viewing patterns narrows the final recommendations to two or three books that you are likely to read.

Use AI to improve customer experience, not just for the sake of AI

As with any technology, the hype of what AI can accomplish sometimes outstrips the reality. Real benefits will come from applying it to defined use cases and putting customers and their experience at the heart of what you are trying to achieve. Some business leaders have jumped on the AI bandwagon but struggle to quantify the outcomes. It all comes down to the pragmatic principle that nobody should implement AI for the sake of saying they have AI. But if you have real business outcomes driven by AI, which improves the customer experience, then use it.

AI is like IoT: it’s here to stay, accept it or not. For example, many companies have access to social media data on the internet and mobile apps. If you have major operations in commerce, for example, you will want to be aware of social media conversations regarding customer sentiment as it pertains to your company, brand, and products. AI can analyze the sentiment of these social media conversations and use these insights to define customer interactions.

Start AI as a lean project: personalize a customer experience

Like any emerging technology making its way into the corporate enterprise, many people don’t know where to start with AI. I suggest you start lean to demonstrate how AI can drive value for the enterprise. For example, you could automate and personalize an individual customer experience, such as a microsite or tradeshow app. Experiment with different aspects using AI to automate and personalize the customer experience. However, work to keep the bigger picture and outcome in mind.

You’ll have to consider how AI will fit into your existing processes and to scale it for growth over an extended period of time. That’s because the long-term value of AI is in effectively scaling and embedding it into existing processes once the value proposition has been proven and received buy in from management. As another example, you could leverage your existing customer loyalty card in online commerce. Run a promotion online for those customers who supply their in-store customer loyalty card number. It should be relatively easy to capture that customer loyalty card data online with the right commerce and digital experience platform.

Use AI to understand personas and segments

Before AI can be deployed into a customer experience platform it must first understand the relevant personas and segments. Among other things, the AI instance will need to know the level of customer involvement with your brand. For example, does a particular customer make comments online. By matching information about your customers from social media against your ERP, CRM, and other databases, you can complete your understanding of how they interact with the brand.

After connecting your customer data, AI can begin to draw insights about making recommendations. For example, if “Thing A” did well with customers who share characteristics with your target customer, who also liked Thing A, and “Thing B” also did well with these customers, but this specific customer hasn’t tried Thing B, AI could run a test to check, perhaps offering a 36-hour coupon that needs to be redeemed. Once you have run enough tests, AI will let you narrow down very precisely what will engage this particular customer so much so you may be able to create a customer segment of one. It’s all about reaching out with the right product at the right time.

Broaden the data that can be used (unstructured and structured) to drive business outcomes

AI can be used to understand unstructured data as well as structured data to drive business outcomes. Traditionally, structured data exists in a database. For example, after a car accident, an adjuster assesses the damage to the car and fills out a standard form to capture the extent of the damage. AI can use that data to draw inferences. But it can also draw inferences from unstructured data. Today, anybody can take a picture of car damage, which is stored in the database as unstructured data. AI can tap this unstructured photographic data and fill in sections of the insurance claim form by itself. It automatically extracts the information based on the previous data it “knows” of what different levels/kinds of car damage look like. This helps automate and speed the insurance claims process.

Social media is another example of unstructured data. Businesses concerned about the customer experience will want to know how their brand is perceived. For example, consider the aggregate sentiment of buyers for luxury cars. Customers can judge different car brands by their creature comforts, engine power, and other criteria. AI can go online and see the sentiment of how product users rate these criteria (i.e., positive, neutral, negative). When you run a campaign and rate the sentiment of an entire group/population, AI helps derive deep insights into customer perceptions and views.

Implementing and starting to add value with AI

Before you start implementing AI to improve your customer experience, first you must establish your data priorities. This entails answering a series of high-level questions such as:

  • What is the expected value add from AI?
  • What delivered customer experiences will trigger the desired business outcomes?
  • Where does the data reside that AI needs to create these customer experiences?
  • If the requisite data does not reside internally, what additional external data do you need to feed AI?

With satisfactory answers, you can begin to implement AI. But how do you do it? As an entry point, use machine learning to drive the data. After that you will need to know if change enablement is necessary and how you can get to economies of scale with AI.

Every new amazing customer experience benchmarks the next

All said, AI can make a qualitative difference in moving from personalizing customer experiences to individualizing customer experiences. But to do it, AI needs the ability to:

  • Process both structured and unstructured data
  • Derive reasonable conclusions from the data processed
  • Automate processes
  • Operate the customer experience end to end

When your AI-driven customer experience has all this, you can make every new amazing customer experience the new benchmark for your customer’s next experience. You can gain additional competitive advantage by mining for the right data to drive the right customer experiences at the right time. The most difficult aspect of that is handling real-time customer information by recognizing the right information to integrate while weeding out noise. Most importantly, you need the right digital experience platform and the most qualified partner to help drive those most amazing customer experiences.

 

Source: https://www.avanade.com/en/blogs/avanade-insights/artificial-intelligence/how-ai-improves-customer-experience

Why AI Is the transformational technology of the digital age

ai 3

It wasn’t long ago that those of us working on “digital” solutions were almost entirely engulfed in a future focused on four key technologies – social, mobile, analytic and cloud. Organizations were trying to incorporate social into their customer service operations, deliver responsive experiences over every device and overcome the security concerns that paralyzed their decision to move to the cloud. While all four remain incredibly important, it seems few business conversations these days focus exclusively on one of these domains. Now, artificial intelligence (AI) is the focus from boardrooms to basements. AI now stands out as is the transformational technology of the digital age.

There are many reasons why this shift has happened so quickly. Obviously, storage costs continue to fall, the proliferation of data and data sources continue to sky-rocket and compute power continues to increase. Just as important, public cloud providers continue to improve, and add to, the impressive machine learning and deep learning capabilities that they make available to the masses.

When you combine all of the technological improvements with the growing corporate investment in this space, it becomes clear why AI is expected to be the defining technology of our future. The number of AI use cases, from enhancing the client experience in call centers (improved language processing and speech recognition) to predictive maintenance (fixing equipment before failures) is resulting in another powerful wave of business improvement driven by technology. In a recent study, McKinsey estimates that AI has the potential to create between $3.5 trillion and $5.8 trillion in value, annually, across various business functions and industries.

So, with all this promise, why aren’t more firms adopting AI at scale and growing the number of AI solutions across their business processes?

  • There is a lack of skills in the data science discipline.
  • There are regulatory issues that have to be addressed.
  • There remains a trust issue (transparency in how AI decisions are reached)

However, from my experience, the primary reason for the lack of AI scale comes back to the quality of artificial intelligence “nutrients” that the algorithms require for ingestion. That is, many organizations just do not have their data in a state of readiness to take advantage of this AI-powered world.

The first step in creating value from any applied intelligence solution is accessing all the information relevant to a given problem. The concept underpinning all of machine learning is giving an algorithm a massive number of “experiences” (training data) and a generalized strategy for learning, and then letting the AI identify patterns, associations and insight from that data. But, if the data is siloed in an organization and inaccessible, or if it is difficult to obtain data sets sufficiently large and comprehensible to be used for training, then the AI value cannot be realized.

To overcome these challenges, many organizations need to get back to the basics before attempting the AI “leap.” There are three areas that must be addressed:

  • Data Strategy.

To build out the required data collection and data architecture, an organization must understand what the data (and associated analytics) will be used for. In many cases, executives worry about their ability to choose the most effective systems for their needs and they get lost in a state of paralysis. Data is no longer about just measuring and managing. Data is core to a firm’s innovation. Defining the data strategy is core organizational function.

  • Data Generation & Aggregation.

I have met with numerous firms that are sourcing and collecting large amounts of data but that still do not have a plan or a platform to consolidate the information in a useful way. Organizations struggle with creating the right structure for any meaningful synthesis to take place. This is why cloud platforms, such as Microsoft Azure, are fundamental. The ability to generate and aggregate becomes only more important with AI since the quantity of available data is core to the machine learning.

  • Driving Insight.

Driving insight is all about revealing the invisible and gleaning new information from data that can be acted upon. While insight is obviously the output, understanding that business problem upfront is important. In understanding what insight is required, an organization can balance the requirements for traditional analytics and developing AI-powered solutions.

Artificial intelligence is here and advancing quickly. The technology can drive significant value and the opportunity is tremendous. For organizations wishing to deploy AI to realize that value, however, there are some basics that must be in place. Developing the data strategy, collecting and aggregating the information in a thoughtful manner and focusing on the insight required to address specific business problems are table stakes. From there, the value of AI can be mined. All companies have the opportunity in front of them. As Mark Twain wrote, “there’s gold in them thar hills!”

Source: https://www.avanade.com/en/blogs/avanade-insights/artificial-intelligence/ai-is-the-transformational-tech-of-digital-age

Practicing ‘No Code’ Data Science

Blog Insights by, Ashvini Shahane, President – Learning Services, Synergetics Information Technology Services India Pvt. Ltd.

“This is a great article and really fascinating to see how the world of Data Science and Machine Learning is becoming more democratized with the “No-code Data Science Tools” enabling the growth of more “Citizen Data Scientists”. Quoting from the article – “In advanced analytics and AI it’s about the shortage, cost, and acquisition of sufficient skilled data scientists.”, the need to make Machine Learning solutions faster with more efficiency, consistency is what is needed.

When I started out on the journey of AI and Data Science, coming from a Microsoft technologies space tools like Microsoft ML Studio, helped build, train and operationalize ML models quickly with minimum Data Science background. Microsoft started with tools like Azure ML Studio for the budding inexperienced Data Scientist and then went on to provide tools for the more experienced in the field with Microsoft ML Services.

Recently, Microsoft has grown its offerings on ML with its most recent addition of “Automated ML” capability in Azure Machine Learning Services. Automated ML empowers customers, with or without data science expertise, to identify an end-to-end machine learning pipeline for any problem, achieving higher accuracy while spending far less of their time. It is like a recommender system for machine learning pipelines.

AS-insight

https://azure.microsoft.com/en-us/blog/announcing-automated-ml-capability-in-azure-machine-learning/

Really looking forward to the innovations in the “No-code Data Science” space making the creation and usage of Data Science and ML solutions easier, faster and more accurate.”

 

datascience

Summary:  We are entering a new phase in the practice of data science, the ‘Code-Free’ era.  Like all major changes this one has not sprung fully grown but the movement is now large enough that its momentum is clear.  Here’s what you need to know.

We are entering a new phase in the practice of data science, the ‘Code-Free’ era.  Like all major changes this one has not sprung fully grown but the movement is now large enough that its momentum is clear.

Barely a week goes by that we don’t learn about some new automated / no-code capability being introduced.  Sometimes these are new startups with integrated offerings.  More frequently they’re features or modules being added by existing analytic platform vendors.

I’ve been following these automated machine learning (AML) platforms since they emerged.  I wrote first about them in the spring of 2016 under the somewhat scary title “Data Scientists Automated and Unemployed by 2025!”.

Of course this was never my prediction, but in the last 2 ½ years the spread of automated features in our profession has been striking.

No Code Data Science

datascience 1

No-Code data science, or automated machine learning, or as Gartner has tried to brand this, ‘augmented’ data science offers a continuum of ease-of-use.  These range from:

Guided Platforms: Platforms with highly guided modeling procedures (but still requiring the user to move through the steps, (e.g. BigML, SAS, Alteryx). Classic drag-and-drop platforms are the basis for this generation.

Automated Machine Learning (AML): Fully automated machine learning platforms (e.g. DataRobot).

Conversational Analytics: In this last version, the user merely poses the question to be solved in common English and the platform presents the best answer, selecting data, features, modeling technique, and presumably even best data visualization.

This list also pretty well describes the developmental timeline.  Guided Platforms are now old hat.  AML platforms are becoming numerous and mature.  Conversational analytics is just beginning.

Not Just for Advanced Analytics

This smart augmentation of our tools extends beyond predictive / prescriptive modeling into the realm of data blending and prep, and even into data viz.  What this means is that code-free smart features are being made available to classical BI business analysts, and of course to power user LOB managers (aka Citizen Data Scientists).

The market drivers for this evolution are well known.  In advanced analytics and AI it’s about the shortage, cost, and acquisition of sufficient skilled data scientists.  In this realm it’s about time to insight, efficiency, and consistency.  Essentially doing more with less and faster.

However in the data prep, blending, feature identification world which is also important to data scientists, the real draw is the much larger data analyst / BI practitioner world.  In this world the ETL of classic static data is still a huge burden and time delay that is moving rapidly from an IT specialist function to self-service.

Everything Old is New Again

When I started in data science in about 2001 SAS and SPSS were the dominant players and were already moving away from their proprietary code toward drag-and-drop, the earliest form of this automation.

The transition in academia 7 or 8 years later to teaching in R seems to have been driven financially by the fact that although SAS and SPSS gave essentially free access to students, they still charged instructors, albeit at a large academic discount.  R however was free.

We then regressed back to an age, continuing till today when to be a data scientist means working in code.  That’s the way this current generation of data scientists has been taught, and expectedly, that’s how they practice.

There has also been an incorrect bias that working in a drag-and-drop system did not allow the fine grain hyperparameter tuning that code allows.  If you’ve ever worked in SAS Enterprise Miner or its competitors you know this is incorrect, and in fact that fine tuning is made all the easier.

In my mind this was always an unnecessary digression back to the bad old days of coding-only which tended to take the new practitioner’s eye off the ball of the fundamentals and make it look like just another programming language to master.  So I for one both welcome and expected this return to procedures that are both speedy and consistent among practitioners.

What About Model Quality

We tend to think of a ‘win’ in advanced analytics as improving the accuracy of a model.  There’s a perception that relying on automated No-Code solutions gives up some of this accuracy.  This isn’t true.

The AutoML platforms like DataRobot, Tazi.ai, and OneClick.ai (among many others) not only run hundreds of model types in parallel including variations on hyperparameters, but they also perform transforms, feature selection, and even some feature engineering.  It’s unlikely that you’re going to beat one of these platforms on pure accuracy.

A caveat here is that domain expertise applied to feature engineering is still a human advantage.

Perhaps more importantly, when we’re talking about variations in accuracy at the second or third data point, is the many weeks you spent on development a good cost tradeoff compared to the few days or even hours these AutoML platforms offer?

The Broader Impact of No Code

It seems to me that the biggest beneficiaries of no-code are actually classic data analysts and LOB managers who continue to be most focused on BI static data.  The standalone data blending and prep platforms are a huge benefit to this group (and to IT whose workload is significantly lightened).

These no-code data prep platforms like ClearStory Data, Paxata, and Trifacta are moving rapidly to incorporate ML features into their processes that help users select which data sources are appropriate to blend, what the data items actually mean (using more ad hoc sources in the absence of good data dictionaries), and even extending into feature engineering and feature selection.

Modern data prep platforms are using embedded ML for example for smart automated cleaning or treatment of outliers.

Others like Octopai, just reviewed by Gartner as one of “5 Cool Companies” focus on enabling users to quickly find trusted data through automation by using machine learning and pattern analysis to determine the relationships among different data elements, the context in which the data was created, and the data’s prior uses and transformations.

These platforms also enable secure self-service by enforcing permissions and protecting PID and other similarly sensitive data.

Even data viz leader Tableau is rolling out conversational analytic features using NLP and other ML tools to allow users to pose queries in plain English and return optimum visualizations.

What Does This Actually Mean for Data Scientists

Gartner believes that within two years, by 2020, citizen data scientists will surpass data scientists in the quantity and value of the advanced analytics they produce.  They propose that data scientists will instead focus on specialized problems and embedding enterprise-grade models into applications.

I disagree.  This would seem to relegate data scientists to the role of QA and implementation.  That’s not what we signed on for.

My take is that this will rapidly expand the use of advanced analytics deeper and deeper into organizations thanks to smaller groups of data scientists being able to handle more and more projects.

We’ve already emerged by only a year or two from where the data scientist’s most important skills included blending and cleaning the data, and selecting the right predictive algorithms for the task.  These are specifically the areas that augmented/automatic no-code tools are taking over.

Companies that must create, monitor, and manage hundreds or thousands of models have been the earliest adopters, specifically insurance and financial services.

What’s that leave?  It leaves the senior role of Analytic Translator.  That’s the role McKinsey recently identified as the most important in any data science initiative.  In short, the job of Analytics Translator is to:

  • Lead the identification of opportunities where advanced analytics can make a difference.
  • Facilitate the process of prioritizing these opportunities.
  • Frequently serve as project manager on the projects.
  • Actively champion adoption of the solutions across the business and promote cost effective scaling.

In other words, translate business problems into data science projects and lead in quantifying the various types of risk and rewards that allow these projects to be prioritized.

What About AI?

Yes even our most recent advancements into image, text, and speech with CNNs and RNNs are rapidly being rolled out as automated no-code solutions.  And it couldn’t come fast enough because the shortage of data scientists with deep learning skills is even greater than with our more general practitioners.

Both Microsoft and Google rolled out automated deep learning platforms within the last year.  These started with transfer learning but are headed toward full AutoDL.  See Microsoft Custom Vision Services (https://www.customvision.ai/) and Google’s similar entry Cloud AutoML.

There are also a number of startup integrated AutoDL platforms.  We reviewed OneClick.AI earlier this year.  They include both a full AutoML and AutoDL platform.  Gartner recently nominated DimensionalMechanics as one of its “5 Cool Companies” with an AutoDL platform.

For a while I tried to personally keep up with the list of vendors of both No-Code AutoML and AutoDL and offer updates on their capabilities.  This rapidly became too much.

I was hoping Gartner or some other worthy group would step up with a comprehensive review and in 2017 Gartner did a fairly lengthy report “Augmented Analytics In the Future of Data and Analytics”.  The report was a good broad brush but failed to capture many of the vendors I was personally aware of.

To the best of my knowledge there’s still no comprehensive listing of all the platforms that offer either complete automation or significantly automated features.  They do however run from IBM and SAS all the way down to small startups, all worthy of your consideration.

Many of these are mentioned or reviewed in the articles linked below.  If you’re using advanced analytics in any form, or simply want to make your traditional business analysis function better, look at the solutions mentioned in these.

Source: https://www.datasciencecentral.com/profiles/blogs/practicing-no-code-data-science

 

What is the difference between AI, machine learning and deep learning?

ai 2

In the first part of this blog series, we gave you simple and elaborative definitions of what is artificial intelligence (AI), machine learning and deep learning. This is the second part of the series; here we are elucidating our readers with – What is the difference between AI, machine learning, and deep learning.

You can think of artificial intelligence (AI), machine learning and deep learning as a set of a matryoshka doll, also known as a Russian nesting doll. Deep learning is a subset of machine learning, which is a subset of AI.

Artificial intelligence is any computer program that does something smart. It can be a stack of a complex statistical model or if-then statements. AI can refer to anything from a computer program playing chess, to a voice-recognition system like Alexa. However, the technology can be broadly categorized into three groups — Narrow AI, artificial general intelligence (AGI), and superintelligent AI.

IBM’s Deep Blue, which beat chess grandmaster Garry Kasparov at the game in 1996, or Google DeepMind’s AlphaGo, which beat Lee Sedol at Go in 2016, are examples of narrow AI — AI that is skilled at one specific task. This is different from AGI — AGI is the intelligence of a machine that could successfully perform a range of tasks intellectual task that a human being can. On the other hand, Superintelligent AI takes things a step further. As Nick Bostrom describes it, this is “an intellect that is much smarter than the best human brains in practically every field, including scientific creativity, general wisdom, and social skills.” In other words, it is when the machines have outfoxed us.

ai 3

Machine learning is a subset of AI. The theory is simple, machines take data and ‘learn’ for themselves. It is currently the most promising tool in the AI pool for businesses. Machine learning systems can quickly apply knowledge and training from large datasets to excel at facial recognition, speech recognition, object recognition, translation, and many other tasks. Machine learning allows a system to learn to recognize patterns on its own and make predictions, contrary to hand-coding a software program with specific instructions to complete a task.

While Deep Blue and DeepMind are both types of AI, Deep Blue was rule-based, dependent on programming — so it was not a form of machine learning. DeepMind, on the other hand — beat the world champion in Go by training itself on a large data set of expert moves.

That is, all machine learning counts as AI, but not all AI counts as machine learning.

Deep learning is a subset of machine learning. Deep artificial neural networks are a set of algorithms reaching new levels of accuracy for many important problems, such as image recognition, sound recognition, recommender systems, etc.

It uses some machine learning techniques to solve real-world problems by tapping into neural networks that simulate human decision-making. Deep learning can be costly and requires huge datasets to train itself. This is because there are a huge number of parameters that need to be understood by a learning algorithm, which can primarily yield a lot of false-positives. For example, a deep learning algorithm could be trained to ‘learn’ how a dog looks like. It would take an enormous dataset of images for it to understand the minor details that distinguish a dog from a wolf or a fox.

Deep learning is part of DeepMind’s notorious AlphaGo algorithm, which beat the former world champion Lee Sedol in 4 out of 5 games of Go using deep learning in early 2016. Google said, “the way the deep learning system worked was by combining Monte-Carlo tree search with deep neural networks that have been trained by supervised learning, from human expert games, and by reinforcement learning from games of self-play.”

ai 4

Source: https://www.geospatialworld.net/blogs/difference-between-ai%EF%BB%BF-machine-learning-and-deep-learning/

 

Bot Framework – New perspective of Marketing Automation

Developing intelligent chat bots with Microsoft AI platform and Bot Framework

bot framework

Nowadays, we all are using different kinds of applications on different platform and devices. Somebody uses mobiles, somebody uses desktops and laptops to manage their day to day activities and business. In out daily life we use different kinds of applications such as social media applications, messengers, shopping and ticket booking applications, customer service applications and other business applications. What if you need a help while using these applications? What if you get confused while choosing menu options or to get started? Definitely you need some kind of assistance to go ahead. You can contact the customer support team to get assistance for your queries. But you may need to send mail or call to the customer service number and wait for their responses. What if you need immediate assistance? There comes the role of and intelligent online assistant who can help you for choosing options, providing suggestions, and can converse with you in your language.

A chat bot is an intelligent online assistant that can converse with you in your language. It can be programmed with a powerful AI backend that can understand you language and feelings, provide suggestions, collect data from user and respond quickly or later with the results you want. Chat bots can be programmed in different languages and can be hosted in various cloud platforms. A chat bot can be easily integrated with any kind of applications of your choice. It could be a messenger application such as Skype, Facebook messenger, Google Talk, WeChat, Kik or web applications. There are various Bot frameworks available for developers such as Microsoft Bot Framework, Facebook Wit.ai, Google’s api.ai. You can host your bot applications on various platforms such as Azure Bot Services, Chatfuel, HubSpot Motion.ai etc.

Microsoft Bot Framework is one of the best and rich framework for developing Intelligent Bot Applications on Microsoft Azure Cloud platform. The Bot Framework consists of three main components: The Bot builder SDK, Channels, and the Bot Framework Directory. The Bot Builder provides an SDK, libraries, samples, and tools to help you build and debug bots. Microsoft Bot Builder provides SDK for Node.JS and C# ie you can develop your bot applications using Node.JS, C#.NET, Java and Python.

Developing Bot applications using .NET

You can start creating your first bot applications using Visual Studio. For that you need to install the project templates for Bot applications. Two templates are available for .NET, targetting the v3 and v4 versions of the SDK respectively. Both are available as VSIX packages. Both are available in Visual Studio market place. You can download them from the following links.

Bot Builder V3 template: https://marketplace.visualstudio.com/items?itemName=BotBuilder.BotBuilderV3

Bot Builder V4 template: https://aka.ms/Ylcwxk

You need Visual Studio 2015 or later versions to install and develop using these templates. Bot Builder SDK requires .NET framework version 4.6 or later.

Sonu-Blog

Developing Bot Applications using Node.JS

You can develop your bot applications using Node.JS also. To install the Bot templates for Node.JS you need to install the latest version of Node.JS (8.5 or later) and Yeoman. You can download and install the latest version of Node.JS from the Node.JS web site. Install the latest version of Yeoman by running the following command.

npm install -g yo

Install the Node.JS project templates using the following npm command.

npm install generator-botbuilder

Developing Bot using Java and Python

You can also install the Bot templates for Java and python. You can use the following npm commands to install the Yeoman generators for the Java and Python project templates.

npm install generator-botbuilder-java

npm install generator-botbuilder-python

Run the Yeoman command to generate the project template you want.

Sonu-Blog 2

Making your bot intelligent

How you can create an intelligent bot that can understand your language and respond to your queries. Microsoft Azure AI platform provides a set of APIs that can be integrated with any of your applications. These APIs are called Cognitive Services. These APIs include APIs for Language processing, text to speech translation, suggestions, Search APIs, Face API etc. You can integrate these APIs with your bot applications to make your application more intelligent.

The interaction between bot and user is free-form, so it is important for a bot application to understand the user language and the context. Microsoft Azure Cognitive Services provides the LUIS (Language Understanding Intelligent Service) API that helps the bot to understand the users language and context. For that you need to create a LUIS app model and train your nodel to understand the utterances (What the user says) and the entities. Once the model starts processing input, LUIS begins active learning, allowing you to constantly update and improve the model.

Author: Sonu Sathyadas, Tech Lead, Synergetics

 

12 major Artificial Intelligence trends to watch for in 2018

12 major Artificial Intelligence trends to watch for in 2018Artificial Intelligence (AI) has the peculiar ability to simultaneously amaze, enthrall, leave us gasping and intimidate. The possibilities of AI are innumerable and they easily surpass our most artistically fecund imaginations. What all we read in science fiction novels or saw in movies like ‘The Matrix’ could someday materialize into reality. Bill Gates, the founder of Microsoft, recently said that ‘AI can be our friend’ and is good for the society. From decision-making to computing to robotics to vehicles and even cosmetics, AI has left its mark everywhere and it will usher in the grandest social engineering experiment in the history of the world.

CBInsights has prepared a list of the major AI trends to follow in 2018. Let’s have a look at the 13 trends In AI that will have a huge impact in years to come.

Robotic workforce

It is no more a closely guarded secret that in the future much of the labor-intensive work in assembly lines of factories would be done by AI programmed robots and not workers. This would bring down the cost of hiring workers and also reduce outsourcing and offshoring.

Recently, a Chinese T-shirt manufacturer Tianyuan Garments Company signed a Memorandum of Understanding (MoU) with the Arkansas government to employ 400 workers at $14/hr at its new garment factory in Arkansas. Operations were scheduled to begin by the end of 2017. Tianyuan’s factory in Little Rock, Arkansas, will use sewing robots developed by Georgia-based startup SoftWear Automation to manufacture apparel.

In Japan, by 2025, more than 80% of elderly care would be done by robots, not caregivers.

Ubiquitous Artificial Intelligence

Artificial Intelligence impacts multiple fields, even those that we least expect it to. Machine learning, a crucial component of AI, refers to the training of algorithms on large data sets so that they learn how to identify desired patterns better at their tasks.

The functioning of AI is getting more versatile with each passing day.

Uncle Sam vs The Dragon in the realm of AI

12 major Artificial Intelligence trends to watch for in 2018-1China is all set to prove its prowess in AI and outshine the US and other western countries. The Chinese government is investing a lot in this futuristic technology.

The Chinese government is promoting an intelligence plan. It includes everything from smart agriculture and intelligent logistics to military applications.

In 2017 China’s artificial intelligence startups took 48% of all dollars going to AI startups globally in, more than that of the USA. In deep learning also China publishes six times more patents than the US.

Battlefields in the age of AI

The wars of the future will rely on smart technology like never before. Drones are just the beginning. With the increasing convergence of conventional defense, surveillance, and reconnaissance with cybersecurity, the need for algorithm-based AI only expands.

Cyber security is a real opportunity area for AI since attacks are constantly-evolving and the main challenge is new forms of malware. Prima facie, AI would have an extra edge here given its ability to operate at scale and sift through millions of incidents to identify aberrations, risks, and signals of future threats.

The market is mushrooming with new cybersecurity companies trying to leverage machine learning to some extent.

Voice Assistants

Voice-enabled computing was all over at the Consumer Electronics Show in 2018. Barely any IoT device was without integration into the Amazon Echo or Google Home.

Samsung is also working on its own voice assistant, Bixby. It wants all of its products to be internet-connected and have intelligence from Bixby by 2020.

AI to throw the gauntlet before professionals

Skilled professionals — including lawyers, consultants, financial advisors etc —will face the heat of artificial intelligence as much as unskilled and semi-skilled workers.

For instance, artificial intelligence has huge potential to reduce the time and improve efficiency in legal work. As AI platforms become more efficient, affordable and commercialized, this will influence the remuneration structure of external law firms that charge by the hour.

Decentralization and Democratization

Artificial Intelligence isn’t only limited to powerful supercomputers and big devices; it is also becoming a part and parcel of smartphones and wearable devices and equipment. Edge computing is emerging as the next big area in AI.

Apple released its A11 chip with a neural engine for iPhone 8 and X. Apple claims it can perform machine learning tasks at up to 600B operations per second.

Another case for edge AI would be training your personal AI assistant locally on your device to recognize your unique accent or identify faces.

Capsule Networks

Neural networks have myriad architectures. One of the most popular one in deep learning these days is known as convolutional neural networks. Now a new architecture, capsule networks, has been developed and it would outpace the convolutional neural networks (CNNs) on multiple fronts.

CNNs have certain limitations that lead to lack of performance or gaps in security.

Capsule Networks would allow AIs to identify general patterns with less data and be less susceptible to false results.

Capsule Networks would take relative positions and orientation of an object into consideration without needing to be trained exhaustively on variations.

Dream salaries in AI talent hunt

As per a recent report, the approximate number of qualified researchers currently in the field of AI is 300,000, including students in relevant research areas. Meanwhile, companies require a million or more AI specialists for their engineering needs.

In the US, a Glassdoor search for “artificial intelligence” shows over 32,000 jobs currently listed, with several salary ranges well into the 6 digits. Companies are more than willing to pay handsome emoluments to intelligent AI experts.

Bigwigs of enterprise AI

As tech giants like Google, Amazon, Salesforce, and Microsoft improve their enterprise AI capability.

AI medical diagnostics

Regulators in the US are looking forward at approving AI for use in clinical settings. The advantage of AI in diagnostics is early detection and better accuracy.

Machine learning algorithms can compare a medical image with those of millions of other patients, picking up on nuances that a human eye may otherwise miss.

Consumer-focused AI monitoring tools like SkinVision — which uses computer vision to monitor suspicious skin boils — are already in use. But a new wave of healthcare AI applications will set the ground for machine learning capabilities in hospitals and clinics.

Build your own AI

Because of open source software libraries, hundreds of APIs and SDKs, and easy assembly kits from Amazon and Google, the barrier to entry in AI could not have been lower. Google launched an “AI for all ages” project called AIY (artificial Intelligence yourself).

Source: https://www.geospatialworld.net/blogs/13-artificial-intelligence-trends-2018/