The rise of artificial intelligence: what does it mean for development?


Typically, there are two arguments against ICTs ( Information and communications technology ) for development. First, to properly reap the benefits of ICTs ( Information and communications technology ) , countries need to be equipped with basic communication and other digital service delivery infrastructure, which remains a challenge for many of our low-income clients. Second, we need to be mindful of the growing divide between digital-ready groups vs. the rest of the population, and how it may exacerbate broader socio-economic inequality.

These concerns certainly apply to artificial intelligence (AI), which has recently re-emerged as an exciting frontier of technological innovation. In a nutshell, artificial intelligence is intelligence exhibited by machines. Unlike the several “AI winters” of the past decades, AI technologies really seem to be taking off this time. This may be promising news, but it challenges us to more clearly validate the vision of ICT ( Information and communications technology ) for development, while incorporating the potential impact of AI.

It is probably too early to figure out whether AI will be blessing or a curse for international development… or perhaps this type of binary framing may not be the best approach. Rather than providing a definite answer, I’d like to share some thoughts on what AI means for ICT ( Information and communications technology ) and development.

AI and the Vision of ICT ( Information and communications technology ) for Development

Fundamentally, the vision of ICT ( Information and communications technology ) for development is rooted in the idea that universal access to information is critical to development. That is why ICT ( Information and communications technology ) projects at development finance institutions share the ultimate goal of driving down the cost of information. However, we have observed several notable features of the present information age: 1) there is a gigantic amount of data to analyze, which is growing at an unprecedented rate and 2) in the highly complex challenges of our world, it is almost impossible to discover structures in raw data that can be described as simple equations, for example when finding cures for cancer or predicting natural disasters.

This calls for a new powerful tool to convert unstructured information into actionable knowledge, which is expected to be greatly aided by artificial intelligence. For instance, machine learning, one of the fastest-evolving subfields in AI research, provides feature predictions with greatly enhanced accuracies at much lower costs. As an example, we can train a machine with a lot of pictures, so that it can later tell which photos have dogs in it or not, without a human’s prior algorithmic input.

To summarize, AI promises to achieve the vision of ICT ( Information and communications technology ) for development much more effectively. Then, what are some practical areas of its usage?

AI for development: areas of application

Since AI research is rapidly progressing, it is challenging to get a clear sense of all the different ways AI could be applied to development work in the future; nonetheless, the following are a couple areas where current AI technologies are expected to provide significant added-value.

First, AI allows us to develop innovative new solutions to many complex problems faced by developing countries. As an example, a malaria test traditionally requires a well-trained medical professional who analyzes blood samples under a microscope. In Uganda, an experiment showed that real-time and high-accuracy malaria diagnoses are possible with machines running on low-powered devices such as Android phones.

Secondly, AI could make significant contributions to designing effective development policies by enabling accurate predictions at lower costs. One promising example is the case of the US-based startup called Descartes. The company uses satellite imagery and machine learning to make corn yield forecasts in the US. They use spectral information to measure chlorophyll levels of corn, which is then used to estimate corn production. Their projections have proven to be consistently more accurate than the survey-based estimates used by the US Department of Agriculture. This kind of revolution in prediction has great potential to help developing economies design more effective policies, including for mitigating the impact of natural disasters.


Looking forward – Toward the democratization of AI?

Many assume that it is too early to talk about AI in the developing world, but the mainstreaming of AI may happen sooner than most people would assume. Years ago, some tech visionaries already envisioned that AI would soon become a commodity like electricity. And this year, Google revealed TensorFlow Lite, the first software of its kind that runs machine learning models on individual smartphones. Further, Google is working on the AutoML project, an initiative to leverage machine learning to automate the process of designing machine learning models themselves.

As always, new technology can be liberating and disruptive, and the outcome will largely depend on our own ability to use it wisely. Despite the uncertainty, AI provides another exciting opportunity for the ICT ( Information and communications technology ) sector to leverage technological innovation for the benefit of the world’s marginalized populations.



Artificial Intelligence Trends 2018

AI trends of 2018-1

Artificial intelligence (AI) continued to be a major driver of digital transformation in 2017, with the rapidly advancing technology affecting business strategy and operations, customer interactions and the workforce itself. While these are all general and broad impacts of AI, they will continue to be important for businesses trying to keep up with rapid technological advancements in 2018.

Embedded deep learning will become the focus of software product teams in the coming year, as buyers will begin to inquire about the machine learning as a service capabilities of the tools they are purchasing. Many vendors are already including machine learning in products to enhance and automate certain functionalities, and build marketing campaigns around those AI enhancements. As embedded AI becomes more standard in solutions, there will be less of an emphasis around the glitz and glamour of machine learning and more of a focus on how the embedded AI is contributing to a business’ overall digital transformation.

There will also be a push to open data sources in 2018 for the benefit of machine learning developers. AI is only as good as the data that it has to learn from, so when building embedded AI applications or training machine and deep learning models, one needs as much data as possible. Enterprise companies, like Amazon and Google, among others, do not have a problem accessing mammoth data sets, because their everyday businesses are so large that they create a seemingly endless supply of data. However, small businesses or independent developers do not have that luxury; therefore, they will take advantage of open-source data sets, often made available by those same enterprise companies.

Similarly, businesses will begin to share their data with the software they work with instead of trying to hoard their own data in secrecy. As embedded AI becomes the norm, companies will have the option to share data with the vendor to increase the machine learning capabilities, and have the technology learn not just from the business’ data, but also from the data of the vendors’ entire customer base. Businesses using AI-enabled software will begin to realize that the benefits of data sharing outweigh the risks, which primarily center around data security.

Businesses and software vendors will also more frequently open up data to partnership opportunities in the form of data swaps. This will be particularly helpful for AI and general automation. Software vendors will begin to trade valuable data to best improve the embedded AI within its products. This will most likely happen across software categories, because the race to have the smartest and most intelligent application will be fiercely competitive. Any edge a vendor can get will be crucial. This will also benefit businesses outside the software space who begin to implement AI into general businesses processes.

Adoption of AI in businesses will be driven by digital platform providers, the same way that those enterprise service providers drove adoption of the cloud. Amazon Web Services (AWS), Google Cloud Platform and Microsoft Azure have created a number of machine and deep learning API’s and microservices that will make it easy for businesses to deploy AI for business operations and automation purposes. These solutions will have the same advantages as the vendors’ other service offerings; they will be cost effective, easy to setup and quick to deploy, making them attractive options for companies that do not have highly skilled, in-house developers. This machine learning as a service (MLaaS) type of deployment will become much more mainstream in 2018.

Finally, robotics process automation (RPA) will make its emergence in the workplace. This technology is still in its infancy, but it will begin to have an impact on business process management. RPA creates intelligent robots that access the software a business already uses, and creates automation for mundane tasks, like data entry. The benefit of RPA systems is that they are very easy to build, setup and train. These solutions can eliminate human error and help IT teams focus on bigger and more important implementations, instead of wasting time and energy improving and correcting the minor, but necessary aspects of industry. Look out for more updates on RPA throughout 2018.

Look for each of these trends to emerge as a focal point for AI in the coming year and have an impact on business modernization and digital transformation. Small businesses and enterprise companies alike will be adopting and embracing these intelligent trends, because the benefits will be so important that they will be unavoidable.

Open Data and Big Data Sharing

Enterprise companies, such as Amazon, Microsoft, Google and IBM, have been able to make the biggest strides in the AI space because they have access to enormous amounts of data. As businesses continue to accumulate and create massive amounts of data, there will be a need for data sharing unlike anything we have seen before. In the past, companies have kept data very close to the vest, with the exception of enterprise companies, but as the need to develop machine learning tools becomes more critical, companies will actively seek partnerships to share their data.

A number of enterprise companies have open sourced specific data sets to help developers train machine learning applications. For example, Google opened up AudioSet which, “consists of an expanding ontology of 632 audio event classes and a collection of 2,084,320 human-labeled 10-second sound clips drawn from YouTube videos.” An AI developer could potentially use this data set to help train a machine learning application for natural language processing purposes, and to better understand human, animal, musical and everyday sounds.

Uber has opened data from more than 2 billion of its ride sharing trips to improve urban planning. Uber partners with cities to better understand people and transportation in a program it calls Uber Movement. According to Uber, “We’ve gotten consistent feedback from cities we partner with that access to our aggregated data will inform decisions about how to adapt existing infrastructure and invest in future solutions to make our cities more efficient.” All data is anonymous, but one can imagine the potential opportunities a city or business can pull out of that rich data set. As a resident of Chicago, I hope that Uber and the city planners can work together to build an AI tool to optimize street lights, because hitting every red light on the way to the office may be my demise.

Of course, not all businesses are willing to just give away this data. As Uber states, it “partners” with cities, which one could speculate means that the company gets something beneficial in return, while other businesses like Yelp have opened up data for academic purposes. The Yelp data set can be used to, “teach students about databases, to learn NLP, or for sample production data while you learn how to make mobile apps.” If you are a hospitality management student trying to learn about restaurant trends or an aspiring developer, it could be extremely helpful to pull out insights from Yelp’s data set, which spans 12 metropolitan areas and consists of 4.7 million reviews, 156,000 businesses and 200,000 pictures, among other data points.

In 2018 it will not just be these huge companies that are opening up their data, but software vendors and customers alike. More and more companies will begin to opt in to software AI tools, such as Salesforce’s Einstein, to better automate tasks for employees. Businesses will happily share their own private CRM data with Salesforce if it means they now have access to the data from the thousands of other customers utilizing the solution’s AI capabilities. This would create better lead scoring, provide automated prospecting tools based on what others have found successful and, ultimately, save sales employees time.

The examples are not limited to the CRM space at all, instead they are seemingly endless. The boom of data from the internet of things (IoT) will open up even more data sharing opportunities for manufacturing and field service companies. Companies will be able to conceivably benchmark their machines’ performance and uptime against competitors within their industries by comparing IoT data. For these data-sharing opportunities to expand, there will be a greater emphasis on data security as well. All of these points will be critical to a company’s digital transformation.

In the coming year, other software companies will follow suit and allow their users to opt in to data sharing that will enhance business processes and offer growth opportunities that never before existed. It will also help pave the way for all software utilizing machine learning as the cornerstone of its solution.

Embedded AI

As more software companies discover ways to take advantage of their existing data sets, they will be able to strengthen their tools with embedded AI and make it the core of their products. Embedded AI is a blanket term for the use of machine and deep learning inside a software platform that improves aspects of an employee’s day-to-day. These machine learning advancements within the software may go unnoticed by most users, but they will be helpful for business strategy and operations, and relieve employees from mundane tasks with automation.

Over the past few years, many vendors have made large announcements notifying their customers, and their prospects, that they have added deep and machine learning to their current product offerings. Salesforce made Einstein the true focus of its major user conference, Dreamforce, back in 2016. That same year, when Microsoft transformed its products into the Dynamics 365 cloud suites, the company sure to highlight the fact that AI was going to be a major part of the tool.

I’m not currently a user of either solution, but I do use Expensify for expense management, and when it added machine learning to its tool, all users were sent an email explaining how the updates would benefit them. While I know the AI capabilities are in the tool, I’ve never actually seen it. That’s the point. For nearly all tools, users will not be aware that they are utilizing AI. But for those decision-makers in charge of purchasing tools, it will become a mandatory question when researching a software product and speaking with vendors. How does this product take advantage of AI to benefit my employees?

Vendors are aware of this, and most have been preparing for some time now, which is why Salesforce and Microsoft drove it home so heavily in their press releases. They have been looking towards the future and understand that as businesses continue their digital transformations, AI will need to be embedded into all of their tools. For the software vendors that have not started engraining AI into their products, they will certainly fall behind, and quickly. There is simply too much potential to improve their own products and too many opportunities to help customers to not make the conscience effort to make machine and deep learning the focal point of their products’ functionality.

Machine Learning as a Service (MLaaS)

In recent years, businesses have begun taking advantage of digital platforms and microservices to build their tech stacks (the holistic view of technology and software a company uses), utilizing the “as-a-service” model for everything from software to infrastructure. Businesses will lean into this strategy for AI, using the microservices from major enterprise vendors for “machine learning as a service.” Amazon Web Services, Google Cloud Platform and Microsoft Azure are a few of the digital platforms already providing this service to businesses.

Developers who understand how to build machine and deep learning applications are few and far between, and on top of that, expensive. Instead of trying to train algorithms with in-house resources, businesses can now purchase pre-built algorithms from said enterprise vendors and run their own company data through them, teaching these applications to do what is needed to better the business. Because of the data these companies have access to, their machine learning tools are likely more advanced than anything that could be built in-house anyways, so why not save time, effort and budget by taking advantage of the available services? That’s the question more and more companies will find themselves asking in 2018.

In the coming year, the algorithms and services provided by these large companies will only continue to expand and advance to the point where your business will be able to quickly and efficiently implement a natural language processing solution into your application or website. The major players will perfect these systems, so their AI offerings will rapidly consume data to be as effective as possible. As companies work through their digital transformations, these microservices will become the easiest and fastest way to progress rapidly with AI.


Artificial intelligence is possibly the most well-known aspect of digital transformation due to constant media coverage and the fear factor that it will make humans obsolete, but in reality, it is becoming a necessity for businesses. Whether companies are investing in software that uses embedded AI, or they are deploying their own MLaaS offering internally to automate processes, they should be taking of advantage of AI to modernize. In 2018, CIO’s and IT departments that have not yet adopted AI in some fashion will begin to feel the pressure, both externally and internally, to use the technology that is out there to better traditional business aspects.

Developers will continue to build important and useful machine learning tools with the help of open data. As companies begin to let go of the tight grip they have historically maintained around their proprietary data, more and more opportunities will present themselves — from data swap partnerships to AI enhancements — simply by sharing data. Each of these opportunities will help to modernize a business.