make-money-with-machine-learning/

AI Economy: How Do You Make Money With Machine Learning?

The AI Ecosystem has generated a multi-billion dollar industry, and it all starts from data. Going upward in the value chain, there are the Chips (GPUs) that allow the physical storing of Big Data (a dominant player is NVIDIA). 

That Big Data will need to be stored on platforms and infrastructures that SMEs can’t afford. That is where players like Google Cloud, Amazon AWS, IBM Cloud, and Microsoft Azure come to the rescue. 

On large scale, a few corporations control the Enterprise AI market; while nations like China, the USA, Japan, Germany, the UK, and France have widely bet on it!

Yet, with the development of new AI companies, like OpenAI, the ecosystem has completely changed, and it’s now developing around three layers of AI. 

 

Understanding the AI ecosystem

ai-business-ecosystem

Beyond all the buzz and hype that comes with new words that enter the mainstream, AI is another of those disciplines that have become critical in today’s economic landscape.

Far from being at an embryonic stage, the AI Ecosystem has become a multi-billion dollar enterprise, led by tech giants that go from IBM to Google, Microsoft, Amazon, and many others.

That doesn’t mean there is no opportunity for new entrants. Quite the opposite.

The AI ecosystem revolves around a few key elements that can also be thought of as the “Toolbox for AI:”

  • Data or Big Data.
  • Infrastructure.
  • Algorithms.

Let’s look more in-depth, into each of those key elements for an AI ecosystem. But before diving into it, we need to understand who and how is making money with AI.

Who is making money with AI?

Billions of dollars have been invested in the AI ecosystem, especially by large tech companies.

This is a piece of good news, as those tech companies have created an ecosystem, which is out there, ready to be understood so that you can build your own company out of it.

Indeed, understanding how this ecosystem works is the first step toward making money out of it.

And it all starts with data!

It all starts with data

Keep in mind that the whole point of AI is to handle and actually be able to do something useful with a massive amount of data.

In short, even though we like to talk about AI and machine learning, they are technologies for their own sake. In reality, the foundation of those technologies is data.

A curated data pipeline is a foundation for an AI ecosystem to work in the first place.

Companies like Google, Wolfram Alpha, Amazon, and many others, spend billions on maintaining and curating their data. If at all, we can argue that for companies like Google, Data is its main asset.

As already explained in the Blockchain Economy, in today’s economic world, built on digitalization, the rule is to keep that data proprietary. That made sense, as this data is what gets eventually monetized with several strategies.

RelatedBlockchain Economics 101: The Theories Of Value In A World Driven By The Blockchain

Let’s a couple of opposite examples of how data gets monetized:

  • Google data-freemium strategy: Google uses its proprietary data (collected by billion of users’ searches each day) to sell advertising
  • Apple data-reversed-razor strategy: iPhones know a great deal about you, but Apple doesn’t share that data with marketers. Instead, it monetizes it by selling expensive devices (iPhone is the primary one)

When Data reaches a critical mass, we can call it Big Data. There is no single definition of Big Data, and it might actually vary throughout the years.

Given that the more the AI industry grows the cheaper data collection and processing will become.

This, in turn, will allow the management of a larger and larger amount of data.

For the sake of this discussion, and as of the time of this writing, a petabyte is understood as the first unit of Big Data:

what-is-a-petabyte

Source: searchstorage.techtarget.com

Chips: from CPU to GPU

nvidia-business-model
NVIDIA is a GPU design company, which develops and sells enterprise chips for industries spacing from gaming, data centers, professional visualizations, and autonomous driving. NVIDIA serves major large corporations as enterprise customers, and it uses a platform strategy where it combines its hardware with software tools to enhance its GPUs’ capabilities.

In the past, you could handle computational tasks with a simple CPU.

Until computers had to process a more substantial amount of data. This is where GPU came to the rescue.

A GPU or graphics processing unit is an electronic circuit able to manipulate a massive amount of data. The interesting part is that GPUs were designed for gaming. 

Yet, their ability to process games with very heavy graphics, made these chips extremely useful for AI as well. 

Opposite to the traditional CPU, a GPU can process large blocks of data in parallel, which is what makes it quite suited for AI systems.

At this pace, NVIDIA is the critical player.

As pointed out in its Annual Report, already back in 2018:

Starting with a focus on PC graphics, NVIDIA invented the graphics processing unit, or GPU, to solve some of the most complex problems in computer science. We have extended our focus in recent years to the revolutionary field of artificial intelligence, or AI. Fueled by the sustained demand for better 3D graphics and the scale of the gaming market, NVIDIA has evolved the GPU into a computer brain at the intersection of virtual reality, or VR, high performance computing, or HPC, and AI.

And it continued:

Its parallel processing capabilities, supported by up to thousands of computing cores, are essential to running deep learning algorithms. This form of AI, in which software writes itself by learning from data, can serve as the brain of computers, robots and self-driving cars that can perceive and understand the world.

As of January 2018, NVIDIA already recorded almost ten billion in revenues, and over eight billion came from the sales of GPU alone!

NVIDIA-revenues-GPU

By 2021 NVIDIA’s business model was ready to move beyond gaming.

Indeed, NVIDIA also developed a chip for companies like Tesla, which use it for self-driving. 

The NVIDIA V100 is used by Tesla (Image Source NVIDIA Corporate Website).

Large tech companies, like IBM and Google, have been investing massive resources to get their GPU chips to process Big Data.

And just like NVIDIA, other players followed suit, to position themselves as AI Chipmakers. 

ai-chip-makers-revenues

Intel as well massively interested into AI chips, which is among its priorities.

intel-priorities

And below, is how each of its chip products is used across various AI-powered industries.

intel-autonomous-driving-chip

Qualcomm also provides a stack of chips for various use cases.

qualcomm-products-by-applications
qualcomm-products-by-applications
qualcomm-products-by-applications

In general, tech giants have now brought the manufacturing of chips, in-house.

One example is Apple which finally started to produce its own chips, for both its phones and computers.

Google, has followed suit, by designing its own chips for the new generations of Pixel phones.

A new chip, designed for the first time in-house, was built to be a premium system on a chip (SoC).

Source: Google

This chip architecture, Google claims, enables it to further power up its devices with machine learning. For example with live translations from one language to another.

Why are companies investing again in chips?

With the rise of AI, and making anything smart, we live at the intersection of various new industries that are spurring the AI revolution (5G, IoT, machine learning models, libraries, and chip architecture).

As such, chip-making has become again, a core strategic asset, for companies that make hardware for consumers.

This doesn’t want to be a complete overview of the AI chip industry. Rather, to show you why chips have become again such a core strategic asset.

Algorithms and infrastructures: the Amazon/Google/Microsoft cloud war

To store a massive amount of data you need infrastructure, which if you are a small, but also a medium business is tough to build.

Therefore, you’ll need a third party able to store that data for you.

This has led to the cloud war between Amazon AWS, Google Cloud Platform, Microsoft Azure, and IBM Cloud. Amazon, Google, and Microsoft are the dominant players.

Google, in particular, is using a smart business strategy, which in a way represents the way Google does it.

Indeed, if you’re in programming, or operate in the machine learning field, you’re aware of Tensorflow, an open-source machine learning library.

Google leverages the open-source model as it allows anyone to use this library, which makes it better over time. 

But it also makes for the need for a larger and larger amount of data to be stored. And guess what, Google has a product for that: Google Cloud Platform. 

Therefore, if you’re a programmer using Tensorflow, and you need a platform to store that data, chances are you’ll use Google Cloud infrastructure.

The underlying cloud infrastructures offered by these tech giants have created a whole new industry, of cloud business models, made of three primary players, IaaS, PaaS, and SaaS. And a few others in between (FaaS, DaaS). 

cloud-business-models
Cloud business models are all built on top of cloud computing, a concept that took over around 2006 when former Google CEO Eric Schmit mentioned it. Most cloud-based business models can be classified as IaaS (Infrastructure as a Service), PaaS (Platform as a Service), or SaaS (Software as a Service). While those models are primarily monetized via subscriptions, they are monetized via pay-as-you-go revenue models and hybrid models (subscriptions + pay-as-you-go).

Enterprise, corporates, and nations

Both the enterprise and corporates AI industries are dominated by huge players that over the years have built massive infrastructure for large enterprise clients (take Salesforce and Oracle in the customer management industry).

At the same time, nations are investing in AI to generate long-lasting economic growth. China, the USA, Japan, France, the UK, and Germany are all investing in AI.

Let’s look at how, you can make money, instead, with AI.

How do you make money with AI?

As Kevin Kelly mentioned, in his book “The Inevitable:” 

The business plans of the next 10,000 startups are easy to forecast: Take X and add AI. Find something that can be made better by adding online smartness to it. 

There are a few ways to start building a business with AI:

  • Start a Startup
  • Contract Work 
  • Job or Internship
  • Write a Book
  • Educational Content
  • Automated Trading Bot
  • Competitions

And a few core pricing models: 

Subscription/retainer

Here the AI company, builds a custom model, through a pilot (either free or charged at a flat fee).

Once the model is ready, it goes through a transitional phase, which starts to run as a subscription/retainer (based on the volume handled by the model).

From there, once ready for scale, the price carries a base subscription, which beyond a certain volume, will spike up.

The subscription has to have a threshold, after which, the volume the model is able to handle might be unlimited (this happens in a very advanced phase, which based on the service, might take at least 18-24 months to develop). 

Pay-as-you-go

These comprise services of AI companies, which might provide standard libraries of machine learning models, and the company that picks one will pay based on the consumption. 

Hybrid pricing

These comprise AI models which are standard but can be customized to a certain extent. In that case, a subscription model, combined with a  pay-as-you-go model might do. 

A cut of the saving

In the case of a company that uses AI models to improve campaign performance, the company might introduce the customer to these models, with a free or flat fee pilot, and once this proves successful, only charge a % of the savings.

This model might be more effective to reduce the friction and acquisition cost of the enterprise customer. 

What the AI can and can’t do?

Based on my experience in the last three years, refining enterprise services, in the AI marketing space, these are the applications that I’ve seen as successful ones: 

Humans in the loop

One critical element is that, for now, the human must be in the loop.

Whatever application, for the enterprise, you choose, it’s critical to have a team of experienced people that are able to understand how to handle these models at scale.

When building AI models, those custom models (built for specific tasks of the company) might become more and more relevant over time.

However, it’s also critical to have the human in the loop, both at the input and output levels.

On the input side, it’s critical to have people spend time enriching the data available to the model and improve it (imagine if you want to make a language model that describes your products, the more information you give it about these products, details, and features of the products, the better the model).

On the output side, as the model generates things that go off, those must be corrected and audited, which connects to the next point. 

Model auditing 

I argue – will be larger than modeling itself! 

In my opinion, as those machine learning models grow in popularity, spurring the next trillion-dollar industry, there will be a core issue with them.

As those models, mostly leverage on deep learning, this means that you know the input and output, but in the middle, it’s a black box. In short, there is no simple way to audit the model, to know how it got from input to output.

In that scenario, being able to devise smart ways to audit machine learning models, will be extremely valuable. Indeed, if you ask me, where I’d start an AI company, I’d tell you right on that I’d start by building a toolbox to audit machine learning models.

Why? Because, any company using AI services, will need to be able to audit these models, and there is no right way to do it now. It’s still a Wild West! 

Commoditizable content is the starting point

If you want to start leveraging these machine learning models, you might want to start by looking at the part of the business which has the potential to be commoditized.

For instance, if you have an e-commerce store, with thousands or millions of product descriptions. Language models can be pretty effective in generating them, at scale.

In fact, the advantage here is that the language model can dynamically change the product description, also based on seasonal searches of users. This is an application that I’ve seen been used very effectively.

However, one thing is to give the machine the ability to change a byte of text on a product page, another is to have the machine rewrite the whole page.

In short, you want to start from, a small section, which can be controlled, easily measured, iterated, corrected, and scale from there. 

Scaling up models

One of the most difficult parts of using machine learning models at an enterprise level will be to scale those models reliably.

For instance, if you have the machine generate text for 100 product pages, is completely different, from a thousand, or ten thousand.

At each level of scale, the complexity the machine handles grows exponentially, and the chance, that a few product descriptions, get way off, becomes a possibility. 

Campaign optimizations

Another interesting application of deep learning models is in the realm of paid campaign optimizations, where the machine can work in two ways.

First, the machine takes unstructured data and makes it into structured data. Imagine if you’re spending millions on Facebook ad campaigns.

Those are mostly handled by a performance manager. For how organized that person might be in handling these campaigns when the budget gets very large, it might become also very complex to understand campaigns that are performing well.

That’s because those campaigns might miss proper labeling. In short, the trivial, but time-consuming task to label these campaigns (like organizing them in clusters that make sense) becomes very hard.

Second, handling a large number of campaigns might also slow down the experimentation process. Indeed, successful campaigns, over time, needs to be continuously tweaked, changed, and re-tested, to keep the ROI on these campaigns stable.

Deep learning, with neural nets, might be very good at that! Labeling, adjusting, testing, and iterating, these campaigns at scale. 

Other resources: 

Case studies:

Read Next: History of OpenAI, AI Business Models, AI Economy.

Connected Business Model Analyses

AI Paradigm

current-AI-paradigm

Pre-Training

pre-training

Large Language Models

large-language-models-llms
Large language models (LLMs) are AI tools that can read, summarize, and translate text. This enables them to predict words and craft sentences that reflect how humans write and speak.

Generative Models

generative-models

Prompt Engineering

prompt-engineering
Prompt engineering is a natural language processing (NLP) concept that involves discovering inputs that yield desirable or useful results. Like most processes, the quality of the inputs determines the quality of the outputs in prompt engineering. Designing effective prompts increases the likelihood that the model will return a response that is both favorable and contextual. Developed by OpenAI, the CLIP (Contrastive Language-Image Pre-training) model is an example of a model that utilizes prompts to classify images and captions from over 400 million image-caption pairs.

Layers of AI

ai-business-ecosystem

OpenAI Business Model

how-does-openai-make-money
OpenAI has built the foundational layer of the AI industry. With large generative models like GPT-3 and DALL-E, OpenAI offers API access to businesses that want to develop applications on top of its foundational models while being able to plug these models into their products and customize these models with proprietary data and additional AI features. On the other hand, OpenAI also released ChatGPT, developing around a freemium model. Microsoft also commercializes opener products through its commercial partnership.

OpenAI/Microsoft

openai-microsoft
OpenAI and Microsoft partnered up from a commercial standpoint. The history of the partnership started in 2016 and consolidated in 2019, with Microsoft investing a billion dollars into the partnership. It’s now taking a leap forward, with Microsoft in talks to put $10 billion into this partnership. Microsoft, through OpenAI, is developing its Azure AI Supercomputer while enhancing its Azure Enterprise Platform and integrating OpenAI’s models into its business and consumer products (GitHub, Office, Bing).

Stability AI Business Model

how-does-stability-ai-make-money
Stability AI is the entity behind Stable Diffusion. Stability makes money from our AI products and from providing AI consulting services to businesses. Stability AI monetizes Stable Diffusion via DreamStudio’s APIs. While it also releases it open-source for anyone to download and use. Stability AI also makes money via enterprise services, where its core development team offers the chance to enterprise customers to service, scale, and customize Stable Diffusion or other large generative models to their needs.

Stability AI Ecosystem

stability-ai-ecosystem

Connected Concepts

DevOps

devops-engineering
DevOps refers to a series of practices performed to perform automated software development processes. It is a conjugation of the term “development” and “operations” to emphasize how functions integrate across IT teams. DevOps strategies promote seamless building, testing, and deployment of products. It aims to bridge a gap between development and operations teams to streamline the development altogether.

DevSecOps

devsecops
DevSecOps is a set of disciplines combining development, security, and operations. It is a philosophy that helps software development businesses deliver innovative products quickly without sacrificing security. This allows potential security issues to be identified during the development process – and not after the product has been released in line with the emergence of continuous software development practices.

Continuous Intelligence

continuous-intelligence-business-model
The business intelligence models have transitioned to continuous intelligence, where dynamic technology infrastructure is coupled with continuous deployment and delivery to provide continuous intelligence. In short, the software offered in the cloud will integrate with the company’s data, leveraging on AI/ML to provide answers in real-time to current issues the organization might be experiencing.

Continuous Integration

continuous-integrationcontinuous-deployment
Continuous Integration/Continuous Deployment (CI/CD) introduces automation into the stages of app development to frequently deliver to customers. CI/CD introduces continuous automation and monitoring throughout the app lifecycle, from testing to delivery and then deployment.

MLOps

mlops
Machine Learning Ops (MLOps) describes a suite of best practices that successfully help a business run artificial intelligence. It consists of the skills, workflows, and processes to create, run, and maintain machine learning models to help various operational processes within organizations.

RevOps

revops
RevOps – short for Revenue Operations – is a framework that aims to maximize the revenue potential of an organization. RevOps seeks to align these departments by giving them access to the same data and tools. With shared information, each then understands their role in the sales funnel and can work collaboratively to increase revenue.

AIOps

aiops
AIOps is the application of artificial intelligence to IT operations. It has become particularly useful for modern IT management in hybridized, distributed, and dynamic environments. AIOps has become a key operational component of modern digital-based organizations, built around software and algorithms.

Ad-Ops

ad-ops
Ad Ops – also known as Digital Ad Operations – refers to systems and processes that support digital advertisements’ delivery and management. The concept describes any process that helps a marketing team manage, run, or optimize ad campaigns, making them an integrating part of the business operations.

Main Free Guides:

Leave a Reply

Discover more from FourWeekMBA

Subscribe now to keep reading and get access to the full archive.

Continue reading

Scroll to Top
FourWeekMBA