ai-supply-chains

AI Supply Chains: The Consumer Is The New Farmer

An AI supply chain starts with the sourcing of data, which is produced by consumers. As this data gets stored on hardware, it goes through a first refinement process via software. Then it’s further refined, and repackaged by algorithms, and stored in data centers, which work as the fulfillment centers.

Data supply chains

Google’s cloud is an interesting example of how information flips the supply chain upside down.

As pointed out in vertical integration, the physical supply chain gets flipped upside down, when we look at it from a data/information standpoint.

data-supply-chain
A classic supply chain moves from upstream to downstream, where the raw material is transformed into products, moved through logistics and distributed to final customers. A data supply chain moves in the opposite direction. The raw data is “sourced” from the customer/user. As it moves downstream, it gets processed and refined by proprietary algorithms and stored in data centers.

To truly appreciate those differences, let’s start with one of the most interesting of the physical supply chains: coffee.

From coffee beans to data farms

The coffee supply chain is among the most interesting, as it’s quite complex and it goes through a cycle of growing beans, harvesting, drying, packing, bulking, blending, roasting, labeling, packaging, distributing, and selling.

Throughout this process, there are many players involved globally, at each stage. Perhaps wherein the growing and harvesting of coffee beans, countries like Brazil and Colombia play a leading role.

Yet, as the bean goes through a process of refinement, it goes through several parts of the chain, and only toward the end, there is the process of roasting, labeling, packaging, and distribution.

Most of the economic value of this supply chain is skewed toward the end. As we get closer to roasting, labeling, distributing, and retailing, that is where most of the economic output is captured.

In other words, of the overall price paid by customers in the shops for their nice double espresso, most of the revenues go toward covering up for the expenses to run the shop/rent, the staff, tax, and profits.

To gain a bit of context, as the Financial Times evaluated on a 2.50-pound cup of coffee, 25 pennies go to the shop as a profit, and only 10 pennies go to the overall coffee chain, and about a penny goes toward the farmer.

If we use this analogy for the world of AI supply chains, the farmer is no longer in a plantation in South America. But anywhere tapping with her/his finger on a 4.5 inches smartphone.

Raw materials are sourced by consumers

In an AI supply chain it all starts from consumers. They are the growers of the raw materials (raw data) that will serve as the basis for the whole supply chain.

It’s worth to point out, that, as in a coffee supply chain, where the beans are grown by farmers, which are the ones capturing less of the individual economic value from the supply chain.

In an AI supply chain, consumers are like farmers, and they also are the ones that gain the least in terms of economic value from the overall supply chain.

Hardware devices become the harvesting facilities

The raw data gets gathered, harvested and collected through physical devices, which are the most proximate object to the consumer.

As the raw data becomes available, hardware becomes the harvesting facility in the AI supply chain.

Software and operating systems become the harvesting machines

Software, operating systems and everything else that is in between the physical device and the company’s algorithms become the harvesting machines, ready to sort the data, that will go through the industrial machineries of the AI supply chain.

Algorithms are the industrial machineries for data

As this data, partly filtered by the software side will go through a process of industrial refinement, algorithms will play a key role in refining, processing, and packaging of the data for several scopes.

In that sense, the data moves in two directions. On the one hand, it will move toward consumers to improve the services they get for free. On the other hand, it will move into the proprietary technology stack of the company, ingrained in its monetization machinery to generate profits.

Before it can move in those two directions, though, it will need to be stored within its data centers.

Data centers as fulfillment facilities

As the data goes through the data centers, it gets stored, and it moves in many directions. Back to consumers in the form of free services and toward the monetizaiton machinery, where the data processed refined, and continously reprocessed become the core servive the company offers on the market.

Perhaps, as Google highlights “Our data centers keep all of Google’s products and services up and running around the clock and around the world. Whenever you access Gmail, edit your documents, or search for information on Google, you’re using one of our data centers and have the power of a supercomputer at your fingertips.

Key takeaways

  • The consumer as the farmer sources the raw data, and it gets back only a fraction of the overall economic value.
  • Most costs go back to data centers, power sourcing, profits, and organizational costs.
  • Algorithms work in two directions, by refining data to offer free services to consumers. And by creating the premises for the monetization machinery to work at the best.
  • Hardware, as the most proximate thing to the consumer, becomes the harvesting facility.
  • Data centers become the fulfillment facilities moving refined and repackaged data in two directions. Toward consumers in the form of free services and toward businesses in the form of advertising or premium services.

Is this a permanent design for AI supply chains? Not necessarily. Yet that has become the predominant design for new dominant media companies (Google, Facebook).

Read next: 

Connected Business Phenomena

Vertical vs. Horizontal Integration

horizontal-vs-vertical-integration
Horizontal integration refers to the process of increasing market shares or expanding by integrating at the same level of the supply chain, and within the same industry. Vertical integration happens when a company takes control of more parts of the supply chain, thus covering more parts of it.

Supply Chain

data-supply-chain
A classic supply chain moves from upstream to downstream, where the raw material is transformed into products, moved through logistics and distributed to final customers. A data supply chain moves in the opposite direction. The raw data is “sourced” from the customer/user. As it moves downstream, it gets processed and refined by proprietary algorithms and stored in data centers.

AI Supply Chain

ai-supply-chains
An AI supply chain starts with the sourcing of data, which is produced by consumers. As this data gets stored on hardware, it goes through a first refinement process via software. Then it’s further refined, and repackaged by algorithms, and stored in data centers, which work as the fulfillment centers.

Backward Chaining

backward-chaining
Backward chaining, also called backward integration, describes a process where a company expands to fulfill roles previously held by other businesses further up the supply chain. It is a form of vertical integration where a company owns or controls its suppliers, distributors, or retail locations.

Decoupling

decoupling
According to the book, Unlocking The Value Chain, Harvard professor Thales Teixeira identified three waves of disruption (unbundling, disintermediation, and decoupling). Decoupling is the third wave (2006-still ongoing) where companies break apart the customer value chain to deliver part of the value, without bearing the costs to sustain the whole value chain.

Entry Strategies

entry-strategies-startups
When entering the market, as a startup you can use different approaches. Some of them can be based on the product, distribution, or value. A product approach takes existing alternatives and it offers only the most valuable part of that product. A distribution approach cuts out intermediaries from the market. A value approach offers only the most valuable part of the experience.

Disintermediation

disintermediation
Disintermediation is the process in which intermediaries are removed from the supply chain, so that the middlemen who get cut out, make the market overall more accessible and transparent to the final customers. Therefore, in theory, the supply chain gets more efficient and, all in all, can produce products that customers want.

Reintermediation

reintermediation
Reintermediation consists in the process of introducing again an intermediary that had previously been cut out from the supply chain. Or perhaps by creating a new intermediary that once didn’t exist. Usually, as a market is redefined, old players get cut out, and new players within the supply chain are born as a result.

Scientific Management

scientific-management
Scientific Management Theory was created by Frederick Winslow Taylor in 1911 as a means of encouraging industrial companies to switch to mass production. With a background in mechanical engineering, he applied engineering principles to workplace productivity on the factory floor. Scientific Management Theory seeks to find the most efficient way of performing a job in the workplace.

Poka-Yoke

poka-yoke
Poka-yoke is a Japanese quality control technique developed by former Toyota engineer Shigeo Shingo. Translated as “mistake-proofing”, poka-yoke aims to prevent defects in the manufacturing process that are the result of human error. Poka-yoke is a lean manufacturing technique that ensures that the right conditions exist before a step in the process is executed. This makes it a preventative form of quality control since errors are detected and then rectified before they occur.

Gemba Walk

gemba-walk
A Gemba Walk is a fundamental component of lean management. It describes the personal observation of work to learn more about it. Gemba is a Japanese word that loosely translates as “the real place”, or in business, “the place where value is created”. The Gemba Walk as a concept was created by Taiichi Ohno, the father of the Toyota Production System of lean manufacturing. Ohno wanted to encourage management executives to leave their offices and see where the real work happened. This, he hoped, would build relationships between employees with vastly different skillsets and build trust.

Dual Track Agile

dual-track-agile
Product discovery is a critical part of agile methodologies, as its aim is to ensure that products customers love are built. Product discovery involves learning through a raft of methods, including design thinking, lean start-up, and A/B testing to name a few. Dual Track Agile is an agile methodology containing two separate tracks: the “discovery” track and the “delivery” track.

Scaled Agile

scaled-agile-lean-development
Scaled Agile Lean Development (ScALeD) helps businesses discover a balanced approach to agile transition and scaling questions. The ScALed approach helps businesses successfully respond to change. Inspired by a combination of lean and agile values, ScALed is practitioner-based and can be completed through various agile frameworks and practices.

Kanban Framework

kanban
Kanban is a lean manufacturing framework first developed by Toyota in the late 1940s. The Kanban framework is a means of visualizing work as it moves through identifying potential bottlenecks. It does that through a process called just-in-time (JIT) manufacturing to optimize engineering processes, speed up manufacturing products, and improve the go-to-market strategy.

Toyota Production System

toyota-production-system
The Toyota Production System (TPS) is an early form of lean manufacturing created by auto-manufacturer Toyota. Created by the Toyota Motor Corporation in the 1940s and 50s, the Toyota Production System seeks to manufacture vehicles ordered by customers most quickly and efficiently possible.

Six Sigma

six-sigma
Six Sigma is a data-driven approach and methodology for eliminating errors or defects in a product, service, or process. Six Sigma was developed by Motorola as a management approach based on quality fundamentals in the early 1980s. A decade later, it was popularized by General Electric who estimated that the methodology saved them $12 billion in the first five years of operation.
Scroll to Top
FourWeekMBA