Federated Learning

Federated learning is the decentralized form of machine learning where the model is trained on decentralized data across multiple edge devices. 

Understanding federated learning

Traditional machine learning techniques require the training data from edge devices to be aggregated and centralized in a data center or machine. Machine learning algorithms then train themselves on the data and run the model on a cloud server where it can be accessed via various applications.

However, the traditional technique is subject to privacy concerns. Since tech giants such as Amazon, Microsoft, and Google offer cloud-based AI solutions, sensitive user data is sent to their servers where the models are trained.

Federated learning is one way to remedy this issue. Born at the intersection of blockchain, on-device AI, and edge computing, the approach involves training a centralized machine learning model on decentralized data. 

How does federated learning work?

To understand how the process works, consider a smartphone. Federated learning enables smartphones to learn a shared prediction without the training data leaving the device. In other words, machine learning can take place without the need to store the data in the cloud

Note that federated learning moves beyond local models that already make predictions on smartphones like the Mobile Vision API. This is because they enable model training to occur on the device as well.

When a smartphone downloads the current model, it improves it with data from the phone, and that improvement is summarized as a small update. Importantly, only the update is sent to the cloud and in any case, is encrypted and averaged with other user updates.

The three types of federated learning

There are three main types of federated learning:

  1. Horizontal – where the central model is trained on similar datasets.
  2. Vertical – where datasets are complementary. For example, book and movie reviews can be combined to predict someone’s music interests, and
  3. Federated transfer learning – where a pre-trained model that performs one task is trained on a different dataset to perform another task. For example, banks could train an AI model to detect fraud and then repurpose it elsewhere.

Benefits of federated learning

Since models are trained on the device, applications can continue to function even when the device has no internet access. Users who are on metered connections will also appreciate the ability of federated learning to save them bandwidth.

What’s more, in many cases, on-device inference is far more energy-efficient than constantly sending data to the cloud. Since training data remains on the device, it can also be used to train models to deliver a personalized experience. 

More detail is provided on this in the next section.

Federated learning and Gboard

Google is currently testing federated learning in Gboard on Android – otherwise known as the Google Keyboard. When Gboard shows a user a suggested query, the smartphone stores information on the current context and whether the query was clicked on.

Federated learning then processes a user’s search history and behavior on-device to deliver improvements the next time Gboard displays suggestions.

Key takeaways:

  • Federated learning is the decentralized form of machine learning where the model is trained on decentralized data across multiple edge devices. 
  • When a device downloads the current model, it improves it with data with that improvement summarized as an update. Only the update is sent to the cloud and in any case, it is encrypted and averaged with other user updates to improve the model.
  • Federated learning has several benefits. Since models are trained on the device, applications can continue to function even when the device has no internet access. There are also improvements in bandwidth usage and the ability to deliver personalized experiences on devices.

Connected AI Concepts

AGI

artificial-intelligence-vs-machine-learning
Generalized AI consists of devices or systems that can handle all sorts of tasks on their own. The extension of generalized AI eventually led to the development of Machine learning. As an extension to AI, Machine Learning (ML) analyzes a series of computer algorithms to create a program that automates actions. Without explicitly programming actions, systems can learn and improve the overall experience. It explores large sets of data to find common patterns and formulate analytical models through learning.

Deep Learning vs. Machine Learning

deep-learning-vs-machine-learning
Machine learning is a subset of artificial intelligence where algorithms parse data, learn from experience, and make better decisions in the future. Deep learning is a subset of machine learning where numerous algorithms are structured into layers to create artificial neural networks (ANNs). These networks can solve complex problems and allow the machine to train itself to perform a task.

DevOps

devops-engineering
DevOps refers to a series of practices performed to perform automated software development processes. It is a conjugation of the term “development” and “operations” to emphasize how functions integrate across IT teams. DevOps strategies promote seamless building, testing, and deployment of products. It aims to bridge a gap between development and operations teams to streamline the development altogether.

AIOps

aiops
AIOps is the application of artificial intelligence to IT operations. It has become particularly useful for modern IT management in hybridized, distributed, and dynamic environments. AIOps has become a key operational component of modern digital-based organizations, built around software and algorithms.

Machine Learning Ops

mlops
Machine Learning Ops (MLOps) describes a suite of best practices that successfully help a business run artificial intelligence. It consists of the skills, workflows, and processes to create, run, and maintain machine learning models to help various operational processes within organizations.

OpenAI Organizational Structure

openai-organizational-structure
OpenAI is an artificial intelligence research laboratory that transitioned into a for-profit organization in 2019. The corporate structure is organized around two entities: OpenAI, Inc., which is a single-member Delaware LLC controlled by OpenAI non-profit, And OpenAI LP, which is a capped, for-profit organization. The OpenAI LP is governed by the board of OpenAI, Inc (the foundation), which acts as a General Partner. At the same time, Limited Partners comprise employees of the LP, some of the board members, and other investors like Reid Hoffman’s charitable foundation, Khosla Ventures, and Microsoft, the leading investor in the LP.

OpenAI Business Model

how-does-openai-make-money
OpenAI has built the foundational layer of the AI industry. With large generative models like GPT-3 and DALL-E, OpenAI offers API access to businesses that want to develop applications on top of its foundational models while being able to plug these models into their products and customize these models with proprietary data and additional AI features. On the other hand, OpenAI also released ChatGPT, developing around a freemium model. Microsoft also commercializes opener products through its commercial partnership.

OpenAI/Microsoft

openai-microsoft
OpenAI and Microsoft partnered up from a commercial standpoint. The history of the partnership started in 2016 and consolidated in 2019, with Microsoft investing a billion dollars into the partnership. It’s now taking a leap forward, with Microsoft in talks to put $10 billion into this partnership. Microsoft, through OpenAI, is developing its Azure AI Supercomputer while enhancing its Azure Enterprise Platform and integrating OpenAI’s models into its business and consumer products (GitHub, Office, Bing).

Stability AI Business Model

how-does-stability-ai-make-money
Stability AI is the entity behind Stable Diffusion. Stability makes money from our AI products and from providing AI consulting services to businesses. Stability AI monetizes Stable Diffusion via DreamStudio’s APIs. While it also releases it open-source for anyone to download and use. Stability AI also makes money via enterprise services, where its core development team offers the chance to enterprise customers to service, scale, and customize Stable Diffusion or other large generative models to their needs.

Stability AI Ecosystem

stability-ai-ecosystem

Main Free Guides:

About The Author

Scroll to Top
FourWeekMBA