Who is Jakob Uszkoreit?

Jakob Uszkoreit is an ML and NLP researcher and computer scientist with a particular focus on language translation. Uszkoreit is also one of the co-authors of the seminal paper on transformer architecture entitled Attention Is All You Need.

Education and early career

Uszkoreit earned his Master’s degree in computer science and mathematics from Technische Universität Berlin in 2007. To support himself as a student, he worked as a freelance software developer for a production company. 

Uszkoreit interned at Google Research in 2006 and 2007. According to his LinkedIn profile, he was tasked with distributing “clustering algorithms for applications in machine translation, language modeling and other natural language understanding systems.

Post-university, he worked at Acrolinx as a software engineer for around twelve months. Acrolinx is a provider of enterprise content governance for human and AI-generated content. 

Google Brain

Uszkoreit joined Google Brain’s Berlin facility in March 2008 and worked on early versions of Google Translate. 

He was motivated to abort his Ph.D. and return to Google because the company was one of the few places where one could work on an ML project that was intellectually stimulating while also being challenging and exciting.

Later, he managed the team responsible for the natural language query understanding system that underpins Google Assistant and several other products.

In subsequent years, Uszkoreit led a team in Google Machine Intelligence and worked on large-scale deep learning for natural language understanding. This work also had implications for Google Assistant and so forth. 

Uszkoreit spent over 13 years at Google Brain and co-authored several influential papers on self-attention and transformers for music and image recognition, among many other topics. 

When he left the company in July 2021, he was part of the so-called “brain drain” of AI talent who had resigned from big tech firms to pursue their own interests.

Inceptive

That same month, Uszkoreit co-founded the biotech company Inceptive with computational biochemist and Stanford associate professor Rhiju Das. 

In essence, Inceptive incorporates AI into RNA biology to develop software that can execute complex functions in a biological system. This software can be used to develop novel and broadly accessible therapeutics as well as biotechnologies once believed impossible.

On his reasons for starting the company, Uszkoreit explained in an interview with Andreesen Horowitz that “For quite a while now, biology struck me as such a problem where it doesn’t seem inconceivable that there are bounds to how far we can go in terms of, say, drug development and direct design with traditional biology as the backbone…

Uszkoreit then mentioned that deep learning at scale, in particular, was appropriate in this context. This would enable the company to observe life at a sufficient scale with enough fidelity but also much less specificity whilst focusing on the problem that needs to be solved.

Key takeaways

  • Jakob Uszkoreit is a deep learning researcher and computer scientist with a particular focus on language translation. Uszkoreit is also one of the co-authors of the seminal paper on the transformer architecture entitled Attention Is All You Need.
  • Uszkoreit joined Google Brain in March 2008 and worked on early versions of Google Translate.  He was motivated to abort his Ph.D. and return to Google because the company was one of the few places where one could work on exciting but challenging machine learning projects.
  • Uszkoreit worked at Google Brain for 13 years before founding Inceptive with computational biochemist Rhiju Das. Inceptive develops software that can execute complex functions in a biological system and facilitate the creation of new medicines.

Timeline

  • Jakob Uszkoreit’s Background: Jakob Uszkoreit is a computer scientist and machine learning (ML) researcher with a specialization in natural language processing (NLP) and language translation. He co-authored the influential paper on transformer architecture titled “Attention Is All You Need.”
  • Education and Early Career: Uszkoreit earned his Master’s degree in computer science and mathematics from Technische Universität Berlin in 2007. He interned at Google Research and worked as a software engineer at Acrolinx.
  • Google Brain: In March 2008, Uszkoreit joined Google Brain’s Berlin facility, where he contributed to early versions of Google Translate. Over 13 years at Google Brain, he worked on natural language query understanding systems for Google Assistant and large-scale deep learning for various NLP applications.
  • Brain Drain and Departure from Google: In July 2021, Uszkoreit was part of the “brain drain” of AI talent leaving big tech firms to pursue their interests. He left Google Brain to explore new opportunities.
  • Co-Founder of Inceptive: In the same month, Uszkoreit co-founded Inceptive, a biotech company that incorporates AI into RNA biology. The company aims to develop software that can execute complex functions in biological systems, leading to novel therapeutics and biotechnologies.
  • Vision for Inceptive: Uszkoreit’s goal with Inceptive is to use deep learning at scale to address complex biological problems, such as drug development and direct design. The company’s approach focuses on solving critical problems in the field of biology using AI technology.
  • Contribution to AI Research: Throughout his career, Uszkoreit has contributed to several influential papers on self-attention and transformers, advancing the fields of NLP and deep learning.

Read Next: History of OpenAI, Who Owns OpenAI, AI Business Models, AI Economy.

Connected Business Model Analyses

AI Paradigm

current-AI-paradigm

Pre-Training

pre-training

Large Language Models

large-language-models-llms
Large language models (LLMs) are AI tools that can read, summarize, and translate text. This enables them to predict words and craft sentences that reflect how humans write and speak.

Generative Models

generative-models

Prompt Engineering

prompt-engineering
Prompt engineering is a natural language processing (NLP) concept that involves discovering inputs that yield desirable or useful results. Like most processes, the quality of the inputs determines the quality of the outputs in prompt engineering. Designing effective prompts increases the likelihood that the model will return a response that is both favorable and contextual. Developed by OpenAI, the CLIP (Contrastive Language-Image Pre-training) model is an example of a model that utilizes prompts to classify images and captions from over 400 million image-caption pairs.

OpenAI Organizational Structure

openai-organizational-structure
OpenAI is an artificial intelligence research laboratory that transitioned into a for-profit organization in 2019. The corporate structure is organized around two entities: OpenAI, Inc., which is a single-member Delaware LLC controlled by OpenAI non-profit, And OpenAI LP, which is a capped, for-profit organization. The OpenAI LP is governed by the board of OpenAI, Inc (the foundation), which acts as a General Partner. At the same time, Limited Partners comprise employees of the LP, some of the board members, and other investors like Reid Hoffman’s charitable foundation, Khosla Ventures, and Microsoft, the leading investor in the LP.

OpenAI Business Model

how-does-openai-make-money
OpenAI has built the foundational layer of the AI industry. With large generative models like GPT-3 and DALL-E, OpenAI offers API access to businesses that want to develop applications on top of its foundational models while being able to plug these models into their products and customize these models with proprietary data and additional AI features. On the other hand, OpenAI also released ChatGPT, developing around a freemium model. Microsoft also commercializes opener products through its commercial partnership.

OpenAI/Microsoft

openai-microsoft
OpenAI and Microsoft partnered up from a commercial standpoint. The history of the partnership started in 2016 and consolidated in 2019, with Microsoft investing a billion dollars into the partnership. It’s now taking a leap forward, with Microsoft in talks to put $10 billion into this partnership. Microsoft, through OpenAI, is developing its Azure AI Supercomputer while enhancing its Azure Enterprise Platform and integrating OpenAI’s models into its business and consumer products (GitHub, Office, Bing).

Stability AI Business Model

how-does-stability-ai-make-money
Stability AI is the entity behind Stable Diffusion. Stability makes money from our AI products and from providing AI consulting services to businesses. Stability AI monetizes Stable Diffusion via DreamStudio’s APIs. While it also releases it open-source for anyone to download and use. Stability AI also makes money via enterprise services, where its core development team offers the chance to enterprise customers to service, scale, and customize Stable Diffusion or other large generative models to their needs.

Stability AI Ecosystem

stability-ai-ecosystem

About The Author

Scroll to Top
FourWeekMBA