Who is Llion Jones?

Llion Jones is a Welsh-born software engineer and artificial intelligence researcher who lives in Tokyo. Jones was one of the co-authors of the seminal paper Attention Is All You Need which proposed a new simple network architecture known as a transformer.

Education and early career

Jones showed proficiency in mathematics, computing, chemistry, and physics from an early age and later went on to study at the University of Birmingham. 

He earned a Master’s in Advanced Computer Science in 2009, but found it difficult to find work and spent around six months attending unsuccessful job interviews. 

At some point, Jones sent his resume to Google for a software engineer position the company had advertised in London. He successfully navigated two phone interviews but ultimately declined a job offer as he had just started in another role.

This role was at Delcam, a Birmingham-based CAD/CAM software provider for the manufacturing industry.

YouTube

Jonesโ€™s first encounter with Google may have been his only encounter, but 18 months later, another recruiter from the company reached out and asked if he wanted to re-apply. 

Jones joined YouTube as a software engineer in February 2012 and worked in this capacity until the middle of 2015. Shortly before he departed from YouTube, he took a machine learning course at Coursera in preparation for his next move.

Google Research

Jones then transferred to Google Research in 2015 where he remains to this day as a senior software engineer. He worked under famous author, entrepreneur, and inventor Ray Kurzweil, who joined Google as director of engineering three years previous. 

In close collaboration with Kurzweil, Jones has studied ways computers can process, interpret, and understand human language in everyday applications.

His passion and enthusiasm for his work are also clearly evident. In 2018, Jones told the North Wales Chronicle that โ€œartificial intelligence is just fascinating, from a scientific, philosophical and ethical point of view. Iโ€™m at the forefront of an effort to reverse engineer literally the most complex thing in the known universe: the human mind.โ€

Academic contributions

In addition to his work on the transformer architecture, Jones has been prolific in his contributions to other areas of AI and machine learning.

For example, in 2022, he co-authored the paper Helpful Neighbors: Leveraging Neighbors in Geographic Feature Pronunciation. The paper proposed a novel architecture that could guess the pronunciation of a location based on the pronunciation of nearby locations โ€“ a feature seen as especially useful for Japanese place names in Google Maps. 

In a paper published three years earlier that has now been cited almost 1300 times, Jones and others argued that progress on open-domain question answering (QA) had been hindered by a lack of appropriate training data. 

The report, titled Natural Questions: A Benchmark for Question Answering Research, presented the first publicly available data set that paired quality answer annotations in documents with real user queries. The team also introduced robust new metrics to evaluate question-answering systems with higher human upper bounds on said metrics.

Key takeaways:

  • Llion Jones is a Welsh-born software engineer and artificial intelligence researcher who lives in Tokyo. Jones was one of the co-authors of the seminal paper Attention Is All You Need which proposed a new simple network architecture known as a transformer.
  • Jones joined YouTube as a software engineer in February 2012 and worked in this capacity until the middle of 2015. He was encouraged to apply for the role by a Google representative based on an earlier interaction.
  • Jones then transferred to Google Research in 2015 where he remains to this day as a senior software engineer. There, he has worked closely with author, entrepreneur, and inventor Ray Kurzweil and has co-authored several influential papers.

Timeline

  • Llion Jones’s Background: Llion Jones is a Welsh-born software engineer and AI researcher based in Tokyo. He has shown proficiency in various subjects from an early age, including mathematics, computing, chemistry, and physics.
  • Education and Early Career: Jones studied at the University of Birmingham and earned a Master’s in Advanced Computer Science in 2009. After facing initial challenges finding work, he joined Delcam, a CAD/CAM software provider.
  • Google and YouTube: Jones had his first encounter with Google during unsuccessful job interviews but later joined YouTube as a software engineer in 2012. He took a machine learning course on Coursera to prepare for future opportunities.
  • Google Research and Collaboration with Ray Kurzweil: In 2015, Jones transferred to Google Research, where he currently works as a senior software engineer. He collaborates closely with Ray Kurzweil, a renowned author and inventor.
  • Contributions to AI and Machine Learning: Jones co-authored the seminal paper “Attention Is All You Need,” introducing the transformer architecture. He has also made significant contributions to open-domain question answering research and other areas of AI.
  • Helpful Neighbors Paper: In 2022, Jones co-authored a paper proposing a novel architecture for guessing the pronunciation of a location based on nearby locations, useful for Japanese place names in Google Maps.
  • Natural Questions Paper: In a highly cited paper from 2019, Jones and his team presented the first publicly available data set for open-domain question answering research, along with robust metrics to evaluate question-answering systems.
  • Passion for AI and the Human Mind: Jones has expressed his passion for AI and its scientific, philosophical, and ethical implications. He sees AI as an effort to reverse engineer the complexities of the human mind.
  • Ongoing Research and Career: As of the available information, Jones continues his research and work at Google Research, focusing on language processing and AI applications. His contributions have had a significant impact on the field of AI and machine learning.

Read Next: History of OpenAI, Who Owns OpenAI, AI Business Models, AI Economy.

Connected Business Model Analyses

AI Paradigm

current-AI-paradigm

Pre-Training

pre-training

Large Language Models

large-language-models-llms
Large language models (LLMs) are AI tools that can read, summarize, and translate text. This enables them to predict words and craft sentences that reflect how humans write and speak.

Generative Models

generative-models

Prompt Engineering

prompt-engineering
Prompt engineering is a natural language processing (NLP) concept that involves discovering inputs that yield desirable or useful results. Like most processes, the quality of the inputs determines the quality of the outputs in prompt engineering. Designing effective prompts increases the likelihood that the model will return a response that is both favorable and contextual. Developed by OpenAI, the CLIP (Contrastive Language-Image Pre-training) model is an example of a model that utilizes prompts to classify images and captions from over 400 million image-caption pairs.

OpenAI Organizational Structure

openai-organizational-structure
OpenAI is an artificial intelligence research laboratory that transitioned into a for-profit organization in 2019. The corporate structure is organized around two entities: OpenAI, Inc., which is a single-member Delaware LLC controlled by OpenAI non-profit, And OpenAI LP, which is a capped, for-profit organization. The OpenAI LP is governed by the board of OpenAI, Inc (the foundation), which acts as a General Partner. At the same time, Limited Partners comprise employees of the LP, some of the board members, and other investors like Reid Hoffmanโ€™s charitable foundation, Khosla Ventures, and Microsoft, the leading investor in the LP.

OpenAI Business Model

how-does-openai-make-money
OpenAI has built the foundational layer of the AI industry. With large generative models like GPT-3 and DALL-E, OpenAI offers API access to businesses that want to develop applications on top of its foundational models while being able to plug these models into their products and customize these models with proprietary data and additional AI features. On the other hand, OpenAI also released ChatGPT, developing around a freemium model. Microsoft also commercializes opener products through its commercial partnership.

OpenAI/Microsoft

openai-microsoft
OpenAI and Microsoft partnered up from a commercial standpoint. The history of the partnership started in 2016 and consolidated in 2019, with Microsoft investing a billion dollars into the partnership. It’s now taking a leap forward, with Microsoft in talks to put $10 billion into this partnership. Microsoft, through OpenAI, is developing its Azure AI Supercomputer while enhancing its Azure Enterprise Platform and integrating OpenAI’s models into its business and consumer products (GitHub, Office, Bing).

Stability AI Business Model

how-does-stability-ai-make-money
Stability AI is the entity behind Stable Diffusion. Stability makes money from our AI products and from providing AI consulting services to businesses. Stability AI monetizes Stable Diffusion via DreamStudio’s APIs. While it also releases it open-source for anyone to download and use. Stability AI also makes money via enterprise services, where its core development team offers the chance to enterprise customers to service, scale, and customize Stable Diffusion or other large generative models to their needs.

Stability AI Ecosystem

stability-ai-ecosystem

About The Author

Scroll to Top
FourWeekMBA