Who is Alex Smola?

Alex Smola is a prominent machine learning and artificial intelligence researcher. Smola has authored hundreds of articles on these topics and, according to Google Scholar, has been cited more than 160,000 times by his peers.

In business, Smola served as the Director of Machine Learning and Deep Learning at AWS and, more recently, co-founded the scalable AI model company Boson.ai. Before his work at AWS, Smola also held academic positions at Carnegie Mellon University, UC Berkeley, and Yahoo! Research, and was also a visiting scientist at Google Research.

Early career

Smola received his doctoral degree in computer science from the University of Technology, Berlin, in 1998. Out of university, he worked at the GMD Institute in software engineering and computer architecture.

Between 2004 and 2008, he worked as a professor at Australiaโ€™s Information and Communications Technology Research Centre (NICTA). There, he progressed to senior principal researcher and program leader of the facilityโ€™s Statistical Machine Learning Group.

Yahoo and Google

Smola joined Yahoo! as a Principle Research Scientist in September 2008 and remained there for just under four years. Some of his focus areas included nonparametric models, user profiling, statistical modeling, content analysis, and distributed optimization.

He then completed two separate stints as a visiting scientist at Google. The second stint, which concluded in January 2015, saw Smola working on distributed and large-scale data analysis. Over this time he was also a Professor at Carnegie Mellon University and worked on machine learning at scale.

Smola also served relatively briefly as CEO of Marianas Labs, a developer of algorithms for data processing, analysis, and machine learning services.

AWS

Smola joined AWS in the aforementioned role in August 2016 and one of his first tasks was to ensure AWS was making contact with developer communities. More specifically, he strived to involve them in the open-source deep learning framework known as MXNet (part of the Apache Incubator program).

Smola was also responsible for growing AWSโ€™s machine learning (ML) internship and ensured that Amazon more broadly was an active recruiter in key ML areas such as computer vision, natural language processing (NLP) systems, and core algorithms. 

In his role, Smola recognized the value of academia and noted that the relationship between it and business was a two-way street: โ€œAcademic talent helps AWS excel. At the same time, we want to make sure we share ideasโ€ฆ AWS has lots of infrastructure, bigger machines and, of course, lots of data, allowing interns to do things with AWS that would normally be challenging within their university settings.โ€

Boson.ai

Smola left AWS in February 2023 to start Boson.ai. Little is known about this venture so far, with an excerpt on his LinkedIn profile stating that โ€œWeโ€™re building something big โ€ฆ stay tuned. Talk to me if you want to work on scalable foundation models.โ€

In the past, Smola has spoken frequently about how to design efficient algorithms at scale and noted that the principles that govern MXNet, for example, can be applied to various deep-learning problems. This idea is something Smola also taught as an Adjunct Professor at the University of California in Berkeley. 

Key takeaways:

  • Alex Smola is a prominent machine learning and artificial intelligence researcher. Smola has authored hundreds of articles on these topics and, according to Google Scholar, has been cited more than 160,000 times by his peers.
  • Smola joined AWS in August 2016 and was responsible for growing its machine learning (ML) internship. He was also tasked with ensuring that Amazon was an active recruiter in key ML areas such as computer vision, natural language processing (NLP), systems, and core algorithms.
  • Smola left AWS in February 2023 to start Boson.ai. Little is known about this venture, but one can assume that it is related to his passion and expertise in developing efficient algorithms at scale.

Key Highlights

  • Background and Early Career:
    • Alex Smola is a prominent researcher in the field of machine learning and artificial intelligence.
    • He holds a doctoral degree in computer science from the University of Technology, Berlin, earned in 1998.
    • Smola worked at the GMD Institute in software engineering and computer architecture after completing his degree.
    • He served as a professor at Australiaโ€™s Information and Communications Technology Research Centre (NICTA) from 2004 to 2008.
  • Yahoo and Google:
    • Smola joined Yahoo! as a Principle Research Scientist in 2008, focusing on nonparametric models, user profiling, statistical modeling, content analysis, and distributed optimization.
    • He had two stints as a visiting scientist at Google, with his second stint concluding in January 2015.
    • During this time, he also held positions as a Professor at Carnegie Mellon University and worked on machine learning at scale.
  • AWS and Academic Engagement:
    • Smola joined Amazon Web Services (AWS) in August 2016 as the Director of Machine Learning and Deep Learning.
    • He focused on engaging developer communities and involving them in the open-source deep learning framework MXNet.
    • Smola contributed to growing AWS’s machine learning (ML) internship and recruiting efforts in key ML areas.
  • Boson.ai Venture:
    • Smola left AWS in February 2023 to co-found Boson.ai.
    • The nature of Boson.ai’s venture is not fully disclosed, but it likely aligns with his expertise in developing efficient algorithms at scale.
    • His passion for designing algorithms at scale is evident in his previous work, such as MXNet, and his teachings as an Adjunct Professor at the University of California in Berkeley.

Read Next: History of OpenAI, AI Business Models, AI Economy.

Connected Business Model Analyses

AI Paradigm

current-AI-paradigm

Pre-Training

pre-training

Large Language Models

large-language-models-llms
Large language models (LLMs) are AI tools that can read, summarize, and translate text. This enables them to predict words and craft sentences that reflect how humans write and speak.

Generative Models

generative-models

Prompt Engineering

prompt-engineering
Prompt engineering is a natural language processing (NLP) concept that involves discovering inputs that yield desirable or useful results. Like most processes, the quality of the inputs determines the quality of the outputs in prompt engineering. Designing effective prompts increases the likelihood that the model will return a response that is both favorable and contextual. Developed by OpenAI, the CLIP (Contrastive Language-Image Pre-training) model is an example of a model that utilizes prompts to classify images and captions from over 400 million image-caption pairs.

OpenAI Organizational Structure

openai-organizational-structure
OpenAI is an artificial intelligence research laboratory that transitioned into a for-profit organization in 2019. The corporate structure is organized around two entities: OpenAI, Inc., which is a single-member Delaware LLC controlled by OpenAI non-profit, And OpenAI LP, which is a capped, for-profit organization. The OpenAI LP is governed by the board of OpenAI, Inc (the foundation), which acts as a General Partner. At the same time, Limited Partners comprise employees of the LP, some of the board members, and other investors like Reid Hoffmanโ€™s charitable foundation, Khosla Ventures, and Microsoft, the leading investor in the LP.

OpenAI Business Model

how-does-openai-make-money
OpenAI has built the foundational layer of the AI industry. With large generative models like GPT-3 and DALL-E, OpenAI offers API access to businesses that want to develop applications on top of its foundational models while being able to plug these models into their products and customize these models with proprietary data and additional AI features. On the other hand, OpenAI also released ChatGPT, developing around a freemium model. Microsoft also commercializes opener products through its commercial partnership.

OpenAI/Microsoft

openai-microsoft
OpenAI and Microsoft partnered up from a commercial standpoint. The history of the partnership started in 2016 and consolidated in 2019, with Microsoft investing a billion dollars into the partnership. It’s now taking a leap forward, with Microsoft in talks to put $10 billion into this partnership. Microsoft, through OpenAI, is developing its Azure AI Supercomputer while enhancing its Azure Enterprise Platform and integrating OpenAI’s models into its business and consumer products (GitHub, Office, Bing).

Stability AI Business Model

how-does-stability-ai-make-money
Stability AI is the entity behind Stable Diffusion. Stability makes money from our AI products and from providing AI consulting services to businesses. Stability AI monetizes Stable Diffusion via DreamStudio’s APIs. While it also releases it open-source for anyone to download and use. Stability AI also makes money via enterprise services, where its core development team offers the chance to enterprise customers to service, scale, and customize Stable Diffusion or other large generative models to their needs.

Stability AI Ecosystem

stability-ai-ecosystem

About The Author

Scroll to Top
FourWeekMBA