Vellum - Developer platform for building great AI applications
blog2

Introducing Vellum: The Developer Platform for Building Great AI Applications

The emergence of large language models (LLMs) has revolutionized the field of artificial intelligence. However, building production-worthy LLM applications can be a daunting task for developers. That’s where Vellum comes in. Founded in 2023 by a team of MIT engineers and McKinsey consultants, Vellum is a developer platform that helps companies adopt AI by taking their prototypes to production.

In this article, we’ll take a closer look at Vellum and its various features, as well as hear from the founders about how the company got started.

Who are the Founders of Vellum and How Did the Company Get Started?

Vellum was founded by Sidd Seethepalli, Noa Flaherty, and Akash Sharma. The three worked together at Dover (YC S19) for over two years where they built production use-cases of LLMs. Sidd and Noa are MIT engineers who have worked with DataRobot’s MLOps team and Quora’s ML Platform team, respectively. Akash spent five years at McKinsey’s Silicon Valley Office.

While working with GPT-3 and Cohere to build user-facing LLM apps, they found themselves building complex internal tooling to compare models, fine-tune them, measure performance, and improve quality over time. This took away time from building their user-facing product. They realized they needed ML Ops for LLMs and decided to build it themselves.

In early 2023, the founders recognized the growing demand for better generative AI prompting and saw the market’s potential to rapidly scale. With the release of tools like ChatGPT, they realized the shift towards natural language inputs was making AI technology more accessible to a broader range of professionals. This shift in market demand laid the foundation for Vellum’s inception.

The Vellum team building AI solutions for developers.
The Vellum team building AI solutions for developers.

What is Vellum Playground and How Can It Help Developers?

Vellum Playground is a set of tools that can help prompt engineers compare prompts, models, and even LLM providers side-by-side. The platform allows users to curate a library of test cases to evaluate prompts against and quantitatively evaluate the output of their prompts using industry-standard ML metrics such as Bleu, Meteor, Levenshtein distance, and Semantic similarity. By using Vellum Playground, developers can save time and resources by testing and fine-tuning their models more efficiently.

Vellum’s interface for testing and comparing AI models.
Vellum’s interface for testing and comparing AI models.

What is Vellum Manage and How Can It Help Developers?

Vellum Manage is a simple API interface that proxies requests to any model provider. It also offers back-testing and version control, as well as observability of all inputs and outputs. Developers can use the UI and API to submit explicit or implicit user feedback. With Vellum Manage, developers can confidently iterate on models in production, ensuring that their applications are always improving.

What is Vellum Search and How Can It Help Developers?

Vellum Search is a powerful tool that allows developers to use their proprietary data in LLM applications. The platform offers a robust API endpoint to submit documents (“corpus of text”) for querying against. It also provides configurable chunking and semantic search strategies, as well as the ability to query against a corpus of text at run time. By using Vellum Search, developers can leverage their own data to create more powerful and accurate AI applications.

What is Vellum Optimize and How Can It Help Developers?

Vellum Optimize is a tool that allows developers to continuously fine-tune their models to improve quality and lower costs. The platform passively accumulates training data to fine-tune proprietary models. It also allows users to swap model providers or parameters under the hood without any code changes required. With Vellum Optimize, developers can ensure that their models are always up-to-date and performing at their best.

AI model visualized in action, showing LLM complexity.
AI model visualized in action, showing LLM complexity.

How Has Vellum Grown Since Its Launch?

Since its founding, Vellum has rapidly gained traction, securing 40 paying customers within its first few months of operation. The company is growing its revenue by 25% to 30% monthly, driven by the increasing need for AI-powered solutions across multiple industries. The company raised $5 million in seed funding, with participation from several investors including Rebel Fund, Eastlink Capital, Pioneer Fund, Y Combinator, and angel investors. This funding is allowing Vellum to expand its capabilities and support a growing number of companies that are integrating generative AI into their workflows.

How is Vellum Shaping the Future of Prompt Engineering?

Vellum’s founders emphasize that prompt engineering is evolving into a crucial skill in the development of AI applications. The company's tooling enables developers to refine and version their prompts for optimal performance, addressing the challenges companies face when taking prototypes to production. With generative AI continuing to grow in importance across various sectors, Vellum's platform is uniquely positioned to provide the necessary infrastructure for prompt engineering, making it possible for developers to create more accurate and reliable models.

The market for LLM-powered applications is rapidly expanding, and Vellum is at the forefront of providing the tools necessary for companies to keep pace with this growth. The company’s focus on improving the quality, consistency, and scalability of generative AI models makes it a vital player in the prompt engineering space.

Who Trusts Vellum?

Vellum's cutting-edge developer platform has already earned the trust of several companies on the bleeding edge of AI. Some of these companies include:

Yuma.ai: Yuma.ai is a company that builds AI solutions for the insurance industry. Their AI-powered platform helps insurers improve their underwriting process by reducing the need for manual review and providing more accurate risk assessments. By using Vellum's developer platform, Yuma.ai was able to build a more efficient and effective solution for their customers.

Pangea: Pangea is a fintech startup that provides access to financial services for underbanked populations in emerging markets. Their platform uses AI to make lending decisions and provide customers with personalized financial advice. With Vellum's platform, Pangea was able to build and deploy their AI models faster and with greater confidence.

Truewind: Truewind is a startup that uses AI to help companies reduce their carbon footprint. Their platform uses machine learning to analyze a company's energy usage and identify areas where they can reduce waste. With Vellum's developer platform, Truewind was able to build a more accurate and reliable solution for their customers.

Alphawatch: Alphawatch is a startup that provides AI-powered monitoring and diagnostics for industrial equipment. Their platform uses machine learning to predict equipment failures and optimize maintenance schedules. By using Vellum's platform, Alphawatch was able to build a more robust and scalable solution for their customers.

Why Choose Vellum?

If you're looking to build AI applications, Vellum's developer platform is the best choice for several reasons:

Easy to Use: Vellum's platform is designed to be user-friendly, with intuitive interfaces and easy-to-understand workflows. This makes it easy for developers to get started building AI applications without a steep learning curve.

Robust: Vellum's platform is built to handle large amounts of data and complex workflows. This means that you can build AI applications that can handle real-world use cases and scale as your business grows.

Flexible: Vellum's platform is designed to work with a variety of different AI models and providers. This means that you can choose the best model for your use case, whether it's a pre-trained model or one you've built yourself.

Transparent: Vellum's platform provides full observability into your AI models, so you can see exactly what's happening under the hood. This makes it easier to diagnose problems and optimize performance.

Cost-Effective: Vellum's platform is designed to help you build AI applications faster and with greater confidence. This means that you can save time and money on development and get your applications to market faster.

Conclusion

Vellum's developer platform is the best choice for companies looking to build AI applications. With its user-friendly interface, robust architecture, flexible model support, transparent observability, and cost-effectiveness, Vellum is the perfect tool for taking your AI prototypes to production. Whether you're building AI applications for finance, insurance, energy, or any other industry, Vellum's platform can help you build better, more reliable solutions faster. So if you're ready to take your AI applications to the next level, contact Vellum today and start building!