truefoundry
New

Backend Interns

IndiaIndiaentry
OtherBackend
0 views0 saves0 applied

Quick Summary

Overview

About TrueFoundry Every production AI system whether it's powering customer support, writing code, analyzing financial data, or diagnosing medical conditions needs the same foundational infrastructure.A way to route between models. A way to manage tools and integrate them securely.

Technical Tools
anthropicawsdockerkuberneteslangchainopenaipythoncustomer-supportmachine-learningrest-apis

About TrueFoundry

Every production AI system whether it's powering customer support, writing code, analyzing financial data, or diagnosing medical conditions needs the same foundational infrastructure.A way to route between models. A way to manage tools and integrate them securely. A way to orchestrate agents and enforce governance. A unified compute layer to run it all.

We're TrueFoundry, and we're building it. We're looking for a Backend Intern to join our Engineering team.

Companies are moving beyond simple chatbots to production agentic systems. These systems route between OpenAI, Anthropic, Google, and self-hosted models. They integrate dozens of tools via protocols like MCP. They orchestrate multi-agent workflows where agents coordinate with other agents.

The infrastructure to support this doesn't exist yet. You can't just duct-tape together a few API calls and call it production-ready.

You need a control plane that handles:

  • Intelligent routing with observability, cost policies, and fallback logic
  • Centralized tool and MCP server management with security and lifecycle controls
  • Agent orchestration with governance and guardrails
  • A unified compute layer to run self-hosted models, custom tools, and agents

We've built two products to solve this:

AI Gateway is the control plane five composable components (Prompts, LLM Gateway, MCP Gateway, Guardrails, Agent Gateway) that handle routing, orchestration, and governance.

AI Deploy is the compute layer of Kubernetes-based platform that abstracts ML workloads as standard software primitives, so everything runs on unified infrastructure.

We're Series A, backed by Intel Capital and Sequoia. Companies like CVS, Mastercard, Siemens, Paytm, Synopsys, and Zscaler run production AI workloads on our platform.

Contribution to open source repos is preferred
* Experience writing concurrent and distributed programs, AWS lambda, Kubernetes, Docker, Spark is preferred.
* Experience with one relational & 1 non-relational DB is preferred
* Prior work in the ML domain will be a big boost

**What You’ll Do**
* Help realize the product vision: Production-ready machine learning models with monitoring within moments, not months.
* Help companies deploy their machine learning models at scale across a wide range of use-cases and sectors.
* Build integrations with other platforms to make it easy for our customers to use our product without changing their workflow.
* Write maintainable, scalable performant python code
* Building gRPC, rest API servers
* Working with Thrift, Protobufs, etc.

Education: IITians are highly preferred for education, especially in engineering and technology.

Traits we are looking for: Ownership, ability to execute, hustle and think out of the box, data driven decision making, be comfortable with more unknowns than knowns

What you'll be doing? You will play a crucial role in helping us achieve our growth objectives. Your primary responsibility will be to assist in building and optimizing both our Outbound funnels.

What do we do?

TrueFoundry is a Cloud-native PaaS for Machine learning teams to build, deploy and ship ML/LLM Applications on their own cloud/on-prem Infra in a faster, scalable, cost efficient way with the right governance controls, allowing them to achieve 90% faster time to value than other teams.

TrueFoundry abstracts out the engineering required and offers GenAI accelerators - LLM PlayGround, LLM Gateway, LLM Deploy, LLM Finetune, RAG Playground and Application Templates that can enable an organisation to speed up the layout of their overall GenAI/LLMOps framework. Enterprises can plug and play these accelerators with their internal systems as well as build on top of our accelerators to enable a LLMOps platform of their choice to the GenAI developers. TrueFoundry is modular and completely API driven, has native integration with popular tools in the market like LangChain, VectorDBs, GuardRails, etc.

Location & Eligibility

Where is the job
India
On-site within the country
Who can apply
IN

Listing Details

Posted
February 20, 2026
First seen
May 6, 2026
Last seen
May 8, 2026

Posting Health

Days active
0
Repost count
0
Trust Level
14%
Scored at
May 6, 2026

Signal breakdown

freshnesssource trustcontent trustemployer trust

3 other jobs at truefoundry

View all →

Explore open roles at truefoundry.

Newsletter

Stay ahead of the market

Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.

A
B
C
D
Join 12,000+ marketers

No spam. Unsubscribe at any time.

truefoundryBackend Interns