tech-partners
New

Enterprise Data Analytics & AI Developer/Architect

mid
OtherAi Developer
0 views0 saves0 applied

Quick Summary

Overview

Enterprise-scale data, analytics, and AI leadership role Tech Partners is partnering with a client to identify an Enterprise Data Analytics and AI Developer who will lead the design and delivery of secure, scalable, and production-ready data and AI solutions across the organization.

Technical Tools
azurepower-bipythonsqlci-cddata-analysisdatabase-designetloauth

Tech Partners is partnering with a client to identify an Enterprise Data Analytics and AI Developer who will lead the design and delivery of secure, scalable, and production-ready data and AI solutions across the organization. This role sits at the intersection of data engineering, analytics, AI application development, and enterprise architecture, with a strong emphasis on Microsoft Fabric and Microsoft AI Foundry.

You'll work cross-functionally with enterprise architecture, engineering, governance, and business teams to translate business outcomes into robust technical solutions. This is a hands-on, senior-level role with responsibility for strategy, solution delivery, technical leadership, and operational excellence.

The interview process includes an in-person interview in San Diego.

---

Responsibilities

~1 min read
  • Partner with enterprise architecture teams to define enterprise data and AI roadmaps.
  • Develop reference architectures and design patterns for Microsoft Fabric and AI Foundry.
  • Establish data service standards, semantic modeling conventions, and AI model lifecycle policies.
  • Influence enterprise security, privacy, and compliance for data and AI workloads.

Technical & Project Leadership

  • Lead end-to-end delivery from discovery and design through build, release, and production support.
  • Own technical quality gates, including design reviews, security reviews, and production readiness.
  • Drive non-functional requirements such as performance, scalability, cost optimization, and observability.
  • Coordinate integrations with third-party data sources and enterprise systems.
  • Design and implement Lakehouse and Warehouse architectures in OneLake.
  • Build ETL pipelines using Data Factory, notebooks, shortcuts, and data mirroring.
  • Develop and optimize Power BI semantic models and datasets.
  • Implement real-time and operational analytics using KQL.
  • Harden solutions with RBAC, sensitivity labels, RLS/OLS, OAuth, SAML, and governance tooling.
  • Automate CI/CD for Fabric assets using deployment pipelines.
  • Design, configure, and deploy custom copilots using Microsoft Copilot Studio.
  • Integrate copilots into Teams and SharePoint experiences.
  • Select, evaluate, and manage models using AI Foundry model catalog and control plane.
  • Build and operate single- and multi-agent solutions with Agent Service.
  • Implement RAG solutions using Azure AI Search and vector indices.
  • Configure observability, evaluations, guardrails, and data leakage prevention.
  • Design and train ML models to solve business problems.
  • Establish CI/CD using GitHub for data and AI assets.
  • Implement automated testing, monitoring, logging, and runbooks.
  • Enable cost observability and capacity right-sizing.
  • Support UAT, cutover, incident response, and production operations.

---

Requirements

~1 min read
  • 7+ years of experience in enterprise data engineering, analytics engineering, and/or AI application development.
  • Proven delivery of production-grade solutions in Azure environments.
  • Strong experience working across large, complex enterprise organizations.
  • Microsoft Fabric: OneLake, Data Factory, Lakehouse, Warehouse, KQL/Real-Time Intelligence, Power BI semantic models.
  • Microsoft AI Foundry: Model catalog, Agent Service, evaluations/observability, guardrails, and control plane.
  • Copilot Studio: Building and deploying custom copilots integrated into Teams/SharePoint.
  • RAG implementations using Azure AI Search and vector databases.
  • Strong SQL, Python, and KQL skills.
  • Deep understanding of data modeling (Kimball, EDW, Streaming, Lakehouse).
  • CI/CD, automated testing, IaC, monitoring, and operational readiness.
  • Data governance, privacy, and security (RBAC, sensitivity labels, RLS/OLS, DLP).
  • Strong communication skills with the ability to explain complex technical concepts to non-technical stakeholders.
  • Proven ability to lead, mentor, and influence across teams.

Requirements

~1 min read
  • Must be able to attend an in-person interview in San Diego.
  • Valid driver's license and active personal auto insurance required.
  • Ability to pass a background check and pre-employment drug screening.

Nice to Have

~1 min read
  • Bachelor's degree in Computer Science, Information Systems, or related field.
  • Microsoft certifications (Azure Data Engineer, Azure AI Engineer, Fabric).
  • Experience with Purview, DLP, compliance frameworks, and regulated environments.
  • Experience integrating ERPs and other enterprise systems.

---

What We Offer

~1 min read
$140,000–$170,000 annually, depending on experience, skills, certifications, and location.

Location & Eligibility

Where is the job
Location terms not specified

Listing Details

First seen
May 5, 2026
Last seen
May 9, 2026

Posting Health

Days active
0
Repost count
0
Trust Level
42%
Scored at
May 6, 2026

Signal breakdown

freshnesssource trustcontent trustemployer trust
Newsletter

Stay ahead of the market

Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.

A
B
C
D
Join 12,000+ marketers

No spam. Unsubscribe at any time.

tech-partnersEnterprise Data Analytics & AI Developer/Architect