Loading...
Loading...
Cary sits inside the Research Triangle but operates on a meaningfully different rhythm than Raleigh, Durham, or Chapel Hill. The town carries one of the highest concentrations of data scientists per capita in the country, anchored by the SAS Institute world headquarters on SAS Campus Drive — the company that arguably invented the analytics industry — and the Epic Games campus that quietly hosts hundreds of ML engineers building Unreal Engine and game telemetry infrastructure. The buyer mix here also includes MetLife's regional headquarters on Westin Drive, John Deere's Intelligent Solutions Group at the new Cary office tied to Hexagon Mining acquisitions, the Marketing AI work at Toyota Financial Services, and a tight cluster of biopharma data engineering teams supporting the Research Triangle Park employers ten minutes north. Wake Tech Community College's data analytics program and the spillover from NC State's Institute for Advanced Analytics in Raleigh supply the regional ML talent pipeline. ML engagements in Cary typically center on SAS-platform modernization projects for buyers transitioning off legacy SAS to Python and modern cloud platforms, game telemetry and player churn forecasting tied to Epic Games and the broader gaming employer base, insurance risk modeling at MetLife, and the Research Triangle Park spillover work in biopharma and healthcare. LocalAISource matches Cary operators with ML practitioners who can ship production models on SageMaker, Azure ML, Vertex AI, or Databricks, and who understand the SAS-to-modern-stack migration patterns that define so much of this metro's predictive analytics work.
Updated May 2026
The single most distinctive Cary predictive analytics demand stream is the SAS-to-modern-stack migration. SAS Institute's headquarters has been the dominant analytics employer in the metro since 1976, and the company has trained generations of statisticians who deployed SAS-based forecasting and risk models at thousands of enterprises nationally. Those buyers are now migrating to Python, modern MLOps, and cloud-native infrastructure, and Cary-based practitioners with deep SAS fluency are uniquely positioned to lead the migration work. The engagements typically combine reverse-engineering legacy SAS code, re-implementing forecasting and credit risk models on modern stacks like Databricks or Azure ML, and validating that the new implementation matches the legacy outputs within acceptable tolerances. Practitioners shipping in this segment need genuine bilingual fluency — not just Python expertise but real SAS Macro, SAS/STAT, and SAS Enterprise Miner depth. The platform mix runs heterogeneous because the migration target depends on the buyer's broader cloud strategy. Engagement totals for a typical SAS modernization project run one hundred to three hundred thousand and span sixteen to twenty-four weeks. SAS Institute itself does not typically buy these services, but a meaningful share of the senior local consulting practices was built by former SAS employees and consultants. References inside an actual completed migration matter more than generic ML credentials.
Epic Games' campus on Crossroads Boulevard runs a serious in-house ML and analytics operation tied to Fortnite, Unreal Engine telemetry, and the broader live-service game infrastructure. While Epic's internal teams handle the bulk of the work, the surrounding ecosystem — game studios that license Unreal Engine, the marketing analytics partners that support Epic's player acquisition, and the live-service game studios that have located in the Triangle to be near Epic's talent pool — drives meaningful outside ML demand. The work centers on player churn modeling at multiple time horizons, lifetime value prediction, demand forecasting for event-driven game features, and increasingly LLM-augmented community moderation and content recommendation. Practitioners shipping in this segment need fluency in event-stream feature engineering at scale, the specific telemetry patterns that game data produces, and the privacy frameworks that govern player data — particularly when models touch younger players. The platform stack leans Databricks and Vertex AI for the larger studios, with smaller indie operations on lighter SageMaker or self-hosted setups. Engagement totals run sixty to one hundred and eighty thousand and ten to sixteen weeks. Practitioners with prior tours at Epic, Riot, or Activision Blizzard bring an operational fluency that generic SaaS ML consultants rarely match.
MetLife's Cary regional operations on Westin Drive drive a steady stream of insurance ML demand around actuarial modernization, claims forecasting, customer churn, and increasingly fraud detection on the disability and group benefits books. The work is methodologically conservative — actuarial models still need to satisfy state insurance regulator requirements, which constrains how aggressively practitioners can push deep learning approaches into core pricing and reserving — but the documentation overhead is real and meaningful. Practitioners shipping into MetLife need fluency in actuarial software, model validation that aligns with the National Association of Insurance Commissioners model risk frameworks, and the specific reporting requirements that insurance ML demands. Beyond MetLife, the Research Triangle Park spillover into Cary brings biopharma data engineering work tied to GSK, Eli Lilly, Biogen, and Merck operations ten minutes north on I-40. The biopharma work tends to be heavy on clinical trial forecasting, manufacturing yield prediction, and supply chain optimization for cold-chain logistics. Engagement totals across these segments run eighty to two hundred and twenty thousand and twelve to twenty weeks. The platform mix leans Azure ML for MetLife and SageMaker for the biopharma side, partly because of NIH-grant precedent.