Machinify is a leading healthcare intelligence company with expertise across the payment continuum, delivering unmatched value, transparency, and efficiency to health plan clients across the country. Deployed by over 60 health plans, including many of the top 20, and representing more than 160 million lives, Machinify brings together a fully configurable and content-rich, AI-powered platform along with best-in-class expertise. We’re constantly reimagining what’s possible in our industry, creating disruptively simple, powerfully clear ways to maximize financial outcomes and drive down healthcare costs.Product Manager | Technical DataAbout MachinifyMachinify is a leading healthcare intelligence company formed through the combination of five payment integrity industry leaders: The Rawlings Group, Apixio Payment Integrity, VARIS, Machinify's AI platform, and Performant Healthcare. Backed by New Mountain Capital and valued at approximately $5 billion, we serve over 60 health plans including many of the top 20, representing more than 160 million lives.We are building a unified AI-powered platform that transforms healthcare payments by combining revolutionary technology, clinical expertise, and rich data assets to deliver unmatched value, transparency, and efficiency across the payment continuum.Position SummaryWe are hiring a Senior Technical Data Product Manager to drive the data product roadmap in close partnership with data engineering and architecture leadership. Reporting to the Sr. Director of Product Management for Platform & Data, you will work alongside the VP of Data Engineering, CTO, and technical leads to translate business needs into product requirements and coordinate execution across teams.This role requires someone exceptional at three critical capabilities: deeply understanding complex current state, envisioning and defining compelling future state, and executing at extraordinary velocity to bridge the two. The right candidate will rapidly assess the landscape, build credibility across technical teams, and ship measurable value quickly—we expect clear thinking on product direction within the first 90 days and demonstrable impact on team velocity shortly after.You will partner with data engineering, data science, and platform engineering leadership to define and deliver data products and capabilities that enable product teams across coordination of benefits, subrogation, audit, pharmacy payment integrity, and complex claims solutions. You will serve as the connective tissue between business requirements and technical execution, coordinating the consolidation of disparate legacy systems into modern, unified infrastructure while enabling new product capabilities.This is a hands-on technical role requiring prior experience as a data engineer, data scientist, or analytics engineer. You must be comfortable writing SQL, reviewing data architectures, and engaging substantively in technical discussions. You will work across multiple legacy platforms, each with different technologies, cultures, and tribal knowledge—requiring exceptional ability to influence without direct authority.Core ResponsibilitiesProduct Strategy and PlanningAssess current state rapidly: Work with data engineering and architecture teams to understand complex legacy landscapes—what exists, where critical information lives, and how systems actually workContribute to future state vision: Partner with VP Data Engineering, CTO, and architecture leads to shape target data architectures, canonical models, and platform capabilities that will scale to support product teamsDevelop product roadmap: Translate business priorities into data product requirements, working with technical leadership to sequence initiatives and balance migration work, new capabilities, and product enablementSupport technical evaluations: Contribute product perspective to build vs. buy decisions, technology evaluations (lakehouse formats, real-time processing, AI-powered automation), and architectural choicesDefine and track success metrics: Establish product-level OKRs, track adoption across product teams, and communicate progress to stakeholdersExecution and CoordinationDrive cross-functional delivery: Coordinate data initiatives from requirements through production, working across data engineering, data science, platform engineering, and product teamsUnblock relentlessly: Identify and resolve dependencies, bottlenecks, and blockers before they slow down team velocityNavigate complexity: Find critical information scattered across legacy platforms, undocumented systems, and tribal knowledge; synthesize insights and create clarityFacilitate decisions: Build consensus across teams with competing priorities and different technical opinionsLeverage AI extensively: Use LLMs and AI-powered tools to accelerate analysis, documentation, SQL generation, information synthesis, and decision-makingEstablish lightweight visibility: Create metrics, dashboards, and reporting that provide insight without creating overheadTechnical Collaboration and Product EnablementPartner with technical leadership: Work closely with data engineering, data science, and architecture leads—contributing product perspective while respecting their technical expertise and domain ownershipTranslate requirements: Convert product team needs into clear technical requirements that engineering teams can execute againstEnable product teams: Ensure downstream product teams can successfully consume data platform capabilities through clear interfaces, documentation, and supportParticipate in technical discussions: Engage substantively in reviews of ETL pipelines, data models, distributed architectures, and platform decisionsBridge stakeholders: Translate complex technical concepts into business value for executives and product teams; bring business context to technical discussionsWhat You'll Work OnYou will partner with data engineering, data science, and platform teams on initiatives such as:Data consolidation and unification across legacy platforms—working with engineering teams to coordinate migrations to unified infrastructure while maintaining production stability and enabling parallel product developmentCanonical data model development—collaborating with data engineering and data science leadership to define product requirements for production-ready models covering medical claims, pharmacy claims, eligibility, and other core healthcare entitiesPlatform modernization—contributing product perspective to technical evaluations and roadmaps for lakehouse adoption, OLAP/OLTP separation, real-time processing capabilities, and distributed architecture patternsAI-powered automation—partnering with technical teams to evaluate and implement LLM-based approaches that accelerate ETL development, data transformation, and migration workflowsData discovery and cataloging—working with engineering to define requirements for capabilities that help teams understand what data exists, where it lives, how to access it, and what it meansProduct team enablement—ensuring downstream product teams can successfully consume data platform capabilities through clear interfaces, comprehensive documentation, and responsive supportThese represent current focus areas but the role will evolve based on business priorities, strategic direction, and technical roadmap.Required QualificationsExperience Requirements10+ years total professional experience5+ years in product management rolesPrior hands-on experience as data engineer, data scientist, or analytics engineer (required)Proven track record shipping data products or platforms used by internal/external teamsExperience driving execution in matrixed organizations without direct authorityDemonstrated ability to assess complex technical landscapes and define future-state architecturesTechnical Skills (Must-Have)Data architecture expertise: Deep understanding of data modeling, normalization/denormalization, distributed systems, batch/streaming patterns, ETL/ELT designAdvanced SQL proficiency: Write complex queries, optimize performance, understand CDC patterns, validate data qualityCloud data infrastructure: AWS preferred (S3, Spark, RDS, DMS, Glue) or equivalent GCP/Azure experienceModern data stack fluency: Knowledge of data warehouses, lakehouse formats, orchestration tools (Airflow), transformation frameworks (DBT), BI platformsAnalytical rigor: Define metrics, analyze data, make data-driven decisions, identify patterns across complex systemsProduct Management Excellence (Must-Have)Strong stakeholder management across technical teams (data engineering, data science, platform) and business audiencesAbility to translate complex technical architectures into business outcomes and vice versaExperience defining product vision, building roadmaps, and measuring successProven influence without direct authority—building consensus through credibility and data-driven argumentsExcellent written and verbal communication across all organizational levelsAgile/Scrum methodology experienceCritical Success FactorsCurrent State Mastery: Exceptional ability to rapidly understand complex legacy systems—navigating five different platforms with different data models, ETL patterns, and team cultures to discover how things actually workFuture State Vision: Can envision target architectures that will scale 10-100x beyond current state, articulate why they matter, and define pragmatic paths to get thereExecution Velocity: Move with extraordinary speed from analysis to decision to implementation. Bias toward shipping 80% solutions today over 95% solutions next quarter. Understand that speed is a competitive advantage.AI-Augmented Productivity: Active, sophisticated use of AI tools (ChatGPT, Claude, Copilot, etc.) to accelerate analysis, generate SQL, synthesize information, draft documentation, and make faster decisions than traditional approachesTechnical Credibility: Data engineering and data science teams respect you because you can engage substantively in architectural discussions, understand their constraints, and spot issues before they become problemsCross-Boundary Navigation: Excel at finding critical information across disparate systems and tribal knowledge; build trust across teams with different cultures and priorities; serve as connective tissue in high-pressure environmentsSystems Thinking: Understand second and third-order effects of architectural decisions across platform, products, and operationsPreferred QualificationsExperience with LLM/AI applications for data transformation, code generation, or workflow automationPython proficiency for data analysis, prototyping, or understanding engineering implementationsPrior data platform migrations or consolidations at significant scaleHealthcare payment integrity, payer operations, or regulated industry experienceHands-on experience with Snowflake, Databricks, Kafka, Fivetran, or similar modern data platformsBackground in distributed systems, database internals, or data-intensive applicationsFast-paced startup or high-growth company experienceWhy This Role MattersTransformational Impact: Your work enables product teams serving 160M+ lives to ship faster and unlocks significant business value across coordination of benefits, subrogation, audit, and payment integrityTechnical Depth: Engage substantively in cutting-edge architecture decisions—lakehouse formats, distributed systems, real-time processing, LLM-powered automation—not just coordinate meetingsUnique Challenge: Navigate five legacy platforms with different data models, technologies, and cultures—partnering with technical leadership to shape unified future state in a once-in-career data consolidation opportunityExecution Autonomy: We value speed over process. Fast decision-making and shipping results matter more than perfect planningAI-First Environment: We use Claude, LLMs, and AI-powered tools extensively throughout the organization. You're expected to leverage AI to move faster than traditional approachesStrategic Partnership: Work alongside VP Data Engineering, CTO, and architecture leadership to shape data platform serving dozens of product teams and supporting a $5B healthcare intelligence platformTeam and CultureReporting Structure: Sr. Director of Product Management, Platform & DataKey Partnerships:VP of Data Engineering and CTO (primary technical partners for strategy and architecture)Data Engineering leadership (Eric, Andre, Sam)Data Science teamsPlatform Engineering teamsProduct teams across COB, Subrogation, Audit, Pharmacy PI, Complex ClaimsArchitecture and Technical Strategy leadershipYou will work in close partnership with the VP of Data Engineering and CTO, bringing the product lens to technical strategy discussions while they own engineering execution and technical architecture. Your role is to translate business needs into product requirements, coordinate across teams, and ensure data initiatives deliver value to product organizations.Culture:Fast-paced with bias toward action over analysis paralysisTechnically deep—we respect engineering expertise and engage substantivelyLightweight processes that accelerate rather than burdenRemote-first with occasional travel for strategic planning (~quarterly)Sophisticated use of AI tools to augment productivity expected and encouragedTechnical Environment:Current: Postgres (Citus), Apache Spark, AWS (S3, DMS, RDS), Python, SQLEvolving: Lakehouse formats (Delta Lake/Iceberg), streaming capabilities, unified data architecture, LLM-powered automationScale: Hundreds of millions of records, petabyte-range data, multi-tenant architecture, dozens of data sourcesEqual Employment Opportunity at MachinifyMachinify is committed to hiring talented and qualified individuals with diverse backgrounds for all of its positions. Machinify believes that the gathering and celebration of unique backgrounds, qualities, and cultures enriches the workplace. See our Candidate Privacy Notice at: https://www.machinify.com/candidate-privacy-notice/