\u003C/p>\u003C/li>\u003C/ul>\u003Ch1>Requirements\u003C/h1>\u003Cul style=\"min-height:1.5em\">\u003Cli>\u003Cp style=\"min-height:1.5em\">8+ years building data‑heavy systems with \u003Cstrong>schema design, identity resolution, and event pipelines\u003C/strong> (idempotent, incremental, testable). \u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">Proven \u003Cstrong>technical lead\u003C/strong> impact: multi‑quarter technical leadership, cross‑team influence, dependable delivery, and risk management. \u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">Strong Python and SQL; comfort across cloud data platforms and containers; experience designing APIs and data contracts used by product teams. \u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">Data‑quality and observability mindset (validation, lineage, run health) in production pipelines. \u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">Experience with \u003Cstrong>cloud data lakes\u003C/strong> (Delta Lake or Apache Iceberg) on \u003Cstrong>Parquet\u003C/strong>; BigQuery/Cloud SQL; Redis; Pub/Sub. \u003C/p>\u003C/li>\u003C/ul>\u003Cul style=\"min-height:1.5em\">\u003Cli>\u003Cp style=\"min-height:1.5em\">Strong in Python or Rust or TypeScript in production services; interest in shaping evaluation for enrichment/agentic steps alongside Context Mining. \u003C/p>\u003C/li>\u003C/ul>","https://jobs.ashbyhq.com/parable/0075c2fe-a405-4c50-b758-218c9a621cf5",{"id":126,"name":127,"urlSafeSlug":127,"logo":128},[450,451],{"city":18,"region":18,"country":230},{"city":159,"region":159,"country":24},"2025-09-10T07:17:52.528Z","Candidates should have 8+ years of experience building data-heavy systems, including schema design, identity resolution, and event pipelines that are idempotent, incremental, and testable. Proven technical lead impact is required, with experience in leading multiple technical initiatives.","The Staff Engineer & Technical Lead – Ontology will own the ontology by defining entities, relationships, and versioning data contracts, balancing abstraction and specificity for cross-client reusability. Responsibilities include hydrating and linking data through idempotent, incremental pipelines to unify identities into stable Parable IDs and persist them in lakehouse tables. They will establish gold sources and standards, productize research by transforming client solutions into reusable ontology pipelines, and co-design stable APIs and query surfaces to power client UX. The role involves partnering upstream to specify connector mappings and entity keys, and raising the bar for technical standards in testability, observability, lineage, and validation to ensure ontology jobs are dependable and easy to operate per tenant.",{"employment":456,"compensation":458,"experience":461,"visaSponsorship":464,"location":465,"skills":466,"industries":478},{"type":457},{"id":168,"name":169,"description":287},{"minAnnualSalary":400,"maxAnnualSalary":459,"currency":402,"details":460},350000,"Base salary with potential bonuses.",{"experienceLevels":462},[463],{"id":294,"name":295,"description":296},{"type":11},{"type":11},[362,467,359,468,469,470,471,472,473,474,475,476,477],"Enterprise Schema","Identity Resolution","Data Modeling","Scalability","Technical Leadership","C++","GCP","VPC","Compute","Storage","KMS",[479,480,481],{"id":311,"name":312},{"id":122,"name":121},{"id":248,"name":482},"SaaS",{"id":484,"title":485,"alternativeTitles":486,"slug":502,"jobPostId":484,"description":503,"isReformated":50,"applyUrl":504,"company":127,"companyOption":505,"locations":506,"listingDate":509,"listingSite":161,"isRemote":14,"requirements":510,"responsibilities":511,"status":164,"expiryDate":18,"isGoogleIndexed":50,"summary":512},"79d5b407-c17c-482d-abd2-387cdcfb6f81","Forward-Deployed Engineer",[487,488,489,490,491,492,493,494,495,496,497,498,499,500,501],"Integration Engineer","Enterprise Integration Specialist","API Integration Developer","Data Integration Engineer","Cloud Integration Engineer","SaaS Integration Specialist","Workday Integration Developer","Salesforce Integration Engineer","Okta Integration Specialist","Python Integration Developer","TypeScript Integration Engineer","Rust Integration Developer","Technical Integration Consultant","Client Integration Engineer","Platform Integration Engineer","forward-deployed-engineer-79d5b407-c17c-482d-abd2-387cdcfb6f81","Salary: $200K - $325K\n\nLocation Type: Remote\n\nEmployment Type: FullTime\n\n\u003Cp style=\"min-height:1.5em\">We are hiring for our very first \u003Cstrong>Forward-Deployed Engineer\u003C/strong> at Parable.\u003C/p>\u003Cp style=\"min-height:1.5em\">As the technical face of Parable for enterprise integrations, you will have a major influence on the development of our product. You will guide client admins through secure setup—API key/secret generation, permissions scoping, automated audit-log exports, and connector configuration across Workday, Oracle Fusion, Microsoft 365, Okta, Salesforce, Netsuite, and more—so that clean data lands reliably in their Parable tenant. When not with clients, you will build and harden connectors/taps, improve scaffolding/CLI and docs, and help make integrations self-serve. You will partner closely with the Raw Data TPM and contribute to both internal and external UX where our CX team and our clients manage tokens and view connector health.\u003C/p>\u003Cp style=\"min-height:1.5em\">\u003C/p>\u003Ch1>About Parable\u003C/h1>\u003Cp style=\"min-height:1.5em\">Parable’s mission is to \u003Cstrong>make time matter\u003C/strong>.\u003C/p>\u003Cp style=\"min-height:1.5em\">We provide CEOs of companies with 1,000+ employees with deep observability of the time spent by their team across all strategies, projects, and processes. Our insights help teams focus on the work that matters the most, and drive data-driven decisions in resource allocation.\u003C/p>\u003Cp style=\"min-height:1.5em\">The company was founded by seasoned founders, with multiple 9-figure exits under their belts, and reached $2m of ARR within 6 months of going to market. Parable raised $17 million from investors like \u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" href=\"http://hofcapital.com/\">\u003Cu>HOF Capital\u003C/u>\u003C/a> and \u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.storyventures.vc/\">\u003Cu>Story Ventures\u003C/u>\u003C/a>, as well as 50+ founders and executives.\u003C/p>\u003Cp style=\"min-height:1.5em\">\u003Cstrong>On the technical side,\u003C/strong> we are building an operating system for large enterprises – ingesting silo’ed data from across the workplace stack, shaping it into a strongly opinionated enterprise ontology, and contextualizing it to extract insights for clients. Each customer runs in a fully isolated, single-tenant GCP environment (own VPC, compute, storage, and KMS), with shared, parameterized pipelines instantiated per tenant—no bespoke schema per client.\u003C/p>\u003Cp style=\"min-height:1.5em\">Our platform stack includes Cloud Run Jobs/Compute Engine, Pub/Sub, Cloud Storage, Memorystore, BigQuery/Cloud SQL, and an Iceberg-based lake; we build primarily in Python/TypeScript (plus Rust).\u003C/p>\u003Cp style=\"min-height:1.5em\">\u003C/p>\u003Ch1>The Raw Data team\u003C/h1>\u003Cp style=\"min-height:1.5em\">This team’s mission is to \u003Cstrong>productize landing client data\u003C/strong>—from SaaS APIs and custom/on-prem systems—into each client’s private data lake. They are building a \u003Cstrong>Connector Factory\u003C/strong> (to reduce time to new connectors and taps), \u003Cstrong>ingestion observability\u003C/strong>, and \u003Cstrong>first-class API/documentation\u003C/strong> so clients and internal teams can self-serve.\u003C/p>\u003Cp style=\"min-height:1.5em\">\u003C/p>\u003Ch1>You’ll be responsible for:\u003C/h1>\u003Cul style=\"min-height:1.5em\">\u003Cli>\u003Cp style=\"min-height:1.5em\">\u003Cstrong>Owning client integrations end-to-end – \u003C/strong>Work directly with IT/SecOps to generate credentials (OAuth2, PATs, service principals), define least-privilege scopes, turn on audit exports/webhooks or scheduled jobs, and validate data flow in their isolated VPC. \u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">\u003Cstrong>Owning automations end to end – \u003C/strong>Turn client setup learnings into repeatable, automated processes with internal systems that keep clients and our CX teams up to date on the health and state of client integrations without human work.\u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">\u003Cstrong>Standing up production-grade ingestion –\u003C/strong> Plan backfills vs. incremental sync; handle rate limits/pagination; design retries/idempotency using our Pub/Sub-orchestrated jobs. \u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">\u003Cstrong>Building connectors & taps –\u003C/strong> Implement new sources in Python/TypeScript (plus Rust where useful); contribute scaffolds/CLI to shrink “time-to-first-tap.” \u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">\u003Cstrong>Instrumenting ingestion health –\u003C/strong> Expose coverage windows, lag, error budgets, and volumes—visible to clients and internal teams. \u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">\u003Cstrong>Writing crisp setup docs –\u003C/strong> Produce step-by-step guides for token admin flows and source-specific quirks; align with the client-facing App’s self-service token UX. \u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">\u003Cstrong>Partnering across streams –\u003C/strong> Ensure Raw Data unblocks Ontology & Context Mining by delivering the right datasets and documenting semantics for mapping into our opinionated schema. \u003Cbr />\u003C/p>\u003C/li>\u003C/ul>\u003Ch1>This role is for someone who:\u003C/h1>\u003Cul style=\"min-height:1.5em\">\u003Cli>\u003Cp style=\"min-height:1.5em\">\u003Cstrong>Is passionate about data infrastructure productization \u003C/strong>– you believe that removing friction from data ingestion directly accelerates client value.\u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">\u003Cstrong>Balances technical depth and product leadership\u003C/strong> – you can review APIs, write requirements, and contribute to technical docs, while also driving stakeholder alignment across Product, Engineering, Sales, and Customer Success.\u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">\u003Cstrong>Appreciates the details involved in solving complexity\u003C/strong> – you enjoy figuring out why specific API scopes and permissions don’t work, and troubleshooting those with high velocity.\u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">\u003Cstrong>Enjoys crafting documentation\u003C/strong> – you love writing explainer content for both developers and clients.\u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">\u003Cstrong>Sees GenAI as an enabler\u003C/strong> – you’re excited to use GenAI to scaffold taps from an API spec.\u003C/p>\u003C/li>\u003C/ul>\u003Cp style=\"min-height:1.5em\">\u003C/p>\u003Ch1>Requirements\u003C/h1>\u003Cul style=\"min-height:1.5em\">\u003Cli>\u003Cp style=\"min-height:1.5em\">Significant experience (generally 7–10+ years) shipping \u003Cstrong>enterprise data integrations\u003C/strong> as a solutions/forward-deployed/implementation or software engineer.\u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">Mastery of \u003Cstrong>enterprise admin/security models\u003C/strong>: OAuth2/SAML/SCIM, service principals, RBAC, audit logs, IP allow-listing/egress controls.\u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">Hands-on with enterprise systems like \u003Cstrong>Workday, Oracle Fusion, Microsoft 365, Okta, Salesforce, Netsuite\u003C/strong> and their APIs, exports, and permission models.\u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">Proven ability to \u003Cstrong>debug\u003C/strong> auth/permissions, rate limits, schema mismatches, and data quality in production.\u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">Strong proficiency in \u003Cstrong>Python or TypeScript\u003C/strong> (bonus: Golang) and familiarity with \u003Cstrong>GCP\u003C/strong> (Cloud Run Jobs, Pub/Sub, Cloud Storage, IAM, VPCs). \u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">Security-first mindset aligned with \u003Cstrong>per-tenant isolation and KMS\u003C/strong> boundaries; comfortable engaging client security teams. \u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">Communication and leadership consistent with \u003Cstrong>senior/staff-level\u003C/strong> expectations in our framework (strong execution, cross-functional influence, and client-facing clarity).\u003C/p>\u003C/li>\u003C/ul>\u003Cp style=\"min-height:1.5em\">\u003Cstrong>\u003Cu>Other nice-to-haves include:\u003C/u>\u003C/strong>\u003C/p>\u003Cul style=\"min-height:1.5em\">\u003Cli>\u003Cp style=\"min-height:1.5em\">Building \u003Cstrong>Iceberg/Delta-style\u003C/strong> lakes, streaming/batch ingestion patterns, and data-pipeline observability. \u003C/p>\u003C/li>\u003Cli>\u003Cp style=\"min-height:1.5em\">Authoring customer-ready docs and shaping \u003Cstrong>self-serve token/admin\u003C/strong> experiences with the App + TPM. \u003C/p>\u003C/li>\u003C/ul>","https://jobs.ashbyhq.com/parable/3ab70155-2a5e-466f-ae87-e9b40c1062e6",{"id":126,"name":127,"urlSafeSlug":127,"logo":128},[507,508],{"city":18,"region":18,"country":230},{"city":159,"region":159,"country":24},"2025-09-10T07:17:52.552Z","Candidates should have a strong understanding of enterprise integrations, including API key/secret generation, permissions scoping, automated audit-log exports, and connector configuration. Experience with platforms such as Workday, Oracle Fusion, Microsoft 365, Okta, Salesforce, and Netsuite is required. Proficiency in Python, TypeScript, and Rust is also necessary, along with a passion for data infrastructure productization and the ability to balance technical depth with product thinking.","The Forward-Deployed Engineer will act as the technical point of contact for enterprise integrations, guiding clients through secure setup processes and ensuring reliable data flow into their Parable tenant. Responsibilities include building and hardening connectors/taps, improving scaffolding/CLI and documentation, and contributing to self-serve integration capabilities. The role involves owning client integrations end-to-end, developing automations for client setup, standing up production-grade ingestion, instrumenting ingestion health, writing setup documentation, and partnering with other teams to deliver necessary datasets and document semantics.",{"employment":513,"compensation":515,"experience":516,"visaSponsorship":520,"location":521,"skills":522,"industries":538},{"type":514},{"id":168,"name":169,"description":287},{"minAnnualSalary":400,"maxAnnualSalary":401,"currency":402,"details":460},{"experienceLevels":517},[518,519],{"id":177,"name":178,"description":292},{"id":294,"name":295,"description":296},{"type":11},{"type":11},[523,524,525,526,527,528,529,473,530,531,532,533,534,535,536,537],"API integration","Workday","Oracle Fusion","Microsoft 365","Okta","Salesforce","Netsuite","Cloud Run","Pub/Sub","Compute Engine","Data Engineering","Client-facing","Technical Support","Problem-solving","Communication",[539,540,541],{"id":311,"name":312},{"id":122,"name":121},{"id":248,"name":482},["Reactive",543],{"$ssite-config":544},{"env":545,"name":546,"url":547},"production","nuxt-app","https://jobo.world",["Set"],["ShallowReactive",550],{"company-Parsable":-1,"company-jobs-d53f1d38-4dde-495e-8411-5e9cd5c2d1d9-carousel":-1},"/company/Parsable",{}]