Technical ArchitectCLOUDSUFIfull-timeRequired skillsPythonAWSAPIDatabricksenterprise SaaSFusionGCPJavaOAuthOraclePRDsSaaSSalesforceserializationAbout the role CLOUDSUFI Website: cloudsufi.com Job details: Must-have5+ years of data engineering; at least 2 years working on connector or integration framework developmentDeep Python expertise including PySpark, pyarrow, and an understanding of Spark's execution model (driver vs executor, serialization constraints, partition fan-out)Hands-on experience with at least one SaaS ingestion platform — Fivetran, Airbyte, Google DTS, AWS Glue connectors, or equivalent — at the connector-build level, not just configurationStrong understanding of OAuth 2.0 flows (auth code, PKCE, client credentials, JWT), rate limiting strategies (token bucket, leaky bucket, per-endpoint quotas), and incremental sync patterns (cursor, watermark, CDC)Experience designing shared connector frameworks — reusable auth managers, rate governors, state stores — not just per-connector scriptsAbility to author and own TDDs and PRDs that can be handed to a junior engineer with minimal back-and-forthNice-to-havePrior exposure to Databricks Asset Bundles / Declarative Automation Bundles or Lakeflow pipelinesExperience with the Databricks Python Data Source API (DBR 15.4 LTS+) — extremely rare, so treat practical Spark DSv2 Java/Scala background as equivalentGCP DTS or Cloud Data Fusion connector experience (directly transferable — this is CloudSufi's advantage in screening)Knowledge of the specific source systems in Raj's list, particularly Social Ads APIs (Meta, LinkedIn, X) or enterprise SaaS (Salesforce, Oracle) Click on Apply to know more. This page is fully interactive when JavaScript is enabled. Please enable JavaScript to apply or browse related roles.