About the role
Expertise with Python or a JVM programming language (e.g. Java, Scala)
Expertise with SQL (e.g., Postgres) databases
2+ years of experience designing, maintaining, and orchestrating ETL pipelines (e.g., Apache Spark, Apache Airflow) in cloud based environments (e.g., GCP, AWS, or Azure).
Build and maintain ETL pipelines to process and export record data to Sayari Graph application
Develop and improve entity resolution processes
Implement logic to calculate and export risk information
Work with product team and other development teams to collect and refine requirements
Run and maintain regular data releases
Experience with entity resolution, graph theory, and/or distributed computing
Experience with Kubernetes
Experience working as part of an agile development team using Scrum, Kanban, or similar
About the company
Sayari provides intelligence on counterparty and supply chain risks, helping businesses and analysts make informed decisions. The platform integrates trade data from 65 countries with global corporate ownership data, allowing clients to identify various risks such as regulatory and reputational risks. Operating on a subscription-based model, Sayari offers features like batch supplier screening and detailed reporting. The company's goal is to ensure transparency and mitigate risks in global commerce by providing a complete view of supply chains and corporate relationships.