Flag job

Report

Snowflake Data Engineer

Min Experience

10 years

Location

Fort Mills SC

JobType

full-time

About the job

Info This job is sourced from a job board

About the role

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Job Location - Fort Mills SC (hybrid) Key Responsibilities Design, develop, and maintain data pipelines and ETL processes using Snowflake, AWS services, Python, and DBT. Collaborate with data scientists and analysts to understand data requirements and implement solutions. Optimize data workflows for performance, scalability, and reliability. Troubleshoot and resolve data-related issues in a timely manner. Stay updated on the latest technologies and best practices in data engineering. Required Skills Snowflake, Snowpark: The candidate should have 10+ years of exp in understanding of Snowflake data warehousing platform and be proficient in using Snowpark for data processing and analytics. DBT: Experience with DBT (Data Build Tool) for modeling data and creating data transformation pipelines is a plus. AWS services (Airflow): The candidate should have hands-on experience with AWS services, particularly Apache Airflow for orchestrating complex data workflows and pipelines. AWS services (Lambda): Proficiency in AWS Lambda for serverless computing and event-driven architecture is essential for this role. AWS services (Glue): The candidate should be well-versed in AWS Glue for ETL (Extract, Transform, Load) processes and data integration. Fivetran (HVR); Working knowledge and handson experience on Fivetran HVR. Python: Strong programming skills in Python are required for developing data pipelines, data transformations, and automation tasks. Qualifications Bachelor's degree in Computer Science, Engineering, or related field. Proven experience in data engineering roles with a focus on Snowflake, AWS services, Python, and DBT. Strong analytical and problem-solving skills. Excellent communication and teamwork abilities. Technical certifications are a plus.

About the company

Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fuelled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of €22.5 billion.

Skills

snowflake
snowpark
dbt
aws
airflow
lambda
glue
fivetran
python