Flag job

Report

Data Engineer - Pyspark,SQL

Location

Pune, Maharashtra, India

JobType

Permanent

About the job

Info This job is sourced from a job board

About the role

Join us as a Data Engineer - Pyspark,SQL at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. As a part of team of developers, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. To be successful as a Data Engineer - Pyspark,SQL you should have experience with: Hands on experience in Pyspark and strong knowledge on Dataframes, RDD and SparkSQL Hands on experience in Pyspark performance optimization techniques . Hands on Experience in developing, testing and maintaining applications on AWS Cloud. Strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake formation, Athena) Design and implement scalable and efficient data transformation/storage solutions with open table formats such as DELTA, Iceberg, Hudi. Experience in using DBT (Data Build Tool) with snowflake/Athena/Glue for ELT pipeline development. Experience in Writing advanced SQL and PL SQL programs. Hands On Experience for building reusable components using Snowflake and AWS Tools/Technology Should have worked at least on two major project implementations. Exposure to data governance or lineage tools such as Immuta and Alation is added advantage. Experience in using Orchestration tools such as Apache Airflow or Snowflake Tasks is added advantage. Knowledge on Ab-initio ETL tool is a plus Some other highly valued skills includes: Ability to engage with Stakeholders, elicit requirements/ user stories and translate requirements into ETL components Ability to understand the infrastructure setup and be able to provide solutions either individually or working with teams. Good knowledge of Data Marts and Data Warehousing concepts. Resource should possess good analytical and Interpersonal skills. Implement Cloud based Enterprise data warehouse with multiple data platform along with Snowflake and NoSQL environment to build data movement strategy.

About the company

Barclays is built on an international scale. Our geographic reach, our wide variety of functions, businesses, roles and locations reflect the rich diversity of our worldwide customer base. All of which means we offer incredible variety, depth and breadth of experience. And the chance to learn from a globally diverse mix of colleagues, including some of the very best minds in banking, finance, technology and business. Throughout, we'll encourage you to embrace mobility, exploring every part of our operations as you build your career.

Skills

pyspark
sql
aws
glue
s3
lambda
athena
delta
iceberg
hudi
dbt
snowflake
sql
plsql
airflow
ab-initio