AWS Data Engineer Senior

Min Experience

5 years

Location

Remote (India), Mumbai

JobType

full-time

About the role

Mactores is a trusted leader among businesses in providing modern data platform solutions. Since 2008, Mactores have been enabling businesses to accelerate their value through automation by providing End-to-End Data Solutions that are automated, agile, and secure. We collaborate with customers to strategize, navigate, and accelerate an ideal path forward with a digital transformation via assessments, migration, or modernization.

Mactores is seeking an AWS Data Engineer (Senior) to join our team. The ideal candidate will have extensive experience in PySpark and SQL, and have worked with data pipelines using Amazon EMR or Amazon Glue. The candidate must also have experience in data modeling and end-user querying using Amazon Redshift or Snowflake, Amazon Athena, Presto, and orchestration experience using Airflow.

What you will do?

  • Develop and maintain data pipelines using Amazon EMR or Amazon Glue.
  • Create data models and end-user querying using Amazon Redshift or Snowflake, Amazon Athena, and Presto.
  • Build and maintain the orchestration of data pipelines using Airflow.
  • Collaborate with other teams to understand their data needs and help design solutions.
  • Troubleshoot and optimize data pipelines and data models.
  • Write and maintain PySpark and SQL scripts to extract, transform, and load data.
  • Document and communicate technical solutions to both technical and non-technical audiences.
  • Stay up-to-date with new AWS data technologies and evaluate their impact on our existing systems.

What are we looking for?

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 3+ years of experience working with PySpark and SQL.
  • 2+ years of experience building and maintaining data pipelines using Amazon EMR or Amazon Glue.
  • 2+ years of experience with data modeling and end-user querying using Amazon Redshift or Snowflake, Amazon Athena, and Presto.
  • 1+ years of experience building and maintaining the orchestration of data pipelines using Airflow.
  • Strong problem-solving and troubleshooting skills.
  • Excellent communication and collaboration skills.
  • Ability to work independently and within a team environment.

You are preferred if you have

  • AWS Data Analytics Specialty Certification
  • Experience with Agile development methodology

Skills

Data Engineering
PySpark
AWS