Flag job

Report

Data Architect

Min Experience

10 years

Location

bangalore

JobType

full-time

About the job

Info This job is sourced from a job board

About the role

Tarento Technologies is a dynamic and innovative technology solutions provider, specializing in delivering cutting-edge IT services and solutions. With a strong focus on software development, data analytics, cloud computing, and enterprise applications, Tarento aims to empower businesses to thrive in the digital age. The company's team of experts combines industry knowledge with technical expertise to deliver tailored solutions that meet the unique needs of each client. At Tarento, we believe in fostering a collaborative, growth-oriented environment, where creativity and continuous learning are encouraged. As we expand, we are looking for passionate individuals to join our team and help shape the future of technology-driven solutions. If you're ready to take on exciting challenges and grow professionally, Tarento Technologies offers the ideal platform to advance your career. We are looking for Technically Data Architects with hands-on experience in Apache Spark and Databricks skills with expertise on big data processing, data warehousing, and cloud platforms like Azure, AWS, or GCP. Key Responsibilities: Design, develop, and optimize ETL/ELT pipelines using Databricks and Apache Spark. Implement data lakes, data warehouses, and lakehouses using Delta Lake. Develop scalable, high-performance data solutions for batch and streaming data processing. Optimize Spark jobs for performance and cost efficiency. Implement data governance, security, and compliance best practices. Work with CI/CD pipelines for data workflows using tools like Terraform, Git, and DevOps practices. Collaborate with data analysts, scientists, and business teams to understand data requirements. Required Skills & Experience: 10 years of experience in data engineering. Strong expertise in Databricks, Apache Spark (PySpark/Scala/Java), and Delta Lake. Proficiency in SQL, Python, or Scala. Hands-on experience with ETL/ELT pipeline development. Experience with cloud platforms (Azure Data Factory, AWS Glue, GCP Dataflow). Knowledge of data modeling, data lakes, and data warehousing Understanding of CI/CD, Git, and DevOps tools. Strong troubleshooting and performance optimization skills.

About the company

Tarento Technologies is a dynamic and innovative technology solutions provider, specializing in delivering cutting-edge IT services and solutions. With a strong focus on software development, data analytics, cloud computing, and enterprise applications, Tarento aims to empower businesses to thrive in the digital age. The company's team of experts combines industry knowledge with technical expertise to deliver tailored solutions that meet the unique needs of each client.

Skills

apache spark
databricks
etl
elt
delta lake
sql
python
scala
data engineering
cloud computing
data modeling
data warehousing
ci/cd
git
devops