Flag job

Report

Tesco Returnship Program: Software Development Engineering- Data

Min Experience

0 years

Location

Tesco Campus Rd, Vijayanagar, KIADB Export Promotion Industrial Area, Whitefield, Bengaluru, Karnataka 560066, India

JobType

full-time

About the job

Info This job is sourced from a job board

About the role

Are you looking to return from a career break? Tesco Returnship Program is an initiative designed to support professionals re-enter the workforce post career break by offering 24-week paid internship opportunity, with a potential full-time employment upon successful completion on internship, depending on performance and feedback during the program. The Program will enable you to work on projects that match your expertise, interests and abilities. Coding & Development Practices: 1. Good understanding of distributed computing concepts. 2. Very good knowledge on PySpark and dataframe APIs. 3. Understanding of Spark application performance tuning and optimisations. Should have worked on batch and good to have streaming data processing experience. 4. Good understanding of Hive and usage with Spark as well as any other open table format like iceberg. 5. Good knowledge on data integration tools similar to Sqoop or ADF (Azure Data Factory). 6. Experience on orchestration tools like oozie; airflow etc. 7. Basic knowledge on shell scripting. 8. Proven ability to write clean code that’s maintainable and extensible (design patterns; OOPs). Proven ability to write unite test cases. 9. Comfortable with Git and GitHub. 10. Experience in cloud platform like MS Azure is an added advantage. 11. Automate everything by default and debugging problem-solving capabilities. 12. Understanding of building a CI/CD pipeline 13. Understanding automation and apply it for security scanning, implementation and performance testing as part of build. Design: 1. Actively participate with the team to evaluate multiple design decisions to choose an appropriate solution based on functional and non-functional requirements 2. Actively work along with the team to understand the requirements and to convert them into user stories 3. Active participation and design thinking with the team considering future needs, maximize performance. 4. Ability to design reusable and scalable modules Problem Solving: 1. Solves problems on their own merit. 2. Clarifies requirements and asserts them in unit tests when necessary. 3. Takes an iterative incremental approach to solving the problem. 4. Is able to communicate and discuss the problem effectively with the team. 5. Able to re-create the problematic scenario and suggest ideal solution. Qualifications Data Engineering; Big Data Processing; Spark; Hadoop; HBase; Hive; Hadoop; Data Pipelines; Python; Scala; CI; CD; Problem Solving; Leadership skills; Performance analysis & improvements.

About the company

Tesco Bengaluru: We are a multi-disciplinary team creating a sustainable competitive advantage for Tesco by standardizing processes, delivering cost savings, enabling agility, providing cutting-edge technological solutions and empowering our colleagues to do ever more for our customers. Tesco Technology consists of people from a number of different backgrounds but having a common purpose to serve our shoppers a little better every day with our retail technological solutions. We shared a common interest in harnessing innovations in technology to enhance their shopping experience at Tesco stores. Whether making products, software or systems, our teams focuses on various aspects from taking strategic ownership of the architecture to delivering technological solutions such as design, testing, deployment, infrastructure, operation and security of the systems to ensure agile, smooth and safe operations. These help us to deliver the maximum business impact. Teams refine their internal processes to best fit their own needs, working to build core capabilities in application and services. We collaborate globally across teams to build end-to-end customer-facing solutions, as well as to share knowledge, experience, tools and techniques.

Skills

PySpark
dataframe APIs
Spark
Hive
data integration tools
orchestration tools
shell scripting
Git
GitHub
cloud platform
CI/CD pipeline