Flag job

Report

AWS Devops Engineer (5-10 Years), Only Pune and Hyderabad

Location

Hyderabad, Telangana, India

JobType

full-time

About the job

Info This job is sourced from a job board

About the role

Tata Consultancy Services

Website: tcs.com
Job details:

Dear Professionals,

Greetings from Tata Consultancy Services (TCS)!!!

Job Title : AWS Devops Engineer

Experience : 5-10 Years

Location: Only Pune and Hyderabad

Mode of work : Work from Office

Mode of Interview : Virtual


Responsibilities:

  • Creating Jenkins CI pipelines to integrate Sonar/Security scans and test automation scripts.
  • Part of DevOps QA and AWS team focusing on building CI/CD pipeline.
  • Responsible for writing and maintaining Jenkins Pipelines.


Requirements:

  • Experience as Data Engineer on AWS Stack with experience on DevOps tool.
  • AWS Solutions Architect or AWS Developer Certification required.
  • Solid experience of AWS services such as CloudFormation, S3 Athena, Glue, Glue DataBrew, EMR/Spark, RDS, Redshift, DataSync, DMS, DynamoDB, Lambda, Step Functions, IAM, KMS, SM, EventBridge, EC2 SQS, SNS, LakeFormation, CloudWatch, Cloud Trail.
  • Implement high-velocity streaming solutions and orchestration using Amazon Kinesis, AWS Managed Airflow, and AWS Managed Kafka (preferred)
  • Solid experience building solutions on AWS data lake/data warehouse.
  • Analyze, design, develop, and implement of data ingestion pipeline in AWS.
  • Knowledge of implementing ETL/ELT for data solutions.
  • End-to-end data solutions (ingest, storage, integration, processing, access) on AWS.
  • Knowledge of implementing RBAC strategy/solutions using AWS IAM and Redshift RBAC model.
  • Build and implement CI/CD pipelines for EDP Platform using CloudFormation and Jenkins.
  • Programming experience with Python, Shell scripting, and SQL.
  • Knowledge of analyzing data using SQL Stored procedures.
  • Build automated data pipelines to ingest data from relational database systems, file systems, and NAS shares to AWS relational databases such as Amazon RDS, Aurora, and Redshift.
  • Build Automated data pipelines to ingest data from Rest APIs to AWS data lake (S3) and relational databases such as Amazon RDS, Aurora, and Redshift.
  • Good Experience in DevOps practice.
  • Experience with modern source code management and software repository systems (Bitbucket)
  • Experience with programming languages (Python)
  • Experience with scripting languages (Shell, Groovy)
  • Experience with API deployment for tooling integration.
  • Experience using Jenkins, CloudBees (Pipeline as Code, Shared Libraries)
  • The ability to document exceptions/issues/action plans/meeting minutes/lessons learned accurately and in a timely fashion.
  • Experience with administering DevOps tools in SaaS.
  • Experience using DevOps Tools (SonarQube, Artifactory, etc. )
  • Experience using build tools (Maven, MS Build, and Gradle)
  • Experience using containers (Docker)
  • Experience using Atlassian suite (Jira, Confluence)
  • Experience of Infrastructure as Code using CloudFormation.


If you are Interested in the above opportunity kindly share your updated resume to pullagura.gokulsaiakhil@tcs.com immediately with the details below (Mandatory)


Name:

Contact No.

Email id:

Skillset:

Total exp:

Relevant Exp:

Fulltime highest qualification (Year of completion with percentage scored):

Current organization details (Payroll company):

Current CTC:

Expected CTC:

Notice period:

Current location:

Preffered Location

Any gaps between your education or career (If yes pls specify the duration):

Will you be able to join within 30/45 days? (Yes/NO)

Click on Apply to know more.

Skills

Python
Airflow
AWS
Atlassian
Bitbucket
CI
CloudFormation
CloudWatch
Confluence
data engineer
data ingestion
data lake
data solutions
data warehouse
database
DevOps
Docker
DynamoDB
EC2
end-to-end
ETL
Gradle
Groovy
Jenkins
Jira
Kafka
Lambda
Maven
SaaS
Shell Scripting
Source Code
Spark
SQL
test automation
REST APIs