Flag job

Report

Data Engineer

Salary

₹6 - 12 LPA

Min Experience

1 years

Location

Coimbatore, Bengaluru / Bangalore

JobType

full-time

About the job

Info This job is sourced from a job board

About the role

We are looking for a Data Engineer who can build a robust database and its architecture. In this role, you will assess a wide range of requirements and apply relevant database techniques to create a sustainable data architecture before you begin the implementation process and develop the database from scratch. You are all set to: Develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source-to-Target Data Mapping among others. You are someone who can: You will be involved in the design of data solutions using Hadoop based technologies along with Hadoop, Azure, HDInsight for Cloudera based Data Late using Scala Programming. Liaise and be part of our extensive GCP community, contributing in the knowledge exchange learning programme of the platform. Be required to showcase your GCP Data engineering experience when communicating with business team on their requirements, turning these into technical data solutions. Be required to build and deliver Data solutions using GCP products and offerings. Hands on and deep experience working with Google Data Products (e.g. Big Query, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.). Experience in Spark /Scala / Python/Java / Kafka. Responsible to Ingest data from files, streams and databases. Process the data with Hadoop, Scala, SQL Database, Spark, ML, IoT Develop programs in Scala and Python as part of data cleaning and processing Responsible to design and develop distributed, high volume, high velocity multi-threaded event processing systems Develop efficient software code for multiple use cases leveraging Python and Big Data technologies for various use cases built on the platform Provide high operational excellence guaranteeing high availability and platform stability Implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Pyspark, Kafka, any Cloud computing etc

About the company

Kanini is a digital 2.0 transformation enabler with expertise in Cloud Enablement, Data Analytics AI, Product Engineering, Intelligent Automation, ServiceNow Solutions, Telehealth, Location Intelligence, IoT and Field Service Management all accelerated through Flexible Engagements to provide a great Customer Experience. We are charting this path of Digital 2.0 with innovation, agile development, flexible engagements, and proven expertise and Global Delivery Framework

Skills

cloud computing
gcp
cloud
scala
spark
eclipse
software development life cycle
data architecture
big data
python