UST
Website:
ust.com
Job details:
Role Description
Design and develop a system to govern and manage batch (bigdata) and online (microservices) jobs - Define technical scope and objectives through research and participation in requirements-gathering and definition of processes - Ingest and Process data from various sources in raw, structured, semi-structured, and unstructured format into Big Data ecosystem - Realtime data feed processing using Big Data ecosystem - Design, review, implement and optimize data transformation processes in Big Data ecosystem - Test and prototype new data integration tools, techniques and methodologies - Participate in overall test planning for the application integrations, functional areas and projects. - Work with cross functional teams in an Agile/Scrum environment to ensure a quality product is delivered
11+ years of hands-on experience with design and development of enterprise-scale applications and systems - 8+ Year of expertise in Big Data technologies in AWS EMR and Hadoop ecosystem (Spark, Scala, PySpark, Airflow, Kafka, Delta Tables, Iceberg, HBase, Hive, HDFS, MapReduce etc.) - Experience with designing, building, management, and operation of the infrastructure as a service layer (cloud-based platforms) that supports the different platform services - Experience with Scala - Experience with Python - Experience with AWS cloud excosystem - Experience with MongoDB is a plus - Strong understanding of data analytics and data visualization - Experience with AI Machine Learning (Python/Keras/Tensorflow) is a plus - Excellent analytical and problem solving skills - Excellent one-on-one communication and presentation skills, specifically able to convey technical information in a clear and unambiguous manner - Working knowledge of Linux operating system, Bash and other scripting Qualification - Technically focused Bachelor s degree in Computer Science, Engineering, Math, etc.
Skills
big data,scala,pyspark,airflow,delta tables,
Click on Apply to know more.