o9 Solutions, Inc.
Website:
o9solutions.com
Job details:
Role: Senior Technical Consultant (Data Engineer) - RDAF/MFG
Location: Bangalore, Hybrid
About the role...
We are looking for a Senior Technical Consultant – RDAF/MFG with strong experience in data engineering, ETL pipelines, and cloud-based data platforms. The role involves designing and implementing data ingestion and transformation pipelines, collaborating with functional teams, and ensuring seamless data integration across multiple systems.
What you will do in this role:
Advanced Execution & Data Management
- Manage and execute complex project tasks related to data ingestion, transformation, validation, and publishing.
- Review and analyze data provided by customers, understanding its technical and functional intent along with dependencies.
- Work closely with functional teams to understand end-to-end data flows and integration requirements.
Data Integration & Pipeline Development
- Build and maintain data ingestion and egress pipelines capable of handling large volumes of data.
- Develop data transformation functions using technologies such as Python, PySpark, SQL, and SSIS.
- Integrate data from various sources including Teradata, SAP ERP, SQL Server, Oracle, Sybase, ODBC connectors, and flat files using API or batch integrations.
Production Deployment & Hypercare
- Support production deployment activities.
- Assist in issue triaging, testing, and root cause analysis.
- Resolve batch automation disruptions in a timely manner to ensure customer SLA compliance and accurate data delivery.
Technical Leadership
- Provide technical guidance to junior consultants and review their code to ensure alignment with best practices and quality standards.
- Follow and promote o9 ways of working and industry standards for efficient project execution.
What you’ll have...
- 3+ years of experience in Data Architecture, Data Engineering, or related fields.
- Strong experience in data modelling, ETL processes, and cloud-based data platforms.
- Hands-on experience with: Python, PySpark, SQL
- Experience with workflow management tools such as Airflow and SSIS.
- Experience working with Parquet, JSON, RESTful APIs, HDFS, Delta Lake, and query frameworks such as Hive and Presto.
- Strong SQL expertise, including query writing and working with relational databases.
- Experience with version control platforms such as GitHub or Azure DevOps.
- Familiarity with Agile development methodologies.
- Strong problem-solving skills and a proactive learning mindset.
- Excellent verbal and written communication skills.
Good to Have
- Hands-on experience with Delta Lake.
- Experience working with Supply Chain Planning applications.
- Exposure to cloud platforms such as AWS, Azure, or Google Cloud.
What we’ll do for you:
- Flat organization: With a very strong entrepreneurial culture (and no corporate politics).
- Great people and unlimited fun at work.
- Possibility to really make a difference in a scale-up environment.
- Support network: Work with a team you can learn from every day.
- Diversity: We pride ourselves on our international working environment.
- DavOs Insight (Turning VUCA into value with Neuro-Symbolic AI): https://l1nk.dev/ngQoe
- Work-Life Balance: https://youtu.be/IHSZeUPATBA?feature=shared
- Feel part of A team: https://www.youtube.com/watch?v=O6B2lEJl2oI
How the process works...
- Setup & Create your profile in Workday to track the status - Link to apply.. If already done, kindly ignore!
- Respond with your interest to us.
- We’ll contact you either via video call or phone call - whatever you prefer, with the further schedule status.
- During the interview phase, you will meet with the technical panel for 60 minutes. We will contact you after the interview to let you know if we’d like to progress your application.
- There will be 2 rounds of technical discussion followed by a Managerial round.
- We will let you know if you’re the successful candidate.
Good luck!
Click on Apply to know more.