The Hartford
Website:
thehartford.com
Job details:
IND Staff Engineer - GCC097
We’re determined to make a difference and are proud to be an insurance company that goes well beyond coverages and policies. Working here means having every opportunity to achieve your goals – and to help others accomplish theirs, too. Join our team as we help shape the future.
- Data engineering experience including Data solutions, SQL and NoSQL, Snowflake, ETL/ELT tools, CICD, Bigdata, Cloud Technologies (AWS/Google/AZURE), Python/Spark, Datamesh, Datalake or Data Fabric.
- Mastery level data engineering and architecture skills, including deep expertise in data architecture patterns, data warehouse, data integration, data lakes, data domains, data products, business intelligence, and cloud technology capabilities.
- Expertise with cloud platforms (AWS, GCP, or Azure) and containerization technologies (Docker, Kubernetes).
- Data engineering experience focused on supporting Generative AI technologies.
- Hands on experience with Snowflake
- Experience with building Data and AI pipelines that bring together structured, semi-structured and unstructured data. This includes pre-processing with extraction, chunking, embedding and grounding strategies, semantic modeling, and getting the data ready for Models and Agentic solutions.
- Strong hands-on experience implementing production ready enterprise grade GenAI data solutions.
- Experience with prompt engineering techniques for large language models.
- Experience in implementing Retrieval-Augmented Generation (RAG) pipelines, integrating retrieval mechanisms with language models.
- Intermediate mastery in processing and leveraging unstructured data for GenAI applications.
- Intermediate mastery in implementing scalable AI driven data systems supporting agentic solutions (AWS Lambda, S3, EC2, Langchain, Langgraph, MCP, A2A).
- Strong programming skills in Python and familiarity with deep learning frameworks such as PyTorch or TensorFlow.
- Experience in vector databases, graph databases, NoSQL, Document DBs, including design, implementation, and optimization. (e.g., AWS open search, GCP Vertex AI, Neo4j, Spanner Graph, Neptune, Mongo, DynamoDB etc.).
- Mastery in implementing data governance practices, including Data Quality, Lineage, Data Catalogue capture, holistically, strategically, and dynamically on a large-scale data platform.
- Strong written and verbal communication skills and ability to explain technical concepts to various stakeholders.
- Expert level collaboration skills across teams, decision making, conflict resolution and relationship building skills.
- Expertise in mentoring and developing Junior AI or Data Engineers.
- Familiarity Knowledge of evolving industry design patterns for AI.
- Strong planning, organization, and execution skills.
- Ability to provide thought leadership to dynamic and collaborative teams, demonstrating excellent interpersonal skills and time management capabilities.
- Ability to understand and align deliverables to the departmental and organization strategies and objectives.
- Ability to lead successfully in a lean, agile, and fast-paced organization, leveraging Scaled Agile principles and ways of working.
- Ability to translate complex technical topics into business solutions and strategies, as well as turn business requirements into a technical solution.
Click on Apply to know more.