About the role
Responsibilities:
Design, build, and optimize scalable backend systems for large data processing.
Work with Python and FastAPI to build services and applications.
Develop systems to manage large-scale datasets, including web scraping and data wrangling.
Utilize PostgreSQL for managing relational data, and GCP/AWS for cloud storage and computing.
Collaborate with data scientists and ML engineers to process and analyze data.
Skills Required:
Strong knowledge of Python and FastAPI.
Experience with PostgreSQL and large-scale databases.
Knowledge of GCP or AWS cloud platforms.
Familiarity with Kubernetes and container orchestration.
Experience in building large-scale TTS systems or similar datasets.
Preferred Qualifications:
Experience in web scraping and data wrangling.
Knowledge of machine learning frameworks.