Parimatch Tech is an international product company that develops and provides tech and marketing solutions for the Gaming & Entertainment industry. Headquartered in Cyprus and operating globally, the company has R&D centers located in four countries, the largest of which is in Ukraine.
Hi-tech solutions and innovative approaches are what drive Parimatch Tech forward in the gaming market and remain its basis for success and development.
DataAPI – is the key product of Data Platform team, that gives ability for all other part of PM platform get fresh and relevant data to implement personalized content, communications and user journeys. All components of platform that uses customer data in production for making greater user experience need to have reliable real-time source of data – DataAPI.
We invite those who fired up to:
- Leading designing architecture and development process of Data API product (including data modeling) to meet product expectations and technical requirements (peak load up to 5-10K RPS);
- Supporting external (data contributors) and internal (data providers) services integrations with DataAPI and maintaining reliable work in production environment;
- Developing, inception and improving different features of Data API product;
- Regularly implementing decomposition and planning sessions with team;
- Providing technical excellence and product evolution as it’s key part of system;
- Managing technical backlog and debt;
- Setting-up technical objectives and team goals for mid-term;
- Supporting clear, transparent as-a-code documentation;
- Team mentoring.
Essential professional experience:
- 5+ years experience Python / Data Engineer;
- Extensive knowledge of best practices in software design and patterns;
- Hands-on experience with designing, implementing RESTful API’s (Aiohttp, Flask, FastAPI);
- Hands-on experience with job scheduling, task queues;
- Hands-on experience with AWS S3, Athena, Aurora, Redshift;
- Hands-on experience with Linux, Docker, Kubernetes;
- Experience with UnitTests, Integration Tests, Performance profiling;
- Knowledge in performance tuning of SQL’s, Partitioning, Indexing;
- Experience in building high load applications;
- Hands-on experience with ETL, Data Warehousing tasks;
- Solid understanding of git flow best practices;
- Exceptional problem solving, technical and data analysis skills;
- Strong Computer Science fundamentals.
Desirable skills and personal features:
Hands-on experience with following technologies:
- Apache Airflow;
- NoSQL databases (Elasticsearch, Redis, MongoDB);
- Kafka, Kafka Connect;
- IaC: Terraform, Ansible;
- Data visualisation tools (Tableau, PowerBI, Superset, Grafana, Kibana etc.);
- Golang knowledge.