Job Description This is a remote position and we are hiring candidates from the whole country. AgileEngine is one of the Inc. 5000 fastest-growing companies in the US and a top-3 ranked dev shop according to Clutch. We create award-winning custom software solutions that help companies across 15+ industries change the lives of millions. If you like a challenging environment where you're working with the best and are encouraged to learn and experiment every day, there's no better place - guaranteed! :)
Must Haves BSc. in Computer Sciences from a top university, or equivalent;
5+ years in data engineering and data pipeline development in high-volume production environments;
2+ years experience with Java;
2+ years experience with monitoring systems (Prometheus, Grafana, Zabbix, Datadog);
Ability to develop, design, and maintain end-to-end ETL workflows, including data ingestion and transformation logic, involving different data sources;
Experience with data-engineering cloud technologies such as: Apache Airflow, K8S, Clickhouse, Snowflake, Redis and cache technologies;
Experience with relational and non-relational databases. Proficient in SQL and query optimizations;
Experience with designing infrastructure to maintain high availability SLAs;
Experience with monitoring and managing production environments;
Upper-intermediate English level.
The Benefits of Joining Us Professional Growth: Accelerate your professional journey with mentorship, TechTalks, and personalized growth roadmaps.
Competitive Compensation: We match your ever-growing skills, talent, and contributions with competitive USD-based compensation and budgets for education, fitness, and team activities.
A Selection of Exciting Projects: Join projects with modern solutions development and top-tier clients that include Fortune 500 enterprises and leading product brands.
Flextime: Tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office – whatever makes you the happiest and most productive.
Requirements BSc. in Computer Sciences from a top university, or equivalent;
5+ years in data engineering and data pipeline development in high-volume production environments;
2+ years experience with Java;
2+ years experience with monitoring systems (Prometheus, Grafana, Zabbix, Datadog);
Ability to develop, design, and maintain end-to-end ETL workflows, including data ingestion and transformation logic, involving different data sources;
Experience with data-engineering cloud technologies such as: Apache Airflow, K8S, Clickhouse, Snowflake, Redis and cache technologies;
Experience with relational and non-relational databases. Proficient in SQL and query optimizations;
Experience with designing infrastructure to maintain high availability SLAs;
Experience with monitoring and managing production environments;
Upper-intermediate English level.
#J-18808-Ljbffr