Strong knowledge of Big Data general concepts
Strong Knowledge of Spark (Core, SQL, Streaming)
Strong Knowledge of Java, Scala or Python
Strong Knowledge of Kafka Strong Knowledge of Hadoop Ecosystem (HDFS, Map-Reduce, Yarn, Hive)
Strong Knowledge of NoSQL DB (any popular)
Experience with AWS
— Experienced colleagues who are ready to share knowledge;
— The ability to switch projects, try yourself in different roles;
— More than 150 workplaces for advanced training;
— Study and practice of English: courses and communication with colleagues and clients from different countries;
— Support of speakers who make presentations at conferences and meetings of technology communities.
The ability to focus on your work: a lack of bureaucracy and micromanagement, and convenient corporate services;
Lack of dress code, friendly atmosphere, concern for the comfort of specialists;
Flexible schedule and the ability to work remotely;
The ability to work in any of our development centers.
Our client is a New York City-based FinTech startup that is building a trading and risk software platform for the investment banking industry to use in-house as an investment banking company and to sell the platform as a service.
DataArt has become the client’s main technology partner. We have constant work for the client and we plan to increase DataArt’s team during the year. The processes in the project are built based on Agile.
The BigData specialist will work on building a system for creating complex queries in a structured data warehouse system for the operations of quantum traders.
Core technologies: Spark, Python, Java, AWS, Cloud Architecture, Apache AirFlow, Hadoop, Drill.