Strong knowledge of Big Data general concepts
Proficient with Java, Scala or Python
Knowledge and hands-on experience with Hadoop Ecosystem (HDFS, Map-Reduce, Yarn, Hive)
Experience with AWS
Strong knowledge of NoSQL DB (any popular)
Strong knowledge of Spark (Core, SQL, Streaming)
Solid understanding of Kafka
Knowledge of Hortonworks Data Platform (or Cloudera)
Experience with Public Clouds (Azure, Google, etc.)
Experience troubleshooting production systems
• Professional Development:
— Experienced colleagues who are ready to share knowledge;
— The ability to switch projects, technology stacks, try yourself in different roles;
— Over 150 courses for workplace-based training
— Study and practice of English: courses and communication with colleagues and clients from different countries;
— Support of speakers who make presentations at conferences and meetings of technology communities.
• The ability to focus on your work: a lack of bureaucracy and micromanagement, and convenient corporate services;
• Lack of dress code, friendly atmosphere, concern for the comfort of specialists;
• Flexible schedule and the ability to work remotely;
• The ability to work in any of our development centers.
Our client is a New York City-based FinTech company that is building a software platform for the investment banking industry and a line of products to serve the needs of financial companies.
The BigData specialist will work on building a system for creating complex queries into a structured data warehouse system for the operations of quantum traders.
The specialist we are looking for will join the distributed team of
Note that we hire specialists not to the project, but to one of DataArt’s companies. If the project is over, or you become uncomfortable in it, you can discuss a transition to another project.