Сучасна диджитал-освіта для дітей — безоплатне заняття в GoITeens ×
SoftServe is a global digital solutions company with Ukrainian roots. We are a team of thinkers, doers, dedicated good people who like what they do and do it well. For us, that means a lot.
9 січня 2020

Big Data Software Engineer (Java+GCP), (ID 50680) (вакансія неактивна)

Львів, Дніпро

WE ARE
Transforming the way thousands of global organizations do business by developing the most innovative technologies and processes in Big Data, Internet of Things (IoT), Data Science and experience design.
Our mix of the brightest, most inquiring minds, technical experts, and strategic thinkers allows us to form collaborative relationships with our clients so that we can really understand their needs and create the experience they demand. SoftServe Big Data & Analytics team provides high-value technology consulting services, elaborates state of the art Big Data solutions to end clients, has a rich portfolio and numerous success stories in different business domains. Our Big Data Competency is recognized by Amazon Web Services and Cloudera.
Together with the Software Engineering Institute, the team where vacancy is open, Smart Decisions game is invented Smart Decisions game in order to promote learning about architecture design process based on Attribute-Driven Design (ADD) method, intended to facilitate the design of complex architectures. Please read more at smartdecisionsgame.com

YOU ARE
A person demonstrating
• Extensive knowledge in at least one of the following programming languages: Java, Scala
• Data processing and computation frameworks: Spark, Spark Streaming, Flink, Kafka Streams
• Experience in stream processing, batch processing and data integration from multiple data sources
• Good understanding of distributed computing, pipe-and-filter, Lambda and Kappa Architecture principles
• A skill of Cloud platforms: GCP platform, GCP fundamentals (IAM, VPC, Cloud Storage, Cloud Function, Cloud Fusion, GKE, Data Catalog, Composer), Computation services (Dataflow, Dataproc), Streaming (Pub/Sub, Kafka), Data storages (BigQuery, Cloud SQL, Cloud Spanner, Firestore, BigTable)
• Knowledge of RDBMS & NoSQL: PostgreSQL, MySQL, Cassandra, HBase, Elasticsearch, Redis, MongoDB, Impala, etc.
• A concept of migration approaches: Lift & Shift, Replatform, Re-Architect
• Experience with implementation of Enterprise Data Lake and Data Governance
• Data Modeling skills (DWH, Data Marts)
• Proficiency in continuous integration and delivery
• Hadoop stack understanding: YARN, HDFS, Hive, etc.

YOU WANT TO WORK WITH
• Automation development processes relying on configuration management, continuous integration and delivery best practices
• Involvement in consulting projects, such as technology assessment and evaluation
• Delivering discovery reports or technical solution vision, etc.
• R&D and rapid prototyping
• Involvement in long-term projects with the possibility to learn new technologies and assimilate best practices from experts
• New clients engagement and pre-sales support
• The sustainable platform for Big Data solutions

TOGETHER WE WILL
• Work in a team of certified architects by Software Engineering Institute, Carnegie Mellon University
• Participate in international events
• Get certifications on cutting-edge technologies
• Implement modern Big Data solutions using our Lambda Architecture accelerator and proven architecture design methodologies
• Have access to strong educational and mentorship programs
• Communicate with the world-leading companies from our logos portfolio
• Work as a consultant on different projects with a flexible schedule

Гарячі вакансії

Всі вакансії