We are seeking a Senior Data Engineer who has a strong desire and drive to contribute to the next generation of our innovative product. As a TLM at Metadata, you will have a chance to contribute to the globally distributed team across 13 different countries, make deep technical contributions and decisions, manage your own priorities without supervision, participate in innovative tech design, understand how your work fits in with related projects or components, and help the team course-correct when necessary.
It is a rare opportunity to boost your career growth, learn a lot, meet talented people, and work with cutting-edge technologies to transform B2B Marketing.
The learning curve is steep and challenging, and the day-to-day life of aIf this sounds like the right opportunity for you, we want to hear from you as soon as possible! Senior Data Engineer requires sharp focus, exceptional attention to detail, an analytical approach, and continuous learning.
If this sounds like the right opportunity for you, we want to hear from you as soon as possible!
- Design and implement workflows for ingestion and transformation of new data sources that will be used in the company.
- Report generation and distribution based on the existent data and ne external sources of data ingested.
- Evolve the current Data Engineering infrastructure to make scalable and get it ready for the predicted future volume
- Work on maintenance and redesign of the current data lake to empower analysts and ML team.
- Maintain and deploy ML models, set up the necessary infrastructure to run those models, and distribute the predictions to the appropriate place.
- 5+ years of experience as a Software Engineer
- Proficient in Python and SQL.
- Experience with Apache Spark, Python libraries for parallelization and creation of rest API clients, and data manipulation like Pandas.
- Knowledge of MySQL, and relational databases.
- Database normalization, denormalization, star and snowflake schemas.
- Experience with search engines such as ElasticSearch, Solr
- Experience with key/value databases like Redis.
- Experience with Streams and Message Brokers such as RabbitMQ and Kafka.
- Good knowledge of data formats like Json, Parquet, Delta, Avro, Protocol Buffers...
- Should be comfortable with Linux (basic scripting language), Gitlab, Jira and good grasp of the concept of DevOps and CI/CD
- Proven experience with taking a feature request, providing tech design, implementing end-to-end, performing code review
- Familiar cloud technologies such as Azure, AWS or DigitalOcean
- Excellent understanding of design patterns, data structures, and algorithms
- Proficient logical and analytical skills
- Familiar with working in Agile environment and remote teams
Experiments with NodeJS and the creation of web services with Python Flask will be a Bonus.
- Flexible remote work
- Competitive salary based on your level
- Flexible PTO and holidays
- Friendly supportive team
- Business trips to the USA and other locations (optional)