11 січня 2022

Python developer (вакансія неактивна)

Дніпро, віддалено

Required skills:

  • 5+ years of experience with modern Python (modern frameworks)
  • Experience with modern database architecture and management techniques (PostgresDB, mongoDB, BigQuery)
  • Strong experience with Docker and containerization
  • Used to working in a fast-paced, agile environment
  • Networking skills (TCP/UDP, IP, proxy, VPN, etc.)
  • Knowledge authorization, authentication , JWT, Oauth
  • Version control and continuous integration tools (Github CI, Jenkins, Travis CI, etc.)
  • Experience building services that leverage cloud-based infrastructure (GCP, AWS, etc.)
  • Good understanding of Linux

Will be a plus:

  • Experience with modern JavaScript & one JavaScript framework (Vue.js, React.js, Angular, etc.)
  • Experience with Gradle

Propose:

  • Become a known Airbyte contributor with us!
  • Working in high efficient team
  • Automated development environment, use and participate in it!
  • Professional growth
  • Airbyte is outperforming in terms of community ORK: subscriptions, new users, contributors. Airbyte is the most growing Slack community
  • Develop high reliable technical solution that exposed to other platform contributors and aimed to become a benchmark in connectors development
  • Leadership and mentorship program

Duties:

  • Building connectors for data sources and destinations is a critical part of what will make Airbyte successful. Some of the things you’ll focus on building are connectors like:
  • Python will be mostly focused on APIs and frameworks for building APIs ○ REST APIs, SOAP, GraphQl, etc...
  • Building abstractions and frameworks to increase leverage in developing connectors
  • Building & maintaining test frameworks to ensure high quality & performance

Our goals:

  • Work with us to make Airbyte to become a 1st number ETL platform in the world. Now we are in the top 3
  • We will achieve it by creating night numbers of connectors. No Airbyte has 70, goal is to have 200 connectors at the end of the year.


    About the project
    Open-source data integration platform that gives your infrastructure superpowers to move data seamlessly. The main responcibility is to building connectors for data sources and destinations is a critical part of what will make project successful.

    Data sources and destinations like:
    Relational databases: OracleDB, MSSQL, MySQL, and others
    Data Lakes: S3, GCS, Azure Blob storage, DataBricks
    Warehousess: BigQuery, Snowflake, Clickhouse
    Queues: Kafka, Google Pubsub, Amazon Kinesis, Amazon SQS

    Working knowledge of English required

    Proactive communication: if requirements are not clear or you need support from the team, you speak up quickly so no time is wasted.

    Built for extensibility:
    Adapt an existing connector to your needs or build a new one with ease.

    Optionally normalized schemas: Entirely customizable, start with raw data or from some suggestion of normalized data.

    Full-grade scheduler: Automate your replications with the frequency you need.

    Real-time monitoring: We log all errors in full detail to help you understand.

    Incremental updates: Automated replications are based on incremental updates to reduce your data transfer costs.

    Manual full refresh: Sometimes, you need to re-sync all your data to start again.

    Debugging autonomy: Modify and debug pipelines as you see fit, without waiting.