PandaDoc empowers more than 27,000 growing organizations to thrive by taking the work out of document workflow. PandaDoc provides an all-in-one document workflow automation platform that helps fast scaling teams accelerate the ability to create, manage, and sign digital documents including proposals, quotes, contracts, and more. For more information, please visit www.pandadoc.com.
We’re known for our work-life balance, kind co-workers, & our commitment to learning, making an impact and having fun. Our Pandas are located all over the globe, and we stay connected with the help of technology and ensure that everyone on our team feels, well, like a team.
Pandas work best when they’re happy. Happiness doesn’t come from a ping-pong table or free snacks. We retain our talent by upholding our values of integrity & transparency, and selling a product that changes the lives of our customers.
Check out our LinkedIn to learn more.
You will be responsible for building ETL pipelines from scratch, creating new DataMarts, creating new DWH layers, managing and maintaining DWH.
- Building Product part of DataWarehouse
- Be a data partner in crime for product analyst in their conquest to understand vision and problems of product organization
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Deep knowledge of concepts and architecture of Data Warehouses
- Deep understanding of one of relational DB (Oracle, Postgres, Redshift.... )
- On-hands experience with column-based DB (Redshift, BigQuery)
- Experience with orchestration tools (Airflow preferable)
- On-hands experience with Python
- Fluent written and spoken English (B1+)
- Flexibility and willingness to work in agile environment with a focus on the result
Would be a plus:
- Experience in building Datamarts/DWH for B2B domain
- Experience with Redshift
- Experience with Airflow
What we offer:
- An honest, open culture that emphasizes feedback and promotes professional and personal development
- An opportunity to work from anywhere — our team is distributed worldwide, from Minsk to Manila, from Florida to California
- An annual personal budget for educational classes, conferences, etc. — anything to further your professional knowledge
- A competitive salary
- And much more!