Sphere Partners is a technology and management consulting firm focusing on digital, data and engineering transformation for our customers. We are a 100% virtual organization with presence across various continents.
We are strategically scaling our AI, analytics and data practice with leaders and practitioners, to that effect, we are hiring top talent from the industry, a sample of our leadership here and the kind of role he is engaged in.
With our unwavering commitment to excellence, we are rapidly expanding our suite of consulting services, augmenting our geographic presence, and honing our industry expertise. We stand at the precipice of an exciting new chapter, poised to establish ourselves as a renowned name in the realm of technology consulting. Now we are looking for a Data Architect to join our team.
Location: Anywhere, Remotely
Start Date: ASAP
Responsibilities include but are not limited to:
- Work closely with the sales team to participate in client engagements, understanding their data requirements and technical challenges;
- Work closely with the head of data to build practice strategy, define offering, if needed, work in building POCs, innovations, write whitepapers, etc.;
- Design end-to-end data architectures;
- Create detailed data architecture proposals (including diagrams, technical specifications, and estimated project timelines);
- Collaborate with internal data engineers, analysts, and other technical teams to refine and validate data architecture designs;
- Present data architecture solutions to clients, showcasing the technical benefits and business impact;
- Provide technical guidance during client discussions and workshops;
- Manage data partnerships along with sales;
- Participate in client advisory consulting for 6-8 weeks, only if needed;
- Participate in innovation initiatives.
- Experience as a Data Architect (5+);
- Experience in data strategy, data architecture, data journey map creation;
- Experience in articulation and estimation frameworks for all areas of data engineering;
- Hands-on experience to Datawarehouses and Datalakes;
- Experience in consulting and delivery areas for data engineering;
- Experience with programming languages like Python, Java, Scala or their equivalents, used for data processing (must);
- Experience with reporting tools — Tableau, Qlik sense and others (must);
- Experience with Informatica, Pyspark, Databricks (must);
- Experience with Database (e.g., SQL, NoSQL) and data manipulation languages (must);
- Experience with Microsoft tools like Azure Data Factory, PowerBI, Azure Data Factory, Synapse, SQL, SSIS, SSRS, SSAS or Amazon tools like as AWS, Redshift Pipeline, Kinesis, Glue, S3, Athena and EMR (must);
- Strong conceptual foundation of data modeling including star schemas (must);
- Experience with Snowflake;
- Experience with GCP Big Query, Dataflow, pub sub, data prep and others.
It will be a plus if you have:
- Experience in creating whitepapers, blogs (speaking experience at C level events, webinars;
- Innovation exposure — Exposure in building data innovations in Datawarehouses, ingestion, data quality, data migration, data modernization, reporting — is a huge plus;
- Domain exposure — exposure to as many industries is preferred — building data architecture for banking, insurance, healthcare, life sciences, pharma, retail, ecommerce.