Solution Architect – Data Engineering

Responsibilities

  • Design big data solutions covering key aspects of data management i.e. ingestion, storage, transformation, pipeline management, metadata management and access
  • Engage in client conversations to develop a strategy for migrating on-premise Data Warehouse to Cloud (Big Query, Redshift, ADLS, HDInsight, S3/Hive, Snowflake) based modern data platform
  • Support pre-sales activities like PoCs, RFPs and Client Presentations
  • Assess existing environments and identify & resolve performance issues
  • Lead the development of reusable assets during a project and package them as accelerators for internal and customer use
  • Coach and mentor consultants in data engineering
  • Deliver data flow design documentation

Education and Skills

  • Bachelor’s degree or higher in an Engineering, IT, Math or Science related field.
  • 10+ years of experience in which a minimum of 5+ years of professional Services (customer facing) experience, 5+ years in Solution Architecture Hadoop/Spark/Big Data ecosystem
  • Experience in Databricks will be a huge plus
  • Must have hands-on experience on one or more languages Spark SQL, Scala, Python, HiveQL etc.
  • Hands on experience on Airflow, NiFi or any other data orchestration tool
  • Should have a good understanding of data engineering project delivery
  • Experience on APIs like REST/JSON or XML/SOAP, Java ecosystem including debugging and monitoring tools
  • Strong verbal and written communications skills are a must
  • The ability to work effectively across internal and external organizations in different time zones
  • Certifications – Google Professional Data Engineer, AWS Big Data Specialty, Cloudera, Hortonworks (Spark), Oracle and Teradata

Job Features

Job CategoryData Engineer

Apply Online