Technical Architect
Job Description
As a Technical Architect you will work closely with other members of the team to lead product development and testing, contribute to product improvement, and ensure technical documentation is created and maintained according to regulatory requirements. The position involves various responsibilities such as staying up to date with technology trends and maintaining expertise in agile product delivery processes and related technologies.
Responsibilities
- Developing Modern Data Warehouse solutions using Databricks and Azure Stack.
- Ability to provide solutions that are forward-thinking in data engineering and analytics space.
- Collaboration with DW/BI leads to understanding new ETL pipeline development requirements.
- Triage issues to find gaps in existing pipelines and fix the issues.
- Work with business to understand the need in reporting layer and develop data model to fulfil reporting needs. Help joiner team members to resolve issues and technical challenges.
- Drive technical discussion with client architect and team members. Ability to define cloud-based, scalable product architectures in line with company standards and technology trends.
- Able to create achievable product implementation strategies that fit into the existing enterprise architecture and deliver business value continuously.
- Experience with Apache Kafka for use with streaming data / event-based data.
- Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala).
- Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J).
- Experience working with structured and unstructured data including imaging & geospatial data.
- Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT.
- Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshooting.
Qualifications
- Deep understanding of Star and Snowflake dimensional modelling.
- Strong knowledge of Data Management principles.
- Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture.
- Should have hands-on experience in SQL, Python and Spark (PySpark).
- Desirable to have ETL with batch and streaming.
- Experience in building ETL / data warehouse transformation processes.
- In depth hands-on implementation on Delta Lake, Delta table – Managing Delta Tables, Databricks Cluster Configuration, Cluster policies.
- Experience in handling structured and unstructured datasets.
- Strong proficiency in programming languages like Python or Scala or Java.
- Databricks Certified Data Engineer Associate/Professional Certification (Desirable).
- Should have experience working in Agile methodology.
- Strong experience in Data Modelling.
- Bachelor’s and/or master’s degree in Computer Science or IT or equivalent experience.
- 12+ yrs. of overall experience while architecting solutions on data platforms with experience on Azure Data Factory.
- Lead solution delivery independently by collaborating with global stakeholders and 8+ years and experience in Datawarehouse/ETL development projects.