Rupeek , Business Analyst , ETL

  • 5-8 yrs
  • Not Disclosed

Job Description

Responsibilities:, Develop a new Datawarehouse (Postgres,Redshift) by creating a workflow to systemize manual user data daily, Build simple UI for CRUD applications, Build scalable open,source ETL data pipelines using Pyspark & SQL(Postgres &Redshift) to automate manual Data Reconciliation and derived tables, Schedule Jobs to trigger emails with planned queries and attend ad hoc data queries from different Team, Build and maintain large no. of Tableau Dashboards for Daily,Monthly MIS Reporting, Troubleshoot downtime issues in data warehouse and the scheduled ETL Scripts, Integrate complete life cycle data of a customer from multiple sources, Undertake handover of complete data maintenance of related projects in the DatabaseMandatory Skill:, 5+ yrs experience in working in database management and analysis, Experience in building data pipelines using big data technologies like Spark, Hadoop etc., Java,Flask to build simple CRUD UI, Knowledge on data warehouse concepts, database design, etc. and experience working Postgres and Redshift DB, Proficiency in Advanced SQL, Python, PySpark, any ETL Tool, Good to have: Apache Airflow, any Email orchestration, Experience in working with AWS services like S3,Redshift,EMR,Lambda,SQS etc