Senior Manager,Deputy General Manager , Big Data Analytics

  • 12-20 yrs
  • Not Disclosed

Job Description

Job Opportunity for Big Data Analytics Delivery Manager with India's Leading Multinational Company.Experience :, 12+ years of experienceDesignation : SM,DGMEducation : BE,BTech,Msc,MCA,MTechJob Description :, Must have implementation experience of implementing data lake on any cloud platform, Experience in using VMs, GCS, Dataflow, Dataproc, CloudSQL, BigTable, Cloud Composer, GKE, Docker, Spreadhead end,to,end partnership process with business units from business case development, problem definition, analytics delivery and impact measurement., Proactively engage with the business & operations to support & drive organization priorities through Technology & Analytics services., Develop action plan to structure next steps , tasks, assign corresponding timelines, ensure timely execution of agreed actionable and provide a progress update to the senior management on weekly,fortnightly basis., Create exhaustive business case for new technology & Analytics initiatives basis requirements within the stipulated timelines., Effective communication , negations with the internal customers to ensure successful execution with ownership., Envision the business KRA,KPI, analytics and data science needs; requires ownership and active involvement in implementation and driving some of the business KPIs as :, Solutioning the business needs , requirements, participate, drive discussions with business, for the speedy and agile documentation of the requirements and solution which can be taken by the delivery team, Ability to seamlessly integrate a team from different internal functions (IT, Tech, Support) and external vendors, Identify & implement the right Data,warehouse tools, Data mining, ETL tools, Data engineering tools, advance data analytics tools, Visualization tools etc., Prioritize task and manage projectsSkillset :, Experience in delivering large scale data platforms involving structured & unstructured data on cloud (GCP preferred), Experience in data engineering (streaming, real,time replication & batch), Knowledge of architectural patterns of Data Lake, Experience in Devops tools in managing & tracking projects, maintaining repository, Good understanding of GCP services; hands on experience in GCP Managed & serverless components (Dataflow, DataProc, CloudSQL, BigQuery, DataFusion, Cloud Composer), Good knowledge of Python; excellent ability in writing and tuning SQL queries, Has experience of working on open source DB (PostgreSQL), data warehouse (BigQuery), RDBMS (Oracle, SQL Server);, Exposure to reporting and visualization tools (PowerBI, Looker), Exposure to big data workflows and analytics tools (Spark, AI Platform), Knowledge of AWS services is an added advantage, Understanding of ETL tools (preferably Informatica), Ability to work in an entrepreneurial environment and be a self,starter, Good team player, Strong bias towards action and results