Data Architect / Data Operation

Bengaluru 8-9 years 20-25 Lacs

Job Details

Company Overview:

The Company, headquartered in Neutraubling, Germany, plans, develops, and manufactures machines and complete lines for the fields of process technology, bottling, and packaging, plus intralogistics and recycling. Every day, millions of bottles, cans and containers are "processed" in company lines - in alcoholic and non-alcoholic beverage industries, dairy and liquid food industry as well as in the chemical, pharmaceutical and home & personal care industries. It is quite likely that the bottle of water, cola or juice in your hand is manufactured in one of the company lines!! The company Digital Solutions India is created as of 2023 to be the Technology Competence Centre for organization, focusing on developing software solutions for the Internal organization as well as for the customers of Organization Global.

 

Who are we looking for?

We have a fantastic opportunity for a data enthusiast as a Data Architect/Data Operations in company of India team. The Data Architect would be responsible for controlling, structuring and modelling the data and sharing the analysis across the organization by outlining how those assets are responsible for business outcomes. He/She is responsible for the data models and can understand the effects of different data analysis scenarios on the global IT architecture including cost optimization and cares about the entry points of the data and structure. Data Operation is responsible for the operations of data platform including Security and Governance, integration and clarification of new data sources and would in close collaboration with Data Engineer and Data Analyst.

 

Roles and responsibilities:

  1.  Design, develop and implement data pipeline solution and infrastructure for scalability.
  2.  Identify and analyze business requirements from various functional/production areas and customer demands to meet the needs of the business.
  3.  Ensure operational efficiency in internal processes and standards for data quality
  4.  Design and implement data models, data flows, data mesh, data pipelines, DDL scripts and ETL/ELT processes using state-of-the-art technologies in a   cloud environment (e.g.,Databricks and other components).
  5.  Provide expertise in the technology stack, including but not limited to: Databricks, AWS S3, AWS Glue, AWS Redshift, Pyspark, Python, ETL tools, SQL   and to assist with troubleshooting and data modelling efforts.
  6.  Create and optimize complex SQL queries, design tables/schemas, and evaluate trade-offs between raw, aggregated, and enriched data.
  7.  Continuously enhance and harmonize data flows and the architecture of data lake and warehouse structures.
  8.  Design and build robust, high-performance, and adaptable solutions optimal for modern platforms (Big Data/Relational/No-SQL/OLAP) with ability to   integrate technical functionality.
  9.  Design pipelines to ingest & integrate structured/unstructured data from heterogeneous sources
  10.  Guarantee of strict usage of Governance Rules
  11.  Unity Catalog, grant access
  12.  Cost Tracking and data consumption
  13.  Monitoring and managing of data pipelines and compute cluster

 

What is in it for you?

  1.  You are a part of a brand-new org. setup – with a clean slate and a mission to build a People first organization
  2.  You work beyond borders, with international teams seeking and imparting learning through shared/individual experiences and knowledge through the     community
  3.  You would be responsible for building the Data roadmap of our organization’s growth journey, helping to define the process strategy, organizational culture,   and best practices, while collaborating with onshore and offshore teams

 

What are we looking for? (Experience/Qualifications/Skillsets/Must-haves)

  1.  Education - Bachelor’s degree in Computer Science/Engineering or a comparable qualification with 9+ years of relevant experience.
  2.  Technical Skills
  3.  Skills in designing and building robust, high-performance, and adaptable solutions optimal for modern platforms (Big Data/Relational/No-SQL/OLAP) with   ability to integrate technical functionality.
  4.  Knowledge to Design pipelines to ingest & integrate structured/unstructured data from heterogeneous sources
  5.  Collaborate with delivery leadership to deliver projects on time adhering to the quality<

    Login to apply for this job