Data Modeler – Data Engineer
7th April, 2022

Mandatory Requirements:

The Data Engineer is responsible for designing and implementing data models on various data platforms. As a Data Engineer – Data Modeling, your responsibilities include –


  • Develop, test, and support future-ready data solutions for customers across industry verticals
  • Develop, test, and support end-to-end batch and near real-time data flows/pipelines
  • Demonstrate understanding of data architectures, modern data platforms, big data, ML/AI, analytics cloud platforms, data governance and information management, and associated technologies
  • Design and implement data models according to business requirements for specific use cases and/or client’s business domains
  • Demonstrate understanding of data modeling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling, Entity-Relationship (ER) Models, and Data Vault
  • Demonstrate understanding of Data Asset concept or Data Mesh Architecture and its Domain Data Products
  • Develop and demonstrate Proof of Concepts and Working Demos
  • Lead or collaborate with other internal/external consultants in consulting, workshops, and delivery engagements
  • Mentor junior IBM consultants in the practice and delivery engagements.

To ensure success in the role you will possess the following skills –

  • Minimum of 10+ years of total work experience as a solution designer, developer, tester, or support role in IT Consulting or other technology business units across the industry
  • Minimum of 3+ years of hands-on development, testing, and administration experience with Big Data/Data Lake services in cloud and on-premise
  • Minimum of 3+ years of designing and implementing data models in Big Data/Data Lake or Data Warehouse environment using modeling approaches and techniques such as Dimensional Modelling on top of Star or Snowflake Schemas
  • Experience in implementing data model products (IFW/BDW/FSDM) from vendors such as IBM and Teradata
  • Hands-on experience in data model design tools such as Erwin and Sparx Systems Enterprise Architect
  • Experience in applying industry best practices, design patterns, and first-of-a-kind technical solutions as developer or administrator on data and non-data specific platforms and applications
  • Experience in implementing near real-time data flows/pipelines and distributed streaming applications
  • Experience in implementing traditional ETL, data warehousing, and BI solutions
  • Hands-on development experience and working skill level in Python, Scalar, SQL, shell scripting, and other programming/scripting languages
  • Hands-on implementation of services sensitive to performance SLAs with high availability, fault-tolerance, automatic fail-over, and geographical redundancy
  • Working knowledge and hands-on experience with data services (Azure Synapse, Cosmos DB, etc) in Cloud Platforms such as Azure, AWS, GCP, IBM, and other modern data platforms
  • Solid understanding of containerisation, virtualisation, infrastructure, and networking
  • Diverse experience in software development methodologies and project delivery frameworks such as Agile Sprints, Kanban, Waterfall, etc
  • Experience in presenting to and influencing stakeholders and senior managers
  • Team leadership and people manager experience
  • Degree in Computer Science, Information Technology, or related Engineering courses
  • One or more industry-recognised technology certifications or badges
You can’t apply as it’s expired.