Full Time

DBT Data Engineer

Position Summary: KleisTech is looking for experienced DBT Data Engineers who are passionate about data and eager to help tackle our clients’ analytics challenges. As a DBT Data Engineer, you will be responsible for building and maintaining data pipelines, developing DBT models, and ensuring the seamless integration of data for analytics purposes. This role requires a deep understanding of cloud technologies, data architecture, and ETL processes.

Key Responsibilities:

  • Documentation & Lineage: Generate comprehensive documentation around model descriptions, dependencies, SQL, sources, and tests. Create lineage graphs to provide transparency into data processes and business logic mapping.
  • External Data Integration: Work with external client data to build and optimize DBT solutions for real-time data collection and processing.
  • Pipeline Design & Development: Involve in the design of data pipelines and develop DBT models to transform raw data into actionable insights.
  • Data Modeling: Create models, identify patterns, and ensure data pipeline architectures are robust, scalable, and secure.
  • Collaboration: Work closely with management to align data strategies with company objectives and support business needs.
  • Validation & Compliance: Develop new data validation methods and tools, ensuring compliance with data governance and security policies.
  • Data Transformation: Design and implement data models and transformations using DBT, tailored to meet specific business requirements.
  • Technical Expertise: Leverage your experience in ETL tools, cloud data warehouses (especially Snowflake), and cloud technologies (AWS, Azure, Google Cloud) to optimize data processes.

Essential Skills:

  • DBT & Snowflake Expertise: 2+ years of hands-on experience with DBT and Snowflake ETL processes, including model development, package maintenance, and documentation.
  • SQL Proficiency: Strong skills in SQL and database table design, capable of writing efficient queries for large datasets.
  • Cloud Technology Experience: Hands-on experience with AWS, Azure, or Google Cloud, including data architecture design and ETL process optimization.
  • Programming: Proficiency in Python, Spark, or Scala for data engineering tasks.
  • CI/CD & DevOps: Experience with CI/CD and DevOps practices, especially in the context of Snowflake and cloud-based data platforms.

Desirable Skills:

  • ETL Testing: Familiarity with ETL testing and data orchestration tools.
  • Pharmaceuticals Sector Experience: Previous experience working within the Pharmaceuticals sector is an advantage.
  • Certifications: Certification in DBT Tool, Snowflake, or other related data engineering technologies is highly desirable.

Qualifications:

  • Education: Degree in Computer Science, Software Engineering, or a related field.

Qualities:

  • Problem-Solving: Demonstrates confidence, strong decision-making skills, and a logical approach to problem-solving.
  • Independence: Capable of independently tackling problems and following up with developers on related issues.
  • Team Collaboration: Able to work in a self-organized, cross-functional team and iterate based on feedback.
  • Client Interaction: Comfortable working seamlessly with clients across multiple geographies.
  • Analytical Skills: Strong analytical, presentation, reporting, documentation, and interactive skills.

Join KleisTech as a DBT Data Engineer. We’re hiring experienced professionals for remote roles focused on building and optimizing data pipelines using DBT and Snowflake. Apply now to advance your career in data engineering.

Job Category: IT
Job Type: Full Time
Job Location: Work From Home (WFH)
Experience: 7+ Years
Joining: Immediately or within 15-20 days

Apply for this position

Allowed Type(s): .pdf, .doc, .docx

Apply For Position

Have Any Questions?

+91 82080 93527

Mail Us

hr@kleistech.com

Apply For Job

Connect With Us