Principal Data Engineer

Dun & Bradstreet
Full-time
Austin, Texas
Posted on 3 months ago

Job Description

This role supports both platform and data teams, developing and maintaining front and back-end code for a B2B audience building platform. It involves developing, maintaining, and analyzing datasets from diverse sources to create insights for clients, power the platform, and create innovative market understanding.

Responsibilities

  • Design and implement business requirements by collaborating with stakeholders
  • Advise project leadership on technical subjects and provide input on feasibility of product asks
  • Assist the leadership team in identifying engineering talent
  • Keep stakeholders apprised of project progress by regularly providing engineering updates
  • Take ownership of the application code and develop a complete understanding of how the application functions
  • Master the development tools being used and the services employed
  • Develop a thorough understanding of how the application functions from a systems perspective
  • Collaborate with cross-functional teams to identify and design requirements for advanced systems
  • Architect robust systems and write highly fault tolerant software
  • Create new insights for customers to understand their markets
  • Design and document systems that can be cleanly and easily maintained
  • Maintain data quality by writing validation tests
  • Understand variety of unique data sources
  • Create and document data documentation, including processing systems and flow diagrams
  • Help maintain existing systems, including troubleshooting and resolving alerts
  • Share ideas across teams to spread awareness and use of frameworks and tooling
  • Show an ownership mindset in everything you do
  • Continuous growth mindset, keep learning through social experiences and relationships

Requirements

  • 10+ years of successful commercial experience of software engineering life cycle
  • Experience designing and implementing scalable architecture with real-time capabilities
  • Experience moving large volumes of data across services and architectures
  • Experience with Google Cloud Platform services or AWS equivalent technologies
  • Extensive experience with SQL and relational databases, including optimization and design
  • Experience with analytic tools and ETL/ELT/data pipeline frameworks such as Airflow
  • Testable and efficient Python coding for data processing and analysis
  • Expert at developing secure and performant applications
  • Experience with OS level scripting
  • Experience in AdTech, web cookies, and online advertising technologies is a plus
  • Expertise in containerized infrastructure and CI/CD systems
  • Experience with version control and Agile Project Management tools
  • Experience with object-oriented programming, functional programming
  • Familiarity with parallelization of applications
  • Experience with data visualization tools
  • Experience working with global remote teams
  • Extremely data driven and detail oriented
  • Knowledge of data transformation processes
  • Google Cloud certification a plus
  • Proficiency in Microsoft Office Suite

Benefits

  • No benefits