Lead Data Engineer
Description & Requirements
WHAT MAKES US A GREAT PLACE TO WORK
We are proud to be consistently recognized as one of the world’s best places to work, a champion of diversity and a model of social responsibility. We are currently #1 ranked consulting firm on Glassdoor’s Best Places to Work list and have maintained a spot in the top four on Glassdoor’s list for the last 13 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents.
WHO YOU’LL WORK WITH
As a member of Bain's Advanced Analytics Group (AAG) you’ll join a talented team of diverse and inclusive analytic and engineering professionals who are dedicated to solving complex challenges for our clients. We work closely with our generalist consultants and clients to develop data-driven strategies and innovative solutions. Our collaborative and supportive work environment fosters creativity and continuous learning, enabling us to consistently deliver exceptional results.
WHERE YOU’LL FIT WITHIN THE TEAM
As a Lead, Data Engineering you will leverage your experience to implement and refine technical solutions for a wide range of industries. Working with a diverse team, you will engage in the entire engineering life cycle, focusing on designing, developing, optimizing, and deploying sophisticated data engineering solutions and infrastructure at a production scale suitable for the world’s largest companies.
WHAT YOU’LL DO:
- Develop data and software solutions to address large-scale enterprise challenges for Bain's clients, serving as the data engineer and expert within a cross-functional team
- Develop and maintain long-lasting products that support internal or client needs
- Collaborate closely with and influence general consulting teams to identify analytics solutions for client business problems and to execute those solutions
- Collaborate with data engineering leaders to develop and advocate for modern data engineering concepts to both technical audiences and business stakeholders
- Enable data and technology for data science, analytics, and other application use cases via data engineering
- Transformations at scale including cleaning, enriching, de-duping, joining and correlated on structured, semi-structured or unstructured data
- Defining and implementing new and innovative deployment techniques, tooling, and infrastructure automation within Bain the full software development life cycle including designing, writing documentation and unit/integration tests, and conducting code reviews for data engineering solutions
- Participate in infrastructure engineering for data ecosystem including development, testing, deployment and release
- Provide technical guidance to external clients and internal stakeholders in Bain
- Contribute to industry-leading innovations that translate into great impact for clients in case work
- Stay current with emerging trends and technologies in cloud computing, data analysis, and software engineering, and proactively identify opportunities to enhance the capabilities of the analytics platform
- Travel is required (30%)
ABOUT YOU:
- Proven experience in end to end data and software engineering within either/or product engineering or professional services organizations; including project setup, test cases, dependency, and build management
- •Master’s degree in Computer Science, Engineering, or a related technical field
- 5 years minimum experience
- ** 3+ years at Senior or Staff level, or equivalent
Technical Skills and Knowledge:
- Working knowledge of (3+ years) Python, Scala, C/C++, Java, C#, Go, or similar programming language
- Experience in deploying serverless data pipelines through containerization and terraform orchestration
- Experience in data ingestion using one or more modern ETL compute and orchestration frameworks (Airflow, Beam, Luigy, Spark, Nifi or any other)
- Experience (3+ years) of SQL or NoSQL databases: PostgreSQL, SQL Server, Oracle, MySQL, Redis, MongoDB, Elasticsearch, Hive, HBase, Teradata, Cassandra, Amazon Redshift, Snowflake
- Experience with Cloud platforms and services (AWS, Azure, GCP, etc.) or Kubernetes via Terraform automation, and associated deep understanding of failover, high-availability, and high scalability
- Experience with DevOps, CI/CD, Github Actions, version control and git workflows
- Experience scaling and optimizing schema and performance turning SQL and ETL pipelines in data lake and data warehouse environments
- Strong computer science fundaments in data structures, algorithms, automated testing, object-oriented programming, performance complexity, and implications of computer architecture on software performance
- Experience working according to agile principles
Interpersonal Skills:
- Strong interpersonal and communication skills, including the ability to explain and discuss technicalities of solutions, algorithms and techniques with colleagues and clients from other disciplines
- Curiosity, proactivity and critical thinking
- Ability to collaborate with people at all levels and with multi-office/region teams