Go Back

Data Engineer (Remote)

  • IT, Software development, System Engineering Jobs
  • Marketing and PR, Advertising and Creative Media Jobs

Description

At Wunderman Thompson, we exist to inspire growth for ambitious brands. Part creative agency, part consultancy, and part technology company, our experts provide end-to-end capabilities at a global scale to deliver inspiration across the entire brand and customer experience.

We are 20,000 strong in 90 markets around the world; our people bring together creative storytelling, diverse perspectives, inclusive thinking, and highly specialized vertical capabilities to drive growth for our clients. We offer deep expertise across the entire customer journey, including communications, commerce, consultancy, CRM, CX, data, production, and technology.

 

 

Who we are looking for:

Wunderman Thompson is seeking a smart, inquisitive, self-driven Data Engineer to join our growing data engineering function within Wunderman Thompson Data. You’ll be responsible for building scalable data science solutions that can be used across our data business.

As a Data Engineer you will be responsible for building a new enterprise level Analytical Mart for use by all analytic functions across the group. We have a previously implemented solution built in Oracle and SAS - your job will be to bring this into the 21st century by building a cloud-based solution, utilizing Python and Snowflake. Architectural input is also an opportunity for applicants who have/desire experience in this area. We are hiring a small team of 2 people to deliver this build, after which you will continue to work in collaboration with the Data Science teams to deploy scalable data engineering pipelines, developed in both SQL (Snowflake) and Pyspark (AWS).

If you are genuinely excited about the data, how to structure it, how to make warehousing and data creation dynamic, this is the opportunity for you.

 

 

What you’ll do:

  • Build data engineering solutions | Build new enterprise level products, including, our analytical mart for use across Wunderman Thompson. This includes data wrangling, feature engineering, as well as formulation of required features for analytical consumption. 
  • Create | Build and implement a flexible data mart solution in python that enables auto build cycles and flexible runs to create data marts on the fly.
  • Collaborate | You will be an active participant in the Wunderman Thompson data science community where best practices are shared, innovations are hatched, and cross-vertical collaboration with the product, data science and delivery teams.
  • Cutting edge technology | Cloud based engineering to manage and deploy data pipe-lines end to end from problem formulation, raw data to implementation and optimization.

Responsibilities

Who you are:

  • Curious | With an inquisitive mindset you embrace the unknown and see as an opportunity to explore and innovate.
  • Ambitious | Willing to take calculated risks, stretch yourself and your team to do new things vs. plugging into existing solutions.
  • Passionate | You take great pride in your work. You approach our own and our clients’ business challenges with enthusiasm and a commitment to getting it right. You love working in health. You see data science as a way of expressing creativity.
  • Humble | Wear any hat that needs to be worn, and you know you do not know everything. You want to learn from others. 

 

 

What you’ll need:

  • Deep data science and data engineering skill set with grounding in practical marketing applications oriented towards content, customer insights and customer experience in a digital marketing-heavy environment.
  • Minimum of 2 years of experience deploying enterprise level data engineering solutions.
  • Strong experience with structure data warehouse design and management. Historic knowledge of Oracle, schema design, and Snowflake.
  • Production level PL/SQL scripting, bash scripting.
  • Production level experience with Python, scala/spark.
  • Proven track record building cloud-based infrastructure; AWS required.
  • Some experience with Kafka, Storm, Hadoop, Hive, Presto, MySQL, Redshift, Cassandra, DynamoDB, Elasticsearch.

Hard Skills

  • Coding and Programming (Python, C#, Java, PHP, etc
  • Customer relationship manager (CRM)
  • Data Analytics
  • Data entry
  • Digital marketing campaigns

Soft Skills

  • Communication
  • Strategic thinker
  • Competitive
  • Enthusiastic
  • Attention to detail