hero

Come grow with us.

Data Engineer (II)

FloatMe

FloatMe

Data Science
Remote
Posted on Tuesday, February 6, 2024

About the Role

Come shape the future of FloatMe! As our Data Engineer (II), you will help define the experiences we build to guide the 150M Americans living paycheck to paycheck on their path to financial resilience. In this role, you’ll continue to improve the foundation for a trusted and performant data platform that enables the entire company to make rapid data-driven decisions and ship scalable data-driven features.

What You’ll Do

  • Maintain and improve our data pipelines and warehouse to ensure our internal stakeholders have timely access to accurate data. Our current stack uses Python, SQL, DBT, Thoughtspot, Snowflake and more.

  • You will own our integrations with multiple data providers such as Twilio, Iterable, App Store, among others via APIs.

  • Write, configure, deploy, and maintain the tools needed to deliver accurate and clean data, rapidly.

  • Partner with Product Analytics to understand our current use cases and in tandem create, maintain and debug data models and reports.

  • Build and maintain consistent data models that act as the source of truth for all metrics within the company.

  • Build and share a deep understanding of our chosen languages, frameworks, and other areas of focus. Continuously pushing yourself and the team to be continuously learning, growing and implementing best practices.

  • Demonstrate ownership for your work including understanding requirements, crafting solutions, and building robust testing plans to ensure the highest data quality.

Who You Are

  • Thoughtful. You guide teams using data and insights and are constantly deploying the best solutions.

  • Driven by ownership. You are maniacal about improving and maintaining our data pipelines to ensure we are able to rapidly pull insights for all stakeholders.

  • Product’s best friend. Your insights ensure teams both understand and bring value to their users.

Requirements

  • 5+ years’ of overall data engineering work experience

  • Familiarity and hands on experience with Kinesis Streams

  • Extensive experience with DBT jobs and optimization

  • Experience managing multiple data sets via APIs

  • Experience with utilizing Terraform

  • Experience writing in Python

  • Must have ETL experience with Appflow, Dynamodb, Postgres and Snowflake

  • Exceptional analytical, organizational, interpersonal, and communication (both oral and written) skills

  • Self-motivated, driven, resourceful, and able to get things done

Nice to Have:

  • Previous BI reporting experience

Benefits

  • Health insurance

  • Dental & Vision

  • Long-term disability

  • 401(k) with standard & Roth options

  • Team outings (lunches, happy hours, games, and more)

  • Opportunities for growth and professional development

  • Unlimited PTO

  • Sick Leave & Personal Days