🚀 Launch your SaaS fast, with Larafast.

Production ready Laravel Starter Kit with everything that you need to start your next SaaS project, AI Wrapper or any other web application.

Launch Your SaaS Fast

Senior Data Engineer, DP Team (Remote, International/Non-U.S.)

PulsePoint
United States
TELECOMMUTE
Full-Time
8
Calm

Summary

Responsibilities

  • Design, build, and maintain reliable and scalable enterprise-level distributed transactional data processing systems.
  • Optimize jobs to utilize resources efficiently.
  • Monitor and provide transparency into data quality across systems.
  • Increase accessibility and effectiveness of data through collaboration with analysts and data scientists.
  • Provide mentorship and guidance to junior team members.

Requirements

  • 5+ years of data engineering experience.
  • Fluency in Python and SQL.
  • Experience in Scala/Java is a plus.
  • Proficiency in Linux.
  • Strong understanding of RDBMS and query optimization.
  • Knowledge of distributed production systems, and cloud migration (AWS/GCP/Azure) is a plus.

Work-Life Balance Benefits

  • Flexible working hours as long as you can work until 12pm/1pm EST.
  • Fully remote work options.

Benefits

Apply Now

👉 Please mention that you found this job on CalmJobs, thanks!

Full Details of Job Post

A bit about us:
PulsePoint is a leading healthcare ad technology company that uses real-world data in real-time to optimize campaign performance and revolutionize health decision-making. Leveraging proprietary datasets and methodology, PulsePoint targets healthcare professionals and patients with an unprecedented level of accuracy—delivering unparalleled results to the clients we serve. The company is now a part of Internet Brands, a KKR portfolio company and owner of WebMD Health Corp.
Sr. Data Engineer
PulsePoint Data Engineering team plays a key role in our technology company that’s experiencing exponential growth. Our data pipeline processes over 80 billion impressions a day (> 20 TB of data, 200 TB uncompressed). This data is used to generate reports, update budgets, and drive our optimization engines. We do all this while running against tight SLAs and provide stats and reports as close to real-time as possible.
The most exciting part about working at PulsePoint is the enormous potential for personal and professional growth. We are always seeking new and better tools to help us meet challenges such as adopting proven open-source technologies to make our data infrastructure more nimble, scalable and robust. Some of the cutting-edge technologies we have recently implemented are Kafka, Spark Streaming, Presto, Airflow, and Kubernetes.
What you'll be doing:
  • Design, build, and maintain reliable and scalable enterprise-level distributed transactional data processing systems for scaling the existing business and supporting new business initiatives
  • Optimize jobs to utilize Kafka, Hadoop, Presto, Spark, and Kubernetes resources in the most efficient way
  • Monitor and provide transparency into data quality across systems (accuracy, consistency, completeness, etc)
  • Increase accessibility and effectiveness of data (work with analysts, data scientists, and developers to build/deploy tools and datasets that fit their use cases)
  • Collaborate within a small team with diverse technology backgrounds
  • Provide mentorship and guidance to junior team members
Team Responsibilities:
  • Ingest, validate and process internal & third party data
  • Create, maintain and monitor data flows in Python, Spark, Hive, SQL and Presto for consistency, accuracy and lag time
  • Maintain and enhance framework for jobs(primarily aggregate jobs in Spark and Hive)
  • Create different consumers for data in Kafka using Spark Streaming for near time aggregation
  • Tools evaluation
  • Backups/Retention/High Availability/Capacity Planning
  • Review/Approval - DDL for database, Hive Framework jobs and Spark Streaming to make sure they meet our standards
Technologies We Use:
  • Python - primary repo language
  • Airflow/Luigi - for job scheduling
  • Docker - Packaged container image with all dependencies
  • Graphite - for monitoring data flows
  • Hive - SQL data warehouse layer for data in HDFS
  • Kafka - distributed commit log storage
  • Kubernetes - Distributed cluster resource manager
  • Presto/Trino - fast parallel data warehouse and data federation layer
  • Spark Streaming - Near time aggregation
  • SQL Server - Reliable OLTP RDBMS
  • Apache Iceberg
  • GCP - BigQuery for performance, Looker for dashboards

Requirements

  • 5+ years of data engineering experience
  • Fluency in Python and SQL
  • Experience in Scala/Java is a plus (Polyglot programmer preferred!)
  • Proficiency in Linux
  • Strong understanding of RDBMS and query optimization;
  • Passion for engineering and computer science around data
  • East Coast U.S. hours 9am-6pm EST preferred, but we can be flexible as long as you can work until 12pm/1pm EST; you can work fully remotely
  • Knowledge and exposure to distributed production systems i.e Hadoop
  • Knowledge and exposure to Cloud migration (AWS/GCP/Azure) is a plus
Selection Process:
1) Recruiter Screen (30 mins)
2) Hiring Manager Interview (45 mins)
3) Tech challenge take-home
4) SQL & Python Interview (60 minutes)
5) Team Interviews (60 mins + 3 x 45 mins) + SVP of Engineering (15 mins)
6) WebMD Sr. Director, DBA (30 mins)
WebMD and its affiliates is an Equal Opportunity/Affirmative Action employer and does not discriminate on the basis of race, ancestry, color, religion, sex, gender, age, marital status, sexual orientation, gender identity, national origin, medical condition, disability, veterans status, or any other basis protected by law.  

 

Apply Now

👉 Please mention that you found this job on CalmJobs, thanks!

Similar Jobs

Fannie Mae

Washington, District of Columbia, United States Full-time

View Details

Experian

United States Full-time

View Details

McDonald's Corporation

Chicago, Illinois, United States Full-time

View Details

McAfee

Madrid, Madrid, Spain FULL_TIME

View Details

Visa

Reading, Reading, United Kingdom Full-time

View Details

Tether Operations Limited

Tokyo, TĹŤkyĹŤ, Japan TELECOMMUTE FULL_TIME

View Details

Prime Therapeutics

Mountain Home, Idaho, United States FULL_TIME

View Details

Radian

Denver, Colorado, United States FULL_TIME

View Details