PRODUCT DATA ANALYST

Lantern
Full-time
Remote
Posted on 5 months ago

Job Description

Lantern is seeking a Product Data Analyst to support product initiatives within their third-party administrator (TPA) and Provider network solutions. This role requires a strong product mindset, technical skills, and expertise in building analytics dashboards, enabling data pipelines, and creating experimentation frameworks. The analyst will transform data sets into actionable insights, working closely with product managers and technical leads.

Responsibilities

  • Support roadmap initiatives across Product neighborhoods
  • Centralize fragmented data and integrate external data sources
  • Analyze data and deliver actionable insights
  • Build and monitor key pipelines and champion automation tooling
  • Analyze and drive insights from member and provider experience journey
  • Champion and execute A/B testing on key initiatives
  • Write dashboards and reports
  • Document requirements and translate into system specifications
  • Execute and coordinate requirements and change management processes
  • Design, prepare and execute unit tests
  • Participate in cross-functional teams
  • Develop innovative team solutions
  • Integrate technical expertise and business understanding
  • Consult with team members and other organizations on complex issues
  • Special projects as requested

Requirements

  • 5+ years in data analytics, preferably in product or healthcare environments
  • Expertise in BI tools (e.g., Tableau, Power BI), SQL, and data pipeline monitoring
  • Experience with healthcare standards and data formats (e.g. 837, 270/271) and provider data sources
  • Experience working with data lake, data fabric and tools like Data Bricks
  • Strong understanding of experimentation, funnel analysis, and engagement metrics
  • Excellent communication and collaboration skills
  • Bachelor’s degree in related field preferred
  • 5+ years of proven experience in data analysis using SQL, Python, R, and Excel
  • 4+ years of experience with Big Data technologies including Databricks, PySpark, Python, and Azure Storage, Microsoft Power BI, Tableau
  • 4+ years of performing data engineering/ETL operations, preferably with Azure Data Factory
  • 4+ years of data warehouse experience, preferably with the Snowflake data platform
  • 4+ years of experience in SQL and be able to write complex logic using SQL as part of ETL, and use SQL effectively to perform complex data analysis and discovery
  • Demonstrate strong organization skills and detail-oriented
  • Experience with CMD shell and PowerShell
  • Experience with large-scale, complex data environments
  • Ability to self-motivate and meet deadlines
  • Intense desire to learn
  • Ability to express complex technical concepts effectively, both verbally and in writing
  • Ability to multi-task in a fast-paced, changing environment
  • Ability to maintain confidentiality

Benefits

  • No benefits