Data Engineer

See more jobs from EyeCare Partners LLC

almost 3 years old

This job is no longer active

EyeCare Partners (ECP) is dedicated to being the provider of choice for vision care patients. With a network of more than 550 full-scope medical optometry and ophthalmology practices across 18 states, we are the largest vertically integrated medical vision services provider in the US and continue to grow. Founded in 2015 and headquartered in St. Louis, Missouri, ECP offers patients end-to-end services covering medical optometry, ophthalmology and sub-specialties, and vision correction products. Our service-oriented team provides an integrated network of services to cover the entire lifecycle of a patient's eye care needs, allowing our doctors and their teams to do what they do best – care for our patients. For more information visit eyecare-partners.com.

ECP is backed by Partners Group, one of the largest private markets investment managers in the world.

Data Engineer – Information Technology

Position Summary:  We are seeking an experienced professional who will serve as the Data Engineer in our Data & Analytics that is deploying some of the modern data platforms and analytics tools. The Data Engineer will be responsible and accountable for expanding and optimizing our data and data pipeline architecture, as well as optimizing data collection. This role will support our software engineers, data architects, data analysts and data scientists on various enterprise data initiatives and will ensure SLA based data delivery. The ideal candidate will be excited by the opportunity to design & build our company’s data architecture to support our next generation of data and analytics solutions.

Essential Responsibilities:

  • Design, build and maintain data pipelines from various source systems into Snowflake
  • Analyze data elements from various systems, data flow, dependencies, relationships and assist in designing conceptual physical and logical data models
  • Design, build and maintain complex data sets designed to meet various business needs in the areas of reporting, advanced analytics and ad-hoc analysis
  • Coordinate the build and maintenance of data pipelines by third party service providers
  • Enabling and executing data migrations across systems (e.g. SQL server to Snowflake or other cloud data platforms)
  • Development and implementation of scripts for datahub maintenance, monitoring, performance tuning
  • Work with data and business analysts to deploy and support a robust data quality platform
  • Work with data and business analysts to deploy and support a robust data cataloging strategy
  • Work with various business and technical stakeholders and assist with data-related technical needs and issues
  • Work with data and analytics teams and drive greater value from our data and analytics investments
  • Work closely with cross-functional teams to understand and transform business requirements into scalable and manageable solutions
  • Present solutions and options to leadership, project teams and other stakeholders adapting style to both technical and non-technical audiences
  • Ensures teams adhere to documented design and development patterns and standards
  • Proactively monitor and resolve on-going production issues
  • Work closely with various technical teams to ensure consistency, quality of solutions and knowledge sharing across the enterprise
  • Educate organization on available and emerging tool sets
  • Ensure adherence to the approach of self-service data solutions and enable other teams with analytics solutions delivery via ‘Data as a Service’ model

Requirements:

  • Bachelor's degree in Computer Science, Information Systems or equivalent + 3 years related experience
  • 3+ years of hands-on-experience in the design, development, and implementation of data solutions
  • Advanced SQL knowledge with strong query writing, stored procedures skills
  • Experience with Snowflake development and support
  • Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc.
  • Experience with AWS cloud services: EC2, EMR, RDS, DMS
  • Experience with relational databases such as SQL Server and object relational databases such as PostgreSQL
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with data analysis, ETL, and workflow automation
  • Experience working with multiple ETL/ELT tools and cloud based data hubs
  • Demonstrated problem solving
  • Demonstrated ability to think and work with a proactive mindset
  • A self-motivated personality with a passion for working in a fast-paced environment