Everly Health is the leading digital health company at the forefront of the $300 billion dollar virtual diagnostics-driven care industry. Our mission is to improve the lives of millions with a fully integrated digital care platform for consumers and businesses. We continue to innovate in the space by delivering more care to more people on a seamless diagnostics-driven platform.
Everlywell, the consumer-initiated at-home laboratory testing brand within Everly Health, has helped over 1 million people manage their health and wellness with easy at-home tests, physician-reviewed results and actionable intelligence delivered digitally in days. You know your body best, and we believe information about the way your body works should be fully visible and available to you however and whenever you need it.
At Everlywell our goal is simple, make at home testing affordable and available to everyone. Everlywell has seen unprecedented growth in the last year or so. As we continue to grow rapidly, it is imperative to design an effective and scalable data infrastructure to support our current and future business needs. As a data engineer you will be directly reporting to the Director of Data architecture. If you have a passion for data and cloud technologies and a drive to design an effective data pipeline then look no further. We are looking for a passionate and skilled data engineer to help us with our data infrastructure and architecture. As a Data Engineer at Everlywell, you will ensure the data platform infrastructure and architecture supports the evolving requirements of the Data Engineering and Data Analytics teams as well as other parts of our business. You will closely work with data analytics to develop a strategy for our long term Data Platform architecture, to identify gaps in the data processes and drive improvements while mentoring and coaching other team members.
What You'll Do:
Be an essential part of designing and building our new data architecture and platformExplore and evaluate new technologies as appropriate and make recommendations where necessaryDevelop, test and maintain existing architectureCritical thinking abilityStrong problem solving abilitiesA desire to stay on top of changing technologies and evolving our data pipeline accordinglyIdentify gaps in current data processes and drive improvementsRecommend ways to improve data reliability, efficiency, and quality of the data platform and optimize for performance, scalability, and costWork with ETL tools to provide data needed for data based decisions in a timely mannerHelp in building a reliable and scalable data pipeline Collaborate with Data Analytics team to build the correct datasets for further consumption by various visualization toolsWork on data models for various aspects of our data pipelineWho You Are:
Programming experience and a demonstrated interest in statistical analysis and business intelligenceMinimum of 5+ years experience with SQL and Data Warehouse development and ETLHands-on experience with at least one of cloud based data warehouse toolsScripting skills using shell, python or rubyExperience with standard warehousing concepts like Data Marts and Dimensional ModelingExcellent communication skills, both verbal and written.Experience with at least one data modelling toolHands on experience managing and performance tuning PostgreSQL Hands on experience with orchestration tools like AirflowExperience with ETL tools like Stitch, Pentaho etc.Experience with data warehouse schema design and architectureExperience with Big data solutions such as Snowflake or RedshiftExperience managing RDS a definite plusExperience with NoSQL databases