This position requires a Bachelor's Degree in Computer Science or a related technical field, and 5+ years of relevant employment experience.
5+ years of work experience with ETL, Data Modeling, and Data Architecture.
- Expert-level skills in writing and optimizing SQL.
- Experience with Big Data technologies such as Hive/Spark.
- Proficiency in one of the scripting languages - python, ruby, linux or similar.
- Experience operating very large data warehouses or data lakes.
- Solid communication skills and team player.
- A passion for technology. We are looking for someone who is keen to leverage their existing skills while trying new approaches.