Olo is looking for a Data Engineer to help analyze, define and implement our enterprise data platform and the processes that build it. You will join the Data Infrastructure team as one of the first key hires, and will work closely with the Data Architect and other stakeholders to build, test and operationalize many of the software components in this space.
What You’ll Be Doing
- Build and maintain a data catalog and dictionary, sourced from our current and future data assets.
- Work with the Data Architect to implement new components within the data platform according to design specifications.
- Evaluate technology and help the Data Architect measure its effectiveness and fit.
- Write and maintain technical documentation.
- Work closely with product teams on initiatives such as A/B testing, web page analytics, user event tracking, and similar focus areas.
- Work with our security and privacy teams to ensure we have a strong data security and privacy posture. This includes regulatory compliance (CCPA, GDPR, SOC), security at rest and on the wire, defense in depth, zero trust, and more.
- Enforce our data retention policies, minimizing liability and cost while maximizing the effective data lifecycle.
- Champion the practice of data democratization, enabling our internal teams to access the right data and get the answers they need.
- Reimagine the way we source, process, contextualize and model our data.
What We’ll Expect From You
- Demonstrated success in the design, development and evolution of modern data pipelines, business intelligence, advanced analytics and reporting applications
- Fluency in data modeling and warehousing
- Past experience in data governance (MDM, Data Cataloging, compliance)
- Experience working with large data sets (hundreds of terabytes) and volumes (millions or billions of transactions per day)
- Experience in the full data pipeline, from extraction to grooming, modelling, loading and dashboarding/BI
- Strong understanding of data structures, encodings and storage formats and the tradeoffs between the various options
- Hands-on experience with modern data platforms (examples: Hadoop/HDFS, Redshift, Snowflake, Hive, Kafka, Spark, HBase)
- Experience with Postgres and SQL server
- Strong grasp of zero-trust patterns and related security design principles
- Experience with DevOps processes including CI/CD and Infrastructure as Code principles
- A passion for staying abreast of industry trends and bleeding edge tech developments in the data engineering space
- Strong focus on data quality, durability and resource/cost optimization
- Excellent communication skills
- Team-centered mindset and ability to work with remotely distributed teams
- Willingness to roll up your sleeves, work hard and be scrappy!
Nice to Have
- A passion for AI and ML
- Experience with Looker and the LookML language
- Experience in Python, R or other batch data processing/mining languages
- Prior exposure to A/B testing, web page and user event analytics