2 weeks ago
Job type: Full-time
Hiring from: USA Only
- Build and maintain a data catalog and dictionary, sourced from our current and future data assets.
- Work with the Data Architect to implement new components within the data platform according to design specifications.
- Evaluate technology and help the Data Architect measure its effectiveness and fit.
- Write and maintain technical documentation.
- Work closely with product teams on initiatives such as A/B testing, web page analytics, user event tracking, and similar focus areas.
- Work with our security and privacy teams to ensure we have a strong data security and privacy posture. This includes regulatory compliance (CCPA, GDPR, SOC), security at rest and on the wire, defense in depth, zero trust, and more.
- Enforce our data retention policies, minimizing liability and cost while maximizing the effective data lifecycle.
- Champion the practice of data democratization, enabling our internal teams to access the right data and get the answers they need.
- Reimagine the way we source, process, contextualize and model our data.
- Demonstrated success in the design, development and evolution of modern data pipelines, business intelligence, advanced analytics and reporting applications
- Fluency in data modeling and warehousing
- Past experience in data governance (MDM, Data Cataloging, compliance)
- Experience working with large data sets (hundreds of terabytes) and volumes (millions or billions of transactions per day)
- Experience in the full data pipeline, from extraction to grooming, modelling, loading and dashboarding/BI
- Strong understanding of data structures, encodings and storage formats and the tradeoffs between the various options
- Hands-on experience with modern data platforms (examples: Hadoop/HDFS, Redshift, Snowflake, Hive, Kafka, Spark, HBase)
- Experience with Postgres and SQL server
- Strong grasp of zero-trust patterns and related security design principles
- Experience with DevOps processes including CI/CD and Infrastructure as Code principles
- A passion for staying abreast of industry trends and bleeding edge tech developments in the data engineering space
- Strong focus on data quality, durability and resource/cost optimization
- Excellent communication skills
- Team-centered mindset and ability to work with remotely distributed teams
- Willingness to roll up your sleeves, work hard and be scrappy!
- A passion for AI and ML
- Experience with Looker and the LookML language
- Experience in Python, R or other batch data processing/mining languages
- Prior exposure to A/B testing, web page and user event analytics
- Our families come first. We know they make us who we are and they are who we live and work for every day.
- Olo is our extended family. We’re in this together, fighting for one another. We’re happy to be here. We will not let one another down.
- We learn from and fight through setbacks. We recognize and help one another with direct feedback.
- We care about you. We offer 20 days of paid time off, fully paid health, dental and vision care premiums, stock options, a generous parental leave plan.
- We value diversity. At Olo, we know a diverse and inclusive team not only makes our products better, but our workplace better. Many groups are consistently underrepresented across the tech sector and we are fully committed to doing our part to move the needle.
- Learn more about our culture, values, and mission. https://www.olo.com/images/culture.jpg.
Before you apply, please check if any restrictions apply in terms of time zone or country.
This job has a geo-restriction in place: USA Only.
Please mention that you come from Remotive when applying for this job.
Does this job need an edit? 🙈