About Loom: Loom is on a mission to empower everyone at work to communicate more effectively, wherever they are. We are already trusted by over 4M users across 90k+ companies. Our customers are global and use Loom at work at world-class companies including HubSpot, Square, Uber, GrubHub, and LinkedIn. Founded in 2016, Loom has raised $73 million from top-tier investors including Sequoia Capital, Kleiner Perkins, the Slack Fund, and the founders of Instagram, Figma, and Front. The Role: You will be collaborating with the data team, engineering team & stakeholders across the company to build the foundation of our analytics infrastructure. You will take ownership of the transformation layer from raw data to the models that drive all critical analysis and dashboards across the organization. We are looking for a SQL wizard trained in the arts of performant & well-documented query writing, someone who puts data quality first, and someone that can translate business problems into models that drive actionable insight.
What We're Looking For:
- Help build foundational analytical components that drive the creation of insightful automated dashboards and data visualization to track key business metrics.
- Partner with product managers, engineers, marketers, designers, and business operations to translate business insights into infrastructure that drives business results.
- Drive table design, transformation logic and efficient query development to support the growing needs of the data analytics organization.
- Automate analyses and data pipelines while building scalable data infrastructure.
- Develop testing and monitoring across the transformation layer to ensure data quality from raw sources and all models downstream.
- Build out documentation that supports code maintainability and ultimately a Data Dictionary that makes data accessible to the whole company.
What We're Looking For:
- 2-4 years of experience in a data science or analytics role.
- Proficiency in SQL and database table design - able to write structured and efficient queries on large data sets.
- Strong communication skills to work with stakeholders to translate business needs and ideas into tractable work items.
- Experience with ETL tools such as Airflow or DBT.
- Experience working in the command line and in git workflows.
- Proficiency in R and/or Python is a nice to have.