Data Engineer

Data Engineer

At Tresta

Date Posted:


About Tresta

Since 1990, Tresta has been providing communication services to businesses throughout North America. Today, our service offerings help thousands of entrepreneurs and SMBs run their businesses from anywhere on any device, with app-based solutions for business calling, texting and call management. To learn more about our services, visit


About the Position

Tresta is looking for an experienced Data Engineer to join our technology team. In this position, you will be responsible for the management of all of our reporting data, from its collection and aggregation, to the design and build out of data structures that will be used in a variety of ways throughout the organization.  Our data warehouse provides information to stakeholders in almost every department in a variety of formats and tools (including Tableau, scheduled reports, custom dashboards, etc.) and is relied upon by executives to inform their decisions, so accuracy and efficiency are crucial to this role.  Our Data Engineer will also need to become a subject matter expert on how our services work so that the appropriate data can be provided to team members when requested. 

This is an important position on a small team – so we're looking for the kind of person who will take strong ownership of their role and use their experience to deliver great results in the following areas:

  1. Understanding and implementing data warehousing principles, tools, and technologies
  2. Organizing, collecting, and standardizing data that helps generate insights and addresses reporting needs
  3. Using expertise, judgment, and precedent to translate business requirements and decompose data flows/relationships into concrete data pipelines
  4. Maintaining and improving existing ETL pipelines and reporting models



In more specific terms, you'll be expected to:

  • Develop scalable ETL Pipelines in Airflow using SQL, Python, Bash, and other technologies
  • Work with various APIs to integrate 3rd party apps with internal data models
  • Design Dimensional Models that adhere to Kimball methodology 
  • Maintain data sources for Tableau and other reporting solutions
  • Develop and help implement cloud first DevOps tools: Docker, Git, Kubernetes, Continuous Integration and Deployment as it relates to the data infrastructure
  • Create and maintain documentation for each step in the data lifecycle
  • Understand and communicate data lineage to foster trust and optimize reporting



  • Ability to independently develop robust solutions for complex problems
  • A strong attention to detail
  • A love for learning and personal growth
  • Strong understanding of Data Warehouse and Data Modelling principles and best practices
  • Strong proficiency writing SQL and optimizing queries
  • Some experience with Python is preferred
  • Experience with Tableau is a big plus



  • Competitive salary + bonus structure
  • Profit sharing program
  • 401K + match
  • Great health, dental and vision plans
  • Life, disability and supplemental insurance plans
  • Generous PTO allowance
  • Freedom to work remotely  
Apply for the job