ID 1684 – Data Engineer

Remoto

We are building independent and strictly technical teams, where the professional growth it’s not about dealing with spreadsheets, but developing and designing the next application´s generation using cutting-edge technologies.
We don’t want to keep you away from the code, we would like you to be part of a new community of senior developers working on agile and high impact projects, for our frontline customers in the US and UK; so you can grow while you continue working in what you like the most.

What you will get the chance to do

  • Build data ingest, propagation and transformation code in DBT, but thinking about reusability and data modeling;
  • Validate (and sometimes propose) data models to support analysis;
  • Diligence with git branching with regular commits and pushes;
  • Diligence in keeping documents and JIRA tasks updated with comments and caveats;
  • Write JIRAs for opportunities for improvement (or where you took a shortcut and need to clean up the TODO in a future sprint);
  • Write legible code, using descriptive variable names and comments so that any of your peers can pick up your code without significant knowledge transfer;
  • Write data quality tests, develop code that will pass those tests, and test the code yourself;
  • Perform code reviews and be comfortable with your code being reviewed by others;
  • Adhere to the design standards that exist, and apply new design standards as they are produced;
  • Perpetually look for areas of improvement with a focus on data quality, integrity and governance;
  • Perform root cause analyses on bugs and on long-running queries and stages in the data pipelines;
  • Perform data mapping activities to describe source data, target data and the high-level or detailed transformations that need to occur;

What will help you succeed:

  • Collaborative with a can-do attitude;
  • Solid understanding of Snowflake and all of its unique features;
  • Solid understanding of DBT;
  • Solid understanding of ADF;
  • Solid understanding of data quality testing;
  • Solid understanding of analytics platform patterns;
  • Experience with various data modelling patterns (eg. ODS, Kimball, Inmon, 3NF, OLTP vs. OLAP, SCDs);
  • Advanced SQL skills;
  • Able to work independently within the scope of the role, with a good understanding of when to escalate;
  • Ready and willing to share what you’ve produced;
  • Takes pride the work done, willing to accept technical criticism of work done, and willing to explain / defend the choices in the work done (eg. by describing the scenario considered);
  • Eager to learn new technologies and new patterns;
  • Problem-solving aptitude;

Education:

BSc/BA in Computer Science, Engineering or relevant field would be an asset;

Experience:

  • 2 years of experience in the Snowflake cloud platform;
  • 2 years of experience with agile sprints;
  • Experience with Looker would be an asset;

Benefits:

  • Automatic salary adjustment.
  • English classes with native teachers
  • Home Office
  • Within your team you will be able to choose the benefits you consider and they will be evaluated if they are implemented.
  • Family group health insurance: Osde or Swiss Medical.

Postularme

    Todos los derechos reservados. Desarrollado por Socialbits.net

    CONEXIONHR – Todos los derechos reservados. Desarrollado por Socialbits.net