Our company is a results-driven global consulting firm that specializes in helping businesses successfully address their most complex and critical challenges.
What you’ll do
We work in smaller and more senior teams that bring deep industry and functional knowledge to our clients.
In this role, you will have the chance to create ELT/ETL workflows, scripts, visualizations, and statistical models while taking responsibility for the design, build, test, execution, and support of data-related activities including but not limited to integration, profiling, cleansing, and visualizing. The ideal candidate will have a detailed understanding of the underlying data and data structures of multiple systems to allow in-depth analysis of existing and potential data insights.
- Collect data from a wide variety of enterprise databases, including Redshift, MSSQL, Synapse, Excel, plain or formatted text files, etc.
- Implement automated data ingestion and transformation processes.
- Enrich client data through external and public sources.
- Build effective, reliable, and robust processes that govern the data ingestion flow.
- Design database models, efficient table structures, and advanced schemas that carry out data quality and consistency standards.
- Apply modern modeling approaches, business intelligence patterns, and data management techniques.
- Rock-solid SQL skills to work with extensive volumes –billions of rows- of data at various aggregation levels.
- Review and analyze legacy code/scripts to understand data processing logic and business rules.
- Apply statistical learning toolsets to build predictive models that enrich, expand, and allow a deeper understanding of data analyses and solutions.
- Use of interactive data visualization tools, such as Tableau and Power BI, to present results in a compelling manner, for the benefit of internal or client teams.
- Ability to tell a convincing story to C-level executives using visual charts and dashboards.
- Present complicated technical findings to a non-technical audience.
- Collaborate with clients and stakeholders, with the ability to build relationships, influence, and provide advice to key decision-makers.
- Prepare client-ready documents and compelling presentation materials.
What you’ll need
- Bachelor’s degree with concentration in Computer Sciences, Engineering, or another quantitative related field.
- More than three years of experience in roles with similar characteristics.
- Comfort with big data architectures, both on cloud platforms like AWS/Azure/others and on-premise.
- Capability to be a creative, innovative problem solver —but using simple ideas.
- Motivated to discover and learn new analytical techniques and software tools to improve the quality of our work.
- Strong verbal and written communication skills in English and Spanish. Proficiency in other languages is a plus.
- Authorized to work in Argentina and travel freely internationally without restrictions or visa sponsorship.
- Ability and willingness to work long hours and travel to meet client demands, if necessary.
- Desire to actively engage in geographically dispersed teams.
- Ability to work full time in an office and remote environment.
- Ability to work full time in an office and remote environment; physically able to sit/stand at a computer and work in front of a computer screen for significant portions of the workday.
- Pago último día hábil del mes
- OSDE 310 y familiares directos sin costo adicional
- 3 semanas de vacaciones (15 días hábiles)
- Línea corporativa + Iphone
- En la oficina: snacks y bebidas a granel, café Nespresso
- Equipamiento para trabajo remoto: Silla ergonómica, Monitor adicional
- Elegibles de un bono discrecional anual por performance que va de un 20% a un 30% aprox de la remuneración anual