Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
We are a young and growing company, with operations in Medellin and Bogota, focused on the generation of technological solutions in synergy with our customers and our team so that these solutions add value within their organizations and their business processes.
At Softgic we work with the coolest, with those who build, with those who love what they do, with those who have 100 in attitude, because that's our #Cooltura. Join our purpose of making life easier with technology and be part of our team as a
Data Engineer.
Compensation:
USD 20 - 28/hour.
Location:
Remote (anywhere).
Mission of Softgic:
In Softgic S.A.S. we work for the digital and cognitive transformation of our clients, aware that quality is an essential factor for us, we incorporate the following principles into our policy:
Deliver quality products and services.
Achieve the satisfaction of our internal and external clients.
Encourage in our team the importance of training to grow professionally and personally through development plans.
Comply with the applicable legal and regulatory requirements.
Promote continuous improvement of the quality management system.
What makes you a strong candidate:
You are proficientAzure Data Lake, Azure Sql, ELT (Extract, load, transform), and Python.
English - Native or fully fluent.
Spanish - Native or fully fluent.
Responsibilities and more:
Design, develop, and maintain scalable data architectures using SQL Server, Azure SQL Database, and Snowflake on Azure.
Implement and manage data pipelines using Azure Data Factory, supporting ETL and ELT processes.
Work with SQL Change Data Capture (CDC) along with Debezium to enable real-time and incremental data processing.
Work with Streaming technologies such as Kafka and Azure Event Hub to deliver near real time analytics and reporting.
Manage Azure Data Lake to store and process structured and unstructured data efficiently.
Design and optimize Data Vault and Star Schema models for data warehousing solutions.
Develop and maintain ETL/ELT workflows using Python and SQL-based tools.
Leverage Databricks for big data processing, machine learning, and advanced analytics.
Ensure data quality, governance, and security across multiple data environments.
Build and maintain analytical reports using Sigma
Collaborate with business stakeholders and data analysts to ensure data solutions align with business needs.
Monitor and troubleshoot data pipelines to ensure reliability, accuracy, and efficiency.
Support disaster recovery planning and high-availability data strategies.
Stay up to date with emerging data engineering technologies and best practices.
Requirements
Abilities:
5-7 years of experience as a data architect or senior-level data engineer.
Expertise in SQL Server (SSMS, T-SQL, SSIS, SSRS, SSAS) and Azure SQL Database.
Strong experience in data modeling, including Data Vault and Star Schema methodologies.
Proficiency in ETL/ELT development and data pipeline management.
Hands-on experience with Snowflake on Azure and Databricks for big data processing.
Experience working with streaming technologies (e.g., Kafka, Flint, Event Hub)
Strong analytical and problem-solving skills with a focus on data integrity and scalability.
Knowledge of Python for data transformation, automation, and analytics is a bonus.
Requirements:
Ability to sit or stand for extended periods of time as required.
Ability to work in a fast-paced, deadline-driven environment with minimal supervision.
Benefits
We're certified as a Great Place to Work.
Opportunities for advancement and growth.
Paid time off.
Formal education and certifications support.
Benefits with partner companies.
Referral program.
Flexible working hours.
Salary:
USD 20 - 28/hour
Required profile
Experience
Level of experience:Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.