This is a remote position.
At Softgic we work with the coolest, with those who build, with those who love what they do, with those who have 100 in attitude, because that's our #Cooltura. Join our purpose of making life easier with technology and be part of our team as a
Data Engineer.
Compensation:
USD 20 - 25/hour.
Location:
Remote (for México, Guatemala, Colombia, Perú, Chile, Argentina, Paraguay, Brasil, Honduras, Jamaica, Dominican, Belice, Spain, United States, Canadá, Kenya, South Africa, India, and Filipinas residents).
Mission of Softgic:
In Softgic S.A.S. we work for the digital and cognitive transformation of our clients, aware that quality is an essential factor for us, we incorporate the following principles into our policy:
- Deliver quality products and services.
- Achieve the satisfaction of our internal and external clients.
- Encourage in our team the importance of training to grow professionally and personally through development plans.
- Comply with the applicable legal and regulatory requirements.
- Promote continuous improvement of the quality management system.
What makes you a strong candidate:
- You have 3+ years of experience in data framework and pipeline development.
- You are proficient in Scala.
- You are beginner in Apache Spark.
- English - Native or fully fluent.
Responsibilities and more:
This vacancy is 100% On-site in: Colombia, Guatemala, Mexico, Peru, Chile, Belize, United States, Canada, Spain, Dominican Republic, Jamaica, Honduras, Brazil, Paraguay, Argentina, South Africa, Kenya, India, Philippines.
We are seeking a Data Engineer to help transform our data infrastructure, migrating from relational databases to a modern big data architecture. You will play a key role in defining event-driven data feeds, improving automation, and enhancing observability, alerting, and performance.
We’ve built a strong data engineering team to date, but have a lot of work ahead of us,
including:
- Migrating from relational databases to a streaming and big data architecture, including a complete overhaul of our data feeds.
- Defining streaming event data feeds required for real-time analytics and reporting.
- Leveling up our platform, including enhancing our automation, test coverage, observability, alerting, and performance.
Responsibilities:
- Build our next generation data warehouse.
- Build our event stream platform.
- Translate user requirements for reporting and analysis into actionable deliverables.
- Enhance automation, operation, and expansion of real-time and batch data environment.
- Manage numerous projects in an ever-changing work environment.
- Extract, transform, and load complex data into the data warehouse using cutting-edge technologies.
- Build processes for topnotch security, performance, reliability, and accuracy.
- Provide mentorship and collaborate with fellow team members.