5+ years of experience in backend engineering and data infrastructure management., Proficiency in Python, Google Cloud Platform (GCP), and PostgreSQL., Experience with CI/CD processes, Docker, and Kubernetes., Strong communication skills and ability to work remotely..
Key responsabilities:
Implement and manage ETL processes using Python and optimize workflows within GCP.
Develop APIs using FastAPI for data access and integration.
Manage and query large datasets with Cloud SQL and PostgreSQL.
Integrate new data sources and implement caching solutions using Redis or Memcached.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
Workforce Professional Recruitment ~ Worldwide
11 - 50
Employees
About Workforce Professional Recruitment ~ Worldwide
Workforce Human Resources is a boutique consulting firm that aims qualified and efficient service concept and provides company–specific solutions by analyzing the company's needs correctly.
We operate with the awareness that the most sensitive point in the recruitment process is the effective communication, and we manage the business in the most accurate manner by well analyzing our business partners.
The key point of our performance; We correctly analyze the expectations of our business partners and our candidates.
While we guide the candidates who will contribute to our business partners by starting from the principles of good communication, trust and continuity, we aim to make candidates have the opportunity of business dreamed of.
Time is the biggest cost. Accurate Staffing to Provide Your Company with Time
We're looking for an experienced Backend Engineer to build, manage, and maintain a robust data infrastructure for collecting vast amounts of financial data.
What you’ll do:
Implement and manage ETL processes using Python (Pandas).
Optimize workflows within GCP (Composer, Cloud run …)
Work with Cloud SQL and PostgreSQL to manage and query large datasets.
Integrate new data sources and map to existing datasets.
Develop APIs using FastAPI to enable data access and integration.
Implement caching solutions using Redis or Memcached.
Manage version control and CI/CD pipelines with GitLab.
Requirements
5+ years of experience in the following tech stack:
Python
Google Cloud Platform (GCP)
CI/CD
PostgreSQL
Apache Airflow
Docker
Kubernetes
ETL
GitLab
Excellent Communication Skills
Valid reference contacts
Benefits
Equity stake:Participate in the success of our platform.
Remote first: Work from anywhere, anytime.
Interesting work: Collaborate in developing cutting-edge financial technology that processes millions of data points, leverages data science and natural language processing (NLP) to extract valuable insights and keeps an eye on financial markets 24/7.
Required profile
Experience
Spoken language(s):
English
Check out the description to know which languages are mandatory.