Match score not available

Data Scientist

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Bachelor’s or Master’s degree in Computer Science, Statistics, Mathematics, Engineering, or a related field., 3+ years of experience in data science, machine learning, or advanced analytics roles., Proficiency in Python, strong SQL knowledge, and familiarity with big data technologies like Apache Spark and Kafka., Upper-Intermediate level of English..

Key responsabilities:

  • Develop and validate predictive/statistical models using Python and deep learning frameworks.
  • Collaborate with Data Engineers to integrate models with data pipelines for real-time and batch processing.
  • Track model experiments and manage the model lifecycle using MLOps tools.
  • Perform exploratory data analysis and partner with business stakeholders to translate findings into actionable insights.

Intellectsoft logo
Intellectsoft Computer Software / SaaS SME https://www.intellectsoft.net/
51 - 200 Employees
See all jobs

Job description

Intellectsoft is a software development company delivering innovative solutions since 2007. We operate across North America, Latin America, the Nordic region, the UK, and Europe.We specialize in industries like Fintech, Healthcare, EdTech, Construction, Hospitality, and more, partnering with startups, mid-sized businesses, and Fortune 500 companies to drive growth and scalability. Our clients include Jaguar Motors, Universal Pictures, Harley-Davidson, Qualcomm, and London Stock Exchange.Together, our team delivers solutions that make a difference. Learn more at www.intellectsoft.net

You'll be part of a dynamic team developing a cutting-edge Analytical Platform for one of the largest resorts and casino companies in Southeast Asia. This mission-critical application is designed to empower client stakeholders with deep insights into customer behavior, enabling data-driven decision-making and strategic business development.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Statistics, Mathematics, Engineering, or a related field
  • ML/Analytics Experience: 3+ years in data science, machine learning, or advanced analytics roles, building models and delivering data-driven insights
  • Programming Skills: Proficiency in Python (NumPy, pandas, scikit-learn), strong SQL knowledge, and familiarity with PySpark or other distributed computing frameworks
  • Big Data & Streaming: Experience with Apache Spark, Kafka, or similar technologies for large-scale data processing and real-time analytics
  • MLOps & Model Deployment: Familiarity with MLflow, containerization (Docker), and orchestrators (Airflow, Kubernetes) for automated model lifecycle management
  • Data Warehousing / Lakehouse: Understanding of data lake architectures (Delta Lake, Iceberg, Hudi), partitioning, ACID transactions, and schema evolution
  • Analytical Mindset: Strong understanding of statistical methods, hypothesis testing, data visualization, and feature engineering best practices
  • Upper-Intermediate level of English

Nice to have skills:

  • Cloud Services: AWS (EMR, S3, Glue, SageMaker), Azure, or GCP for analytics and ML at scale
  • Deep Learning: Hands-on experience with PyTorch, TensorFlow, or advanced NLP frameworks (Hugging Face)
  • Data Catalog & Governance: Experience with DataHub, Apache Atlas, Collibra, or similar metadata management solutions
  • Real-time Analytics: Exposure to Apache Flink, Spark Structured Streaming, or AWS Kinesis for event-driven pipelines
  • Domain Knowledge: Prior exposure to gaming, loyalty/rewards analytics is a plus

Responsibilities:

Model Development & Experimentation

  • Design, prototype, and validate predictive/statistical models using Python (NumPy, pandas, scikit-learn), PySpark, MLlib, or deep learning frameworks (PyTorch/TensorFlow)
  • Continuously iterate on feature engineering strategies with data from streaming (Kafka) and batch (NiFi + Airflow) pipelines

Data Pipeline Integration

  • Collaborate with Data Engineers to ensure models consume and output data seamlessly into/from Delta Lake / Iceberg (in S3 or HDFS) and can handle both real-time and batch feeds
  • Contribute to the design of scalable, fault-tolerant data flows that handle high-volume, low-latency data from operational gaming systems

MLOps & Model Lifecycle Management

  • Track model experiments using MLflow (or similar) for reproducibility, versioning, and performance monitoring
  • Work with Airflow (or similar orchestrator) to schedule training, re-training, and automated model evaluation
  • Deploy and monitor models via Models as a Service solutions (e.g., FastAPI, TorchServe, SageMaker) ensuring robust logging, alerting, and rollback procedures

Advanced Analytics & Domain Insight

  • Perform exploratory data analysis and statistical tests to uncover insights for dynamic pricing, customer segmentation, fraud detection, or game optimization
  • Partner with business stakeholders (marketing, operations, finance) to translate analytical findings into tangible action items or product features

Data Governance & Quality

  • Adhere to data lineage and cataloging practices using tools like DataHub, Apache Atlas, or AWS Glue to ensure compliance, traceability, and data quality
  • Collaborate with governance teams to define and enforce security, privacy, and compliance measures (e.g., handling PII, region-specific regulatory requirements)

Continuous Improvement & Innovation

  • Stay up-to-date on emerging ML trends (e.g., reinforcement learning, large language models, real-time analytics)
  • Evaluate open-source and cloud-native ML solutions (e.g., Kubeflow, SageMaker) to optimize cost, performance, and operational overhead

Benefits

  • 35 paid absence days per year for work-life balance
  • Up to 15 unused absence days can be add to income after 12 month of cooperation
  • Health insurance
  • Depreciation coverage for personal laptop usage for project needs
  • Udemy courses of your choice
  • English courses with native-speaker
  • Regular soft-skills trainings
  • Excellence Сenters meetups
  • Online/offline team-buildings

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Computer Software / SaaS
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Analytical Thinking
  • Collaboration
  • Communication

Data Scientist Related jobs