Match score not available

Associate Solutions Architect

extra holidays - extra parental leave
Remote: 
Full Remote
Experience: 
Expert & Leadership (>10 years)
Work from: 

Offer summary

Qualifications:

10+ years of development experience, Understanding of data engineering principles, Proficiency in Python and SQL, Bachelor's degree in Computer Science or related field.

Key responsabilities:

  • Collaborate with stakeholders for requirements analysis
  • Design architectures for scalable data models
  • Develop and implement data processing logic
  • Evaluate performance of data engineering models

CFRA Research logo
CFRA Research Financial Services SME http://www.cfraresearch.com
51 - 200 Employees
See all jobs

Job description

Department: Product

Location: CFRA India

Description

The Associate Solution Architect will be responsible for providing technical leadership, architecture, governance, and oversight of CFRA's next generation of data science and data processing software using a modern cloud-native technology stack with Python on AWS cloud infrastructure. This is a rare opportunity to make a big impact on both the team and the organization by being part of the design and development of application frameworks that will serve as the foundation for all future development at CFRA.

The ideal candidate has a passion for solving business problems with technology and can effectively communicate business and technical needs to stakeholders. We are looking for candidates that value collaboration with colleagues and having an immediate, tangible impact for a leading global independent financial insights and information company.


Key Responsibilities


  • Requirement Analysis: Collaborate with stakeholders to understand business requirements and data sources and define the architecture and design of data engineering models to meet these requirements.

  • Architecture Design: Design scalable, reliable, and efficient data engineering models, including algorithms, data pipelines, and data processing systems, to support business requirements and quantitative analysis.

  • Technology Selection: Evaluate using POCs and recommend appropriate technologies, frameworks, and tools for building and managing data engineering models, considering factors like performance, scalability, and cost-effectiveness.

  • Data Processing: Develop and implement data processing logic, including data cleansing, transformation, and aggregation, using technologies such as AWS Glue, Batch, Lambda.

  • Quantitative Analysis: Collaborate with data scientists and analysts to develop algorithms and models for quantitative analysis, using techniques such as regression analysis, clustering, and predictive modeling.

  • Model Evaluation: Evaluate the performance of data engineering models using metrics and validation techniques and iterate on models to improve their accuracy and effectiveness.

  • Data Visualization: Create visualizations of data and model outputs to communicate insights and findings to stakeholders.

  • Monitoring and Logging: Implement monitoring and logging solutions for data engineering models using tools like AWS CloudWatch to ensure model health and
    performance.

  • Security and Compliance: Ensure data engineering model architecture complies with security best practices and regulatory requirements, implementing encryption, access controls, and data masking as needed.

  • Documentation: Create and maintain documentation for data engineering model architecture, design, and implementation, including diagrams, data flow descriptions, and operational procedures.

  • Collaboration: Collaborate with cross-functional teams, including data engineers, data scientists, and business analysts, to understand their requirements and integrate data engineering models into their workflows.

  • Continuous Improvement: Stay updated with the latest trends, tools, and technologies in data engineering and quantitative analysis, and continuously improve data engineering model processes and methodologies.




Desired Skills and Experience


  • A minimum of 10+ years of development experience on enterprise applications

  • Data Engineering: Understanding of data engineering principles and practices, including data ingestion, processing, transformation, and storage, using tools and technologies such as AWS Glue, Batch, Lambda.

  • Quantitative Analysis: Proficiency in quantitative analysis techniques, including statistical modeling, machine learning, and data mining, with experience in implementing algorithms for regression analysis, clustering, classification, and predictive modeling.

  • Programming Languages: Proficiency in programming languages commonly used in data engineering and quantitative analysis, such as Python, R, Java, or Scala, as well as experience with SQL for data querying and manipulation.

  • Big Data Technologies: Familiarity with big data technologies and platforms, such as Hadoop, Apache Kafka, Apache Hive, or AWS EMR, for processing and analyzing large volumes of data.

  • Cloud Computing: Proficiency in using AWS cloud services for data pipeline architecture and implementation.

  • Data Integration: Understanding of data integration techniques and tools for integrating data from various sources, including batch and real-time data integration, and experience with ETL (Extract, Transform, Load) processes.

  • Database Systems: Knowledge of database systems, including relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra), and experience in designing and managing database schemas.

  • Problem-solving Skills: Excellent problem-solving skills, with the ability to analyze complex data engineering and quantitative analysis problems, identify solutions, and
    implement them effectively.

  • Communication and Collaboration: Strong communication and collaboration skills, with the ability to effectively communicate technical concepts to non-technical stakeholders and work effectively in a team environment.

  • Learning Agility: A commitment to continuous learning and staying updated with the latest trends, tools, and technologies in data engineering, quantitative analysis, and machine learning.

  • Bachelor's Degree: A bachelor's degree in Computer Science, Software Engineering, or a related field is often preferred, although equivalent experience and certifications can also be valuable. Experience with Financial domain knowledge is a plus.




Benefits


  • 21 days of Annual Vacation

  • 8 sick days

  • 6 casual days

  • 1 paid Volunteer Day

  • Medical, Accidental & Term Life Insurance

  • Telehealth, OPD

  • Competitive pay

  • Annual Performance Bonus

Required profile

Experience

Level of experience: Expert & Leadership (>10 years)
Industry :
Financial Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Problem Solving
  • Learning Agility
  • Collaboration
  • Communication

Solutions Architect Related jobs