Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
At PadSplit we’re disrupting the affordable housing industry by creating safe, attractive, and respectable co-living environments.
Welcome to PadSplit, the nation's largest co-living marketplace and the leading provider of innovative housing solutions. We're here to tackle the affordable housing crisis, one room at a time. 💪
At PadSplit, we're dedicated to empowering individuals and investors by bridging the gap between supply and demand in the rental market. Our unique co-living model revolutionizes how people live, fostering a strong sense of community and enabling financial freedom.
As a PadSplit Member, you gain access to a supportive community of like-minded individuals, along with fully furnished rooms and amenities within shared living spaces. It's a vibrant and affordable living experience like no other!
For PadSplit Hosts, we provide a reliable stream of residents who have been meticulously vetted to meet our strict safety and community engagement standards. By joining PadSplit, you not only unlock a new revenue stream but also make a positive impact on your community.
Join us at PadSplit and be part of the solution to the housing shortage. Together, we can create better living opportunities for all. Contact us today to learn more!
#PadSplit #CoLiving #CommunityImpact
PadSplit is hiring for a Data Engineer to build and maintain scalable data infrastructure that drives analytics, reporting, and decision-making across the organization. This role is critical to optimizing data pipelines, ensuring data reliability, and enabling cross-functional teams to unlock valuable insights in a remote, high-growth environment.
The Person We Are Looking For:
PadSplit is looking for a highly skilled Data Engineer with expertise in building and maintaining scalable data infrastructure using tools like PostgreSQL, AWS, Snowflake, and dbt. The ideal candidate is a collaborative problem-solver who is eager to optimize data pipelines, enhance query performance, and drive reliable, data-informed decision-making across the organization.
Here’s What You’ll Do Day-to-Day:
Design, build, and optimize scalable ETL/ELT pipelines to facilitate seamless data ingestion and transformation processes.
Develop and maintain data models to enable self-service analytics and reporting across the organization.
Optimize database performance in PostgreSQL, ensuring efficient data storage, retrieval, and query execution.
Implement and enhance search capabilities using NoSQL technologies like ElasticSearch or Solr to improve data discovery.
Collaborate with data analysts to create insightful dashboards that support data-driven decision-making.
Ensure data quality, governance, and security by adhering to best practices in cloud-based data environments.
Monitor and troubleshoot issues within data pipelines, focusing on optimizing efficiency and reliability.
Work closely with software engineers and product teams to integrate data solutions into operational workflows and product development.
Here’s What You’ll Need to Be Successful:
5+ years of experience in data engineering or a similar role, with a proven track record of designing scalable data solutions.
Expertise in PostgreSQL, including database management, query optimization, and performance tuning.
Hands-on experience with AWS cloud services such as S3, Lambda, Glue, Redshift, and IAM.
Proficiency in data warehousing technologies like Snowflake, Redshift, or BigQuery for cloud-based data storage and analysis.
Strong skills in data transformation, modeling, and building efficient ETL/ELT pipelines.
Experience with data visualization tools like Mode, Looker, Tableau, or Hex to support analytics and reporting.
Knowledge of ElasticSearch or Solr for implementing search indexing and query capabilities.
Proficiency in SQL and Python, with experience in automation, scripting, and workflow orchestration (e.g., Airflow).
Understanding of CI/CD pipelines, infrastructure-as-code principles, and cloud-based deployment practices.
Strong analytical and problem-solving abilities, with a passion for leveraging data-driven insights to inform decisions.
Nice-to-Have: Experience with streaming data solutions like Kafka or Kinesis, knowledge of machine learning pipelines, and familiarity with data privacy regulations such as GDPR or CCPA.
The Interview Process:
Your application will be reviewed for possible next steps by the Hiring Manager.
If you meet eligibility requirements, the next step would be a video screen with a member of the PeopleOps team for about thirty (30) minutes.
If warranted, the next step would be a video interview with our CTO for forty-five (45) minutes.
If warranted, the next step would be a video panel interview with key stakeholders at PadSplit for one-and-a-half (1.5) hour.
The panel interview will require a candidate to work on a technical assessment where you will showcase your engineering skills to the panel for discussion.
If warranted, then we move to offer!
Compensation, Benefits, and Perks:
Fully remote position - we swear!
Competitive compensation package including an equity incentive plan
National medical, dental, and vision healthcare plans
Company provided life insurance policy
Optional accidental insurances, FSA, and DCFSA benefits
Unlimited paid-time (PTO) policy with eleven (11) company-observed holidays
401(k) plan
Twelve (12) weeks of paid time off for both birth and non-birth parents
The opportunity to do what you love at a company that is at the forefront of solving the affordable housing crisis
Required profile
Experience
Level of experience:Senior (5-10 years)
Industry :
Real Estate Management & Development
Spoken language(s):
English
Check out the description to know which languages are mandatory.