Remote Data Engineering jobs. Data pipelines, ETL, data architecture and big data.
Checkr está expandiendo su centro de innovación en Santiago para impulsar la precisión y la inteligencia de su motor de verificaciones de antecedentes a escala global. Este equipo colabora estrechamente con las oficinas de EE. UU. para optimizar el motor de selección, detectar fraude, y evolucionar la plataforma con modelos GenAI. El candidato seleccionado formará parte de un esfuerzo estratégico para equilibrar velocidad, costo y precisión, impactando millones de candidatos y mejorando la experiencia de clientes y socios. El rol implica liderar iniciativas de optimización, diseño de estrategias analíticas y desarrollo de modelos predictivos dentro de una pila tecnológica de alto rendimiento.
This job is original from Get on Board.
Favor considerar adjuntar cv en Ingles actualizado al momento de postular
En Checkr, creemos que un entorno de trabajo híbrido fortalece la colaboración, impulsa la innovación y fomenta la conexión. Nuestras sedes principales son Denver, CO, San Francisco, CA, y Santiago, Chile.
Igualdad de oportunidades laborales en Checkr
Checkr se compromete a contratar a personas cualificadas y con talento de diversos orígenes para todos sus puestos tecnológicos, no tecnológicos y de liderazgo. Checkr cree que la reunión y celebración de orígenes, cualidades y culturas únicas enriquece el lugar de trabajo.
Apply directly through getonbrd.com.
En Artefact LatAm, somos una consultora líder enfocada en acelerar la adopción de datos e inteligencia artificial para generar impacto positivo. El Senior Data Scientist es un profesional altamente experimentado en el análisis de datos, con profundos conocimientos en técnicas estadísticas, de programación y de aprendizaje automático. Su rol principal es utilizar estas habilidades para extraer conocimientos significativos y tomar decisiones estratégicas basadas en datos dentro de la organización.
Además de desarrollar modelos analíticos avanzados, el Data Scientist Senior ejerce un rol importante dentro del equipo asignado al cliente, aportando con su conocimiento técnico para poder tomar decisiones concretas que ayuden al desarrollo del proyecto. Su experiencia ayuda en la conceptualización hasta la implementación, y asegura la entrega de soluciones prácticas y detallistas que cumplan con las necesitas del cliente.
Apply through Get on Board.
...y más!
Applications at getonbrd.com.
Official job site: Get on Board.
La modalidad híbrida que ofrecemos, ubicada en Santiago Centro, permite combinar la flexibilidad del trabajo remoto con la colaboración presencial, facilitando un mejor equilibrio y dinamismo laboral.
Coderslab.io is looking to hire a Big Data & Reporting Lead to lead the organization’s data architecture and analytics strategy.
This role will be responsible for designing, governing, and optimizing the enterprise data architecture, ensuring proper structuring, integration, automation, and consumption of data for reporting, advanced analytics, and decision-making.
The position has a strong focus on data architecture, analytical modeling for MicroStrategy, process automation using n8n, and optimization of ETL/ELT data pipelines.
About the client and the project: the company delivers innovative technology solutions and provides opportunities for continuous learning under the guidance of experienced professionals and cutting-edge technologies. The goal is to deliver value in key business processes and improve operational efficiency through SAP.
Originally published on getonbrd.com.
Data Architecture
Design and govern the data architecture for Big Data and BI platforms.
Define analytical data models for reporting and analytics.
Design data lakes, data warehouses, and data marts aligned with business needs.
Establish data governance, quality, and lineage standards.
Ensure platform scalability, availability, and reliability.
Modeling and Reporting in MicroStrategy
Design and optimize the semantic layer and metadata in MicroStrategy.
Define analytical models and Star Schema structures.
Lead the development of dossiers, operational reports, and analytical cubes.
Optimize queries, performance, and execution times.
Define caching, aggregation, and pre-calculation strategies.
Automation of Analytical Processes (n8n)
Design data and reporting automation workflows using n8n.
Integrate sources such as APIs, databases, cloud services, and BI tools.
Automate data extraction, report generation, dashboard distribution, and alerts.
Design orchestration pipelines for analytical processes.
Data Processing Optimization
Design and optimize scalable ETL/ELT processes.
Optimize queries, data pipelines, and incremental loads.
Reduce latency and resource consumption in reporting.
Implement efficient data ingestion strategies.
Technical Leadership and Management
Lead Data Engineering, BI, and Analytics teams.
Track data architecture and reporting projects.
Define the data platform evolution roadmap.
Establish KPIs for reporting performance, data quality, and analytics adoption.
Align business needs with the data architecture.
Experience leading data architecture or analytics platforms.
Experience in analytical data modeling (Star Schema, Data Modeling).
Experience working with Big Data or Data Warehousing platforms.
Experience with MicroStrategy for modeling and reporting.
Experience designing ETL / ELT processes and data pipelines.
Advanced SQL knowledge.
Experience with Python for data processing or automation.
Experience designing scalable data architectures.
Technologies
Big Data & Data Platforms
Spark
Hadoop
Databricks
Snowflake / BigQuery / Redshift
Kafka
Business Intelligence
MicroStrategy
Power BI (nice to have)
Tableau (nice to have)
Automation & Orchestration
n8n
Airflow
REST APIs
Webhooks
Databases
SQL Server
PostgreSQL
Oracle
NoSQL
Data Engineering
Python
Advanced SQL
ETL / ELT pipelines
Experience with workflow automation using n8n.
Experience with orchestration tools such as Airflow.
Experience with Power BI or Tableau.
Knowledge of event-driven or streaming architectures (Kafka).
Experience in data governance, data quality, and data cataloging.
Modalidad prestacion de servicios
Coderslab.io es una empresa dedicada a transformar y hacer crecer negocios mediante soluciones tecnológicas innovadoras. Formarás parte de una organización en expansión con más de 3,000 colaboradores a nivel global, con oficinas en Latinoamérica y Estados Unidos. Te unirás a equipos diversos que reúnen a parte de los mejores talentos tecnológicos para participar en proyectos desafiantes y de alto impacto. Trabajarás junto a profesionales experimentados y tendrás la oportunidad de aprender y desarrollarte con tecnologías de vanguardia.
Role Purpose
We are looking for a Data Engineer to design, develop, and support robust, secure, and scalable data storage and processing solutions. This role focuses on data quality, performance, and integration, working closely with technical and business teams to enable data-driven decision making.
Find this job and more on Get on Board.
Remote | Contractor | High English proficiency
This job offer is available on Get on Board.
En Improving South America buscamo un/a Senior Data Engineer para diseñar y operar soluciones de datos de alta disponibilidad a escala global, trabajando con pipelines batch y streaming que procesan grandes volúmenes de información. El rol requiere experiencia construyendo pipelines robustos, trabajando con Kafka, PySpark y data warehouses en AWS, además de fuerte dominio de SQL y modelado de datos.
Responsabilidades del rol:
En TIMining, trabajamos para convertir la información operativa de las faenas mineras en valor accionable a través de nuestras plataformas de control y monitoreo. Este rol se integra al equipo de datos, aportando en el diseño, desarrollo y operación de pipelines ETL que integran fuentes diversas hacia las bases de datos y productos de TIMining. Serás parte de un proyecto orientado a la continuidad operativa, la calibración de algoritmos y la automatización de procesos internos para optimizar el flujo de trabajo del cliente y del equipo.
Apply exclusively at getonbrd.com.
Formación en Ingeniería en Ciencia de Datos, Ingeniería Civil o carreras afines en computación. Se requieren mínimo 2 años de experiencia en cargos similares y experiencia comprobable en la implementación de pipelines ETL. Se valora conocimiento y manejo avanzado de Python y SQL, experiencia práctica en despliegue de aplicaciones y manejo de contenedores, así como experiencia en orquestación de datos con herramientas como Apache Airflow o Prefect. Dominio de control de versiones (Git) y trabajo colaborativo, consultas a APIs y bases de datos avanzadas. Conocimientos de Google Suite y Office. Habilidades analíticas, proactividad y capacidad para trabajar de forma autónoma y en equipo. Idiomas: Español nativo; Inglés deseable (upper-intermediate).
Se buscan candidatos con experiencia en proyectos tecnológicos y conocimiento de la industria minera a cielo abierto, además de experiencia con arquitecturas Cloud (AWS, Azure o GCP) e Infraestructura como Código (Terraform, CloudFormation).
Experiencia en:
- Implementación de proyectos tecnológicos.
- Conocimiento de la industria minera y su operación.
- Familiaridad con metodologías ágiles, y experiencia con herramientas de Infraestructura como Código.
- Deseable conocimiento en soluciones de monitoreo y en entornos de producción de datos a gran escala.
Ofrecemos un entorno enfocado a innovación en la industria minera, con oportunidades de desarrollo profesional y trabajo en equipo multidisciplinario. Si cumples con el perfil, te invitamos a formar parte de TIMining y contribuir a la transformación digital de operaciones mineras.
En Artefact LatAm, somos una consultora líder enfocada en acelerar la adopción de datos e inteligencia artificial para generar impacto positivo. El Senior Data Engineer tendrá la responsabilidad de liderar el desarrollo de proyectos de Big Data con clientes, diseñando y ejecutando arquitecturas de datos que sirvan como puente entre la estrategia empresarial y la tecnología, bajo los principios de gobernanza de datos establecidos por los clientes. Además, será responsable de diseñar, mantener e implementar estructuras de almacenamiento de datos tanto transaccionales como analíticas. Este rol implica trabajar con grandes volúmenes de datos provenientes de diversas fuentes, procesarlos en entornos de Big Data y traducir los resultados en diseños técnicos sólidos y datos consistentes. También se espera que revise la integración consolidada de datos y describa cómo la interoperabilidad capacita a múltiples sistemas para comunicarse entre sí.
This company only accepts applications on Get on Board.
...y más!
Apply directly at getonbrd.com.
This job is published by getonbrd.com.
Apply to this job without intermediaries on Get on Board.
This job is exclusive to getonbrd.com.
En WiTi lideramos un proyecto estratégico de migración de un ecosistema analítico legado hacia una arquitectura moderna en la nube sobre AWS. El objetivo es estandarizar, optimizar el rendimiento y escalar la operación, trasladando lógica SQL no estándar a SQL estándar para Amazon Redshift. Este esfuerzo involucra automatización para acelerar la migración, reducción de errores y una alta interacción con equipos de data, BI y TI para asegurar trazabilidad, reproducibilidad y gobernanza de datos a nivel enterprise.
Serás parte de un equipo multidisciplinario que diseña y ejecuta la migración de punta a punta, estableciendo reglas de conversión, pipelines, controles de calidad y guías de codificación reutilizables. El proyecto ofrece visibilidad transversal sobre ETL/ELT y buenas prácticas de gobierno de datos en un entorno cloud escalable.
Apply only from getonbrd.com.
En WiTi fomentamos una cultura de aprendizaje y colaboración, con foco en proyectos digitales y de datos de alto impacto. Entre los beneficios se incluyen:
Contribuirás a la construcción y mantenimiento de soluciones de datos que soportan analítica, reporting y la toma de decisiones operativas en toda la organización.
Trabajando de cerca con data engineers y otros perfiles tecnológicos, apoyarás las plataformas que permiten a los equipos transformar datos en insights relevantes.
En este rol, te enfocarás en la gestión de plataformas de datos y en su rendimiento general. Colaborarás con equipos multifuncionales para entender requerimientos de datos, mejorar sistemas existentes y entregar soluciones que respondan a necesidades del negocio.
Esta es una excelente oportunidad para seguir desarrollando tus habilidades en data engineering mientras contribuyes a impulsar decisiones basadas en datos a escala
Apply at the original job on getonbrd.com.
Apply directly on Get on Board.
Assetplan es una compañía líder en renta residencial con presencia en Chile y Perú, gestionando más de 40,000 propiedades y operando más de 90 edificios multifamily. El equipo de datos tiene un rol clave para optimizar y dirigir procesos internos mediante soluciones de análisis y visualización de datos, apoyando la toma de decisiones estratégicas en la empresa. Este rol se enfoca en diseñar, desarrollar y optimizar procesos ETL, creando valor mediante datos fiables y gobernados.
Applications are only received at getonbrd.com.
NeuralWorks es una compañía de alto crecimiento fundada hace 4 años. Estamos trabajando a toda máquina en cosas que darán que hablar.
Somos un equipo donde se unen la creatividad, curiosidad y la pasión por hacer las cosas bien. Nos arriesgamos a explorar fronteras donde otros no llegan: un modelo predictor basado en monte carlo, una red convolucional para detección de caras, un sensor de posición bluetooth, la recreación de un espacio acústico usando finite impulse response.
Estos son solo algunos de los desafíos, donde aprendemos, exploramos y nos complementamos como equipo para lograr cosas impensadas.
Trabajamos en proyectos propios y apoyamos a corporaciones en partnerships donde codo a codo combinamos conocimiento con creatividad, donde imaginamos, diseñamos y creamos productos digitales capaces de cautivar y crear impacto.
Apply directly on the original site at Get on Board.
El equipo de Data y Analytics trabaja en diferentes proyectos que combinan volúmenes de datos enormes e IA, como detectar y predecir fallas antes que ocurran, optimizar pricing, personalizar la experiencia del cliente, optimizar uso de combustible, detectar caras y objetos usando visión por computador.
Dentro del equipo multidisciplinario con Data Scientist, Translators, DevOps, Data Architect, tu rol será clave en construir y proveer los sistemas e infraestructura que permiten el desarrollo de estos servicios, formando los cimientos sobre los cuales se construyen los modelos que permiten generar impacto, con servicios que deben escalar, con altísima disponibilidad y tolerantes a fallas, en otras palabras, que funcionen. Además, mantendrás tu mirada en los indicadores de capacidad y performance de los sistemas.
En cualquier proyecto que trabajes, esperamos que tengas un gran espíritu de colaboración, pasión por la innovación y el código y una mentalidad de automatización antes que procesos manuales.
Como Data Engineer, tu trabajo consistirá en:
¡En NeuralWorks nos importa la diversidad! Creemos firmemente en la creación de un ambiente laboral inclusivo, diverso y equitativo. Reconocemos y celebramos la diversidad en todas sus formas y estamos comprometidos a ofrecer igualdad de oportunidades para todos los candidatos.
“Los hombres postulan a un cargo cuando cumplen el 60% de las calificaciones, pero las mujeres sólo si cumplen el 100%.” D. Gaucher , J. Friesen and A. C. Kay, Journal of Personality and Social Psychology, 2011.
Te invitamos a postular aunque no cumplas con todos los requisitos.
Apply directly on the original site at Get on Board.
Apply directly from Get on Board.
Send CV through Get on Board.
WiTi conecta talento tecnológico con proyectos de alto impacto en Latinoamérica. Nuestro equipo se enfoca en la integración de sistemas, software a medida y desarrollos innovadores para dispositivos móviles, con énfasis en resolver problemas complejos a través de soluciones innovadoras.
Buscamos un/a Ingeniero/a de Datos para integrarse a un proyecto estratégico en uno de los grupos de distribución automotriz más importantes del país, con operaciones a nivel nacional y una infraestructura de datos en plena etapa de transformación y modernización.
Serás responsable de diseñar, implementar y documentar procesos de carga, transformación y migración de grandes volúmenes de datos en un entorno AWS. Trabajarás en un contexto enterprise donde la calidad, la trazabilidad y la reproducibilidad de los resultados son fundamentales, colaborando con equipos técnicos y de negocio para asegurar que los datos sean confiables, escalables y mantenibles.
Send CV through getonbrd.com.
En WiTi promovemos un ambiente colaborativo donde la cultura del aprendizaje es parte fundamental. Entre nuestros beneficios están:
WiTi conecta talento tecnológico con proyectos de alto impacto en Latinoamérica. Nuestro equipo se enfoca en la integración de sistemas, software a medida y desarrollos innovadores para dispositivos móviles, con énfasis en resolver problemas complejos a través de soluciones innovadoras.
Buscamos un/a Ingeniero/a de Datos para integrarse a un proyecto estratégico en uno de los grupos automotrices líderes del país, con presencia nacional en la comercialización de vehículos livianos y comerciales, y una infraestructura de datos en plena etapa de modernización y escalamiento.
Serás responsable de diseñar, implementar y documentar procesos de carga, transformación y migración de grandes volúmenes de datos en un entorno AWS.
Trabajarás en un contexto enterprise donde la calidad, la trazabilidad y la reproducibilidad de los resultados son fundamentales, colaborando con equipos técnicos y de negocio para asegurar que los datos sean confiables, escalables y mantenibles.
© Get on Board.
En WiTi promovemos un ambiente colaborativo donde la cultura del aprendizaje es parte fundamental. Entre nuestros beneficios están:
This posting is original from the Get on Board platform.
Coderslab.io es una empresa global líder en soluciones tecnológicas con más de 3,000 colaboradores en todo el mundo, incluyendo oficinas en América Latina y Estados Unidos. Formarás parte de equipos diversos compuestos por talento de alto desempeño para proyectos desafiantes de automatización y transformación digital. Colaborarás con profesionales experimentados y trabajarás con tecnologías de vanguardia para impulsar la toma de decisiones y la eficiencia operativa a nivel corporativo.
Exclusive to Get on Board.
Diseñar, desarrollar y mantener soluciones de ingeniería de datos sobre AWS.
Implementar componentes y procesos utilizando AWS Lambda, Amazon S3, Amazon API Gateway y Amazon RDS.
Diseñar y mantener infraestructura como código mediante AWS CloudFormation.
Gestionar despliegues automatizados y pipelines CI/CD utilizando GitHub Actions integrados con AWS.
Asegurar buenas prácticas de versionamiento, testing, observabilidad y despliegue continuo.
Monitorear, optimizar y resolver incidentes en componentes de datos desplegados en ambientes productivos.
Colaborar con equipos de arquitectura, desarrollo y negocio para traducir requerimientos funcionales en soluciones técnicas.
Experiencia sólida con AWS Lambda, Amazon S3, AWS CloudFormation, Amazon API Gateway y Amazon RDS.
Conocimiento en integración y automatización de despliegues con GitHub Actions hacia AWS.
Experiencia aplicando prácticas de CI/CD e infraestructura como código (IaC).
Conocimiento de seguridad, permisos y buenas prácticas operativas en AWS.
Capacidad para desarrollar e integrar APIs y componentes de datos en la nube.
Mínimo 3 años de experiencia en ingeniería de datos, desarrollo cloud o roles equivalentes.
Experiencia comprobable trabajando en ambientes AWS productivos.
Título profesional en Ingeniería Informática, Ingeniería Civil en Computación o carrera afín.
Certificaciones deseables
Remoto Fulltime
Find this vacancy on Get on Board.
La persona será dueña de un ecosistema de datos maduro y bien documentado, con más de 300 DAGs en producción, y tendrá por delante una migración estratégica de Airflow 2.x a Airflow 3.x sobre Kubernetes, además de la reconstrucción de flujos legacy que presentan problemas de escalabilidad y mantenimiento.
This company only accepts applications on Get on Board.
Requisitos
Nice to have
Apply only from getonbrd.com.
100% remoto (según organización). Cuando aplique por necesidad del proyecto, la opción ideal es trabajar en Santiago con modalidad 3x2; en caso borde, modalidad remota. Buscamos mantener un esquema de trabajo que permita foco y continuidad para llegar con calidad a los objetivos del área ASAP.
Exclusive offer from getonbrd.com.
En TCIT, somos líderes en desarrollo de software en modalidad cloud con más de 9 años de experiencia. Trabajamos en proyectos que transforman digitalmente a organizaciones, desde sistemas de gestión agrícola y de remates en línea, hasta soluciones para tribunales y monitoreo de certificaciones para minería. Participamos en iniciativas internacionales, colaborando con partners tecnológicos en Canadá y otros mercados. Nuestro equipo impulsa soluciones de calidad y sostenibles, con foco en impacto social. Buscamos ampliar nuestro equipo con talentos que quieran crecer y dejar huella en proyectos de alto impacto en la nube.
Apply directly through getonbrd.com.
Buscamos un Data Engineer con dominio en Python y experiencia demostrable trabajando con soluciones en la nube. El/la candidato/a ideal deberá combinar habilidades técnicas con capacidad de comunicación y trabajo en equipo para entregar soluciones de datos de alto rendimiento.
Requisitos técnicos:
Habilidades blandas:
Experiencia con herramientas de gestión de datos en la nube (BigQuery, Snowflake, Redshift, Dataflow, Dataproc).
Conocimientos de seguridad y cumplimiento en entornos de datos, experiencia en proyectos con impacto social o regulaciones sectoriales.
Habilidad para escribir documentación técnica en español e inglés y demostrar capacidad de mentoría a otros compañeros.
Trabajo en modalidad hibrida.
Las Oficinas se encuentran ubicadas en la comuna de las Condes, cercano a metro Manquehue.
En 42Labs no solo desarrollamos tecnología: construimos soluciones donde lo técnico y lo humano van de la mano. Trabajamos en iniciativas que transforman negocios en distintos verticales (financiero, logística y educación), creando plataformas de datos que permiten tomar mejores decisiones, automatizar procesos y habilitar analítica confiable. Como Data Engineer enfocado en Databricks, seremos parte de un equipo que diseña y mantiene pipelines robustos, escalables y orientados a calidad, asegurando que los datos lleguen a tiempo, con integridad y trazabilidad. Nuestro objetivo es que la plataforma de datos soporte casos de uso reales, desde ingesta y procesamiento hasta modelado y consumo, promoviendo buenas prácticas, colaboración y mejora continua dentro de una cultura transparente y sin jerarquías rígidas.
Exclusive offer from getonbrd.com.
Apply to this job directly at getonbrd.com.
Coderslab.io es una empresa dedicada a transformar y hacer crecer negocios mediante soluciones tecnológicas innovadoras. Formarás parte de una organización en expansión con más de 3,000 colaboradores a nivel global, con oficinas en Latinoamérica y Estados Unidos. Te unirás a equipos diversos que reúnen a parte de los mejores talentos tecnológicos para participar en proyectos desafiantes y de alto impacto. Trabajarás junto a profesionales experimentados y tendrás la oportunidad de aprender y desarrollarte con tecnologías de vanguardia.
Apply exclusively at getonbrd.com.
Objetivo del rol:
Análisis, diseño, desarrollo y mantenimiento de sistemas de procesamiento de datos en proyectos de Big Data. El profesional deberá crear pipelines en plataformas Cloud y Data Lake para la entrega de modelos de datos en producción, apoyando también en la arquitectura, el diseño de plataformas, el desarrollo de procesos ETL/ELT, ingeniería de datos serverless y modelamiento analítico.
Modalidad de contratación: Plazo fijo
En Proyectum Chile, impulsamos la excelencia en Dirección de Proyectos a través de servicios de consultoría, capacitación y outsourcing especializado. Somos una organización internacional presente en 12 países de Latinoamérica, compartiendo conocimiento, metodologías y activos de alto valor. Además, somos el principal Authorized Training Partner (ATP) del PMI en la región, liderando la transformación en gestión de proyectos y agilidad.
Nos encontramos en búsqueda un Data Engineer para integrarse a un servicio en el dominio de plataforma de datos, participando en el desarrollo de soluciones modernas en entornos cloud, con foco en generación de valor a partir de datos. Responsable de generar activos tecnológicos y productos de datos, traduciendo los requerimientos de negocio en información relevante.
Apply to this job directly at getonbrd.com.
Funciones principales:
Educación:
Requisitos excluyentes:
Requisitos deseables:
En CyD Tecnología somos una empresa innovadora en el sector de la tecnología, enfocada en el desarrollo de plataformas web personalizadas que transforman procesos complejos en soluciones simples y eficientes. Nuestro equipo diseña y entrega aplicaciones web y móviles que automatizan, integran y digitalizan operaciones críticas, ayudando a las empresas a reducir costos, mejorar el control y tomar decisiones basadas en datos en tiempo real.
Apply from getonbrd.com.
El Data Engineer será responsable de diseñar, desarrollar y mantener soluciones de datos orientadas a la construcción de dashboards en Power BI, asegurando la disponibilidad, calidad y consistencia de la información para la toma de decisiones.
Trabajará en la integración de distintas fuentes de datos, transformación de información y modelado necesario para soportar reportes de gestión. Además, participará en la optimización de procesos y en la mejora continua de los modelos de datos utilizados por el negocio.
Dentro de sus funciones principales se encuentran:
Se requiere formación en Ingeniería Informática o carrera afín, junto con experiencia en desarrollo de soluciones BI y manejo de datos.
Requisitos excluyentes:
El trabajo considera jornada 4x3 en faena de la II Región de Antofagasta. No existe modalidad de teletrabajo.
Además, se valorará:
Se considerarán como un plus los siguientes conocimientos o experiencia:
El trabajo se enmarca en el desarrollo proyecto FONDEF, titulado “¿Cómo progreso en mi aprendizaje?:
Sistema inteligente para fortalecer la autorregulación del aprendizaje en línea en estudiantes de educación superior”.
Esta iniciativa da continuidad a un proyecto previo orientado al desarrollo de modelos predictivos y explicativos del aprendizaje autorregulado mediante analítica de aprendizaje. En esta nueva etapa, el foco está en el diseño e implementación de soluciones que permitan fortalecer activamente la autorregulación de estudiantes en entornos digitales, entendida como la capacidad de planificar, monitorear y evaluar su propio proceso de aprendizaje.
El trabajo considera el uso intensivo de datos educativos y el desarrollo de herramientas basadas en evidencia para mejorar la experiencia y resultados de aprendizaje en educación superior.
© getonbrd.com.
Responsable de diseñar, completar y optimizar el modelo de datos que soporta las métricas de
aprendizaje de las plataformas LMS administradas por la Oficina EOL, implementadas sobre Open edX.
El cargo tiene como objetivo integrar múltiples fuentes de datos (logs de eventos, bases de datos
relacionales y no relacionales) para estructurar un sistema consistente de eventos, acciones de
aprendizaje y métricas por usuario y curso, así como habilitar el acceso a esta información mediante APIs
para su consumo en interfaces y sistemas externos.
Se espera que el candidato sea capaz de comprender rápidamente arquitecturas de datos existentes,
trabajar sobre sistemas en desarrollo y completar tanto el modelamiento como la capa de exposición de
datos.
Modelamiento de Datos Analíticos:
Diseñar, completar y mantener el modelo de datos; analítico (eventos, métricas, dimensiones).; Definir estructuras de datos orientadas a analítica (tablas de hechos, dimensiones, relaciones); Asegurar consistencia, trazabilidad y calidad de los datos.; Documentar modelos y definiciones de métricas.
Integración de Fuentes de Datos
Integrar datos provenientes de: o logs de eventos de la plataforma o bases de datos relacionales (SQL) o bases de datos no relacionales (MongoDB) Diseñar procesos de transformación de datos (ETL/ELT). Resolver problemas de integración, duplicidad y calidad de datos.
Análisis y Estructuración de Eventos
o Interpretar y estructurar eventos de interacción de usuarios. o Modelar acciones de aprendizaje y comportamiento dentro de la plataforma. o Definir métricas clave a partir de eventos (engagement, progreso, uso, etc.).
Desarrollo de APIs de Datos
o Diseñar y desarrollar APIs para exponer métricas y datos analíticos. o Implementar endpoints eficientes para consumo por interfaces y sistemas externos. o Asegurar buenas prácticas de diseño (performance, versionado, consistencia).
Optimización y Soporte Analítico
o Optimizar consultas y estructuras para análisis eficiente. o Apoyar la generación de reportes y visualizaciones. o Colaborar con equipos técnicos y funcionales.
Bases de datos relacionales (PostgreSQL/MySQL) Avanzado
Bases de datos no relacionales (MongoDB) Avanzado
SQL avanzado (consultas complejas, optimización) Avanzado
Modelamiento de datos analíticos (eventos, hechos, dimensiones) Avanzado
Python (procesamiento de datos y desarrollo backend) Avanzado
Uso de ORMs en Python Medio
Manejo de migraciones Medio
Desarrollo de APIs REST Avanzado
Frameworks backend (FastAPI y/o Django) Avanzado
Procesos ETL/ELT Medio
Sistemas basados en eventos / tracking de usuarios Medio
Git Medio
Linux / Docker Medio
En Datasur, somos líderes en inteligencia comercial basada en datos de comercio exterior. Nuestra plataforma procesa millones de registros de importaciones y exportaciones de más de 70 países, y estamos listos para escalar más alto.
Buscamos un/a Ingeniero/a de Procesos con al menos un año de experiencia para un proyecto de automatización del flujo de producción de datos. El rol se enfoca en levantar, analizar, documentar y mejorar procesos, impulsando la transición desde operaciones manuales a modelos estandarizados, trazables y escalables.
Se requiere una visión TI orientada a procesos, capaz de mapear flujos end-to-end, detectar brechas, definir controles y traducir necesidades de negocio en requerimientos funcionales claros. El trabajo abarca todo el ciclo de datos (ingesta, estandarización, calidad, monitoreo, orquestación y carga analítica), identificando riesgos y oportunidades de automatización.
This job offer is on Get on Board.
Revel Street LLC helps corporate event planners discover and reach private dining venues through an extensive, dependable database. We use LLMs extensively to gather and enrich venue data, streamline the event planning workflow, and reduce the time and effort required to source options for events such as private dining, cocktail receptions, and conferences. We are looking for an experienced Data Engineer to help us improve data quality, fix existing data issues, and ingest more data from APIs and LLM-based sources to complement our current datasets. Our current stack includes React, TanStack, Cloudflare, Django, and Dagster, and we expect you to design solutions that are scalable, testable, and grounded in core engineering fundamentals.
© Get on Board.
You’ll proactively turn ambiguous requirements into well-structured engineering plans. You’ll communicate trade-offs and risks early, and you’ll verify outcomes through hands-on testing. You’ll bring a “build, measure, improve” mindset to performance, reliability, and user experience.
Equifax es mucho más que una empresa de informes; es una compañía global líder en datos, analítica y tecnología con presencia en 24 países. En Chile, operan desde 1979 entregando soluciones críticas de ciberseguridad, identidad y riesgo a más de 14.000 empresas.
El Hub Tecnológico (SDC) Lo que hace única a esta oportunidad es que Chile alberga el Santiago Development Center (SDC). Este centro lidera la transformación digital de Equifax a nivel mundial, concentrando cerca del 60% de sus desarrollos tecnológicos globales.
Cultura y Visión Equifax promueve un entorno de colaboración y excelencia técnica, donde el talento local tiene el desafío de crear soluciones de impacto mundial. Su visión es clara: usar la data y la tecnología para potenciar la toma de decisiones financieras en todo el mundo.
Apply directly on the original site at Get on Board.
¿Qué harás en tu día a día?
Técnicas
Personales
Contrato indefinido desde el inicio con 23people - Tiempo del proyecto 6 meses con posible extensión
Algunos de nuestros beneficios
Apply only from getonbrd.com.
- Conocimiento del Proceso de la Gestión de Configuración del Software
- Administración de Sistemas Operativos Windows Server (Versiones varias)
- Instalaciones sobre IIS – Servicios Web – Servicios Windows
- Conocimientos básicos de Versionamiento en Herramientas como GIT - TFS- SVN
- Conocimientos en SharePoint - Confluence
- Conocimientos básicos en Sistemas Operativos, Linux, Windows Server
- Conocimientos intermedios en Bases de Datos SQL, Oracle, DB2
- Conocimientos básicos de Visual Studio
- Instalaciones de ETL´S SQL
- Experiencia en Despliegues de Aplicaciones Web, Windows, Cliente Servidor, NodeJs …
- Manejo de Herramienta SoapUi
- Conocimientos en Herramienta Power Center
- Conocimientos en Herramienta GoAnyWhere
About OpenLoop
OpenLoop was co-founded by CEO, Dr. Jon Lensing, and COO, Christian Williams, with the vision to bring healing anywhere. Our telehealth support solutions are thoughtfully designed to streamline and simplify go-to-market care delivery for companies offering meaningful virtual support to patients across an expansive array of specialties, in all 50 states.
Our Company Culture
We have a relatively flat organizational structure here at OpenLoop. Everyone is encouraged to bring ideas to the table and make things happen. This fits in well with our core values of Autonomy, Competence and Belonging, as we want everyone to feel empowered and supported to do their best work.
Apply from getonbrd.com.
We're seeking a QA Automation Engineer to join our Data Engineering team and take ownership of quality assurance across our data pipelines and infrastructure. This role will be instrumental in building and maintaining automated test suites that ensure the reliability and accuracy of our healthcare data systems. You'll work closely with a small, focused team of data engineers to establish testing strategies, prioritize coverage for critical data paths, and maintain quality standards as we scale.
• Quality Ownership: Own and maintain the automated test suite that runs in our CI pipeline, including integration tests, data quality checks, and smoke tests for our data infrastructure.
• Strategic Collaboration: Partner closely with data engineers to understand pipeline architecture, identify critical data paths, and develop comprehensive testing strategies that prioritize business-critical datapoints.
• Test Development: Write and maintain automated tests for data pipelines using Python and TypeScript, ensuring coverage across batch and event-driven workflows.
• Data Validation: Implement data quality checks including row counts, schema validation, key-column validation, idempotency testing, and duplicate handling across ETL processes.
• CI/CD Integration: Build and maintain testing frameworks that integrate seamlessly with our CI/CD pipelines using GitHub Actions, AWS CodePipeline, and CodeArtifact.
• Documentation & Standards: Document test cases, testing strategies, and coverage metrics to establish repeatable quality standards across the data team.
• Continuous Improvement: Identify testing gaps and systematically expand coverage toward end-to-end testing of critical data pipelines.
• 3 years of experience in QA automation or software testing, with a focus on data pipelines or backend systems.
• 3 years of hands-on experience with Python and TypeScript for test automation.
• Strong experience with CI/CD pipelines (GitHub Actions, AWS CodePipeline, CodeArtifact).
• Hands-on experience working with data lakes and ETL processes on AWS (familiarity with services like S3, Glue, Athena, Lambda, Step Functions, SQS, EventBridge).
• Experience with testing frameworks for Python (pytest, unittest) and TypeScript/JavaScript (Jest, Mocha).
• Understanding of data structures, data modeling concepts, and data lineage.
• Experience testing in a multi-tenant SaaS environment.
• English (C1/C2) fluency.
ISTQB Certification
This offer is exclusive to getonbrd.com.
This job is exclusive to getonbrd.com.
CodersLab es una empresa dedica al desarrollo de soluciones dentro del rubro IT y actualmente, nos enfocamos en expandir nuestros equipos a nivel global para posicionar nuestros productos en más países de América Latina y es por ello que estamos en búsqueda de un Desarrollador Web para unirse a nuestro equipo.
Formarás parte de un equipo desafiante y ambicioso, con ganas de innovar en el mercado, donde tus ideas y contribuciones serán altamente valiosas para el negocio.
¡Postúlate ahora para este increíble desafío!
© Get on Board. All rights reserved.
Experiencia entre 2 y 3 años
Modalidad de contratación: Recibo por honorarios
Modalidad: Hibrida (3 veces a oficina)
Send CV through getonbrd.com.
© Get on Board. All rights reserved.
© Get on Board.
Resumen del cargo
Buscamos un/a ingeniero recién egresado/a con fuerte base técnica y ganas de aprender "en la vida real" para sumarse a un equipo que construye soluciones de inspección de activos críticos usando robótica, percepción (visión/3D), plataformas de datos y gemelos digitales. Tu foco será conectar el mundo físico con el digital: capturar datos desde robots y sensores, procesarlos (imágenes, nubes de puntos, telemetría), exponerlos en una plataforma (APIs/dashboards) y transformarlos en un gemelo digital útil para operación y mantenimiento.
Perfil ideal
Exclusive to Get on Board.
Si te apasiona la robótica, el software, machine learning, la innovación y el desarrollo, con un foco en la generación de nuevos productos y servicios, este es el lugar ideal para aprender y crecer profesionalmente. Estarás investigando, desarrollando e implementando soluciones en base a una combinación entre software y hardware para resolver problemas desafiantes con alto impacto en la industria y medio ambiente.
Somos un equipo multidisciplinario, con un ambiente laboral grato y relajado que cuenta con servicios únicos bien desarrollados y probados. Tenemos mucho entusiasmo por seguir desarrollando e implementando servicios y soluciones innovadoras.
Fuimos finalistas del Premio nacional de innovación Avonni 2019. También recibimos el premio a la mejor contribución a la industria de transporte de relaves Optimus Pipe 2018.
Improving South America es una empresa líder en servicios de TI que busca transformar positivamente la percepción del profesional de TI mediante consultoría de tecnología, desarrollo de software y formación ágil. Somos una organización con una cultura que fomenta el trabajo en equipo, la excelencia y la diversión, inspirando a nuestro equipo a establecer relaciones duraderas mientras ofrecemos soluciones tecnológicas de vanguardia. Nuestra misión está alineada con el movimiento de Capitalismo Consciente, promoviendo un entorno de trabajo excepcional que impulsa el crecimiento personal y profesional dentro de una atmósfera abierta, optimista y colaborativa.
Official source: getonbrd.com.
Apply to this job from Get on Board.
© Get on Board.
WiTi es una compañía que acompaña a grandes holdings de retail en la implementación de soluciones omnicanal basadas en datos y audiencias. Este rol se integra a una unidad de retail media enfocada en productos de segmentación y audiencias para múltiples países de la región. Trabajarás con equipos de negocio, data e ingeniería para impulsar productos que conectan marcas con millones de clientes a través de plataformas de publicidad basadas en datos, con un enfoque en escalabilidad y rendimiento.
This company only accepts applications on Get on Board.
Buscamos un/a Desarrollador/a Full Stack Senior con probada experiencia en desarrollo de aplicaciones end-to-end. Se requiere dominio de Python en backend, idealmente en entornos productivos, y sólida experiencia desarrollando interfaces en ReactJS. Debe tener conocimiento del mundo Retail y experiencia en proyectos de ecommerce o retail media. Se valora experiencia con FastAPI (y/o Django), conocimiento de Node.js (NestJS) como parte del stack backend, y experiencia en microfrontends o arquitecturas frontend modulares. Se espera experiencia en procesos ETL y plataformas de Big Data, trabajo con monorepos (NX) y pipelines CI/CD, y familiaridad con Docker y Kubernetes en ambientes productivos. Nivel básico de inglés para lectura de documentación técnica.
• Experiencia con FastAPI y buenas prácticas de diseño de APIs REST.
• Conocimiento de Node.js (NestJS) como parte del backend.
• Experiencia en arquitecturas frontend modulares (microfrontends).
• Experiencia en Big Data y pipelines ETL.
• Trabajo con NX monorepos y entornos CI/CD.
• Docker y Kubernetes en producción.
• Inglés básico para lectura de documentación técnica.
En WiTi ofrecemos un entorno de trabajo 100% remoto, con flexibilidad y autonomía. Fomentamos un ambiente colaborativo y una cultura de aprendizaje constante. Beneficios destacados:
Si te apasiona la tecnología y quieres formar parte de nuestro equipo, queremos conocerte.
Breezy HR is a remote-first hiring platform tailored for small and mid-sized businesses. We are expanding our SaaS product with LLM-enabled workflows and a backend-first focus to deliver fast, reliable experiences for both candidates and hiring managers. You’ll contribute to core features, improve data pipelines, and integrate managed AI capabilities (AWS Bedrock) to power smarter recruiting processes. This role sits at the intersection of product engineering and AI-enabled automation, driving end-to-end delivery from design to production.
Official source: getonbrd.com.
We’re seeking a senior backend engineer with 7+ years of web application experience and a strong track record shipping scalable, API-driven systems. You’ve built and operated production services in Node.js, including microservices, REST APIs, and asynchronous workflows. You’re comfortable working with data stores like MongoDB and Redis (schema design, indexing, caching, performance). This role requires that you’ve shipped at least one production LLM workflow end-to-end using AWS Bedrock (not a prototype), with reliability and cost/latency in mind. You communicate clearly in English (B2+ required, C1 preferred), document decisions, work autonomously with a bias toward action, and bring strong product ownership, turning ambiguous goals into shipped outcomes. You must be located in Colombia for payroll/compliance.
Deeper AWS infrastructure experience (e.g., Terraform/CDK/CloudFormation, networking, CI/CD, and production observability patterns). Frontend experience with modern frameworks like React, Angular, Vue, or Svelte to help ship end-to-end product changes.
Remote-first environment with flexible collaboration across time zones, a startup-paced team culture, and the opportunity to shape AI-enabled features in a growing SaaS product. Competitive salary in COP, exposure to cutting-edge LLM-driven workflows, and a collaborative, low-ego team. You’ll work with a distributed engineering and product squad focused on fast, reliable delivery.
Apply to this job from Get on Board.
En Artefact LatAm, somos una consultora líder enfocada en acelerar la adopción de datos e inteligencia artificial para generar impacto positivo
Como Tech Manager, liderarás la visión técnica y ejecución estratégica de soluciones avanzadas en Data Engineering, BI e IA, garantizando arquitecturas escalables y de alto impacto. Serás el catalizador de transformaciones digitales complejas, gestionando equipos multidisciplinarios y actuando como el puente crítico entre los objetivos de negocio de los clientes y la innovación tecnológica. Tu enfoque integrará la excelencia en el delivery, la gobernanza de datos y el desarrollo de talento, consolidando estándares globales que posicionen a la compañía como un referente técnico en el mercado.
© Get on Board.
Capacidades de Datos y Tecnología: Diseñar, implementar y escalar soluciones robustas (modelos predictivos, segmentación IA y BI en tiempo real) garantizando excelencia técnica, escalabilidad y fiabilidad.
Liderazgo de Transformaciones: Actuar como líder técnico en iniciativas de datos e IA, guiando equipos en transformaciones complejas bajo mejores prácticas de ingeniería y arquitectura sólida.
Estrategia y Arquitectura: Definir la visión técnica de plataformas de datos y ecosistemas de BI, alineando decisiones de infraestructura, nube, gobernanza y seguridad con los objetivos del negocio.
Excelencia en Proyectos: Responsable de la ejecución integral, calidad y rendimiento. Anticipar riesgos técnicos y gestionar dependencias para asegurar entregas a tiempo y en alcance.
Gestión de Equipos y Clientes: Dirigir y asesorar equipos multidisciplinarios fomentando una cultura de ingeniería. Actuar como contacto técnico principal para clientes, traduciendo necesidades de negocio en soluciones escalables.
Innovación Continua: Evaluar nuevas tecnologías y herramientas en datos e IA, impulsando la experimentación y validación de conceptos.
Company and Project Context
BNamericas is the leading Latin American business intelligence platform with 28 years of experience delivering news, project updates, and data on people and companies across strategic sectors such as Electric Power, Infrastructure, Mining & Metals, Oil & Gas, and ICT. We empower clients to access high-value information to make informed business decisions. The Engineering Lead will play a pivotal role in shaping a growing information platform used across industries and geographies, driving architecture, data workflows, and product evolution.
As part of a dynamic, multicultural team, you will drive high-performance software, data, and cloud initiatives, ensuring scalability, reliability, and security while fostering a culture of engineering excellence. This role combines hands-on development with strategic leadership to deliver a modular, scalable platform and to integrate cutting-edge AI-enabled capabilities where appropriate.
Originally published on getonbrd.com.
What you’ll bring
Proven experience in a senior or lead engineering role, ideally within SaaS or data/information platforms. Strong hands-on development skills in JavaScript, Node.js, and PostgreSQL with a track record of scalable system design. Solid understanding of DevOps, cloud infrastructure (AWS), and security best practices. Experience with data architecture, including data warehousing and transformation pipelines. Experience integrating third-party platforms (e.g., Appian) and working with internal data pipelines. Familiarity with web scraping technologies, automation, and management of external vendors. Exposure to or interest in AI-driven solutions (e.g., agent-based AI) is a strong plus. Fluent English is required; Spanish and/or Portuguese are a strong plus. Strong communication skills and the ability to collaborate with both technical and non-technical stakeholders. A strategic mindset with the ability to balance hands-on delivery and broader technical direction. An entrepreneurial attitude focused on quality, ownership, and impact.
Why you’ll love this role
You will shape and advance a growing information platform used across industries and geographies. This is a high-impact position with significant ownership, offering the chance to influence technical direction, data strategy, and product evolution while helping to build a culture of engineering excellence. You’ll work with a collaborative, diverse team in a dynamic market, and you'll have the opportunity to leave a lasting imprint on our platform and product roadmap.
At BNamericas, we foster an inclusive, diverse, creative, and highly collaborative work environment. Our team is dynamic, committed, and always willing to support one another, creating a positive and motivating workplace.
We offer a range of benefits, including referral bonuses for bringing in new talent, early finishes on special occasions such as national holidays and Christmas, opportunities for continuous learning and professional development, and a casual dress code that encourages authenticity and comfort at work.
We invite you to be part of a company that values diversity and work-life balance, and that promotes an empowered, goal-oriented, and passionate way of working. Join us!
Apply directly on Get on Board.
In this role, you’ll work on a variety of client projects to find cost-effective, high-quality, pragmatic solutions to complex problems. Responsibilities will include:
Apply to this posting directly on Get on Board.
Original job posting link here for more details
We're looking for a Senior Frontend Developer who excels at building pixel-perfect websites using modern frontend frameworks. You'll collaborate with our team to build elegant, performant, and visually stunning web experiences. Your work will span a diverse range of client projects, from immersive brand websites to complex web applications, all requiring a keen eye for detail and technical excellence.
The person we’re looking for is happy, relaxed and easy to get along with. They’re flexible on anything except conceits that will lower their usually outstanding work quality. They work “smart”, by carefully managing their workflow and staggering features that have dependencies intelligently — they prefer deep work but are OK coming up to the surface now and then for top level / strategic conversations.
We believe people with backgrounds or interests in design, art, music, food or fashion tend to have a well rounded sense of design & quality — so a variety of hobbies or side projects is a big nice to have!
Quick tip: Kindly submit a complete and thoughtful application, including relevant links that help verify your work experience and identity. Applications with missing or insufficient information will not move forward in the review process.
Our team carefully reviews every complete submission, and we truly appreciate the time and effort you put into applying.
Krunchbox is a retail analytics platform used by global consumer brands to transform point-of-sale data into actionable insights. Our platform ingests and processes large volumes of retail data from retailers across North America and Australia, helping brands understand sales performance, optimize inventory, and make smarter supply chain decisions. We are modernizing our platform architecture and rebuilding core components with a focus on scalability, performance, and AI-driven insights. The Senior Backend Engineer will help design and build the next generation of our data platform, collaborating with engineering, product, and data teams to deliver scalable backend services, data ingestion pipelines, and robust cloud infrastructure. This role is ideal for those who enjoy data-intensive systems, large-scale processing, and AI-enabled workflows in a fast-growing SaaS environment.
Apply to this posting directly on Get on Board.
Design and build scalable backend services using Python and FastAPI. Develop and maintain data ingestion and processing pipelines that power analytics across global brands. Build and maintain API services that drive the Krunchbox platform. Improve system performance, reliability, and scalability. Implement and maintain cloud infrastructure and DevOps pipelines. Collaborate with product, engineering, and data teams to deliver new capabilities. Participate in architecture and platform design decisions. Write clean, well-tested, maintainable code. Contribute to engineering best practices and documentation.
We are looking for a senior backend engineer with 5+ years of Python experience and a strong background in building API services and data-intensive systems. You should be proficient with FastAPI or similar async frameworks, design RESTful APIs following best practices, and have hands-on experience with cloud platforms (AWS, Azure, GCP) and CI/CD pipelines. You will work on scalable data pipelines, large datasets, and AI-enabled enhancements, contributing to a modern, AI-native engineering culture. Collaboration across engineering, product, and data teams is essential, as is a proactive approach to performance, reliability, and documentation. Familiarity with analytics databases (ClickHouse, Snowflake, BigQuery, Redshift) and data orchestration tools (Airflow, Dagster, Prefect) is highly desirable. We value ownership, fast iteration, and a passion for solving complex engineering problems in a high-growth SaaS environment.
Nice-to-have skills include full-stack experience (React, TypeScript), experience with analytics databases (ClickHouse, Snowflake, BigQuery, Redshift), data pipeline tooling (ETL/ELT), and AI/ML infrastructure familiarity. Prior experience in SaaS startups or high-growth tech companies, and a track record of owning systems from design through deployment, are also beneficial. Comfort with AI-assisted development tools (e.g., Claude Code) to accelerate coding, debugging, and architecture exploration is a plus.
ARKHO es una consultora experta en tecnologías de la información, que ofrece servicios expertos de TI en el marco de modernización de aplicaciones, analítica de datos, analítica avanzada y migración a la nube. Nuestro trabajo facilita y acelera la adopción de la cloud en múltiples industrias.
Nos destacamos por ser Partner Advanced de Amazon Web Services con foco estratégico en la generación de soluciones usando tecnología en la nube, somos obsesionados por lograr los objetivos propuestos y tenemos especial énfasis en el grupo humano que compone ARKHO (nuestros Archers), reconociendo a nuestro equipo como un componente vital para el logro de los resultados.
¿Te motivas? ¡Te esperamos!
This job offer is available on Get on Board.
Opportunity published on Get on Board.
La empresa promueve una cultura de trabajo excepcional basada en el trabajo en equipo, la excelencia y la diversión, con enfoque en crecimiento personal y recompensas compartidas. Al integrarse, el/la candidato/a formará parte de una comunidad que prioriza la comunicación abierta y relaciones laborales sólidas a largo plazo, respaldada por una estructura de desarrollo profesional y aprendizaje continuo..
Estamos buscando un/a Software Architect con experiencia en Microsoft Azure y plataformas de datos, para liderar el diseño de soluciones escalables y de alto impacto.
Este rol es clave para definir la arquitectura tecnológica, establecer estándares y acompañar a los equipos en la construcción de sistemas robustos, seguros y mantenibles.
This job offer is on Get on Board.
About CloudWalk:
We are not just another fintech unicorn. We are a pack of dreamers, makers, and tech enthusiasts building the future of payments. With millions of happy customers and a hunger for innovation, we're now expanding our neural network - literally and metaphorically.
About the Role:
You will join our reporting team, focused on building automation and reporting solutions that scale across all of CloudWalkâs products. This is not just about data pipelines â youâll also contribute to the creation of a reporting app, including its infrastructure and a web-based interface. AI will be at the center of everything we do, and youâll be applying it in every step of development.
Weâre looking for someone with strong critical thinking for data, grit to overcome challenges, and an endless curiosity for technology. You will be at the intersection of compliance, product, and engineering, helping us reimagine how reporting and automation can become smarter, faster, and globally scalable.
Join us at CloudWalk, where weâre not just engineering solutions; weâre building a smarter, AI-driven future for paymentsâtogether.
By applying for this position, your data will be processed as per CloudWalk's Privacy Policy that you can read here in Portuguese and here in English.
You will be the technical and mission expert for Sayari's most strategic government partners. You will embed directly with government analysts, operators, and data scientists to solve their hardest mission-enabling, intelligence and/or law enforcement problems. Your primary objective is to ensure that Sayari is deeply integrated into our clients' workflows, becoming an indispensable tool for missions ranging from sanctions evasion and counter-threat finance to securing critical supply chains. This is software engineering on the front lines, placing you at the critical juncture between our technology, our government clients, and their high-stakes missions.
This role is a blend of a software engineer, a data analyst, and a mission consultant. You will be architecting data pipelines or writing production code one day and brief
Please mention the word **ENDEARING** and tag RMTU3LjI0NS4yNDcuMTE4 when applying to show you read the job post completely (#RMTU3LjI0NS4yNDcuMTE4). This is a beta feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.
At H1, we believe access to the best healthcare information is a basic human right. Our mission is to provide a platform that can optimally inform every doctor interaction globally. This promotes health equity and builds needed trust in healthcare systems. To accomplish this our teams harness the power of data and AI-technology to unlock groundbreaking medical insights and convert those insights into action that result in optimal patient outcomes and accelerates an equitable and inclusive drug development lifecycle. Visit h1.co to learn more about us.
The Finance team plays a crucial role in creating that future. It is our role to serve as a liaison between H1âs Commercial & Technical teams to oversee issues related to financial reporting, analysis, forecasting, and planning, as well as resource prioritization and business management. With a deep understanding of the business levers underlying the operations of our Infrastructure team, this team is responsible for helping the business to drive toward clear and effective decisions which are critical to the success of the Company
WHAT YOU'LL DO AT H1
As a Finance Analyst, youâll be part of a highly visible team that partners with leaders and departments across the company. Youâll support the finance team with quarterly and annual forecasting, expense budgeting, key metrics reporting and analysis, close processes, and variance analysis, while also driving various automation and simplification projects.
- Assist with the preparation of annual budgets and financial forecasts to ensure alignment with the companyâs strategic goals and key initiatives
- Support the finance team in reporting and analyzing key metrics such as annual recurring revenue (ARR) and churn
- Provide actionable insights on revenue and collection trends, customer retention and profitability, and other key performance drivers
- Assist with the implementation of variable compensation plans for teams across the organization
- Track and calculate monthly, quarterly, and annual sales commissions in accordance with approved compensation plans
- Support monthly financial presentations for both the executive team and board of director meetings
- Implement scalable processes through automation and process improvement to help strengthen the finance foundation
- Perform ad-hoc analysis on critical business needs
ABOUT YOU
Youâre a strong financial data driven analytical professional, with experience in FP&A or strategic finance for high growth, enterprise B2B SaaS tech, healthcare or marketplace companies. You know how to thrive in a fast-paced and frequently changing environment.
REQUIREMENTS
- 3+ years of experience in a Finance department
- Bachelorâs degree in Finance, Accounting, or a related major field (MBA is a plus)
- Experience in B2B SaaS financial modeling is a plus
- Advanced skills in Microsoft Excel and PowerPoint (Google Sheets and Slides experience is a plus)
- Excellent communication skills with the ability to interact directly with people at all levels of the organization
- Ability to meet deadlines while working in a fast paced environment
- Advanced system skills and the ability to learn new systems quickly.
- Strong attention to detail and ability to effectively prioritize tasks
COMPENSATION
This role pays $75,000 to $88,000 per year, based on experience, in addition to stock options.
Anticipated role close date: 01/10/2026
H1 OFFERS
- Full suite of health insurance options, in addition to generous paid time off
- Pre-planned company-wide wellness holidays
- Retirement options
- Health & charitable donation stipends
- Impactful Business Resource Groups
- Flexible work hours & the opportunity to work from anywhere
- The opportunity to work with leading biotech and life sciences companies in an innovative industry with a mission to improve healthcare around the globe
H1 is proud to be an equal opportunity employer that celebrates diversity and is committed to creating an inclusive workplace with equal opportunity for all applicants and teammates. Our goal is to recruit the most talented people from a diverse candidate pool regardless of race, color, ancestry, national origin, religion, disability, sex (including pregnancy), age, gender, gender identity, sexual orientation, marital status, veteran status, or any other characteristic protected by law.
H1 is committed to working with and providing access and reasonable accommodation to applicants with mental and/or physical disabilities. If you require an accommodation, please reach out to your recruiter once you've begun the interview process. All requests for accommodations are treated discreetly and confidentially, as practical and permitted by law.
In the healthcare sector, the Health Insurance Portability and Accountability Act of 1996 (HIPAA) requires that all insurance payers exchange transactions such as claims, eligibility checks, prior authorizations, and remittances using a standardized EDI format called X12 HIPAA. A small group of legacy clearinghouses process the majority of these transactions, offering consolidated connectivity to carriers and providers.
Stedi is the world's only programmable healthcare clearinghouse. By offering modern API interfaces alongside traditional real-time and batch EDI processes, we enable both healthcare technology businesses and established players to exchange mission-critical transactions. Our clearinghouse product and customer-first approach have set us apart. Stedi was ranked as Rampâs #3 fastest-growing SaaS vendor.
Stedi has lightning in a bottle: engineers and designers shipping products week in and week out; a lean business team supporting the companyâs infrastructure; passion for automation and eliminating toil; $92 million in funding from top investors like Stripe, Addition, USV, Bloomberg Beta, First Round Capital, and more. To learn more about how we work, watch our founder Zackâs interview with First Round Capital.
Weâre hiring a full-stack data and analytics engineer to build and own the data foundation that will power our daily GTM operations: revenue analytics, product usage telemetry, CRM data quality, attribution, funnel performance, and forecasting.
This is not a typical business analyst position. You will architect the pipelines, models, and automations that ensure our GTM teams have reliable, real-time insights into how customers discover, adopt, and expand with Stedi and our products. You will work closely with Sales, GTM Ops, Product, and Finance, executing data and analytics engineering workstreams, and conducting hands-on analysis to build the source-of-truth data for our GTM operations.
Build and maintain GTM data pipelines: Own ingestion, transformation, and syncing of CRM data (HubSpot), product-usage telemetry, billing data, and third-party enrichment data in Redshift to support GTM analytics workstreams.
Develop core GTM & revenue data models: Improve operational efficiency through standardization of datasets for Sales, GTM Ops, Finance, and the executive team, while establishing common metric definitions across revenue, customer segments, and more.
Ship dashboards, alerts, and decision-making tools: Improve telemetry into business performance by building dashboards to track things like sales funnel performance and pipeline quality. Better inform GTM leadership through automation of weekly/monthly reporting and establishing a revenue forecast.
Investigate trends and build models to support sales. Accelerate sales effectiveness through implementation of alerting for critical events (e.g. pipeline drops, usage contractions, stuck deals, missed lifecycle transitions), conducting key analyses (e.g. pipeline velocity, win rates, segmentation performance), and development of GTM models (e.g. ICP scoring, account prioritization, churn risk).
Own the GTM analytics roadmap: Work with GTM leadership to maintain a backlog of GTM analytics engineering work. Proactively identify the next set of capabilities the GTM org needs (forecasting, routing logic, new usage signals, etc).
You have exceptional analytical skills: Youâve made a career in working with data to improve products and overall business operations. You know the tools, best practices, and playbooks necessary to stand up a high-performing and organized analytics function at the company.
You know the tech stack: You write efficient SQL queries to analyze large datasets and can work with complex schemas. You're an expert with data visualization tools like Tableau, QuickSight, or Power BI. Familiarity with cloud environments (AWS, Azure, GCP).
You create and execute your own work: You notice patterns others miss and dig deep to understand root causes. You've identified data issues or operational inefficiencies that led to meaningful improvements.
You do what it takes to get the job done: You are resourceful, self-motivating, self-disciplined, and donât wait to be told what to do. You put in the hours.
You move quickly: We move quickly as an organization. This requires an ability to match our pace and not get lost by responding with urgency (both externally to payers and internally to stakeholders), communicating what you are working on, and proactively asking for help or feedback when you need it.
You are a âbottom feederâ: You thrive on the details. No task is too small in order to find success, generate revenue, and improve our costs.
The annual compensation range for this role is $180,000-$230,000. For roles with a variable component, the range provided is the roleâs On Target Earnings ("OTE") range, which means that the range is inclusive of the sales commissions or bonus target and annual base salary. This range may be inclusive of multiple experience levels at Stedi and will be narrowed during the interview process based on a number of factors, including the candidateâs experience, location, and qualifications. Please reach out to your recruiter with any questions.
Weâve been made aware of individuals impersonating the Stedi recruiting team. Please note:
All official communication about roles at Stedi will only come from an @stedi.com email address.
If youâre unsure whether a message is legitimate or have any concerns, feel free to contact us directly at careers@stedi.com.
We appreciate your attention to this and your interest in joining Stedi.
At Stedi, we're looking for people who are deeply curious and aligned to our ways of working. You're encouraged to apply even if your experience doesn't perfectly match the job description.
What we're building and why we're building it.Â
Every month, millions of people use Fetch earning rewards for buying brands they love, and a whole lot more. Whether shopping in the grocery aisle, grabbing a bite at the drive-through or playing a favorite mobile game, Fetch empowers consumers to live rewarded throughout their day. To date, we've delivered more than $1 billion in rewards and earned more than 5 million five-star reviews from happy users.Â
It's not just our users who believe in Fetch: with investments from SoftBank, Univision, and Hamilton Lane, and partnerships ranging from challenger brands to Fortune 500 companies, Fetch is reshaping how brands and consumers connect in the marketplace. When you work at Fetch, you play a vital role in a platform that drives brand loyalty and creates lifelong consumers with the power of Fetch points. User and partner success are at the heart of everything we do, and we extend that same commitment to our employees.
At Fetch, we value curiosity, adaptability, and the confidence to explore new tools, especially AI, to drive smarter, faster work. You don't need to be an expert, but you should be ready to learn quickly and think critically. We welcome learners who move fast, challenge the status quo, and shape what's next, with us. Ranked as one of America's Best Startup Employers by Forbes for two years in a row, Fetch fosters a people-first culture rooted in trust, accountability, and innovation. We encourage our employees to challenge ideas, think bigger, and always bring the fun to Fetch.
Fetch is an equal employment opportunity employer.
About the Role:
We are seeking a Machine Learning Software Engineer to join Fetch's Scan, Match & Catalog team. This role sits at the intersection of applied machine learning, data engineering, and production systems, with a focus on improving receipt understanding, product matching, and catalog enrichment at scale. You w
Please mention the word **FASHIONABLY** and tag RMTU3LjI0NS4yNDcuMTE4 when applying to show you read the job post completely (#RMTU3LjI0NS4yNDcuMTE4). This is a beta feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.
Weâre an AI-first global tech company with 25+ years of engineering leadership, 2,000+ team members, and 500+ active projects powering Fortune 500 clients, including HBO, Microsoft, Google, and Starbucks.
From AI platforms to digital transformation, we partner with enterprise leaders to build whatâs next.
What powers it all? Our people are ambitious, collaborative, and constantly evolving.
A U.S.-based education services provider offering online and campus-based post-secondary education, primarily serving military personnel, veterans, and public service communities. The organization delivers degree and certificate programs across disciplines such as nursing, health sciences, business, IT, and liberal arts. In addition to its headquarters in West Virginia, the customer operates facilities and partner institutions across the United States. The primary product areas to work with are learning management systems, student enrollment, and academic operations on web and mobile platforms.
Exposure to Power BI (optional)Experience with Salesforce data integrationBackground in higher education / ed-tech domains
Intermediate/Upper-Intermediate
Who We Are:
Alpaca is a US-headquartered self-clearing broker-dealer and brokerage infrastructure for stocks, ETFs, options, crypto, fixed income, 24/5 trading, and more. Our recent Series C funding round brought our total investment to over $170 million, fueling our ambitious vision.
Amongst our subsidiaries, Alpaca is a licensed financial services company, serving hundreds of financial institutions across 40 countries with our institutional-grade APIs. This includes broker-dealers, investment advisors, wealth managers, hedge funds, and crypto exchanges, totalling over 6 million brokerage accounts.
Our global team is a diverse group of experienced engineers, traders, and brokerage professionals who are working to achieve our mission of opening financial services to everyone on the planet. We're deeply committed to open-source contributions and fostering a vibrant community, continuously enhancing our award-winning, developer-friendly API and the robust infrastructure behind it.
Alpaca is proudly backed by top-tier global investors, including Portage Ventures, Spark Capital, Tribe Capital, Social Leverage, Horizons Ventures, Unbound, SBI Group, Derayah Financial, Elefund, and Y Combinator.
Our Team Members:
We're a dynamic team of 230+ globally distributed members who thrive working from our favorite places around the world, with teammates spanning the USA, Canada, Japan, Hungary, Nigeria, Brazil, the UK, and beyond!
We're searching for passionate individuals eager to contribute to Alpaca's rapid growth. If you align with our core valuesâStay Curious, Have Empathy, and Be Accountableâand are ready to make a significant impact, we encourage you to apply.
About the Role:
We are seeking an Analytics Engineer to own and execute the vision for our data transformation layer. You will be at the heart of our data platform, which processes hundreds of millions of events daily from a wide array of sources, including transactional databases, API logs, CRMs, payment systems, and marketing platforms.
You will join our 100% remote team and work closely with Data Engineers (who manage data ingestion) and Data Scientists and Business Users (who consume your data models). Your primary responsibility will be to use dbt and Trino on our GCP-based, open-source data infrastructure to build robust, scalable data models. These models are critical for stakeholders across the companyâfrom finance and operations to the executive teamâand are delivered via BI tools, reports, and reverse ETL systems.
What You'll Do:
Gauntlet leads the field in quantitative research and optimization of DeFi economics. We manage market risk, optimize growth, and ensure economic safety for protocols facilitating most spot trading, borrowing, and lending activity across all of DeFi, protecting and optimizing the largest protocols and networks in the industry. We build institutional-grade vaults for decentralized finance, delivering risk-adjusted onchain yields for capital at scale. Designed by the most vigilant, quantitative minds in crypto and informed by years of research.
As of November 2025, Gauntlet manages over $2B in vault TVL, and optimizes risk and incentives covering over $42 billion in customer TVL. We continually publish cutting-edge research that informs our risk models, alerts, and analysis, and is among the most cited institutions â including academic institutions â in terms of peer-reviewed papers addressing DeFi as a subject. Weâre a Series B company with around 75 employees, operating remote-first with a home base in New York City.
As a company, we build institutional-grade vaults that deliver risk-adjusted DeFi yields at scale, powered by automated risk models and off-chain intelligence. Gauntlet curates strategies across Morpho, Drift, Symbiotic, Aera and more, with >$2B in vault TVL and a growing suite of Prime, Core and Frontier vaults.
Our mission is to drive adoption and understanding of the financial systems of the future. We operate with a traderâs discipline and a risk managerâs skepticism: size carefully, stress routinely, unwind decisively. The label equals the package equals the contents. No surprises, just predictable, reliable vaults.
Join our derivatives trading team and work on the key infrastructure that powers our product offering as well as trading systems. Work with a team with decades of experience in tech and finance to build the backbone of our high-performance derivatives trading strategies. You'll work close to trading, own critical infrastructure end-to-end, and ship systems that manage real capital in live crypto markets.
\nPlease note at this time our hiring is reserved for potential employees who are able to work within the contiguous United States and Canada. Should you need alternative accommodations, please note that in your application.
The national pay range for this role is $165,000 - $205,000 plus additional On Target Earnings potential by level and equity in the company. Our salary ranges are based on paying competitively for a company of our size and industry, and are one part of many compensation, benefits and other reward opportunities we provide. Individual pay rate decisions are based on a number of factors, including qualifications for the role, experience level, skill set, and balancing internal equity relative to peers at the company.
#LI-Remote
About StarCompliance
StarCompliance is on a mission to make compliance simple and easy. Trusted globally by enterprise financial institutions, the user-friendly STAR platform empowers organizations to achieve regulatory compliance while safeguarding their integrity and business reputations. Through a customizable, 360-degree view of employee activity, the STAR software enables firms to automate the detection and resolution of potential areas of conflict while streamlining daily workflows and increasing efficiency.
Role
StarCompliance is looking for a senior, hands-on Data Operations & Migration Specialist to oversee our data feed operations and client data migration capabilities. This role combines technical leadership with day-to-day delivery, acting as a player coach who sets direction, unblocks issues, and still gets hands-on when it matters.
You will own the operational health of broker and client data feeds, lead complex data migration initiatives during client onboarding, and provide mentorship and technical guidance to engineers and analysts across both functions. Deep domain knowledge in financial services data, particularly regulated trading, transaction, or reference data, is critical.
This role sits within the Enterprise Data function and works closely with R&D, Client Support Services, Professional Services, and Relationship Management to ensure client data is secure, accurate, compliant, and delivered on time.
\nStarCompliance Background Checks
All positions require pre-employment screening due to employees potentially having access to highly sensitive and confidential information involving finance and compliance; candidates must be trustworthy and have a heightened sensitivity to protecting confidential financial, professional information. To be eligible for employment with StarCompliance, candidates must undergo a rigorous background investigation with checks including, but not limited to, criminal record history, consumer credit, employment history, qualifications, and education checks.
Equal Opportunity Employer Statement
We prohibit discrimination and harassment of any kind based on race, sex, religion, sexual orientation, national origin, disability, genetic information, pregnancy, gender identity or expression, marital/civil union/domestic partnership status, veteran status or any other protected characteristic as outlined by country, state, or local laws.
This policy applies to all employment practices within our organisation, including hiring, recruiting, promotion, termination, layoff, recall, leave of absence, compensation, benefits, training, and apprenticeship. StarCompliance makes hiring decisions based solely on qualifications, merit, and business needs at the time. For more information, please request a copy of our Equal Opportunities Policy.
Offshore CFO (Multifamily Real Estate) â Job Description
Overview
We are hiring a CFO to lead the finance and accounting function for a U.S.-based multifamily owner/operator. This role owns
financial statements, monthly close, cash management, budgeting/forecasting, reporting, and controls across multiple
properties and entities. The right candidate is tech-forward and excited to modernize finance through automation, AI, and APIdriven integrations.
Key Responsibilities
⢠Monthly close & financial statements: Own timely, accurate close and delivery of P&L, balance sheet, and cash flow
with supporting schedules.
⢠Reconciliations & controls: Ensure complete bank/GL reconciliations, AR/AP tie-outs, accruals, prepaids, CIP/fixed
assets, intercompany, and documented processes.
⢠Management reporting: Produce property/portfolio reporting including budget vs. actual, variance explanations, and
key operating KPIs.
⢠Cash management: Maintain daily cash visibility and a rolling 13-week cash forecast; manage payment cadence,
approvals, reserves, and liquidity planning.
⢠Budgeting & forecasting: Lead annual budgets and reforecasts (revenue, payroll, utilities, repairs, insurance, taxes,
CapEx).
⢠CapEx / renovation tracking: Track project budgets, spend, and ROI support (CIP and unit-level economics as
applicable).
⢠Lender / compliance support: Manage covenant reporting, lender deliverables, and coordination with CPAs/tax/audit
teams.
⢠Section 8 / Housing Authority & municipal compliance: Support affordable housing reporting and compliance (as
applicable), including coordination with Housing Authorities/cities, audits, and required documentation.
⢠Team leadership: Lead and develop offshore accounting staff (AP/AR/accountants); set SOPs, close calendar, and
review standards.
⢠Tech/automation leadership: Implement and optimize workflows using AI tools, automation, and API connections
across property management, accounting, reporting, and data pipelines.
Requirements (Must-Have)
⢠Minimum 8+ years of experience as a CFO (or senior finance leader) in real estate; multifamily strongly preferred.
⢠Expert in financial statements, close management, reconciliations, cash forecasting, and internal controls.
⢠Strong ability to deliver decision-ready reporting (budget vs. actual, variance analysis, KPIs).
⢠Bilingual proficiency: fluent professional English and Spanish (written and spoken).
⢠Property management software experience; ResMan preferred.
⢠Expense management software experience with Brex or Ramp; Brex preferred.
⢠Experience working with Section 8 programs, Housing Authorities, and municipal/city requirements (as applicable),
including compliance reporting and audit support.
⢠Strong understanding of real estate legal entities and structures (LLCs/LPs/SPVs), intercompany accounting, and
entity-level reporting.
⢠Tech-forward mindset: comfortable implementing automation/AI and working with APIs/integrations (no coding
required, but must be fluent with modern tools).
⢠Advanced Excel/Google Sheets skills; comfortable building standardized reporting templates and dashboards.
⢠Ability to work offshore with consistent overlap with U.S. business hours and days (ET/CT preferred).
Preferred
⢠Multi-entity consolidation, lender compliance/covenants, and renovation-heavy portfolios.
⢠Experience with BI/reporting tools (Power BI/Tableau) and modern AP/bill pay tools.
Working Model
⢠Remote / Offshore (LATAM preferred for timezone overlap)
⢠Reports to Ownership/CEO/Managing Partner; partners closely with Operations and Asset Management
Revinate is one of the largest and most innovative providers of direct revenue-generating solutions in the hospitality industry. Revinate's mission is to deliver hoteliers scalable direct revenue and profits from data-driven solutions that cultivate deeper relationships with guests. Revinateâs Direct Booking Platform helps capture, convert and retain guests with strategies and services that maximize direct booking revenue. This combination maximizes the lifetime value of each guest through personalized and targeted campaigns across the guest journey. Revinate Marketing has won 1st place for Hotel CRM & Email Marketing in the HotelTechAwards five years in a row!
About Us
Revinate is an innovative hospitality tech company that is revolutionizing how customers manage their operations and enhance the guest experience. Our solutions leverage advanced technology, data analytics, and automation to improve efficiency and drive customer happiness in the hospitality industry.
The Opportunity
We are seeking an experienced and visionary Director, Data Engineering to lead our Data Platform initiatives. In this critical role, you will be responsible for defining the strategy, architecture, and execution of our end-to-end data ecosystem, encompassing data ingestion pipeline, operational data stores, our evolving data lakehouse, and robust data APIs. You will build and lead a high-performing team of data engineers, fostering a culture of innovation, collaboration, and operational excellence. This role requires not only deep technical expertise but also a strong understanding of how data can drive business value, including leveraging data science and machine learning to optimize our operations.
Key Responsibilities
Strategic Leadership: Define and execute the long-term vision and roadmap for our data platform, aligning with overall business objectives and technology strategy.
Team Leadership & Development: Recruit, mentor, and lead a talented team of data engineers, fostering their growth and ensuring best practices in data engineering.
Data Pipeline: Oversee the design, development, and maintenance of scalable and reliable real time data ingestion pipeline, ensuring data quality, accuracy, and timely delivery.
Operational Data Stores: Lead the architecture and management of our operational data stores, optimizing for performance, reliability, and accessibility to support critical business applications.
Data Lakehouse Development: Drive the strategic evolution and implementation of our data lakehouse, enabling unified data access, advanced analytics, and machine learning initiatives.
Data API Development: Champion the design and development of secure, performant, and well-documented data APIs to facilitate data consumption across various applications and user groups.
Data Governance & Quality: Enforce data governance policies, standards, and procedures to ensure data integrity, security, privacy, and compliance.
Operational Efficiency through Data Science/ML: Collaborate closely with data science and analytics teams to identify opportunities where data science and machine learning can be applied to optimize internal operations, automate processes, and improve efficiency within the data platform itself (e.g., predictive maintenance for pipelines, intelligent resource allocation).
Performance & Scalability: Ensure the data platform is highly performant, scalable, and resilient, capable of handling growing data volumes and complex analytical workloads.
Technology Evaluation: Evaluate and recommend new data technologies, tools, and platforms to enhance our data capabilities and stay ahead of industry trends.
Cross-Functional Collaboration: Partner effectively with engineering, product, analytics, data science, and business teams to understand data requirements and deliver impactful solutions.
Monitoring & Support: Establish robust monitoring, alerting, and on-call support processes for all data systems, ensuring high availability and rapid issue resolution.
\nInterview Process
We're excited you're considering a career with Revinate! Our goal is to ensure this is the right opportunity for you, while also determining if you're the right fit for our team. The interview process for this role is designed to be a two-way street, where you'll get to know us just as we get to know you.
- Recruiter Screen - 30 min
- Technical Interview - 60 min
- Cross Functional Interview - 30 min
- Final Interview - 30 min
Revinate values the flexibility of a remote workforce and the benefits of localized hiring. We focus on specific cities to foster local communities and enhance team cohesion, allowing employees to collaborate, attend local events, and build a strong sense of community and company culture.
Candidates must be located in the city listed in the job application. Thank you!
Revinate is not open to third party solicitation or resumes for our posted FTE positions. Resumes received from third party agencies that are unsolicited will be considered complementary.
Important Security Alert
We have been made aware of fraudulent activities involving individuals impersonating our HR team and offering fake job opportunities. Please be vigilant and ensure your safety by verifying all job offers.
For Authentic Opportunities: Only refer to our official careers page on our company website. Your security is our priority. If you encounter any suspicious activity, please report it immediately. Stay safe and secure! You can confirm or inquire with any questions by reaching out to recruiting@revinate.com
AI and Hiring
Please note that interviews at Revinate will be recorded using brighthire.ai. As we continue to build more structure into our interview processes -- the best way to eliminate unconscious bias! We are encouraging our interviewers to focus more on our candidates and the conversation than taking notes. Instead, we can rely on brighthire.ai to do the note taking for us. If youâre uncomfortable with recording your interview, please let us now. Weâll opt you out.
Excited?! Want to learn more? Apply Now!
Our Core Values:
One Revinate - United & Strong, on a single mission together
Built on Trust - Itâs the foundation of everything we do
Expect Amazing - We think, dream & deliver big
Customer Love -- When the customer wins, we win
Make it Simpler -- Apply it to everything we do
Hungerness -- Feel it, follow it, be relentless about our success
Grounded in Gratitude - Weâre glad to be here & make the most of every day
Revinate Inc. provides Equal Employment Opportunity to all employees and applicants for employment without regard to race, color, religion, gender identity or expression, sex, sexual orientation, national origin, age, disability, genetic information, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state and local laws. Revinate complies with applicable state and local laws governing non-discrimination in employment in every location in which the company has facilities.
Revinate is not open to third party solicitation or resumes for our posted FTE positions. Resumes received from third party agencies that are unsolicited will be considered complementary.
If you are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to recruiting@revinate.com.
By submitting your application you acknowledge that you have read Revinate's Privacy Policy (https://www.revinate.com/privacy/)
About the Company:
Netomi is the leading agentic AI platform for enterprise customer experience. We work with the largest global brands like Delta Airlines, MetLife, MGM, United, and others to enable agentic automation at scale across the entire customer journey. Our no-code platform delivers the fastest time to market, lowest total cost of ownership, and simple, scalable management of AI agents for any CX use case. Backed by WndrCo, Y Combinator, and Index Ventures, we help enterprises drive efficiency, lower costs, and deliver higher quality customer experiences.
Want to be part of the AI revolution and transform how the worldâs largest global brands do business? Join us!
Job description
We are looking for a Software Development Intern to help us with coding, fixing, executing and versioning existing code for applications. If you're passionate to solve real time fundamental problems, explore, learn and work on technologies out of scope, Netomi is the perfect place for you.
\nNetomi is an equal opportunity employer committed to diversity in the workplace. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, disability, veteran status, and other protected characteristics.
About the Company:
Netomi is the leading agentic AI platform for enterprise customer experience. We work with the largest global brands like Delta Airlines, MetLife, MGM, United, and others to enable agentic automation at scale across the entire customer journey. Our no-code platform delivers the fastest time to market, lowest total cost of ownership, and simple, scalable management of AI agents for any CX use case. Backed by WndrCo, Y Combinator, and Index Ventures, we help enterprises drive efficiency, lower costs, and deliver higher quality customer experiences.
Want to be part of the AI revolution and transform how the worldâs largest global brands do business? Join us!
Job description
We are looking for a Software Development Intern to help us with coding, fixing, executing and versioning existing code for applications. If you're passionate to solve real time fundamental problems, explore, learn and work on technologies out of scope, Netomi is the perfect place for you.
\nNetomi is an equal opportunity employer committed to diversity in the workplace. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, disability, veteran status, and other protected characteristics.
At Lalamove, we believe in the power of community. Millions of drivers and customers use our technology every day to connect with one another and move things that matter. Delivery is what we do best and we ensure it is always fast and simple. Since 2013, we have tackled the logistics industry head on to find the most innovative solutions for the worldâs delivery needs. We are full steam ahead to make Lalamove synonymous with delivery and on a mission to impact as many local communities we can. We have massively scaled our efforts across Asia and now have our sights on taking our best in class technology to the rest of the world. And we are looking for talented professionals to join us in this journey!!
As a Senior Data Engineer at Lalamove, you will be joining the global Data team as a key member of our expanding technology team in our new market. Due to the importance of user privacy and our commitment to compliance laws, we need an additional engineer to support our operations in the expanding market, while collaborating closely with our global engineering team.
To all candidates- Lalamove respects your privacy and is committed to protecting your personal data.
This Notice will inform you how we will use your personal data, explain your privacy rights and the protection you have by the law when you apply to join us. Please take time to read and understand this Notice. Candidate Privacy Notice: https://www.lalamove.com/en-hk/candidate-privacy-notice
Customer data is the fuel that drives all modern businesses. From product analytics, to marketing, to support, to advertising, advanced data analysis in the warehouse, and even sales â customer data is the raw material for each function at a modern business.
For highly regulated businesses in healthcare, itâs always been a challenge to harness that customer data and get it to the marketing and analytics tools that require it while following patient privacy lawsâ¦.until now.
Something as simple as running ads to get more users is simple for an e-commerce of software company to do. But common web analytics and advertising tools collect sensitive user identifiers and healthcare information automatically. Those same tools are not HIPAA compliant.
We provide a layer of data governance to make current web analytics tools HIPAA-compliant. For analytics, our customers can continue getting the insights they need to improve the patient experience. For marketing, Freshpaint safeguards health information while helping our customers promote access to care through popular advertising platforms like Facebook, Google, and others.
In short, we help healthcare marketers promote access to care and safeguard patient privacy at the same time. This is an important, complex problem in a massive market (healthcare is 20% of the US GDP).
Our customers manage their customer data with:
Privacy Platform. We help healthcare providers automate their websiteâs + appâs HIPAA compliance, and safeguard patient data. This is our core product today
Future additional product lines! Our core product provides a platform that we're building marketing applications on top of.
Weâre fully remote. If you strongly value in-person work, Freshpaint is likely not the best fit for you. Even though we donât care where youâre located, we only hire within the US. Many of our team is concentrated in various metro areas like SF or NYC. To balance out our remote-ness, we gather the team 2x times per year for offsites. Weâre backed by leading investors including Y-Combinator, Intel Capital, and angel investors like the Head of Data from Slack, Head of Data at LinkedIn, and more.
Freshpaint was founded by web analytics veterans who realized how hard it was for highly regulated companies to collect and use customer data in a compliant way. We started as part of Y Combinatorâs S19 cohort and have been focused on enabling healthcare companies collect, safeguard, and activate patient data since.
In 2022 the government issued updated guidance around HIPAA, basically making our software a requirement to use for healthcare companies. As a result, we're one of the fastest growing software companies on earth right now.
Our team has deep analytics and growth experience, with all of us coming from high-growth companies like Heap, Pendo, Iterable, Quantum Metric, and Retool. If you value lots of freedom and ownership in your work, interfacing with customers, and working on a product with high customer impact, then Freshpaint is your home.
At Freshpaint, we believe that strong Engineering teams are built of individuals who
Solve problems, not tickets â Jump into unfamiliar territory and learn what's needed to move the team forward
Think like owners â Focus on delivering measurable business impact rather than completing tasks
Elevate others â Actively mentor, unblock, and celebrate teammates, knowing the team's wins are your wins
We are looking for a Senior Software Engineer - Data to join one of our Product-oriented teams. As Freshpaint has grown, our Products have become more sophisticated and increasingly leveraged multiple sources of data. Weâre seeking a Software Engineer who has competencies in Data and Data Engineering to help us shape the next generation of Freshpaint Products. We believe thereâs a big opportunity ahead, and this person will contribute to the teamâs success by building new products and by influencing how we incorporate data into our Product offerings.
Use your expertise to build Software Products that rely on data
Deliver business outcomes by either directly owning, or guiding others to build reliable and scalable products
Mentor engineers and analysts on best practices for data quality, reliability, testing, monitoring, and documentation
Partner closely with analytics, product, and engineering teams to identify data requirements and translate them into robust, scalable solutions
Join customer calls (both internal teams and external users) to hear firsthand what problems they're solving and what features actually move the needle
Design and refine data models that underpin product functionality while implementing monitoring systems to ensure reliability and performance
Collaborate with our Data Guild to define the organizationâs data strategy influencing decisions on tooling, architecture, and engineering standards
Solve problems side-by-side with team members through a combination of pairing and solo work
If this sounds like you, we would love to chat!
5+ years of experience in building Products, either in Software Engineering, Data engineering or a closely related role
Strong customer orientation, with a focus on details that drive product impact and customer value
Proven experience building and maintaining production-grade data pipelines
Proficiency in application development
Proficiency in SQL and at least one data engineering language (e.g., Python, Scala, or Java)
Hands-on experience with large-scale data warehouses, regardless of specific tooling
Experience with data visualization and the ability to tell clear, compelling stories with data
Hands-on experience with modern data warehouses and data modeling best practices
Experience working with cloud-based data platforms (AWS, GCP, or Azure)
Familiarity with orchestration tools, version control, and CI/CD best practices
Ability to work independently, make sound architectural decisions, and thrive in ambiguous environments
Strong communication skills and comfort collaborating with both technical and non-technical partners
Experience being an early data engineer at a company
Experience with Golang, Typescript, Data Build Tool
Experience with tools like Snowflake, Looker, or Fivetran
Experience with analytics engineering or BI tooling
Prior experience helping scale a data platform as the company grows
Build the foundation for what's next. You'll architect the data systems and strategy that power Freshpaint's future, shaping how the company scales for years to come
See your impact everywhere. Your work will touch every team and product at Freshpaint, giving you visibility into how engineering decisions drive real business outcomes
Code one day, strategize the next. You'll split your time between writing code and making architectural decisions that set technical direction, perfect if you want to keep your hands on the keyboard while influencing the big picture
At the start of the call, we will briefly go through a few standard verification steps to ensure weâre speaking to the right person. This helps protect both candidates and our team against AI misuse. If at any point we get the sense we arenât speaking with the right candidate, we reserve the right to end the call early.
Recruiter Screen
Hiring Manager Call
Virtual Onsite with Technical Pairings
CEO Interview
Offer!
We take care of our teamâhereâs a peek at what you get when you join:
Competitive pay + generous equity (10-year exercise window)
Fully remote (U.S. only) with a $150/month coworking stipend
Half-day Fridays, every Friday
Unlimited PTOâwith a required 2-week minimum
Top-tier health, dental & vision (100% covered for you, 80% for dependents)
2 âTreat Yourselfâ days a yearâ$100 and a day off, just because
Generous parental leave
Epic offsites twice a year (past trips: Greece, Jackson Hole, Cabo, wine country + more)
And moreâcheck out our careers page for the full list.
Who we are
Zus is a shared health data platform designed to accelerate healthcare data interoperability by providing easy-to-use patient data via API, embedded components, and direct EHR integrations. Founded in 2021 by Jonathan Bush, co-founder and former CEO of athenahealth, Zus partners with HIEs and other data networks to aggregate patient clinical history and then translates that history into user-friendly information at the point of care. Zus's mission is to catalyze healthcare's greatest inventors by maximizing the value of patient insights - so that they can build up, not around.
What we're looking for
Weâre looking for an experienced Software Engineer to join the âCostcoâ team at Zus, which builds services for managing our rapidly growing bulk data offerings while adhering to complex healthcare access control requirements.
The ideal candidate will be excited to take on the challenge of processing, storing and delivering the entire health records of millions of patients, adopting tools to handle growing scale, and ensuring high data quality and freshness. You are creative, innovative and love to run experiments to explore the paths to evolve and develop our platform as we scale.
As As part of the core Zus platform, the Costco team has needed to rapidly innovate to stay ahead of data volumes that grow at 10x per year and a growing base of data-savvy customers using data to improve patient care. They are also contending with an evolving regulatory landscape in data privacy and security.
On the Costco team, you will work with microservices in Go, streaming data pipelines in AWS, and state-of-the-art data technologies including Apache Iceberg, Apache Spark, Snowflake, and dbt. Expect to learn a lot and be put on mission-critical projects with direct customer impact.
\nWe will offer youâ¦
⢠Competitive compensation that reflects the value you bring to the team a combination of cash and equity
⢠Robust benefits that include health insurance, wellness benefits, 401k with a match, unlimited PTO
⢠Opportunity to work alongside a passionate team that is determined to help change the world (and have fun doing it)
Please Note: Research shows that candidates from underrepresented backgrounds often donât apply unless they meet 100% of the job criteria. While we have worked to consolidate the minimum qualifications for each role, we arenât looking for someone who checks each box on a page; weâre looking for active learners and people who care about disrupting the current healthcare system with their unique experiences.
We do not conduct interviews by text nor will we send you a job offer unless you've interviewed with multiple people, including the Director of People & Talent, over video interviews. Job scams do exist so please be careful with your personal information.
Office Hours is an on-demand expert network that connects leading organizations with trusted experts across various knowledge domains. Experts earn income by sharing their knowledge through advisory work, projects, and AI model training. Our platform handles the complexities behind the scenesâ screening, compliance, scheduling, and paymentsâso knowledge sharing stays focused on meaningful insights and real impact.
Weâre a hyper-growth and profitable company, quickly expanding our expert network, launching new offices, and new products. We are headquartered in San Francisco, with offices in Brooklyn and Bangalore. Our customers include the fastest-growing digital health companies, technology companies, institutional investment firms, consulting firms and AI Labs. We are backed by top marketplace investors and operators of companies like DoorDash, Airbnb, Affirm.
Human knowledge is the worldâs most valuable asset. And yet, despite being more interconnected than ever, most knowledge still remains stuck in our heads, inaccessible and underutilized. Our vision is to make human knowledge easily accessible and infinitely scalable by building tools for the new age knowledge economy.
At first glance, Office Hours looks simple: search, match, connect, and pay. Under the hood, the system is anything but.
Weâre building and evolving a deeply interconnected platform spanning search, discovery, recommendations, data pipelines, logistics, payments, compliance, and performance. The entire stack has been built in-house, from expert profiles and discovery experiences to workflow automation and an underlying knowledge graph that ties everything together.
Weâre looking for a Staff Full Stack Software Engineer who enjoys working across the stack, takes ownership of complex problems, and cares deeply about building thoughtful, high-quality product experiences. This is a hands-on role with real influence over product direction, technical architecture, and how we ship software.
Own the design, implementation, and rollout of meaningful user-facing features, from problem definition through production
Partner closely with design, product, and client-facing teams to translate real user needs into shipped solutions
Architect, build, and evolve scalable, reliable systems across the front end, back end, and infrastructure
Set a high bar for code quality through clear implementations, thoughtful tradeoffs, and active participation in reviews and technical discussions
Explore and integrate modern tools, including AI-powered workflows, and share learnings that improve how the team builds and ships
8+ years of professional software engineering experience, with meaningful time spent working across the stack
A track record of shipping high-quality, user-facing products in production environments
Strong product intuition and the ability to translate ambiguous user or business problems into technical solutions
Comfort operating in fast-moving environments where priorities evolve and ownership matters
A bias toward action, paired with sound judgment and attention to detail
Back end: Node.js, Typescript, MongoDB & Postgres, OpenSearch, Temporal
Front end: React, Next.js, Tailwind, shadcn
Infrastructure: AWS, Kubernetes, Docker, Datadog, Sentry
Workflow: GitHub, Slack, Notion, Figma, Linear, PostHog, Metabase
Competitive salary and equity
Medical, dental, and vision coverage
401(k)
Monthly wellness and fitness stipend
Paid time off policy, along with company holidays
Annual company off-sites (Tahoe, Mendocino, Mexico City, San Diego, Park City)
Parent-friendly policies, remote flexibility, and paid family leave
Pay Transparency Notice
Full-time offers include base salary, equity, and benefits.
Pay range: $225,000- $250,000 based on seniority and relevant experience
*This role can be 100% remote, but we do have offices in San Francisco and NYC
Donât meet every single requirement? Studies have shown that some candidates, especially underrepresented groups such as women and people of color, are less likely to apply to jobs unless they meet every single qualification. At Office Hours we believe in building a diverse and inclusive workplace, so if youâre excited about this role but donât meet every qualification in the job description, we still encourage you to apply. You could still be the right candidate for this or other roles at Office Hours!
Flock Safety is the leading safety technology platform, helping communities thrive by taking a proactive approach to crime prevention and security. Our hardware and software suite connects cities, law enforcement, businesses, schools, and neighborhoods in a nationwide public-private safety network. Trusted by over 5,000 communities, 4,500 law enforcement agencies, and 1,000 businesses, Flock delivers real-time intelligence while prioritizing privacy and responsible innovation.
Weâre a high-performance, low-ego team driven by urgency, collaboration, and bold thinking. Working at Flock means tackling big challenges, moving fast, and continuously improving. Itâs intense but deeply rewarding for those who want to make an impact.
With nearly $700M in venture funding and a $7.5B valuation, weâre scaling intentionally and seeking top talent to help build the impossible. If you value teamwork, ownership, and solving tough problems, Flock could be the place for you.
We're hiring a Senior Software Engineer to build Night Shift, a conversational AI assistant that helps investigators surface critical evidence and close cases faster. You'll design and implement the conversational interface, build the orchestration backend that manages LLM interactions and tool calling, and develop integration pipelines connecting our AI to Flock's existing data platform and APIs. This is a ground-floor opportunity where product thinking matters as much as technical execution: you'll shape chat experiences with complex context management, partner with platform teams to design new APIs or leverage existing ones, and solve the reliability challenges of deploying AI in high-stakes investigative workflows. You'll collaborate closely with ML engineers on prompt engineering and agentic workflows while maintaining a strong point of view on what makes a great user experience. If you've built LLM-powered products and thrive at the intersection of customer impact and technical depth, this role is for you.
Love for coding and continuous learning, especially in the rapidly evolving LLM space
Resourceful problem-solver mindset: excel in ambiguous situations and take initiative to define product direction
Strong TypeScript / Node / Express skills for web services and API design (REST, SSE, WebSockets for streaming)
Modern web framework expertise (React / TypeScript preferred), particularly for conversational UI and chat interfaces
Hands-on LLM experience: OpenAI/Anthropic/Gemini APIs, prompt engineering, streaming responses, and conversation context management
Familiarity with agentic patterns: function calling, tool use (MCP), and orchestrating multi-step workflows
API integration skills: consume existing APIs or design new ones to ground AI in investigative data
Database confidence: PostgreSQL and sophisticated SQL for data retrieval
Cloud infrastructure basics: Docker, Kubernetes (Helm), AWS services (S3, SQS, API Gateway)
Product-minded: translate user feedback into technical requirements and make pragmatic tradeoffs
Bonus points for: LLM evaluation tools (LangSmith, Langfuse), vector search/RAG, microservices architecture, or Terraform
The First 30 Days
Onboard and Integrate:
Familiarize yourself with Flock's mission, investigative workflows, and how customers use our platform today
Pair with engineers across Cloud Software and ML teams to understand existing APIs, data models, and system architecture
Build relationships with key stakeholders to understand their capabilities and constraints. Meet with members of:
Machine Learning (agentic systems, model serving)
Data Engineering (investigative datasets, pipelines)
Platform teams (APIs, infrastructure)
Product and Design (customer needs, UX direction)
Ship Early and Learn:
Complete a first-day push to production
Pick up initial sprint tickets: bug fixes, small UX improvements, or API integrations
Participate in customer feedback sessions to understand investigator workflows and pain points
The First 60 Days
Build the Foundation:
Deliver core conversational UI components and establish patterns for chat interfaces
Implement backend orchestration for LLM interactions and tool calling
Stand up observability for the AI system (logging, tracing, basic metrics)
Work with ML team to integrate agentic workflows and refine prompt strategies
Demonstrate Velocity:
Own end-to-end features that connect UI, backend orchestration, and data integrations
Collaborate with Product to rapidly iterate based on early user testing
Propose technical improvements to chat quality, performance, or reliability
90 Days & Beyond
Drive Product Impact:
Lead development of a core Night Shift capability that demonstrably improves investigator efficiency
Represent the team in cross-functional initiatives, balancing zero-to-one experimentation with engineering best practices
Establish patterns for testing and quality in an evolving AI product
Shape the Direction:
Influence product roadmap through technical insights and customer feedback
Mentor team members on LLM integration patterns or full-stack best practices
Own a domain area (e.g., conversation management, data grounding, streaming architecture)
We want our interview process to be a true reflection of our culture: transparent and collaborative. Throughout the interview process, your recruiter will guide you through the next steps and ensure you feel prepared every step of the way. To check out our interview stages and how you should prepare visit experiences on our careers page.
In this role, youâll receive a starting salary of $170,000-$185,000 as well as stock options. Base salary is determined by job-related experience, education/training, as well as market indicators. Your recruiter will discuss this in-depth with you during our first chat.
ð´Flexible PTO: We seriously mean it, plus 11 company holidays.
âï¸Fully-paid health benefits plan for employees: including Medical, Dental, and Vision and an HSA match.
ðªFamily Leave: All employees receive 12 weeks of 100% paid parental leave. Birthing parents are eligible for an additional 6-8 weeks of physical recovery time.
ð¼Fertility & Family Benefits: We have partnered with Maven, a complete digital health benefit for starting and raising a family. Flock will provide a $50,000-lifetime maximum benefit related to eligible adoption, surrogacy, or fertility expenses.
ð§ Spring Health: Spring Health offers a variety of mental health benefits, including therapy, coaching, medication management, and digital tools, all tailored to each individual's needs.
ðCaregiver Support: We have partnered with Cariloop to provide our employees with caregiver support
ð¸Carta Tax Advisor: Employees receive 1:1 sessions with Equity Tax Advisors who can address individual grants, model tax scenarios, and answer general questions.
ðERGs: We want all employees to thrive and feel like they belong at Flock. We offer three ERGs today - Women of Flock, Flock Proud, and Melanin Motion. If you are interested in talking to a representative from one of these, please let your recruiter know.
ð»WFH Stipend: $150 per month to cover the costs of working from home.
ðProductivity Stipend: $300 per year to use on Audible, Calm, Masterclass, Duolingo, Grammarly and so much more.
ð Home Office Stipend: A one-time $750 to help you create your dream office.
If an offer is extended and accepted, this position requires the ability to obtain and maintain Criminal Justice Information Services (CJIS) certification as a condition of employment. Applicants must meet all FBI CJIS Security Policy requirements, including a fingerprint-based background check.
Flock is an equal opportunity employer. We celebrate diverse backgrounds and thoughts and welcome everyone to apply for employment with us. We are committed to fostering an environment that is inclusive, transparent, and collaborative. Mutual respect is central to how Flock operates, and we believe the best solutions come from diverse perspectives, experiences, and skills. We embrace our differences and know that we are stronger working together.
If you need assistance or an accommodation due to a disability, please email us at recruiting@flocksafety.com. This information will be treated as confidential and used only to determine an appropriate accommodation for the interview process.
At Flock Safety, we compensate our employees fairly for their work. Base salary is determined by job-related experience, education/training, as well as market indicators. The range above is representative of base salary only and does not include equity, sales bonus plans (when applicable) and benefits. This range may be modified in the future. This job posting may span more than one career level.
At ARB Interactive, creativity, tech, and play collide. Founded in 2022, we've grown to nearly 200 team members and were named one of LinkedIn's â2025 Top 50 Startups in the United Statesâ! We move fast, think big, and love bold ideas that push boundaries (and buttons). From new rewards to fresh game mechanics, every challenge is a chance to innovate and have fun doing it. Our culture is collaborative, curious, and full of laughter because great ideas grow best between coffee, code, and a few epic high-fives.
Weâre looking for a Senior Data Engineer to help shape and expand the foundation of our modern data stack. This is a hands-on role for someone whoâs excited to build and improve robust, scalable pipelines and collaborate cross-functionally to turn raw data into business-critical insights.
As a senior member of the team, youâll play a key role in technical decision-making, partnering closely with analytics, engineering, product, and other talented and collaborative teammates, to help ensure our systems scale with the business. If youâre passionate about solving real-world complex data challenges, in order to move the needle in a high-growth environment, this role provides the perfect blend of a technical challenge and meaningful impact.
This is a great opportunity for someone who thrives on hands-on execution but also enjoys mentoring others, guiding architectural decisions, and helping shape the future of the data function.
Design, build, and maintain scalable, efficient ETL/ELT pipelines
Model clean, trusted datasets to support analytics, experimentation, and reporting
Optimize our data infrastructure for performance, cost, governance, and maintainability
Partner with data analysts and product teams to improve data accessibility and accuracy
Enable self-service analytics by designing intuitive data models and comprehensive documentation
Implement robust data quality frameworks, monitoring, alerting and observability to ensure data reliability
Collaborate with product and engineering on instrumentation of new product features and events
Mentor junior team members, contribute to code reviews, and share best practices
Influence the long-term direction of our data architecture and tooling
Take on team leadership or people management responsibilities as the team scales
5+ years of experience in data engineering or related roles
Strong SQL and Python skills, with a focus on readable and efficient code
Deep understanding of data warehousing concepts and data modeling best practices
Hands-on experience with tools in the modern data stack (e.g., dbt, Airflow, Snowflake, BigQuery, Redshift)
Strong communication and collaboration skills; able to work cross-functionally with analysts, PMs, and engineers
A bias toward action and ownership; you thrive in fast-paced, high-autonomy environments
Experience in gaming, entertainment, or high-volume consumer applications
Familiarity with event tracking platforms (e.g., Segment, Amplitude)
Experience hiring or onboarding engineers in a high-growth environment
Diversity Commitment: We are focused on building a diverse and inclusive team. We welcome people of all backgrounds, experiences, abilities, and perspectives and are an equal opportunity employer. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Important Security Notice: Our recruitment team will only contact candidates through official channels using @arbinteractive.com email addresses and via our recruiting platform, Ashby. If you find a position on a third party careers page (LinkedIn, Indeed, etc.), the job posting will redirect you to our careers page (https://jobs.ashbyhq.com/arb-interactive) to begin your application. We will never request payment, banking information, or personal identification details during the application process.
If you're ever uncertain about the legitimacy of communication claiming to be from our company, please forward it to recruiting@arbinteractive.com for verification before responding or clicking any links.
We believe communication belongs to everyone. We exist to democratize phone service.⯠TextNowâ¯is evolving the way the world connects, and that's because we're made up of people with curious minds who bring an optimistic yet critical lens into the work we do.â¯â¯ We're the largest provider of free phone service in the nation. And we're just getting started.
Joinâ¯us in our mission to break down barriers to communication and free the flow of conversation for people everywhere.
TextNow is looking for an experienced Data Engineerâ¯with hands-on experience designing and developing data platforms. You willâ¯ownâ¯the design, development, and maintenance ofâ¯TextNow'sâ¯dataâ¯platform, enabling us to make effective data-informed decisions. You willâ¯be part ofâ¯cross-functional efforts to build scalable and reliable frameworks that support allTextNow's business and data products.â¯In this role, you can interact with different functional areas within the business and influence decision-making in a fast-growing mobile communicationsâ¯start-up.â¯â¯
\nMore about TextNow...
Our Values:
· Customer Obsessed (We strive to have a deep understanding of our customers)
· Do Right By Our People (We treat each other with fairness, respect, and integrity)
· Accept the Challenge (We adopt a "Yes, We Can" mindset to achieve ambitious goals)
· Act Like an Owner (We treat this company like it's our own... because it is!)
· Give a Damn! (We are deeply commited and passionate about our work and achieving results)
Benefits, Culture, & More:
· Strong work life blend
· Flexible work arrangements (wfh, remote, or access to one of our office spaces)
· Employee Stock Options
· Unlimited vacation
· Competitive pay and benefits
· Parental leave
· Benefits for both physical and mental well being (wellness credit and L&D credit)
· We travel a few times a year for various team events, company wide off-sites, and more
Diversity and Inclusion:
At TextNow, our mission is built around inclusion and offering a service for EVERYONE, in an industry that traditionally only caters to the few who have the means to afford it. We believe that diversity of thought and inclusion of others promotes a greater feeling of belonging and higher levels of engagement. We know that if we work together, we can do amazing things, and that our differences are what make our product and company great.
TextNow Candidate Policy
By submitting an application to TextNow, you agree to the collection, use, and disclosure of your personal information in accordance with the TextNow Candidate Policy
We believe communication belongs to everyone. We exist to democratize phone service. TextNow is evolving the way the world connects and that's because we're made up of people with curious minds who bring an optimistic, yet critical lens into the work we do. We're the largest provider of free phone service in the nation. And we're just getting started.
Join us in our mission to break down barriers to communication and free the flow of conversation for people everywhere.
TextNow is looking for a motivated Senior Data Analyst to join our Analytics & Insights team. Youâll drive data-informed decision-making across the organization by translating business problems into analytical solutions, designing insightful dashboards, and uncovering trends that shape strategic actions.
This role is perfect for someone with strong analytical skills, deep business acumen, and a passion for using data to tell stories that inspire action.
What Youâll Do
Analyze complex datasets to identify actionable insights, trends, and opportunities
Develop and maintain dashboards, reports, and data visualizations using tools like Looker, Tableau, Power BI, or Redash
Conduct ad hoc analyses to support product, marketing, and operations initiatives
Partner with data engineering teams to ensure data quality, integrity, and availability
Develop and maintain KPI frameworks and performance measurement systems
Assist in building scalable data models and automation pipelines
Collaborate cross-functionally with Product, Finance, Marketing, and Operations teams to define analytical needs
Translate business questions into data requirements and present insights and recommendations to senior leadership
Mentor junior analysts and foster a culture of data-driven decision-making
Define and standardize analytical best practices across the organization
Youâll Be a Great Fit If You Have
Bachelorâs degree in Data Science, Statistics, Mathematics, Economics, Computer Science, or a related field (Masterâs preferred)
5+ years of experience in data analytics or business intelligence
Proficiency in SQL and at least one programming language (e.g., Python or R)
Experience with modern BI tools (Looker, Tableau, Power BI, Mode, or Redash)
Strong understanding of A/B testing, statistical analysis, and data modeling
Experience working with large-scale datasets and cloud-based environments (e.g., Snowflake, Eppo)
Excellent communication and storytelling skills with data
Attention to detail, analytical rigor, and curiosity for continuous improvement
Preferred Skills
Experience in telecommunications, SaaS, or consumer app environments
Familiarity with machine learning concepts and predictive analytics
Understanding of ETL processes and data warehousing fundamentals
Experience collaborating with product teams on experimentation and growth analytics
Estimated Base Salary Range by Location:
Canada (CAD): $103,700 â $140,300
US â National (USD): $114,800 â $155,300
Final compensation will be determined based on a number of factors, including skills, experience, location, and on-the-job performance. Weâre committed to paying competitively to hire and retain high-caliber talent. We recognize that exceptional talent may fall outside of these ranges; we encourage all qualified candidates to apply even if their compensation expectations are outside of the listed range.
\nMore about TextNow...
Our Values:
· Customer Obsessed (We strive to have a deep understanding of our customers)
· Do Right By Our People (We treat each other with fairness, respect, and integrity)
· Accept the Challenge (We adopt a "Yes, We Can" mindset to achieve ambitious goals)
· Act Like an Owner (We treat this company like it's our own... because it is!)
· Give a Damn! (We are deeply committed and passionate about our work and achieving results)
Benefits, Culture, & More:
· Strong work life blend
· Flexible work arrangements (wfh, remote, or access to one of our office spaces)
· Employee Stock Options
· Unlimited vacation
· Competitive pay and benefits
· Parental leave
· Benefits for both physical and mental well being (wellness credit and L&D credit)
· We travel a few times a year for various team events, company wide off-sites, and more
Diversity and Inclusion:
At TextNow, our mission is built around inclusion and offering a service for EVERYONE, in an industry that traditionally only caters to the few who have the means to afford it. We believe that diversity of thought and inclusion of others promotes a greater feeling of belonging and higher levels of engagement. We know that if we work together, we can do amazing things, and that our differences are what make our product and company great.
TextNow Candidate Policy
By submitting an application to TextNow, you agree to the collection, use, and disclosure of your personal information in accordance with the TextNow Candidate Policy
The Data team at Loop is on a mission to empower merchants with transformative data products that drive success beyond returns. By building tools that merchants love and fostering a robust data culture, the team enables smarter decision-making across the board. Whether creating insights to guide merchantsâ strategies or strengthening internal data-driven processes, the Data team is integral to shaping Loopâs future and unlocking new opportunities for our merchants and teams alike.
As a Data Engineer at Loop, youâll have the chance to significantly impact our ability to solve merchant problems and fulfill merchant needs. Youâll be an integral member of the team, owning all aspects of data availability, quality, and ease of use of our data platforms. Your success in this role will depend on a healthy blend of creativity and structure with a continuous focus on delivering value to the business.
At Loop, weâre intentional about the way we work so that we can do our best work. We call this our Blended Working Environment. We work from our HQ in Columbus, OH, or one of our Hub or Secluded locations, and are distributed throughout the United States, select Canadian provinces, and the United Kingdom. For this position, weâre looking for someone to join us in a location where we already have an established Hub or HQ.
Our data stack: Snowflake, Fivetran, dbt, GoodData, Secoda
\n#LI-ST1
Loop Story
Commerce should feel effortless. Every product adored, every order perfect, every customer loyal for life. But reality is messier: operations get tangled, margins grow thin, and trust is fragile. Thatâs where Loop steps in. We create confidence where commerce fails.
We started by fixing returns and exchanges. Today, weâre building a connected commerce operations suite â powering everything from order tracking to fraud prevention, with hundreds of innovations in between. Grounded in data and insight, our platform helps merchants make smarter decisions with every transaction. Over 5,000 of the worldâs most loved brands trust Loop to turn cost centers into growth engines. Our mission is simple: protect margins, delight customers, and help merchants build businesses that last.
Life at Loop is rooted in our core values. We balance high empathy with high standards, knowing that work is better when we can show up authentically and resilience is built by facing challenges head-on. We expect youâll grow quickly, learning skills that last far beyond your time here. Loop is a formative chapter in your career â a chance to shape the future of commerce and to leave better than when you arrived.
Learn more about us here: https://loopreturns.com/careers.
You can review our privacy notice here.
At Spotify, we're building the revenue platform that drives how revenue and taxes are processed across the company â enabling reliable, scalable financial operations across every market, product line, and partner. Our systems are essential to Spotifyâs ability to earn, track, and report revenue and taxes, supporting everything from subscriptions and advertising to creator payouts.
As engineers on this team, we design and maintain the backend and data platform capabilities that power millions of transactions each day with precision. We build services that handle tax calculations, produce compliant financial records, and support regulatory requirements across global markets â all while staying agile to keep up with Spotifyâs evolving business models. We equip Finance teams with flexible, configurable tools that govern how revenue and taxes are applied across products, enabling rapid adjustments without needing deep technical expertise. Our modular, process-oriented components simplify the development, maintenance, and scaling of the critical Order to Cash enterprise process that underpin Spotifyâs financial operations.
\nSpotify is an equal opportunity employer. You are welcome at Spotify for who you are, no matter where you come from, what you look like, or whatâs playing in your headphones. Our platform is for everyone, and so is our workplace. The more voices we have represented and amplified in our business, the more we will all thrive, contribute, and be forward-thinking! So bring us your personal experience, your perspectives, and your background. Itâs in our differences that we will find the power to keep revolutionizing the way the world listens.
At Spotify, we are passionate about inclusivity and making sure our entire recruitment process is accessible to everyone. We have ways to request reasonable accommodations during the interview process and help assist in what you need. If you need accommodations at any stage of the application or interview process, please let us know - weâre here to support you in any way we can.
Spotify transformed music listening forever when we launched in 2008. Our mission is to unlock the potential of human creativity by giving a million creative artists the opportunity to live off their art and billions of fans the chance to enjoy and be passionate about these creators. Everything we do is driven by our love for music and podcasting. Today, we are the worldâs most popular audio streaming subscription service.
Who We Are and What We are Doing:
Ethena Labs is actively building and deploying a suite of groundbreaking digital dollar products aiming to upgrade money into the internet era.
Our flagship product, USDe, is a synthetic dollar backed by digital assets, and takes the novel approach of using a delta-neutral hedged basis strategy to maintain its peg. This product scaled from zero to $15b in 18 months.
Expanding on this, iUSDe is designed specifically for traditional financial institutions, incorporating necessary compliance features to enable them to access the crypto-native rewards our protocol generates, in an institutional-friendly manner.
Ethena has also developed USDtb: a fiat backed GENIUS compliant stablecoin in partnership with BlackRock which has scaled to ~$2b.
These products are also offered in a whitelabel stablecoin offering where any application, chain, wallet or exchange can launch their own stablecoin on Ethena's back-end infrastructure.
Through these offerings, Ethena Labs is not just creating new financial products; we are building the foundational infrastructure for a more open, efficient, and interconnected global financial system.
Open job offerings will be focused on two new major product lines coming to market in the next few months.
Join us!!
The Senior Data Engineer is a critical role reporting directly to the CTO. The primary mission is to rapidly deliver a reliable, production-ready market data platform that serves as the single source of truth for trading, risk, and business intelligence.
Youâll immediately own the entire data platform from inception and deliver working historical and real-time Tardis pipelines in the first 60 days. Beyond the initial MVP, the role requires iteratively evolving the platform into a best-in-class, cloud-native, observable, and self-service system. You will work hand in hand with the CTO & trading team to scope & deliver to business needs. The Senior Data Engineer will also serve as the go-to data expert for the firm and will be responsible for mentoring future junior data engineers or analysts.
Why Ethena Labs?
You'd be joining a group that has well established itself as one of the most successful crypto-native company's of all time, a group with a mission to revolutionise decentralised finance and it's position in global finance.
Work alongside a passionate and innovative team that values collaboration and creativity.
Enjoy a flexible, remote-friendly work environment with established opportunities for personal growth and learning.
If you subscribe to the mission of separating the dollar from the state, then we want to hear from you!
We look forward to receiving your application and will be in touch after having a chance to review.
In the meantime, here are some links to more information about Ethena Labs to help you check us out:
Sayari is a risk intelligence provider that equips the public and private sectors with immediate visibility into complex commercial relationships by delivering the largest commercially available collection of corporate and trade data from over 250 jurisdictions worldwide. Sayari's solutions enable risk resilience, mission-critical investigations, and better economic decisions.
Headquartered in Washington, D.C., its solutions are trusted by Fortune 500 companies, financial institutions, and government agencies, and are used globally by thousands of users in over 35 countries. Funded by world-class investors, with a strategic $228 million investment by TPG Inc. (NASDAQ: TPG) in 2024, Sayari has been recognized by the Inc. 5000 and the Deloitte Technology Fast 500 as one of the fastest growing private companies in the United States and was featured as one of Inc.âs âBest Workplacesâ for 2025.
POSITION DESCRIPTION
Sayari is looking for an Entry-Level Data Engineer to join our Data team located in Washington, DC. The Data team is an integral part of our Engineering division and works closely with our Software & Product teams, as well as other key stakeholders across the business.
JOB RESPONSIBILITIES:
SKILLS & EXPERIENCE
Req
Please mention the word **HARMLESS** and tag RMTU3LjI0NS4yNDcuMTE4 when applying to show you read the job post completely (#RMTU3LjI0NS4yNDcuMTE4). This is a beta feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.
Arbol is a global climate risk coverage platform and FinTech company offering full-service solutions for any business looking to analyze and mitigate exposure to climate risk. Arbolâs products offer parametric coverage which pays out based on objective data triggers rather than subjective assessment of loss. Arbolâs key differentiator versus traditional InsurTech or climate analytics platforms is the complete ecosystem it has built to address climate risk. This ecosystem includes a massive climate data infrastructure, scalable product development, automated, instant pricing using an artificial intelligence underwriter, blockchain-powered operational efficiencies, and non-traditional risk capacity bringing capital from non-insurance sources. By combining all these factors, Arbol brings scale, transparency, and efficiency to parametric coverage.
In this role, you will research, develop, and apply machine learning tools to model and price climate and weather risk. You will work with diverse weather and geospatial datasets covering a suite of phenomena, from traditional weather-station readings of temperature and precipitation, to radar measurements of hail stone sizes, to satellite indices of vegetation content. You will learn how to use our existing catalog of pricing and modeling tools, engage in their improvement and maintenance, and develop new methodologies. We are open to a range of experience levels for this position.
About the Team
The analytics team is responsible for making sense of the terabytes of data Arbol has at its disposal. It forms the connective tissue between more client-facing teams, such as sales, and back-end roles like data engineering. Youâll be joining a small team of data scientists and researchers and will have a unique opportunity to impact many levels of the firm. This is an ideal position for someone interested in building machine learning systems while taking a deep dive into the insurance industry.
\nEssential Job Functions & Physical Requirements
Ability to sit for extended periods of time while working at a computer, with or without reasonable accommodation
Ability to use a computer, keyboard, mouse, and standard office equipment (e.g., phone, printer, scanner)
Ability to view a computer screen for prolonged periods, with or without reasonable accommodation
Ability to communicate effectively in person, by phone, and via email
Ability to occasionally stand, walk, bend, and reach within an office environment
Ability to lift and/or move up to 10â15 pounds occasionally (e.g., office supplies, files), with or without reasonable accommodation
Ability to perform repetitive motions, such as typing or data entry
Ability to maintain focus and attention while performing detailed tasks
Interested, but you donât meet every qualification? Please apply!
Arbol values the perspectives and experience of candidates with non-traditional backgrounds and we encourage you to apply even if you do not meet every requirement.
Accessibility
Arbol is committed to accessibility and inclusivity in the hiring process. As part of this commitment, we strive to provide reasonable accommodations for persons with disabilities to enable them to access the hiring process. If you require an accommodation to apply or interview, please contact hr@arbol.io
Benefits
Arbol is proud to offer its full-time employees competitive compensation and equity in a high-growth startup. Our health benefits include comprehensive health, dental, and vision coverage, and an optional flexible spending account (FSA) to support your health. We offer a 401(k) match to support your future, and flexible PTO for you to relax and recharge.
Equal Opportunity Employer
Arbol is an Equal Opportunity Employer and does not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, or gender identity), national origin, age, disability, veteran status, or any other legally protected status.
Arbol participates in the E-Verify program to confirm employment eligibility.
Data Engineering Intern
At RefinedScience, our mission is to advance care by bringing together the best science, data and minds â disease by disease, patient by patient, cell by cell to discover pathways to life beyond disease.
WHAT WE ARE LOOKING FOR
We are seeking a motivated Data Engineering Intern to join our team. This internship is open to undergraduate and graduate students who are interested in building data infrastructure that supports advanced analytics, data science, and AI-driven insights in healthcare and life sciences.
You will work closely with data scientists, bioinformaticians, and engineers to help design, build, and improve data pipelines and platforms that power RefinedScience's research and analytics initiatives.
KEY ACTIVITIES
MUST HAVES
At Satelligence we're looking for a Jr. Data Engineer to join our team.
We are looking for a Junior Data Engineer:
Employment type: 32â40h/week
Location: Utrecht, NL (hybrid)
Experience: JuniorâMedior level
Salary: â¬48 000 â â¬60 000 gross/year (including 8% holiday allowance, based on 40h/week)
About the job
As Data Engineer your main responsibilities are on building out capabilities of our (geo)data query engine. Youâll be part of the data engineering team, which develops and maintains our satellite data processing engine, geospatial storage and query engine and a set of internal tools used mainly by our OPS team. Our tech stack is Python, Django, PostGIS, deployed on Google Cloud services like GKE and cloud functions. This role will report to Engineering Lead.
What will you do?
You'll be instrumental in empowering our product teams to develop and deploy features that help our clients reach their sustainability targets. You'll ensure the reliability, scalability, and performance of our cloud-based data platform, enabling us to deliver critical environmental intelligence through our API. Your work will directly contribute to:
Building and maintaining scalable infrastructure on GCP using infrastructure-as-code tools like Terraform
Optimizing data pipelines for processing and storing massive datasets (ETL, OLAP)
Developing and managing APIs for efficient data dissemination.
Implementing data engineering best practices for data quality, security, and performance.
Collaborating closely with product teams to understand their needs and provide technical guidance.
Contributing to the design and implementation of data storage solutions using databases like PostgreSQL
Monitoring and troubleshooting platform performance and ensuring high availability.
About you
You are an experienced Python developer
You are experienced with RDBMS, especially postgresql
You are familiar with Django
You prefer a well organized codebase over getting your pull requests merged fast
Nice to have
You are experienced with Infrastructure as Code tools such as Terraform
You have experience with Google Cloud (Cloud SQL, Cloud Composer, Kubernetes)
You worked with PostGIS before or bring other experience with geospatial data
What we offer you:
ðOffice centrally located in Utrecht city (with direct access via bus 8 or a 20-minute walk from Utrecht Central Station)
ð27 holidays (based on full-time employment)
ðSolid pension scheme with employer contribution
ðNS Business Card for employees commuting from outside Utrecht
ð¥ï¸Laptop and necessary IT equipment provided
ð©ºAdditional income protection in case of long-term illness or disability, complementing the statutory coverage
ð¥ªDaily lunch, fruits, and Aroma Club coffee at the office
ð¹Not the main reason to join, but definitely a fun one: Annual Team Week, after-summer drinks with friends and family and a festive Christmas celebration.
Meet Satelligence!
Satelligence is the market leader in remote sensing technology for sustainable sourcing with the mission to halt deforestation. We provide traders, manufacturers and agribusinesses such as Mondelez, Bunge, Cargill, Unilever, Rabobank with critical sustainability insights empowering them to minimize their global environmental footprint and track their progress against climate objectives, ensuring a sustainable supply chain. We were founded in 2016 and currently employ +40 people, working in Utrecht and several locations in Asia, Africa, and South America.
Apply for the job
Do you want to join our team as our new junior Data Engineer? Then we'd love to hear about you!
Please mention the word **FAIR** and tag RMTU3LjI0NS4yNDcuMTE4 when applying to show you read the job post completely (#RMTU3LjI0NS4yNDcuMTE4). This is a beta feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.
Â
|
Job Title: |
Sr Software Engineer |
|
Department: |
Product Engineering |
Â
Position Description:
The Sr Software Engineer will be working with other engineers, architects, and product managers to develop software on our philanthropic solutions software platform. This person must be self-motivated and results-oriented with strong programming skills across modern enterprise software architectures. The Sr Software Engineer is expected to work well in an agile development environment to mentor and develop those around them and build superior products.
Â
Duties & Responsibilities:
Our Mission
At Big Health, our mission is to help millions back to good mental health by providing fully digital, non-drug options for the most common mental health conditions. Our FDA-clear digital therapeuticsâSleepioRx for insomnia and DaylightRx for anxietyâguide patients through first-line recommended, evidence-based cognitive and behavioral therapy anytime, anywhere. Our digital program, Spark Direct, helps to reduce the impact of persistent depressive symptoms.
In pursuit of our mission, weâve pioneered the first at-scale digital therapeutic business model in partnership with some of the most prominent global healthcare organizations, including leading Fortune 500 healthcare companies and Scotlandâs NHS. Through product innovation, robust clinical evaluation, and a commitment to equity at scale, we are designing the next generation of medicine and the future of mental health care.
Our Vision
Over the next 5-10 years, we believe digital therapeutics will transform the delivery of healthcare worldwide by providing access to safe and effective evidence-based treatments. Big Health is positioned to take the lead in this transformation.
Big Health is a remote-first company, and this role can be based anywhere in the US.
Join Us
We're seeking a Product Data Analyst contractor to drive data-informed product decisions by improving our data democratization, analyzing data, generating insights, and generating reports. You'll partner closely with product, growth, enrollment marketing, and client implementation teams to understand user behavior, measure product performance, and identify opportunities for growth and improvement.
\nWe at Big Health are on a mission to bring millions back to good mental health, in order to do so, we need to reflect the diversity of those we intend to serve. Weâre an equal opportunity employer dedicated to building a culturally and experientially diverse team that leads with empathy and respect. Additionally, we will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of the San Francisco Fair Chance Ordinance.
Big Health participates in E-Verify for all new hires in the United States.
Restaurant365 is a SaaS company disrupting the restaurant industry! Our cloud-based platform provides a unique, centralized solution for accounting and back-office operations for restaurants. Restaurant365âs culture is focused on empowering team members to produce top-notch results while elevating their skills. Weâre constantly evolving and improving to make sure we are and always will be âBest in Classâ ... and we want that for you too!
Restaurant365 is seeking a Data Analyst to join our Enterprise Data Analytics team. This role supports business teams across the organization by helping turn data into insights that inform day-to-day decisions and longer-term planning.
As a Data Analyst, you will partner with stakeholders to understand business questions, support reporting needs, and help maintain dashboards and KPIs. Youâll work within established data models and governance practices while continuing to build your technical and business analysis skills. This role is ideal for someone who enjoys working with data, learning the business, and growing into a strong analytics partner over time.
\nDYN365, Inc d/b/a Restaurant365 is an equal opportunity employer.
Somos um dos maiores bancos privados do Brasil, conforme o ranking do Banco Central. E temos muito orgulho em dizer que, pelo segundo ano consecutivo, fomos reconhecidos como a melhor instituição financeira para trabalhar no Brasil, segundo o ranking da GPTW 2025! Também recebemos o selo de Diversidade na categoria Mulher, reforçando nosso compromisso com a equidade.
Nossa cultura acontece de verdade: sendo simples, corretos, parceiros e corajosos. Valorizamos as relações, a inovação e um ambiente leve, cada vez mais colaborativo e com intencionalidade no avanço da diversidade e inclusão.
Estamos em constante evolução e construÃmos #parcerias de sucesso para entregarmos nosso propósito de tornar mais tranquila a vida financeira de pessoas e empresas.
Se identificou? Então venha trabalhar com a gente!
\nDiversidade e inclusão
O BV atua intencionalmente em prol da aceleração da equidade e representatividade no mercado financeiro, respeitando e apoiando a diversidade em toda sua pluralidade e interseccionalidade, garantindo uma transformação social positiva.
Por isso, convidamos pessoas negras, mulheres, profissionais com deficiência, comunidade LGBTQIA+ e pessoas de qualquer idade a conhecerem a gente um pouco mais e a se inscreverem nesta vaga.
OUR ORIGIN STORY ð
In 2011 SkySlope started as an idea born at the kitchen table of our CEO, with just him and two others. Headquartered in Sacramento, California, we have since grown out of our previous 3 offices and many of our close to 150 employees are spread all across the United States. Those 150 employees support close to 300,000 users across 5,000 offices nationwide and now in Canada as well. Included in that is 8 out of the 15 largest Real Estate Brokerages in the nation.
But, despite being happy with what weâve achieved we know that as industry leaders in our space thereâs a lot of work left to be done. All of the growth and success that has happened is a result of us obsessing over building cutting edge software that makes the Real Estate world a better place. We know this only happens by hiring people who donât just come up with out of the box ideas but hiring people who actually see those ideas through and bring them to life. As weâve grown, weâve been fortunate enough to hire plenty of people who possess that quality and realize itâs equally important to hire people who can pair that skill with empathy, collaboration, and a keen sense of urgency. If youâre looking to join a company where you can have real impact and surround yourself with an incredible team of people then look no further.
SKYSLOPEâS CORE VALUES ðªð»
These are the principles that helped us get to where we are and they are the principles that will guide us to where we want to go in the future. You can apply them to your professional life, your personal life, to any business and any situation. In no specific hierarchy, our core values are:
Awareness | Execution | Obsession | Ownership | Humility | Radical Candor | Urgency | Greatness | Inches I Fun
Learn more about our core values from our CEO, Tyler Smith here!
About the role: We are looking for a Data Analyst III to join our team and to help elevate the way we leverage data across the organization. While this role includes traditional data retrieval and reporting, we're looking for someone who goes beyond fulfilling requests â someone who proactively identifies trends, surfaces insights, and brings forward recommendations that help teams make better decisions before they even know to ask. Experience or curiosity around AI-assisted analytics is a plus, but this is first and foremost a strong data analyst role.
\nMedical Insurance â Company pays flat dollar amount towards premium
There are 3 plan options
Our Medical Insurance plans are provided through United Healthcare
The United Healthcare HMO is only offered to California residents
Eligibility begins 1st of the month following date of hire
Per Paycheck (24 pay periods a year)
Employee costs per tier are as follows:
UHC HDHP/HSA
Employee Only $58.92
UHC PPO
Employee Only $104.10
UHC HMO (CA residents only)
Employee Only $84.56
Dental Insurance â Company pays 75% of monthly premium only on Base Plan
This PPO plan is administered through Principal
Eligibility begins 1st of the month following date of hire
Principal Dental Base Plan
Employee Only $4.19
Principal Dental Buy-Up Plan
Employee Only $6.65
Vision Insurance â Company pays 100% of monthly premium
This plan is administered through Principal (VSP choice network)
Eligibility begins 1st of the month following date of hire
Basic Life and AD&D Insurance (with additional Voluntary Plans available) â Company paid plan with a guarantee issue amount of $25,000.
Plan is administered through Principal
Eligibility begins 1st of the month following date of hire
Pricing varies for additional coverage, based upon age, coverage and dependent classification
Voluntary Short & Long Term Disability Insurance Plans â Optional plans to help protect your financial well-being.
Plan is administered through Principal
Eligibility begins 1st of the month following date of hire
Pricing varies, based upon age
Voluntary Accident insurance- Optional plans available to purchase that pays you a cash benefit to help with your expenses if you or a covered family member is injured due to an accident.
Employee Only $4.39
Voluntary Hospital Indemnity- Optional plans available to purchase that pays you a cash benefit to help with your expenses if you or a covered family member is admitted to the hospital
Employee Only $6.85
Voluntary Critical Illness- Optional plans available to purchase to help with your expenses if you or a covered family member is diagnosed with a covered critical illness.
Pricing varies, based upon age
Flexible Spending Account â A tax savings account you put money into that you use to pay for certain out-of-pocket health care and dependent care costs.
Plan is administered through Discovery Benefits
Eligibility begins 1st of the month following date of hire, if you sign up by the 25th of the month
Health Savings Account (HSA)â A tax savings account for employees enrolled in a High Deductible Health Plan. You can put money into this account to pay for certain out-of-pocket health care costs
Plan is administered through Discovery Benefits
Eligibility begins 1st of the month following date of hire, if you sign up by the 25th of the month
Must be enrolled in the UHC HDHP/HSA medical plan with SkySlope to be eligible
SkySlope contributes $300 to an individual HSA and $600 to a family HSA
401(k) Plan â Company will match $0.50 on each $1.00 contributed up to the first 6% of eligible earnings
Plan is administered through Principal
Eligibility begins first pay date after 90 days of employment
Auto-enrollment after eligibility at 3% of gross annual earnings
Defer between 1% and 40% of eligible contribution
Employee Stock Purchase Plan - Company match equal to 33.3333% of dollars contributed to the plan, based upon the average purchase price for the quarter.
Plan administered through Fidelity
Eligibility begins first pay date after 90 days of employment
May contribute after-tax dollars from 3% to 15% of base earnings
Paid Time Off (PTO) â Company provides 120 hours (equivalent of 15 days) of PTO for new hires
PTO accrual begins after 90 days of employment
16 Paid Holidays
11 observed, 5 floating (used for personal holidays)
List of observed holidays published annually
Eligibility begins on your first day of employment
Bereavement Leave â Company will provide you with the following off to grieve the loss of a loved one.
5 paid days of leave for an immediate family member. This is a spouse, child, parent, grandparent.
1 paid day of leave for a close non-family member.
Discounts through Fidelity - Purchasing discounts for wireless, car rentals, hotels and moreâ¦
Pet Insurance through Nationwide- 50%, 70% reimbursement plans available through Nationwide with options for wellness. SkySlope contributes $20 a month, per pet, up to 2 pets towards the cost of the plan
Paid Parental Leave - All full-time regular employees are eligible for SkySlopeâs Paid Parental Leave program, which provides employees with up to six (6) weeks of pay following the birth or placement of a new child. Paid Parental Leave must be taken within the first 6 months of the birth or placement of a new child. Employees will be paid at their regular rate of pay based upon their normal work schedule, up to a maximum of forty (40) hours per week.
Dayforce Wallet- All full-time regular employees will have access to sign up for Dayforce Wallet. Dayforce Wallet is a program provided by our payroll provider that allows employees to access their pay on-demand as soon as it is earned, without waiting for their standard payday.
Waldorf University discounts and perks- 10% off tuition for employees and their families, free text books, and scholarship opportunities available
Child Literacy Assistance Program discount- Discounted annual membership to Luminous Minds, an online resource center created to help with child literacy struggles. $85 for 1 year membership as a SkySlope Employee.
$1,000 Employee Referral bonuses- SkySlope will give every referrer $1,000 (post-tax) after a referee passes their 90 day mark.
In addition to the above you also receive other perks like our Annual Employee Appreciation Day and additional internal company events.
SkySlope, is an Equal Opportunity employer. All qualified applicants will receive
consideration for employment without regard to race, color, religion, sex, age, disability, protected veteran status,
national origin, sexual orientation, gender identity or expression (including transgender status), genetic
information or any other characteristic protected by applicable law.
We sincerely thank you for taking the time to review our open positions and hope you'll take the time to submit a concise and thoughtful application.
Still thinking about applying? Waiting to hear back from us? Check out our social media in the meantime!
SkySlope | Facebook | Instagram | YouTube | LinkedIn | Twitter
Your privacy is important to us. Learn more about what data is collected and how we use it here.
Join Our Team
Oowlish, one of Latin America's rapidly expanding software development companies, is seeking experienced technology professionals to enhance our diverse and vibrant team.
As a valued member of Oowlish, you will collaborate with premier clients from the United States and Europe, contributing to pioneering digital solutions. Our commitment to creating a nurturing work environment is recognized by our certification as a Great Place to Work, where you will have opportunities for professional development, growth, and a chance to make a significant international impact.
We offer the convenience of remote work, allowing you to craft a work-life balance that suits your personal and professional needs. We're looking for candidates who are passionate about technology, proficient in English, and excited to engage in remote collaboration for a worldwide presence.
About the Role:
We are seeking a hands-on Big Data Engineer to support and enhance an AWS-based data platform, focusing on pipeline reliability, scalable processing, and performance optimization. This role requires strong Python expertise, deep familiarity with AWS data services, and the ability to maintain production-grade data workflows.
You will work on event-driven pipelines, contribute to CI/CD improvements, and collaborate on platform reliability initiatives. This role is ideal for someone who enjoys building and maintaining data infrastructure, optimizing large-scale data processing systems, and working in cloud-native environments.
This is a 6-month engagement, aligned to ET time zone.
\nBenefits & Perks:
Home office;
Competitive compensation based on experience;
Career plans to allow for extensive growth in the company;
International Projects;
Oowlish English Program (Technical and Conversational);
Oowlish Fitness with Total Pass;
Games and Competitions;
You can also apply here:
Website: https://www.oowlish.com/work-with-us/
LinkedIn: https://www.linkedin.com/company/oowlish/jobs/
Instagram: https://www.instagram.com/oowlishtechnology/
Join Our Team
Oowlish, one of Latin America's rapidly expanding software development companies, is seeking experienced technology professionals to enhance our diverse and vibrant team.
As a valued member of Oowlish, you will collaborate with premier clients from the United States and Europe, contributing to pioneering digital solutions. Our commitment to creating a nurturing work environment is recognized by our certification as a Great Place to Work, where you will have opportunities for professional development, growth, and a chance to make a significant international impact.
We offer the convenience of remote work, allowing you to craft a work-life balance that suits your personal and professional needs. We're looking for candidates who are passionate about technology, proficient in English, and excited to engage in remote collaboration for a worldwide presence.
About the Role:
We are seeking a Senior Data Engineer with strong expertise in enterprise data modeling and AWS-based data platforms to support a mature and evolving data ecosystem. This role requires hands-on experience working with large-scale data environments, optimizing data models, and maintaining event-driven pipelines in a cloud-native architecture.
You will work across data modeling, pipeline development, API data support, and infrastructure collaboration. This position is ideal for someone comfortable operating in enterprise environments, maintaining production-grade systems, and improving performance and scalability across a modern AWS data stack.
This is a 6-month engagement with ET time zone alignment required.
\nBenefits & Perks:
Home office;
Competitive compensation based on experience;
Career plans to allow for extensive growth in the company;
International Projects;
Oowlish English Program (Technical and Conversational);
Oowlish Fitness with Total Pass;
Games and Competitions;
You can also apply here:
Website: https://www.oowlish.com/work-with-us/
LinkedIn: https://www.linkedin.com/company/oowlish/jobs/
Instagram: https://www.instagram.com/oowlishtechnology/
Remote Data Engineering jobs. Data pipelines, ETL, data architecture and big data. En RemoteJobs.lat conectamos a profesionales de Latinoamerica con empresas que ofrecen trabajo 100% remoto. Todas nuestras ofertas permiten trabajar desde cualquier ciudad, con pagos en dolares o moneda internacional.
$4,000 - $11,000 USD/mes
327
100% Remoto LATAM
Estimated ranges in USD/month for remote contracts with international companies. Vary by company, complementary stack and client location.
| Level | Years of experience | Range USD/month |
|---|---|---|
| Junior | 0-2 | $4,000 - $5,750 |
| Mid-level | 2-4 | $5,400 - $7,850 |
| Senior | 4-7 | $7,500 - $9,950 |
| Lead/Staff | 7+ | $9,250 - $11,000 |
Some companies that have historically hired Data Engineering profiles to work 100% remotely from Latin America: