New
All roles at JumpCloud are Remote unless otherwise specified in the Job Description. About JumpCloud JumpCloud® delivers a unified open directory platform that makes it easy to securely manage identities, devices, and access across your organization. With JumpCloud®, IT teams and MSPs enable users to work securely from anywhere and manage their Windows, Apple, Linux, and Android devices from a single platform. JumpCloud® is IT Simplified. About the role We’re looking for a Senior Data Engineer to join JumpCloud’s Data Enablement team. Data Enablements Vision is for data to drive JumpCloud and our Customers. The current mission the team is on is to put in place foundational technology and process to uplevel the data capabilities of our Product and our Data Warehouse/Lakehouse. We are introducing an Event Based Architecture, developing and refining a data model that supports JumpCloud’s growth strategy and modernizing our Data Warehouse. A successful data engineer will exhibit an entrepreneurial spirit and enjoy tackling data engineering problems that most other people cannot solve, as well as shaping the future capabilities of JumpCloud’s data engineering, performance reporting and data governance. Come be a part of an exciting new team where you will be able to work on challenging projects, rich data sets, and develop valuable skills. This role involves frequent engagement with analytics partners, data/platform engineering and product engineering to mature our data model, pipelines and data practices. The role will report to the Senior Manager of Data. This is a senior level position. What you'll be doing: As part of the Data Enablement team, and as part of the engineering organization as a whole here at JumpCloud, you will be responsible for providing critical data infrastructure and systems for multiple areas of the business, including Business Analysis, Product Development, Engineering, Finance, Sales and Executive Strategy On a day-to-day basis, as a senior level data engineer, you may be asked to: Interface with stakeholders to define needs and develop strategies for providing data Integrate Technologies such as Airflow, Python, and Kafka Plan, build, and maintain data pipelines from internal and external data sources Implement data observability and monitoring in the pipeline and in the warehouse Work with appropriate teams to ensure data security and data compliance Guide data analysts to ensure clean delivery of data You will work with other senior level engineers and architects with the goal to achieve top level proficiency in core data engineering skills and business functions You have: Extensive hands-on experience with building scalable data solutions with complex fast moving data sets Can lead the technology on small to large sized projects from start to finish Strong experience with Cloud Data Warehouses and Data Lake architectures and implementations Proven proficiency in data modeling and database design, with an emphasis towards designing optimized self service data solutions. Experience with both batch and streaming data pipelines and ELT processes Ability to work and communicate effectively with other engineers, and both technical and non-technical business stakeholders Ability to quickly integrate new technologies and industry best practices into your skill sets Expert level SQL skills Proficiency with the Python programming tools and ecosystem, while incorporating strong software engineering techniques Experience working with some of these: message brokers, data sync/mirroring tools, stream and batch processors, data orchestrators and workflow engines Nice to haves: Python3 and golang Software Development, following general software engineering principles SQL for data transformation and analysis, with optimization and tuning in mind Snowflake data warehouses (or equivalent) Dremio data lakehouses Apache Airflow Apache Kafka (and it’s supporting tools) Experience of building stream and batch processing big data systems Experience building observable data systems Experience with basic data modeling and data architecture Experience with cloud data storage techniques Familiarity with data storage formats, such as JSON/Avro/Protobuf/Parquet/Iceberg Can work effectively both independently and as part of the data engineering team as a whole Experience with Data Governance, including data contracts and schema management Experience with Data Security standards including RBAC and sensitive data handling #LI-MS1 Where you’ll be working/Location: JumpCloud is committed to being Remote First, meaning that you are able to work remotely within the country noted in the Job Description. This role is remote in the country of India. You must be located in and authorized to work in India to be considered for this role. Language: JumpCloud® has teams in 15+ countries around the world and conducts our internal business in English. The interview and any additional screening process will take place primarily in English. To be considered for a role at JumpCloud®, you will be required to speak and write in English fluently. Any additional language requirements will be included in the details of the job description. Why JumpCloud? If you thrive working in a fast, SaaS-based environment and you are passionate about solving challenging technical problems, we look forward to hearing from you! JumpCloud® is an incredible place to share and grow your expertise! You’ll work with amazing talent across each department who are passionate about our mission. We’re out of the box thinkers, so your unique ideas and approaches for conceiving a product and/or feature will be welcome. You’ll have a voice in the organization as you work with a seasoned executive team, a supportive board and in a proven market that our customers are excited about. One of JumpCloud®'s three core values is to “Build Connections.” To us that means creating ' human connection with each other regardless of our backgrounds, orientations, geographies, religions, languages, gender, race, etc. We care deeply about the people that we work with and want to see everyone succeed.' - Rajat Bhargava, CEO Please submit your résumé and brief explanation about yourself and why you would be a good fit for JumpCloud®. Please note JumpCloud® is not accepting third party resumes at this time. JumpCloud® is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. Scam Notice: Please be aware that there are individuals and organizations that may attempt to scam job seekers by offering fraudulent employment opportunities in the name of JumpCloud. These scams may involve fake job postings, unsolicited emails, or messages claiming to be from our recruiters or hiring managers. Please note that JumpCloud will never ask for any personal account information, such as credit card details or bank account numbers, during the recruitment process. Additionally, JumpCloud will never send you a check for any equipment prior to employment. All communication related to interviews and offers from our recruiters and hiring managers will come from official company email addresses (@jumpcloud.com) and will never ask for any payment, fee to be paid or purchases to be made by the job seeker. If you are contacted by anyone claiming to represent JumpCloud and you are unsure of their authenticity, please do not provide any personal/financial information and contact us immediately at recruiting@jumpcloud.com with the subject line 'Scam Notice' #LI-Remote #BI-Remote
Posted 12 days ago
Lead technical implementation and optimization of data platform
Serve as primary technical contact for key accounts
Posted 12 days ago
Improve data infrastructure Optimize performance and accessibility Enable data-driven
g Collaborate with cross-functional teams Mentor and lead data
Posted 12 days ago
Lead development of scalable ML systems
Advance Apollo's AI-native product features
Posted 12 days ago
Build and productionize Machine Learning models for Apollo products
Optimize users' experience at all stages of their product journey
Posted 12 days ago
Build and deploy ML models for Apollo products.
Enhance user experience through data-driven insights.
Posted 12 days ago
Design scalable AI data platforms
Optimize ML pipeline efficiency and resource allocation
Posted 12 days ago
Hiring Python and Kubernetes Engineers for Data, Workflows, AI/ML, and Analytics solutions
Collaborating on end-to-end data analytics using open-source tools
Posted 12 days ago
Hiring a Senior Analytics Engineer remotely
Axios - Smart brevity
Posted 12 days ago
Building APIs and backend systems for products
Integrating with third-party systems
Posted 12 days ago
Architect scalable data pipelines and infrastructure.
Enable real-time, reliable, and high-quality data access.
Posted 12 days ago
Design scalable data infrastructure
Build and maintain high-quality data pipelines
Posted 12 days ago
Develop and automate compliance dashboards and reports
Support regulatory reporting and audit readiness
Posted 12 days ago
Design and scale generative AI infrastructure
Develop and fine-tune generative video and visual models
Posted 12 days ago
Architect scalable, secure data platforms.
Implement modern software engineering practices.
Posted 12 days ago
* Drive optimization in supply chain and manufacturing sector * Collaborate with cross-functional
ams to build high-quality product features * Deploy AI models to solve complex global problems *
Posted 12 days ago
Revolutionize enterprise data operations through AI solutions.
Automate and accelerate data tasks for overworked data teams.
Posted 12 days ago
Propose, design, and provision cloud-native data solutions.
Lead a technical team operating modern data platforms.
Posted 12 days ago
Design and implement scalable data architectures
Lead and mentor engineering teams
Posted 12 days ago
Design and implement scalable cloud-native data platforms
Optimize and automate data platform performance
Posted 12 days ago
Lead and motivate client technical teams for modern data platforms
Maintain knowledge of modern data technology for best practices
Posted 12 days ago
Lead end-to-end ML ad targeting product development
Drive technical research and strategic roadmap
Posted 12 days ago
Design and maintain scalable data pipelines
Ensure data quality, reliability, and performance
Posted 12 days ago
Conduct SQL analysis for actionable insights, Maintain and optimize ML models, Analyze unstructured
logs, Develop ETL pipelines, Collaborate with engineering
Posted 12 days ago
- Manage platform APIs and AI capabilities - Oversee data system scalability and performance -
borate with data science and product teams - Implement AI and ML models into the platform - Ensure
Posted 12 days ago
Deliver actionable business insights
Develop and optimize data pipelines
Posted 12 days ago
Mine and analyze large-scale business data
Develop and maintain dashboards and reports
Posted 12 days ago
Design and maintain scalable data pipelines
Ensure data quality and reliability
Posted 12 days ago
Design and maintain scalable data pipelines
Ensure data quality and governance
Posted 12 days ago
Enhance data pipelines and models, Drive data-driven decision-making, Collaborate with
teams, Optimize data infrastructure, Analyze product
Posted 12 days ago
- Develop data pipelines and transform data - Optimize data infrastructure for decision-making -
lyze product data and improve solutions - Enhance mental healthcare through data insights - Drive
Posted 12 days ago
- Design and implement Data Pipelines with platform services and serverless solutions - Develop and
test ingestion pipelines from various sources - Create data transformations with SQL, Python, PaaS,
Posted 12 days ago
Design and develop infrastructure and tools for data systems; Generalize data points for multi-dimensional data stores; Build analytics lakehouse; Translate stakeholder requirements to solutions; Champion agile software development practices
None
Posted 12 days ago
Own foundational data artifacts for the business domain Mentor, coach, and advocate for team
Design and build scalable data pipelines Contribute to data architecture and governance Ensure
Posted 12 days ago
. Reduce emissions through technology innovation
. Collaborate with global enterprises
Posted 12 days ago
Build infrastructure software for data platforms, mentor engineers, provide HR support, collaborate
internationally in forensic accounting, hire exceptional talent, drive future
Posted 12 days ago
- Build production-grade data pipelines - Collaborate with cross-functional teams - Take on new
lenges and responsibilities - Shape company culture - Solve real-world complex
Posted 12 days ago
- Enhance data solutions for sustainable food systems - Lead customer onboarding and supply chain
implementations - Transform sustainability goals into actionable solutions - Drive product
Posted 12 days ago
Enable informed decision-making through accessible data
Lead data vision and architecture for impactful insights
Posted 12 days ago
Enhancing media sales innovation through automation and intelligent proposals
Driving growth and optimization for media companies and agencies
Posted 12 days ago
Develop scalable data processing platforms
Design and optimize data pipelines
Posted 12 days ago
Ensure high-quality, reliable data management.
Automate data quality assurance processes.
Posted 12 days ago
Deliver actionable business insights
Collaborate across cross-functional teams
Posted 12 days ago
Lead and mentor a data science team
Integrate analytics into business strategy
Posted 12 days ago
Lead global data governance strategy and execution
Build and maintain governance-aware data pipelines
Posted 12 days ago
Enable data-informed decision-making organization-wide
Design and implement scalable cloud-based ETL/ELT solutions
Posted 12 days ago
Empower developers through technical communication and content creation.
Engage actively in the DataHub community to support users.
Posted 12 days ago
Hiring Python and Kubernetes Specialist Engineers for Data, AI/ML & Analytics Solutions
Building open source solutions for public cloud and private infrastructure