Full-Time

Hadoop Platform Engineer

Photon

Photon

Global AI and digital solutions provider

Compensation Overview

$50k - $175k/yr

+ Variable Pay/Incentives

Dallas, TX, USA

In Person

Category
DevOps & Infrastructure (1)
Required Skills
Bash
Kubernetes
Python
Apache Spark
SQL
Apache Kafka
Docker
Hadoop
Yarn
Linux/Unix
Requirements
  • Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience)
  • Strong experience in designing, implementing, and administering Hadoop clusters in a production environment
  • Proficiency in Hadoop ecosystem components such as HDFS, YARN, MapReduce, Hive, Spark, and HBase
  • Experience with cluster management tools like Apache Ambari or Cloudera Manager
  • Solid understanding of Linux/Unix systems and networking concepts
  • Strong scripting skills (e.g., Bash, Python) for automation and troubleshooting
  • Knowledge of database concepts and SQL
  • Experience with data ingestion tools like Apache Kafka or Apache NiFi
  • Familiarity with data warehouse concepts and technologies
  • Understanding of security principles and experience implementing security measures in Hadoop clusters
  • Strong problem-solving and troubleshooting skills, with the ability to analyze and resolve complex issues
  • Excellent communication and collaboration skills to work effectively with cross-functional teams
  • Relevant certifications such as Cloudera Certified Administrator for Apache Hadoop (CCAH) or Hortonworks Certified Administrator (HCA) are a plus
Responsibilities
  • Design, implement, and maintain Hadoop clusters in large volume.
  • Collaborate with data engineers and data scientists to understand data requirements and optimize data pipelines.
  • Administer and monitor Hadoop clusters to ensure high availability, reliability, and performance.
  • Troubleshoot and resolve issues related to Hadoop infrastructure, data ingestion, data processing, and data storage.
  • Implement and manage security measures within Hadoop clusters, including authentication, authorization, and encryption.
  • Collaborate with cross-functional teams to define and implement backup and disaster recovery strategies for Hadoop clusters.
  • Optimize Hadoop performance through fine-tuning configurations, capacity planning, and implementing performance monitoring and tuning techniques.
  • Work with DevOps teams to automate Hadoop infrastructure provisioning, deployment, and management processes.
  • Stay up to date with the latest developments in the Hadoop ecosystem. Recommend and implement new technologies and tools that enhance the platform.
  • Document Hadoop infrastructure configurations, processes, and best practices.
  • User Interface Design: design interfaces for self-service Hadoop tools such as cluster management interfaces or job scheduling dashboards.
  • Role-Based Access Control: control access to Hadoop clusters for self-service tasks.
  • Cluster Configuration Templates: maintain consistent configurations across Hadoop clusters.
  • Resource Management: optimize resource utilization within Hadoop clusters, enabling dynamic management by users.
  • Self-Service Provisioning: enable users to provision and manage nodes independently.
  • Monitoring and Alerts: monitor health and performance of Hadoop clusters and provide insights to users.
  • Automated Scaling: automatically adjust cluster size based on workload demands.
  • Job Scheduling and Prioritization: manage data processing jobs efficiently.
  • Self-Service Data Ingestion: enable users to ingest data into Hadoop clusters independently.
  • Query Optimization and Tuning Assistance: provide tools or guidance to optimize and tune user queries.
  • Documentation and Training: create resources to help users understand self-service features.
  • Data Access Control: control access to data stored within Hadoop clusters for governance.
  • Backup and Restore Functionality: provide backup and restore operations for data in Hadoop clusters.
  • Containerization and Orchestration: deploy and manage applications within Hadoop clusters using containers.
  • User Feedback Mechanism: collect and act on user feedback for self-service features.
  • Cost Monitoring and Optimization: monitor and optimize costs associated with Hadoop usage.
  • Compliance and Auditing: ensure compliance and auditing of user activities within Hadoop ecosystem.
Desired Qualifications
  • Problem-Solving and Analytical Thinking
  • Collaboration and Teamwork
  • Adaptability and Continuous Learning
  • Performance Monitoring and Tuning
  • Security Best Practices
  • Capacity Planning
  • Automation and Scripting
  • Monitoring and Observability
  • Networking Skills
  • On-premise experience with Hadoop including config, performance, tuning
  • Proficiency in scripting, Linux system administration, networking, and troubleshooting skills
  • Experience with Hadoop ecosystem components and cluster management tools
  • Hands-on and strong understanding of Hadoop architecture
  • Experience with self-service capabilities across large clusters
  • Experience with Hadoop ecosystem components such as HDFS, YARN, MapReduce & cluster management tools like Ambari or Cloudera Manager
  • Proficiency in scripting languages (Bash, Python) for automation

Photon helps large enterprises accelerate AI adoption and digital growth. It delivers AI management, digital innovation, product design thinking, and engineering to implement and run AI solutions, scale products and experiences, and improve operations. By serving thousands of employees across many countries and working with a sizable portion of the Fortune 100, Photon combines global delivery with a broad skill set to handle billions of daily touchpoints. Its goal is to keep clients agile and future-ready by expanding AI capabilities and digital initiatives across industries.

Company Size

N/A

Company Stage

N/A

Total Funding

N/A

Headquarters

London, United Kingdom

Founded

N/A

Simplify Jobs

Simplify's Take

What believers are saying

  • Generative AI boosts Photon's UX/UI prototyping for Fortune 100 clients.
  • Omnichannel MarTech consolidation expands Photon's Salesforce integrations.
  • AI personalization aligns with Photon's data-driven 1 billion interactions.

What critics are saying

  • Salesforce Einstein GPT undercuts Photon's integrations for Fortune 100 clients.
  • Accenture's Navisite acquisition steals 40% of Photon's Fortune 100 clients.
  • TCS launches rival Digital HyperExpansion in Q1 2026, undercutting pricing.

What makes Photon unique

  • Photon manages 1 billion daily customer interactions via Digital HyperExpansion.
  • Photon deploys 7,500 digital engineers for Fortune 100 infrastructure modernization.
  • Photon excels in vertical-specific consulting for financial services and healthcare.

Help us improve and share your feedback! Did you find this helpful?

Benefits

Health Insurance

Dental Insurance

Vision Insurance

401(k) Retirement Plan

Paid Vacation

Paid Holidays

Performance Bonus

Company News

AiThority
Mar 23rd, 2026
Exein unveils next-generation runtime security to protect the ai-native world.

Exein unveils next-generation runtime security to protect the ai-native world. * Photon blocks cyberattacks before execution across physical AI and IoT, autonomous AI agents and cloud and edge infrastructure * Kernel-level prevention sets a new standard beyond traditional user-space detection * Builds on Exein's position as the world's largest runtime security provider, protecting over two billion devices Mar 23, 2026 Prev Next 1 of 42,872 Exein, the global leader in runtime cybersecurity, unveiled Photon, a preemptive breakthrough solution that blocks cyberattacks at the point of execution. Designed for the AI-native world - where digital and physical systems are now inseparable - Photon marks a fundamental shift in how critical infrastructure protects itself. Unlike traditional cybersecurity solutions that detect threats after compromise - typically operating in user space and relying on a cloud network - Exein's Photon operates directly inside the kernel, preventing malicious execution paths before they can run. By blocking attacks before the point of execution, the technology dramatically reduces latency and eliminates entire classes of threats before damage occurs. If malicious instructions cannot execute, the attack itself cannot take place. This advancement establishes a new category of runtime security designed for systems that cannot be disconnected: physical AI and IoT environments, autonomous AI agents, and local hybrid cloud and edge infrastructure. In these environments, from industrial robotics and critical infrastructure to AI-driven platforms, downtime is not an option, and protection must be more precise and granular, blocking malicious threats without shutting down the entire process. The announcement at the RSA Conference (RSAC) comes as cyber threats increasingly target physical systems. Last month, the Munich Security Report 2026 warned that cyber operations are now engineered to cause real-world disruption, accelerating regulatory intervention after voluntary measures failed to address systemic vulnerabilities. At the same time, the speed of attacks is accelerating dramatically: recent threat intelligence shows average attacker 'breakout times' fell to just 29 minutes in 2025, 65% faster than the previous year, driven in part by AI-assisted automation. Protecting the digital and physical in the AI era Artificial intelligence is already capable of identifying vulnerabilities in software and infrastructure. In the near future, these models will not only detect weaknesses but exploit them autonomously to launch attacks at machine speed. As the scale and sophistication of these attacks grow, traditional runtime security systems that rely on detection alone will no longer be sufficient. Photon introduces a new model of preemptive runtime security designed for this AI-driven environment. Rather than detecting attacks after they begin, it prevents malicious execution paths from running in the first place, blocking threats in real time before they can impact the system. Unlike conventional security tools that operate in user space alongside the applications they protect, Photon operates directly within the kernel, the core of the operating system. By enforcing protection at this foundational layer, rather than merely detecting and stopping attacks, it prevents them from executing in the first place - all in real time. This marks a major milestone as physical and digital systems converge, positioning Photon as a new reference architecture for securing physical AI, agent AI and cloud and hybrid infrastructure. Gianni Cuozzo, Founder and CEO of Exein, said: "In a future where the world is infinitely connected with humanoid robots walking among Aithority, local LLMs powering intelligent edges, autonomous drones reshaping mobility, and billions of new autonomous systems bridging the digital and physical realms, preemptive runtime security represents the new generation of protection, built into the very DNA of every device from the ground up. "Exein was born to make this vision a reality: transforming every connected device into a fortress of security, forging the largest decentralised immune system for digital life - cross-vendor, cross-platform, and cross-system. We stand as the first line of defence between the boundless digital world and the physical one we live in, empowering manufacturers to build inherently safe innovations and already safeguarding over 2 billion devices worldwide."