Kafka Platform Architect
Job ID: 111845
Location: Tulsa, Oklahoma [Remote]
Category: Infrastructure
Employment Type: Contract
Date Added: 01/23/2026
Role Summary
This role involves designing, implementing, and governing a scalable and secure enterprise event streaming platform utilizing Confluent Kafka. The platform architect will develop architecture patterns, establish operational guardrails, and create integration strategies to enable real-time data movement across cloud environments. The position requires a senior-level professional to lead the technical direction and ensure platform resilience, performance, and compliance.
Responsibilities
- Define and evolve enterprise Kafka architecture, including cluster sizing, partitioning, replication, and high availability strategies.
- Establish best practices for producers, consumers, and stream processing applications using Confluent components such as Kafka Connect, Schema Registry, and kSQLDB.
- Design multi-region disaster recovery and failover strategies aligned with business continuity needs.
- Implement naming conventions, topic lifecycle policies, and schema evolution standards to ensure consistent platform governance.
- Define security guardrails utilizing RBAC, ACLs, encryption (TLS, SASL), and leverage Confluent Control Center for platform security management.
- Set up performance monitoring and alerting for throughput, latency, consumer lag, and broker health using integrated observability tools.
- Optimize Kafka performance at scale through configuration tuning to enhance reliability and cost efficiency.
- Architect connectors and integrations for core banking systems, cloud data lakes, and analytics platforms.
- Develop reusable patterns and onboarding documentation to enable self-service provisioning for application teams.
- Collaborate with InfoSec, Cloud Engineering, and Data Governance teams to align platform capabilities with enterprise standards.
- Evaluate emerging Kafka features and event streaming trends; drive automation initiatives for provisioning, scaling, and patching using Infrastructure-as-Code tools like Terraform and Ansible.
Qualifications
- Minimum of 8 years experience in enterprise data architecture or platform engineering roles.
- At least 4 years of hands-on experience with Apache Kafka and Confluent Platform in production environments.
- Strong understanding of event-driven architecture, schema management (Avro/Protobuf), and stream processing.
- Expertise in Kafka security mechanisms (TLS, SASL, RBAC), performance tuning, and observability.
- Proficiency with cloud-native deployments on AWS, Azure, or GCP and Kubernetes-based Kafka operations.
- Familiarity with hybrid or multi-cloud Kafka deployment architectures.
- Knowledge of financial services compliance and governance requirements is a plus.
- Experience with automation frameworks and CI/CD pipelines related to Kafka infrastructure.
Publishing Pay Range: $65.00 – $69.62 hourly
This is a fully remote role and can be performed from an approved location.
