Design and build low-latency, real-time data platforms powering live analytics, event processing, and streaming applications at APPIT Software in New York.
New York, USA
Full-time
Data Engineering
Responsibilities
Architect real-time data platforms using Kafka, Flink, and cloud-native streaming services
Design event-driven architectures that handle millions of events per second with sub-second latency
Build and maintain stream processing applications for real-time analytics, alerting, and enrichment
Implement exactly-once semantics, state management, and checkpointing in streaming pipelines
Develop real-time data serving layers using Redis, Elasticsearch, or Apache Druid for low-latency queries
Define SLAs, monitor pipeline health, and build automated alerting for real-time data systems
Requirements
6-9 years of data engineering experience with at least 3 years building real-time data systems
Deep expertise with Apache Kafka, including Kafka Streams, Connect, and cluster operations
Hands-on experience with stream processing frameworks (Apache Flink, Spark Streaming, or ksqlDB)
Strong programming skills in Java, Scala, or Python for building streaming applications
Experience with real-time serving databases (Redis, Elasticsearch, Druid, or ClickHouse)
Understanding of distributed systems concepts including consistency, partitioning, and fault tolerance
Nice to Have
Experience with cloud-managed streaming services (AWS Kinesis, Azure Event Hubs, or Confluent Cloud)
Knowledge of CQRS and event sourcing patterns
Familiarity with Kubernetes for deploying streaming workloads