Senior Data Engineer

Tel Aviv · Full-time · Senior

About The Position

About Us

We are shaping the future of mobility, enabling municipalities to take full control of their roads by providing AI driven solutions for smart cities. Our platform enables its users to capitalize on the enormous amount of data coming from various transportation modes, including connected and autonomous vehicles, to improve traffic safety and proactively manage the city's roads.


Job Description

We are looking for an experienced Data Engineer, who will be responsible for designing and developing our data pipeline architecture which collects, processes, streams and analyzes various data sets. Building the infrastructure will require optimal and scalable transformations, and loading of data from a wide variety of data sources (Today we support dozens of different data sources). You will mentor and be the technical focal point for Data Engineers as well as defining standards and development methodologies for the team.


  • Design, develop, and operate highly-scalable, high-performance, low-cost, and accurate Data processing pipelines
  • Full Data Development Life Cycle
  • Owner of data sets
  • Implement and develop different data integrity processes, data validations and tests
  • Keep up to date with big data technologies, evaluate and make decisions around the use of new or existing software products


  • 3+ years of experience working as a Data Engineer
  • 3+ years experience with advanced programming (preferably Python or JVM based language)
  • Experience in building processes supporting data cleansing, validation and transformation
  • A positive, “can do” attitude who isn’t afraid to lead the Data efforts of Waycare’s evolving data infrastructure
  • Strong communication and collaboration skills
  • BA/B.Sc in Computer Science (or equivalent), or a veteran of a technological army unit.
  • Experience in data quality frameworks and error management
  • Maintain application stability and data integrity by monitoring key metrics and improving code base accordingly
  • Knowledge in SQL


  • Experience with Big data frameworks such as Kafka, Kinesis, Spark, Flink, Snowflake, data warehouse.
  • Experience with data pipeline and workflow management tools such as Airflow
  • Designing data structures and databases
  • Experience with CI/CD and DevOps
  • Experience in AWS environment and Kubernetes

Apply for this position

Join our mailing list

Subscribe to our mailing list to receive monthly updates, reports, and more

Get in Touch

Get in contact to learn more about our solutions or schedule a demo

Get in contact to learn more about our solutions