Iskustvo
Junior, Medior, Senior
Angažman
Puno radno vrijeme
Rok prijave
još 5 dana

O kompaniji

Established in 2002, Bravo Systems d.o.o. is an Internet software premium development company located in Banja Luka, Bosnia and Herzegovina and Belgrade, Serbia.

Bravo Systems d.o.o. is an independent strategic development partner of Propel Media, a US-based performance driven digital media company that brings together online video, display and mobile advertising technology and solutions to advertisers and publishers.

Opis posla

Our team is working on building and maintaining large-scale data infrastructure using state-of-the-art big data technologies to power our online advertising and e-commerce platforms. We are looking for an enthusiastic individual who would take part in creating and maintaining various parts of our data pipelines: collecting and integrating data from different sources, transforming it as it arrives in our system, organizing it into datasets suitable for analytical use cases, and storing it in our OLAP database system.

KEY RESPONSIBILITIES

Our team combines multiple different data processing architectures – real-time stream processing architecture on top of Apache Kafka event streaming platform, together with batch ETL processing on top of Apache Spark and Apache Hadoop ecosystem. Data from both of those streams end up in our Apache Druid OLAP database organized into datasets optimized for analytical use cases.
In addition to creating and maintaining data pipelines, you will also have the opportunity to work on conceptual data modeling where your main responsibility would be to translate the original product requests into data models that will provide us with the ability to answer business questions expressed in form of KPIs.

Kvalifikacije

REQUIRED SKILLS:


  • Strong knowledge of core big data principles
  • Experience with event streaming/message queuing platforms (preferably Apache Kafka)
  • Experience with ETL processing (preferably Spark/Hadoop ecosystem)
  • Experience with any OLAP database (e.g. Apache Druid, Apache Pinot, Clickhouse)
  • Experience with Java or Python programming languages
  • Experience with SQL
  • Experience with data modeling, preferably with designing event schemas
  • Basic knowledge in the field of analytics
  • Basic knowledge of analytical event tracking platforms (e.g. Google Analytics)

Dodatne Informacije

Nice to have skills:

  • Data visualization platforms (e.g. Tableau, Apache Superset)
  • DevOps skills
  • Data orchestration platforms (e.g. Apache Airflow, Airbyte)
  • Data quality assurance skills

WHAT DO WE OFFER?

  • A place where you can continuously grow and advance your career and knowledge
  • Recognition and promotion for results and performance by planning your career development
  • Great financial packages and bonuses for business achievements
  • An environment where you’re highly valued and appreciated
  • Promotion of healthy work-life balance
  • Fun, casual, and collaborative team environment
  • Celebration of your personal and family milestones
  • Occasional travel to our partner’s office (USA)
  • Additional benefits such as sports, gaming, team building, and team and company events

Oglas istekao? 

Top IT poslovi u tvom inboxu

Pretplati se na Dzobs.com newsletter i jednom sedmično ti šaljemo najnovije poslove za odabranu poziciju.

Zanimanje...