Iskustvo
Junior, Medior, Senior
Angažman
Puno radno vrijeme
Rok prijave
još 3 dana

O kompaniji

HTEC Group is a global consulting, software engineering, and digital product development company that empowers the world's best high-tech companies, disruptive startups, and global enterprises with innovative product design and sophisticated engineering services.

HTEC Group was founded in 2008 in Belgrade, Serbia and today has its global headquarters in San Francisco. The company has consultancy, innovation, and product design offices in Silicon Valley, New York, and London, with its technological heart spread across development centers in Central and Southeast Europe. Overall, HTEC employs more than 2,000 highly skilled professionals in 29 locations in 12 countries.

HTEC combines Silicon Valley-based design thinking with the best engineering talent to support global clients with complete digital product development, from strategy and conceptualization to digital product design and agile engineering at scale. The company possesses vast expertise across a multitude of domains, including Healthcare, Retail, Transportation and Smart Mobility, Logistics, FinTech, Green Energy, Media, and Deep Technology.

Opis posla

From FinTech and Retail to AI, Health Tech, and Transportation – there is virtually no domain or industry that can’t vastly benefit from Data Science. The possibilities that this field offers are almost limitless – and HTEC is on the lookout for Data Engineers eager to tap into that potential. Some of our success stories in this domain include building a maritime intelligence platform, an AI-powered expert search engine, a data platform that leverages Lakehouse architecture, and many more. If you would like to become a part of our team, check out the description below and apply.

Key responsibilities:


  • Designing and implementing systems that depend on diverse data sources 
  • Designing and implementing data processing pipelines and ETL processes 
  • Designing and implementing fault-tolerant workflows 
  • Scheduling and planning of job executions 
  • Understanding and advocating the importance of high data accuracy throughout the system 
  • Spreading the culture of maintaining high data quality to support building data-driven products 
  • Making informed decisions about storage systems when designing and implementing data engineering/warehousing solutions 

Kvalifikacije

Required qualifications: 

  • In-depth knowledge of at least one big data processing framework (preferably Spark) 
  • Knowledge of ETL principles 
  • Experience with SQL and concepts of Data Warehousing 
  • Experience with at least one of the following: Scala, Java, or Python
  • Experience with cloud computing and serverless paradigms, building data processing pipelines, and complex workflows
  • Experience with streaming technologies  
  • Knowledge of Unix-like operating systems 
  • English language proficiency

Dodatne Informacije

Nice to have: 


  • Strong knowledge of relational and non-relational databases 
  • Experience with streaming technologies
  • Experience with workflow scheduling and/or specific job scheduling tools 
  • Experience with CQRS and event sourcing approaches 
  • Experience with distributed storage and distributed computing
  • Experience with virtualization and containerized applications (Docker, Kubernetes) 
  • A desire to build valuable data assets and help business decision-makers 

Oglas istekao? 

Top IT poslovi u tvom inboxu

Pretplati se na Dzobs.com newsletter i jednom sedmično ti šaljemo najnovije poslove za odabranu poziciju.

Zanimanje...