leady
lock-svg project
Successfully occupied
View project information dropdown icon
Wallet icon Coin icon Rate 100 000 CZK - 130 000 CZK / month
Timer icon Form of cooperation Full-time / 50% Remote
Briefcase icon Sector Banking
Location icon Location Praha
Start date 01.05.2026 - 01.07.2026
Technology
  • MS Azure
  • Apache Kafka
  • Apache Spark
  • Python
  • Azure Data Lake
  • Azure Data Factory
  • Terraform
  • GCP Google Cloud Platform
  • Github
  • Apache Flink
  • Databricks
  • Apache Airflow
Languages
  • English flag English - active, B2/C1/C2

Offer description

  • Building a modern data platform on Databricks as part of the cloud transformation of a global financial institution
  • Platform management and automation (Terraform, CI/CD, Unity Catalog) with an emphasis on scalability, security, and high availability
  • Support for advanced analytics and AI use-cases across the organization, including the implementation of the Data Mesh concept
  • Management and configuration of Databricks workspaces using Terraform, CLI, and SDK
  • Management of environment settings (clusters, libraries, compute policies, access permissions)
  • Definition and maintenance of catalogs, schemas, and tables using Unity Catalog
  • Ensuring platform security, scalability, and high availability
  • Support for the implementation of the Data Mesh concept across domains
  • Technical support and consulting for data analysts, scientists, and business users
  • Leading onboarding, training, and enablement activities for platform adoption
  • Creation and management of technical documentation, participation in agile development and sprint planning
  • Collaboration takes place hybridly, 2-3 days per week onsite in Prague

Requirements

  • Minimum of 5 years of experience in developing and managing cloud big data platforms (Azure / GCP)
  • Minimum of a bachelor's degree in computer science or a related field
  • Active knowledge of English at a minimum level of B2 (daily communication purely in English)
  • Advanced experience: 
    • Excellent knowledge of Databricks, Delta Lake, and Spark environments
    • Experience with tools like Apache Airflow, Azure Data Factory, or Apache Beam
    • Advanced knowledge of Terraform, Python, and CI/CD tools (e.g., GitHub Actions)
    • Excellent understanding of data security, management, monitoring, and governance
    • Excellent communication and coordination skills within an international team
  • Advantageous:
    • Experience with Kafka, Flink, or similar frameworks
    • Previous work in Agile methodology
    • Experience in financial markets
Are you interested in this offer?
Recommend an IT specialist Do you know anyone who could use this project? Recommend him and get a reward!
Hire an IT specialist Do you need a similar IT freelancer for your project? Hire a specialist
New to the world of IT freelancing ?

Freedom, flexibility, greater control over finances and career.

Are you interested in this offer?
Recommend an IT specialist Do you know anyone who could use this project? Recommend him and get a reward!
Hire an IT specialist Do you need a similar IT freelancer for your project? Hire a specialist
32 285

Titans that have
joined us

746

Clients that have
joined us

699 462

Succcessfully supplied
man-days