Qualysoft21d ago
Senior Databricks DWH Engineer - Banking
Bucharest · BucharestFull-timesenior
OtherData EngineeringEngineer
0 views0 saves0 applied
Quick Summary
Requirements Summary
Providing technical guidance to junior developers and promoting best practices. • Professional Experience: Minimum 5+ years of experience in Data Engineering,
Technical Tools
OtherData EngineeringEngineer
About Qualysoft
· 25 years of experience in software engineering, established in Vienna, Austria
· Active in Romania since 2007, with office in central Bucharest (Bd. Iancu de Hunedoara 54B)
· Delivering End to End IT Consulting Services - From Team Augmentation and Dedicated Teams to Custom Software Development
· We deliver scalable enterprise systems, intelligent automation frameworks, and digital transformation platforms
· Cross-industry experience by sustaining global players in BSFI (Banking, financial services and insurance), Telecom,Retail & E-commerce, Energy and Utilities, Automotive, Manufacturing, Logitics, High Tech
· Global Presence: Switzerland, Germany, Austria, Sweden, Hungary, Slovakia, Serbia, Romania, and Indonesia
· International team of 500+ software engineers
· Strategic partnerships: Microsoft Cloud Certified Partner, Tricentis Solutions Partner in Test Automation and Test Management, Creatio Exclusive Partner, Doxee Implementation Partner
· Powered by cutting-edge technologies: AI, Data & Analytics, Cloud, DevOps, IoT, and Test Automation.
· Project beneficiaries ranging from large-scale enterprises to startups
· Stable growth and revenue increase year over year, a resilient organisation in volatile IT market conditions
· Quality-first mindset, culture of innovation, and long-term client partnerships
· Global and local reach – trusted by key industry players in Europe and the US
• Advanced Design & Implementation: Designing and implementing robust, scalable, high-performance
ETL/ELT data pipelines using PySpark/Scala and Databricks SQL on the Databricks platform.
• Delta Lake: Expertise in implementing and optimizing the Medallion architecture (Bronze, Silver, Gold)
using Delta Lake to ensure data quality, consistency, and historical tracking.
• Lakehouse Platform: Efficient implementation of the Lakehouse architecture on Databricks, combining
best practices from DWH and Data Lake environments.
• Performance Optimization: Optimizing Databricks clusters, Spark operations, and Delta tables (e.g., Zordering, compaction, query tuning) to reduce latency and compute costs.
• Streaming: Designing and implementing real-time/near–real-time data processing solutions using Spark
Structured Streaming and Delta Live Tables (DLT).
• Unity Catalog: Implementation and administration of Unity Catalog for centralized data governance, finegrained security (row- and column-level security), and data lineage.
• Data Quality: Defining and implementing data quality standards and rules (e.g., using DLT or Great
Expectations) to maintain data integrity.
• Orchestration: Developing and managing complex workflows using Databricks Workflows (Jobs) or external
tools (e.g., Azure Data Factory, Airflow) to automate pipelines.
• DevOps/CI/CD: Integrating Databricks pipelines into CI/CD processes using tools such as Git, Databricks
Repos, and Bundles.
• Collaboration: Working closely with Data Scientists, Analysts, and Architects to understand business
requirements and deliver optimal technical solutions.
• Mentorship: Providing technical guidance to junior developers and promoting best practices.
ETL/ELT data pipelines using PySpark/Scala and Databricks SQL on the Databricks platform.
• Delta Lake: Expertise in implementing and optimizing the Medallion architecture (Bronze, Silver, Gold)
using Delta Lake to ensure data quality, consistency, and historical tracking.
• Lakehouse Platform: Efficient implementation of the Lakehouse architecture on Databricks, combining
best practices from DWH and Data Lake environments.
• Performance Optimization: Optimizing Databricks clusters, Spark operations, and Delta tables (e.g., Zordering, compaction, query tuning) to reduce latency and compute costs.
• Streaming: Designing and implementing real-time/near–real-time data processing solutions using Spark
Structured Streaming and Delta Live Tables (DLT).
• Unity Catalog: Implementation and administration of Unity Catalog for centralized data governance, finegrained security (row- and column-level security), and data lineage.
• Data Quality: Defining and implementing data quality standards and rules (e.g., using DLT or Great
Expectations) to maintain data integrity.
• Orchestration: Developing and managing complex workflows using Databricks Workflows (Jobs) or external
tools (e.g., Azure Data Factory, Airflow) to automate pipelines.
• DevOps/CI/CD: Integrating Databricks pipelines into CI/CD processes using tools such as Git, Databricks
Repos, and Bundles.
• Collaboration: Working closely with Data Scientists, Analysts, and Architects to understand business
requirements and deliver optimal technical solutions.
• Mentorship: Providing technical guidance to junior developers and promoting best practices.
• Professional Experience: Minimum 5+ years of experience in Data Engineering, including at least 3+ years
working with Databricks and large-scale Spark.
• Databricks Platform: Proven, expert-level experience with the full Databricks ecosystem (Workspace,
Cluster Management, Notebooks, Databricks SQL).
• Apache Spark: Deep knowledge of Spark architecture (RDD, DataFrames, Spark SQL) and advanced
optimization techniques.
• Delta Lake: Expertise in implementing and administering Delta Lake (ACID properties, Time Travel, Merge,
Optimize, Vacuum).
• Programming Languages: Advanced/expert-level proficiency in Python (with PySpark) and/or Scala (with
Spark).
• SQL: Advanced/expert skills in SQL and Data Modeling (Dimensional, 3NF, Data Vault).
• Cloud: Strong experience with a major Cloud platform (AWS, Azure, or GCP), particularly with storage
services (S3, ADLS Gen2, GCS) and networking.
• Unity Catalog: Hands-on experience with implementing and administering Unity Catalog.
• Lakeflow: Experience with Delta Live Tables (DLT) and Databricks Workflows.
• ML/AI Fundamentals: Understanding of basic MLOps concepts and experience with MLflow to support
integration with Data Science teams.
• DevOps: Experience with Terraform or equivalent tools for Infrastructure as Code (IaC).
• Certifications: Databricks certifications (e.g., Databricks Certified Data Engineer Professional) are a strong
advantage.
working with Databricks and large-scale Spark.
• Databricks Platform: Proven, expert-level experience with the full Databricks ecosystem (Workspace,
Cluster Management, Notebooks, Databricks SQL).
• Apache Spark: Deep knowledge of Spark architecture (RDD, DataFrames, Spark SQL) and advanced
optimization techniques.
• Delta Lake: Expertise in implementing and administering Delta Lake (ACID properties, Time Travel, Merge,
Optimize, Vacuum).
• Programming Languages: Advanced/expert-level proficiency in Python (with PySpark) and/or Scala (with
Spark).
• SQL: Advanced/expert skills in SQL and Data Modeling (Dimensional, 3NF, Data Vault).
• Cloud: Strong experience with a major Cloud platform (AWS, Azure, or GCP), particularly with storage
services (S3, ADLS Gen2, GCS) and networking.
• Unity Catalog: Hands-on experience with implementing and administering Unity Catalog.
• Lakeflow: Experience with Delta Live Tables (DLT) and Databricks Workflows.
• ML/AI Fundamentals: Understanding of basic MLOps concepts and experience with MLflow to support
integration with Data Science teams.
• DevOps: Experience with Terraform or equivalent tools for Infrastructure as Code (IaC).
• Certifications: Databricks certifications (e.g., Databricks Certified Data Engineer Professional) are a strong
advantage.
Location & Eligibility
Where is the job
Bucharest
Hybrid — some on-site time required
Who can apply
Same as job location
Listed under
Worldwide
Listing Details
- Posted
- April 6, 2026
- First seen
- April 6, 2026
- Last seen
- April 27, 2026
Posting Health
- Days active
- 21
- Repost count
- 0
- Trust Level
- 33%
- Scored at
- April 27, 2026
Signal breakdown
freshnesssource trustcontent trustemployer trust
External application · ~5 min on Qualysoft's site
Please let Qualysoft know you found this job on Jobera.
3 other jobs at Qualysoft
View all →Explore open roles at Qualysoft.
Similar Engineer jobs
View all →Browse Similar Jobs
Manager2.5kFitness & Wellness2.1kData Collector1.9kAssistant Manager1.7kDirector1.5kAssociate1.2kConsultant1.1kBehavioral Health1.1kSocial Work & Counseling1kSocial Worker956Assistant925Social760Technician648Analyst619Operations Associate541Coordinator536Psychiatric Mental Health Nurse Practitioner469Staff Engineer457Development443Human Resources (legacy human-resources)414
Newsletter
Stay ahead of the market
Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.
A
B
C
D
No spam. Unsubscribe at any time.
