We are seeking an experienced Data Fabric Developer to join our dynamic team. The ideal candidate will have a strong background in data architecture, data integration, and data management with Fabric technology. You will be responsible for designing, developing, and implementing Data Fabric solutions that enable seamless data access and integration across the organization. As the main priority, migration of the current solution to Fabric and further its development and enhancements in line with client’s business needs and best industrial practices will arise.
Responsibilities:
- Design and implement Data Fabric architectures that facilitate the integration of diverse data sources and formats.
- Develop and maintain data pipelines, ETL processes, and data workflows to ensure data quality and accessibility.
- Collaborate with data engineers, data scientists, and business stakeholders to understand data requirements and deliver effective solutions.
- Implement data governance and security best practices to protect sensitive information and ensure compliance with relevant regulations.
- Optimize data storage and retrieval processes to enhance performance and scalability.
- Monitor and troubleshoot data integration processes and address any issues in a timely manner.
- Stay up-to-date with industry trends and emerging technologies related to data fabric and data integration.
Requirements:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- 3 years of experience in data engineering, data integration, or a similar role.
- Hand-on experience with Microsoft Fabric.
- Strong understanding of data fabric concepts and technologies, including data virtualization, data lakes, and data mesh architectures.
- Strong experience with PySpark, SQL.
- Experience with data integration tools (e.g., Talend, Informatica, Apache Nifi) and cloud platforms (preferably Azure; AWS, Google Cloud), Notebook with Azure Data Factory.
- Knowledge of database technologies such as SQL and NoSQL databases (e.g., MongoDB, Cassandra).
- Familiarity with data modeling, data governance, and data quality frameworks.
- Excellent analytical and problem-solving skills, with strong attention to detail.
- Strong communication skills and the ability to work collaboratively in a team environment both with technical and business stakeholders.
Nice to have:
- DP-600 and/or DP-700 certification(-s).
- Experience with big data technologies.
- Familiarity with machine learning concepts and tools. - Certifications in data management or cloud technologies.
We can offer:
- Projects for such clients as PayPal, Wargaming, Xerox, Philips, adidas and Toyota
- Competitive compensation that depends on your qualification and skills
- Career development system with clear skill qualifications
- Flexible working hours aligned to your schedule
- Compensation of medical costs
- English courses online
- Corporate parties and events for employees and their children
- Gym membership compensation, corporate sport competitions (cybersport included)
- 5 days of paid sick leave per year with no obligation to submit a sick-leave certificat