Używamy plików cookie, aby poprawić wrażenia użytkownika, analizować ruch i wyświetlać odpowiednie reklamy.
Więcej Akceptuję
Podaj pozycję

Przegląd Statystyki wynagrodzenia zawód "Big Data Architect w Polsce"

Otrzymuj informacje statystyczne pocztą

Przegląd Statystyki wynagrodzenia zawód "Big Data Architect w Polsce"

17 000 zł Średnia miesięczna pensja

Średni poziom płac w ciągu ostatnich 12 miesięcy: "Big Data Architect w Polsce"

Waluty: PLN USD Rok: 2024
Wykres słupkowy pokazuje zmiany w poziomie przeciętnego wynagrodzenia zawodu Big Data Architect w Polsce.

Podział miejsc pracy "Big Data Architect" na obszarach Polsce

Waluty: PLN
Jak widać na wykresie, w Polsce największą ilość wakatów zawodu Big Data Architect otwarcie w Małopolskie. Na drugim miejscu - Mazowieckie, a na trzecim - Śląskie.

Regiony ocenił Polsce od poziomu wynagrodzeń dla zawodu "Big Data Architect"

Waluty: PLN
Jak widać na wykresie, w Polsce największą ilość wakatów zawodu Big Data Architect otwarcie w Małopolskie. Na drugim miejscu - Mazowieckie, a na trzecim - Śląskie.

Podobne wolnych miejsc pracy ocenił przez poziom wynagrodzenia w Polsce

Waluty: PLN
Jak widać na wykresie, w Polsce największą ilość wakatów zawodu Big Data Architect otwarcie w Małopolskie. Na drugim miejscu - Mazowieckie, a na trzecim - Śląskie.

Polecane oferty pracy

Internships in IT - UI, BSA, BigData
Grid Dynamics Poland Sp. z o.o., Gdańsk, pomorskie
Technologies we useOperating systemmacOSAbout the projectAre you looking for a great opportunity to expand your skills and knowledge? Want to join a multinational company with interesting projects and learn from our professionals?If you are ready to start your career in IT, just show us your potential, and we will give you the experience!Your responsibilitiesLearning under the supervision of an experienced programmerSpend 30 hours per week for 6 months on developing yourself in your chosen fieldWorking closely with your mentor on daily tasksOur requirementsInterest in expanding knowledge in the chosen IT areaBasic knowledge of programmingA communicative level of EnglishWillingness to work with us full-time after the internshipDevelopment opportunities we offerconferences in Polandindustry-specific e-learning platformsintracompany trainingmentoringsubstantive support from technological leaderstechnical knowledge exchange within the companytime for development of your ideasWhat we offer6 months of training (paid internship) in one of the following fields: UI, BSA, BigData30h per week of hands-on technical experience (flexible schedule)Learning under the supervision of mentorsExperience in commercial projectsWorking with leaders in their industriesBenefitsremote work opportunitiesflexible working timefruitsintegration eventscorporate gymno dress codevideo games at workcoffee / tealeisure zoneemployee referral programopportunity to obtain permits and licensesRecruitment stagesHR InterviewTechnical InterviewGrid Dynamics PolandGrid Dynamics is the engineering services company known for transformative, mission-critical cloud solutions for retail, finance and technology sectors. We have architected some of the busiest e-commerce services on the Internet and have never had an outage during the peak season. Founded in 2006 and headquartered in San Ramon, California with offices throughout the US and Eastern Europe, we focus on big data analytics, scalable omnichannel services, DevOps, and cloud enablement.
SAP Data Developer Analytics
Hays, remote, Lódz
For our Client, we are looking for SAP Data developer.Contract of employment.Your task: As a Data Developer, you translate business and analysis requirements into technical specifications as well as analytical data models and implement them in our data analytics platforms (SAP BW4, Big Query and Qlik Sense). Your contribution: You are the key to providing comprehensive, competent and responsible advice and support to our specialist departments in designing your analytics solutions using innovative web technologies. You will also push forward the digital transformation into a Data Driven Company. Your environment: You will be in an international and agile environment and will be able to help build and develop the growing Analytics team. Your Freedom: With your smart ideas and creative approaches to solutions, you can support the specialist departments and ensure optimized working conditions. Your Tech Stack: Modern SAP system landscape including BW /4HANA, cloud technologies including Big Query, Qlik Sense Analytics Platform.  Ideally, you also bring the following with you: Your education - A successfully completed degree in the STEM field or an apprenticeship with a high IT content form the basis of your profile. Your experience: At least 3-4 years of experience in designing, developing and implementing BI solutions. Additionally, you are already familiar with analytics tools (preferably Qlik). Your skills: Your analytically oriented mindset sets you apart and you enjoy solving analytical problems. Good SQL knowledge on application level and ideally first experiences with Python would be great. You speak English at least on B2 language level. What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.Hays Poland sp. z o.o. is an employment agency registered in a registry kept by Marshal of the Mazowieckie Voivodeship under the number 361.
Senior Data Platform Engineer (Azure)
Cyclad Sp. z o.o., Bydgoszcz, kujawsko-pomorskie
Project scope:Build, test, deploy and automate data components as part of the NNDAP productWork with technologies such as Azure, Terraform, Terratest (Go), Azure Data Factory, DatabricksUnderstand the needs of different customers and translating them into modular solutionsDesign solutions to optimize data platform solution architecture with other team membersTeamwork, pair program, and active participation in the innersource NNDAP community.Support the team in data and artificial intelligence projects. Assist junior engineersRequirements:More than 5 years of experience in the area of relevant software, data and/or platform engineeringMin. 3 years of commercial experience in the area of at least one Azure cloud service(e.g. ADF, Data-lake, Delta-lake, Databricks, Key Vaults, BigQuery, Datapipeline, etc.).Experience with Data as Code; version control, small and regular approvals, unit testing, CI/CD, packaging, branching, containerization, etc.Programming skills in Python/Poetry/PySpark, SQL or Go.Experience in open-source/inner-source projects + experience or interest in Domain Driven Design is welcome.We offer:Unique opportunity to join an international team and lead innovation projectsPrivate medical care with dental care (covering 70% of costs) + rehabilitation package. Family package option possibleMultisport card (also for an accompanying person)Life insurance
Senior Big Data Engineer
Hays, Kraków
Your new company For our client, a company that provides SaaS products related to sales, customer support, and other customer communications, we are looking for a Senior Big Data Engineer.Your new role Enterprise Data & Analytics (EDA) is looking for an experienced Senior Data Engineer to join growing analytics engineering team. Our client is a globally distributed remote first team. You’ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle. You will craft and develop curated data products, applying standard architectural & data modelling practices. You will be primarily developing Data Warehouse Solutions in BigQuery/Snowflake using technologies such as dbt, Airflow, Terraform.What you get to do every single day:Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data modelsUse and mentor on best engineering practices such as version control system, CI/CD, code review, pair programmingDesign, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reportingDesign & build ELT based data models using SQL & dbtBuild analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domainsIdentify, design, and implement internal process improvements: automating manual processes, optimising data deliveryWork with data and analytics experts to strive for greater functionality in data systemsWhat you'll need to succeed 5+ years of data / analytics engineering experience building, working & maintaining data pipelines & ETL processes on big data environmentsProduction experience working with dbt and designing and implementing Data Warehouse solutionsGood Knowledge in modern as well as classic Data Modelling - Kimball, Innmon, etc Intermediate experience with any of the programming languages: Python, Go, Java, Scala, we use primarily PythonAdvanced working SQL knowledge and experience working with Cloud columnar databases (Google BigQuery, Amazon Redshift, Snowflake), query authoring (SQL) as well as working familiarity with a variety of databasesIntegration with 3rd party API SaaS applications like Salesforce, Zuora, etcFamiliarity with processes supporting data transformation, data structures, metadata, dependency and workload managementExperience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.Preferred QualificationsExtensive experience with BigQuery or similar cloud warehouses (Snowflake, Redshift)Demonstrated experience in one or many business domains3+ completed projects with dbtExpert knowledge in PythonWhat you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.Hays Poland sp. z o.o. is an employment agency registered in a registry kept by Marshal of the Mazowieckie Voivodeship under the number 361.
Senior Data Engineer (Azure)
GetInData | Part of Xebia, Warsaw, Puławska /
What you will doAbout roleA Data Engineer's role involves the design, construction, and upkeep of data architecture, tools, and procedures facilitating an organization's collection, storage, manipulation, and analysis of substantial data volumes. This position involves erecting data platforms atop commonly provided infrastructure and establishing a streamlined path for Analytics Engineers who rely on the system.ResponsibilitiesWorking together with Platform Engineers to assess and choose the most suitable technologies and tools for the projectDevelopment and committing of new functionalities and open-source toolsExecuting intricate data intake proceduresImplementing and enacting policies in line with the company's strategic plans regarding utilized technologies, work organization, etc.Ensuring compliance with industry standards and regulations in terms of security, data privacy applied in the data processing layerConducting training and knowledge-sharingWhat we offerSalary: 160 - 200 PLN net + VAT/h B2B (depending on knowledge and experience)100% remote workFlexible working hoursPossibility to work from the office located in the heart of WarsawOpportunity to learn and develop with the best Big Data expertsInternational projectsPossibility of conducting workshops and trainingCertificationsCo-financing sport cardCo-financing health careAll equipment needed for workWhat we expectProficiency in a programming language like Python / Scala or JavaKnowledge of Lakehouse platforms - DatabricksExperience working with messaging systems - KafkaFamiliarity with Version Control Systems, particularly GITExperience as a programmer and knowledge of software engineering, good principles, practices, and solutionsExtensive experience in Microsoft AzureKnowledge of at least one orchestration and scheduling tool, for example, Airflow, Azure Data Factory, Prefect, DagsterFamiliarity with DevOps practices and tools, including Docker, Terraform, CI/CD, Azure DevOpsHow we workLaptopAdditional monitorWindowsLinuxOS XBenefitsAmenitiesCold beveragesHot beveragesFruitsSnacksIntegration eventsChill room
Senior Big Data Engineer
HAYS, Kraków, malopolskie
Senior Big Data EngineerKrakówNR REF.: 1184757Your new company For our client, a company that provides SaaS products related to sales, customer support, and other customer communications, we are looking for a Senior Big Data Engineer.Your new role Enterprise Data & Analytics (EDA) is looking for an experienced Senior Data Engineer to join growing analytics engineering team. Our client is a globally distributed remote first team. You’ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle. You will craft and develop curated data products, applying standard architectural & data modelling practices. You will be primarily developing Data Warehouse Solutions in BigQuery/Snowflake using technologies such as dbt, Airflow, Terraform.What you get to do every single day:Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data modelsUse and mentor on best engineering practices such as version control system, CI/CD, code review, pair programmingDesign, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reportingDesign & build ELT based data models using SQL & dbtBuild analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domainsIdentify, design, and implement internal process improvements: automating manual processes, optimising data deliveryWork with data and analytics experts to strive for greater functionality in data systemsWhat you39ll need to succeed 5+ years of data / analytics engineering experience building, working & maintaining data pipelines & ETL processes on big data environmentsProduction experience working with dbt and designing and implementing Data Warehouse solutionsGood Knowledge in modern as well as classic Data Modelling - Kimball, Innmon, etc Intermediate experience with any of the programming languages: Python, Go, Java, Scala, we use primarily PythonAdvanced working SQL knowledge and experience working with Cloud columnar databases (Google BigQuery, Amazon Redshift, Snowflake), query authoring (SQL) as well as working familiarity with a variety of databasesIntegration with 3rd party API SaaS applications like Salesforce, Zuora, etcFamiliarity with processes supporting data transformation, data structures, metadata, dependency and workload managementExperience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.Preferred QualificationsExtensive experience with BigQuery or similar cloud warehouses (Snowflake, Redshift)Demonstrated experience in one or many business domains3+ completed projects with dbtExpert knowledge in PythonWhat you need to do now If you39re interested in this role, click 39apply now39 to forward an up-to-date copy of your CV, or call us now.Hays Poland sp. z o.o. is an employment agency registered in a registry kept by Marshal of the Mazowieckie Voivodeship under the number 361.
Data Engineer | Cloud&Engineering
Deloitte, Gdańsk, Katowice, Kraków, Łódź, Poznań, Warszawa, ...
Who we are looking for For our fast growing Cloud Engineering practice in Central Europe we are looking for Data Engineer where primary goals are conducting R&D tasks, design data models, prepare solution architecture, implementing data models, data flow pipelines (batch & streaming) in Azure cloud which is part of larger transformation programs for large client from Insurance industry.We are looking for an enthusiastic, person with hands-on experience in data projects, in Azure cloud space. Project scope is to implement modern data platform based on Data Mesh concept. The whole implementation is in Azure with streaming data processing in Kafka and Azure Databricks. Qualifications: At least 3-5 years of experience in a similar position At least 3-5 years of experience in on-premise BI/Big Data class solutions or 3 years of experience of working in the Azure cloud environment dealing with data analytics Knowledge of SQL (any dialect, T-SQL, PL/SQL...) Knowledge of Scala or Python Advanced understanding of most of the listed concepts from the field of relational databases, data warehouses, analytical bases (OLTP/OLAP), BigData, data processing and transformation processes (ETL/ELT) Experience in some of Azure services like Azure SQL Database, Azure Databricks, Azure Synapse Analytics, Azure Data Factory, Azure Data Lake, Azure Stream Analytics, Azure Analysis Services, PowerBI Problem-solving skills and analytical thinking Knowledge of English in speaking and writing Willingness to develop in the field of public cloudMoreover, the following will be an advantage:Knowledge of German in speaking and writing Cloud certifications in the data area in Azure: Azure Fundamentals Azure Data Engineer Associate Experience with data streaming (Databricks Structured Streaming, Kafka) Understanding of Data Mesh concept Knowledge of German in speaking and writing Your future role Here’s a sample of what you can experience: Proposing solution architecture based on cloud solutions Develop data-related solutions (i.e. BigData, Modern Data Warehouse, Data Warehouse Modernization, Data Analytics) based on the cloud environment (Azure, GCP, AWS). Streaming data processing Develop solutions in selected specializations in the area of data transformation (ETL / ELT), advanced analytics (OLAP) and reports Develop Data Governance solutions (Data Quality, Master Data Management, Metadata Management etc.) Support the process of implementing changes into the existing software Participate in the planning and tasks estimation process Cooperate with an external client with respect to analysis and performance of indicated tasks Communicate with an external client Desire for intensive development in the field of cloud technologies with a focus on data analytics  What we offer Working from office or home office – what you preferFlexible working hours Permanent employment or contract Medical and health insurance Multisport and other lifestyle benefits Language courses A development path to fit your needs Friendly coworkers & team spirit Wide variety of projects Multiple geographies and clients Work for well-known brands Exposure to trailblazing business and technology projects A place in the first line of a digital transformation Everyday opportunities to influence how and where we do our business Selection process Recruitment process is easy, all you have to do is: Send us your CV HR Interview (30min) Technical Interview (60min) Final Interview (45min) About Deloitte Deloitte is a variety of people, experience, industries and services we deliver in 150 countries of the world. It is an intellectual challenge, a good starting point for your career, and an excellent opportunity for continuous development and gaining valuable life experiences. What you only must do is to take the first step – press the apply button and send us your CV, go through all the stages of the recruitment process and sign a contract with us. Deloitte is simply your best choice. About the team Our Cloud Engineering teams design and deliver interesting cloud projects for clients in Poland and abroad in areas of cloud development, DevOps, integration, migration, data management, infrastructure and others. We help our clients to strategize, design and implement and migrate solutions with use of modern cloud technologies.Cloud Engineering (deloitte.com) #LI-ET1
Cloud Solution Architect Leader
PwC, Wrocław, dolnośląskie
PwC is a powerful network of over 250.000 people across 158 countries. All committed to deliver quality in Assurance, Tax, Advisory & Technology services. Match your curiosity with continuous opportunities to learn, grow and make an impact. Join PwC and be a game changer. A career in Information Technology, within Internal Firm Services, will provide you with the opportunity to support our core business functions by deploying applications that enable our people to work more efficiently and deliver the highest levels of service to our clients. Our Information Technology Generalist - Practice Support team focuses on managing the design and implementation of technology infrastructure within PwC, developing and enhancing internal applications, and providing technology tools that help create a competitive advantage for PwC to drive strategic business growth.We are looking for:Cloud Solution Architect Leader Your future role: Stakeholder Engagement: Collaborate with internal and external stakeholders to align cloud and container strategies with business objectives.  Develop and enforce comprehensive cloud policies, standards, and guidelines.Exceptional communication skills, capable of articulating technical concepts to diverse audiences and driving adoption of cloud best practices across the organization. Proven ability to troubleshoot and optimize complex cloud infrastructures for performance and cost efficiency. Design and implement robust cloud networking and security architectures to support a multi-tenant developer platform, ensuring compliance with industry standards and organizational policies. Expertise in designing and implementing scalable, secure, and reliable cloud solutions on Azure, Amazon Web Services (AWS), and Google Cloud Platform (GCP).Proficiency in designing and managing cloud infrastructure, including virtual machines, containers, and serverless computing. Ability to create and manage complex networking and security configurations, including firewalls, load balancers, and VPCs. Deep understanding of Infrastructure as Code (IaC) principles, cloud-native architectures, and best practices for implementing self-service cloud platforms. Architecture Alignment: Map long-term business requirements to architecture frameworks such as TOGAF, SABSA, Zachman, etc., ensuring alignment with organizational goals. Container Expertise: Lead the development and implementation of strategies for containerized environments, ensuring robust protection for container orchestration tools and services.  Apply, if you have: 10 year(s) in progressive professional roles involving information security and/or IT management. Bachelor’s or Master’s Degree in Information Technology, Cybersecurity, or related field.Professional certifications such as CISSP, CISM, CCSP, AWS/GCP/AZURE Certified Security or equivalent.In-depth expertise in cloud, container, and infrastructure. Proficiency in designing and managing data pipelines on GCP, using services like BigQuery, Cloud Dataflow, GKE and Cloud Pub/Sub. Expertise in AWS services and tools essential for building and managing IDPs, such as AWS CloudFormation, AWS CDK (Cloud Development Kit), AWS Service Catalog, Control Tower, Lambda, CodePipeline, Step Functions, SQS/SNS, S3, etc. Strong understanding of Azure core services, including Compute (VMs, Containers), Storage (Blobs, Disks), Networking (Virtual Networks, Load Balancers), and Identity (Azure AD).Demonstrates intimate knowledge and/or a proven record of success in architecture as applied in the support of and integration with key business and strategic priorities including the following areas: ​- Contributing new intellectual capital through deep specialisation in a subject matter area or technical domain within architecture.- Translating pillar strategy by leading or managing others and performing work with significant independence.- Influencing both internally and externally through building and leading a large team or complex project, or multiple teams or projects, within architecture.- Managing efforts within the architecture space.- Building and maintaining large-scale, multi-part programs while supervising teams to execute against overall strategy. - Collaborating with multiple stakeholders across functional and technical skill sets.  By joining us you gain:Work flexibility - hybrid working model, flexible start of the day, workation, sabbatical leave,Development and upskilling - our full support during onboarding process, mentoring from experienced colleagues, training sessions, workshops, certification co/financed by PwC and conversations with native speaker,Wide medical and wellbeing program - medical care package (incl. dental care, freedom of treatment, physiotherapy), coaching, mindfulness, psychological support, education through dedicated webinars and workshops, financial and legal counseling, Possibility to create your individual benefits package (a.o. lunch pass, insurance packages, concierge, veterinary package for a pet, massages) and access to a cafeteria - vouchers, discounts on IT equipment and car purchase,3 paid hours for volunteering per month,  Additional paid Birthday Day off,And when you start enjoying PwC as much as we do, you may recommend your friend to work with us. Recruitment process:In the first step of the recruitment process, you will have the opportunity to talk to our Recruiter on a short HR screening call.During the next stages, you will have the opportunity to meet other people from the team, including your future Manager - check us out on LinkedIn and see what we have to say! With any queries please contact [email protected] with job title in the subject.
Cloud Data Architect
Michael Page Poland, Mazowieckie
Projektowanie architektury danychDoświadczenie w Azure, AWS, dbt i Databricks,O naszym kliencieNaszym Klientem jest firma, która specjalizuje się w rozwiązaniach z obszarów Data Science & AI.Opis stanowiskaOpracowywanie strategii, dotyczącej danych w chmurze ,Projektowanie architektury danych w Microsoft Azure,Kierowanie działaniami w zakresie integracji danych i procesów ETL,Wykorzystywanie Azure, dbt i Databricks do różnych rozwiązań analitycznych,Utrzymanie standardów zgodności oraz bezpieczeństwa danych w środowiskach chmurowych,Optymalizacja rozwiązań w celu zwiększenia efektywności kosztowej,Tworzenie kompleksowej dokumentacji, dotyczącej architektury danych i rozwiązań.Profil kandydata+ 10 lat doświadczenia zawodowego w obszarze danych,+ 2 lata doświadczenia na stanowisku architekta lub senior konsultanta z naciskiem na rozwiązania AWS,Wykształcenie informatyczne bądź pokrewne,Doświadczenie w Azure, Databricks,Znajomość Pythona, PySpark, SQLUgruntowana wiedza na temat usług danych w chmurze, hurtowni danych, technologii Big Data i procesów ETL,Rozumienie architektury danych, projektowania aplikacji, inżynierii systemów i integracji,Otwarcie na pracę w wielu różnych technologiach,Mile widziana znajomość narzędzi do wizualizacji danych (np. Power BI, Tableau),Bardzo dobra znajomość języka angielskiego (w mowie i piśmie) - poziom C1.OferujemyMożliwość pracy w środowisku, wykorzystującym najnowsze technologie na rynku,Możliwość pracy hybrydowej (Warszawa) lub zdalnejElastyczność w zakresie godzin pracy i preferowanej formy umowyPraca w środowisku międzynarodowymNieograniczony dostęp do platformy edukacyjnej UdemyCertyfikatowe programy szkoleniowe oraz programy rozwoju talentów
Cloud Data Architect
P&P Solutions Sp. z o.o., Warszawa
O projekcie Dołącz do zespołu PnP Solutions w projekcie dla naszego Klienta. Obecnie poszukujemy osoby na stanowisko Cloud Data Architecta, który będzie odgrywać kluczową rolę w projekcie związanym z przetwarzaniem danych w chmurze. Twój zakres obowiązkówWsparcie klienta w podejmowaniu kluczowych i strategicznych decyzji technologicznych w obszarze danych.Prace badawczo-rozwojowe w zakresie analizy funkcjonalności i przydatności nowych technologii i narzędzi w rozwiązaniach biznesowych klienta.Tworzenie PoC w obszarze danych w celu zaprezentowania wyników prowadzonych prac R&D.Projektowanie i tworzenie całości platformy przetwarzania danych uwzględniając wszystkie jej części oraz powiązania z pozostałymi rozwiązaniami (BI i ML) z uwzględnieniem ekosystemu chmurowego.Kompleksowe zarządzanie procesem wytwarzania rozwiązań z zakresu Big Data: od analizy wymagań biznesowych, przez projektowanie architektury technicznej do testowania i wdrożenia, po tworzenie dokumentacji przepływu danych.Optymalizacja całościowych rozwiązań/systemów przechowywania i analizy danych.Koordynowanie pracy inżynierów zaangażowanych w tworzenie rozwiązania.Nasze wymaganiaMin. 5 lat doświadczenia w IT, w tym przynajmniej 3,5 roku pracy z danymi w chmurze.Bardzo dobra znajomość usług Data w Azure, w szczególności: Azure Data Factory, Azure Data Lake, Azure SQL Server, Azure Synapse Analytics oraz Power BI.Znajomość platformy Databricks, ze szczególnym uwzględnieniem najnowszych funkcjonalności, takich jak Delta Live Tables i Unity Catalog.Znajomość narzędzi i mechanizmów takich jak CI/CD, Azure DevOps, Git.Umiejętność tworzenia i modyfikowania modeli danych, oraz dobierania odpowiednich modeli do rozwiązywanego problemu (w tym: DWH, Data Marts, Data Lake, Delta Lake, Data Lakehouse).Znajomość architektury Lambda i Kappa, oraz umiejętność budowania rozwiązań do analizy danych.Zrozumienie i praktyczna znajomość mechanizmów związanych z bezpiecznym przechowywaniem i przetwarzaniem danych w chmurze.Znajomość języka angielskiego na poziomie zaawansowanym, minimum C1.