Używamy plików cookie, aby poprawić wrażenia użytkownika, analizować ruch i wyświetlać odpowiednie reklamy.
Więcej Akceptuję
Podaj pozycję

Przegląd Statystyki wynagrodzenia zawód "Enterprise Data Architect w Kraków"

Otrzymuj informacje statystyczne pocztą
Niestety nie ma statystyk dotyczących tego żądania. Spróbuj zmienić swoją pozycję lub region.

Polecane oferty pracy

Senior Big Data Engineer
Hays, Kraków
Your new company For our client, a company that provides SaaS products related to sales, customer support, and other customer communications, we are looking for a Senior Big Data Engineer.Your new role Enterprise Data & Analytics (EDA) is looking for an experienced Senior Data Engineer to join growing analytics engineering team. Our client is a globally distributed remote first team. You’ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle. You will craft and develop curated data products, applying standard architectural & data modelling practices. You will be primarily developing Data Warehouse Solutions in BigQuery/Snowflake using technologies such as dbt, Airflow, Terraform.What you get to do every single day:Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data modelsUse and mentor on best engineering practices such as version control system, CI/CD, code review, pair programmingDesign, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reportingDesign & build ELT based data models using SQL & dbtBuild analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domainsIdentify, design, and implement internal process improvements: automating manual processes, optimising data deliveryWork with data and analytics experts to strive for greater functionality in data systemsWhat you'll need to succeed 5+ years of data / analytics engineering experience building, working & maintaining data pipelines & ETL processes on big data environmentsExperience designing and implementing Data Warehouse solutionsGood Knowledge in modern as well as classic Data Modelling - Kimball, Innmon, etc Intermediate experience with any of the programming languages: Python, Go, Java, Scala, we use primarily PythonAdvanced working SQL knowledge and experience working with Cloud columnar databases (Google BigQuery, Amazon Redshift, Snowflake), query authoring (SQL) as well as working familiarity with a variety of databasesIntegration with 3rd party API SaaS applications like Salesforce, Zuora, etcFamiliarity with processes supporting data transformation, data structures, metadata, dependency and workload managementExperience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.Preferred QualificationsExtensive experience with BigQuery or similar cloud warehouses (Snowflake, Redshift)Demonstrated experience in one or many business domainsCompleted projects with dbtExpert knowledge in PythonWhat you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.Hays Poland sp. z o.o. is an employment agency registered in a registry kept by Marshal of the Mazowieckie Voivodeship under the number 361.
Senior Big Data Engineer
HAYS, Kraków, malopolskie
Senior Big Data EngineerKrakówNR REF.: 1184757Your new company For our client, a company that provides SaaS products related to sales, customer support, and other customer communications, we are looking for a Senior Big Data Engineer.Your new role Enterprise Data & Analytics (EDA) is looking for an experienced Senior Data Engineer to join growing analytics engineering team. Our client is a globally distributed remote first team. You’ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle. You will craft and develop curated data products, applying standard architectural & data modelling practices. You will be primarily developing Data Warehouse Solutions in BigQuery/Snowflake using technologies such as dbt, Airflow, Terraform.What you get to do every single day:Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data modelsUse and mentor on best engineering practices such as version control system, CI/CD, code review, pair programmingDesign, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reportingDesign & build ELT based data models using SQL & dbtBuild analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domainsIdentify, design, and implement internal process improvements: automating manual processes, optimising data deliveryWork with data and analytics experts to strive for greater functionality in data systemsWhat you39ll need to succeed 5+ years of data / analytics engineering experience building, working & maintaining data pipelines & ETL processes on big data environmentsExperience designing and implementing Data Warehouse solutionsGood Knowledge in modern as well as classic Data Modelling - Kimball, Innmon, etc Intermediate experience with any of the programming languages: Python, Go, Java, Scala, we use primarily PythonAdvanced working SQL knowledge and experience working with Cloud columnar databases (Google BigQuery, Amazon Redshift, Snowflake), query authoring (SQL) as well as working familiarity with a variety of databasesIntegration with 3rd party API SaaS applications like Salesforce, Zuora, etcFamiliarity with processes supporting data transformation, data structures, metadata, dependency and workload managementExperience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.Preferred QualificationsExtensive experience with BigQuery or similar cloud warehouses (Snowflake, Redshift)Demonstrated experience in one or many business domainsCompleted projects with dbtExpert knowledge in PythonWhat you need to do now If you39re interested in this role, click 39apply now39 to forward an up-to-date copy of your CV, or call us now.Hays Poland sp. z o.o. is an employment agency registered in a registry kept by Marshal of the Mazowieckie Voivodeship under the number 361.
Lead Data Designer
Gi Group, Kraków, małopolskie
Managing and mentoring more junior design resources Working with the Data Solutions Design manager to identify key areas of focus, resolve key issues and liaise with Solutions and Enterprise architects as needed Working with systems and business SMEs to gather requirements Working with the Solutions Architect, source system and business SMEs to understand data structures and usage, including data refresh cadence, any cleansing /transformation logic, volume of history retention, etc. Gathering business requirements and enhancing with technical details and creating the Source-to-Target mapping documentation that is used by Data Engineers for implementation Documenting data quality rules to be incorporated as part of the build process, with expected behaviour of the product to various outputs from those rules Working closely with Solution Architects, Data Architect and Data Engineers to define data harmonisations rules and best modelling approach to satisfy business needs Working with the PO to create user stories with agreed definition of done. Ideally around 4-5 years of previous data experience, including data change programmes or data platform build initiatives in relevant industries Experience in leading small data design teams Demonstrable experience and expertise in handling large complex data sets SQL expertise and experience with other modern data tools Expertise in Kimball and other data modelling techniques A results-oriented mindset with an eye for detail and a passion for data Private medical healthcare at LUXMED (including dentalcare) for you and your family Medicover sports card (Fit&More package) Life insurance financed by the employer 30-minute lunch break included in the 8-hour working day Work in a highly professional and stimulating atmosphere Training & Buddy programme that will allow you to quickly adapt to your new role Wellbeing programme for employees Co-financing of monthly tickets for the public transport in Krakow Comfortable working environment in the office and the possibility of home office Language courses, accounting courses, access to LinkedIn Learning and the possibility of co-financing studies and certification Employee referral programme
Spark Developer with Scala
L.M. GROUP POLAND Sp. z o.o., Kraków, małopolskie
Tasks:Fully utilize the potential of the Spark cluster.Cleanse, transform, and analyze vast amounts of raw data from various systems using Spark to deliver ready-to-use data.Continuously introduce integrations and improvements.Develop architecture, standards, and guidelines for a globally deployed software solution.Support product management by steering the product roadmap.Collaborate with the development team lead to deliver IT solutions, business analysts, and testers.Ensure the delivery of a robust and sustainable solution meeting business requirements while considering user experience and ensuring compliance with organization guidelines, principles, standards, and processes.Requirements:Experience working on the Azure cloud platform.Ability to use and improve Shell and Perl scripts.Experience in the following areas:- Apache Spark 2.x- Apache Spark RDD API- Apache Spark SQL DataFrame API- Apache Spark Streaming API- Scala- Query tuning and performance optimization in Spark.Integration with SQL databases Oracle, Postgres, and/or MySQL.Experience working with NoSQL, DataBricks.Designing, planning, and delivering solutions in an enterprise-scale environment.Experience in information technology within the financial services sector, preferably in Governance, Risk, and Compliance (GRC).Our client offers:A solid, flexible benefits package that can be tailored to individual needs, including a Multisport card, vouchers for shops, and much more - MyBenefit cafeteria.Premium medical services insurance for employees and their family members - Luxmed.Life and disability insurance for employees and their family members - Generali.Profitable Voluntary Pension Fund.Benefits from the social fund: holiday bonuses, daycare subsidies, etc.Integration and cultural events for employees.Awards and recognition program for outstanding employees.Referral bonuses for recommending employees.Relocation assistance -Accommodation, travel, and other expenses covered.
Spark Developer with Scala
L.M. Group Poland, Kraków, malopolskie
Spark Developer with Scala We are an international recruitment agency founded in 1987 in Israel, present in Poland since 2014. We specialize in recruiting for permanent and temporary positions. Our headquarters are located in Poznań, and we also have branches in Warsaw, Gdańsk, and Wrocław.Currently, we are looking for an Apache Spark and Scala Specialist with advanced English language skills for one of our globally-reaching clients. Spark Developer with Scala Numer referencyjny: JAOK Miejsce pracy: Kraków Tasks:Fully utilize the potential of the Spark cluster.Cleanse, transform, and analyze vast amounts of raw data from various systems using Spark to deliver ready-to-use data.Continuously introduce integrations and improvements.Develop architecture, standards, and guidelines for a globally deployed software solution.Support product management by steering the product roadmap.Collaborate with the development team lead to deliver IT solutions, business analysts, and testers.Ensure the delivery of a robust and sustainable solution meeting business requirements while considering user experience and ensuring compliance with organization guidelines, principles, standards, and processes. Requirements:Experience working on the Azure cloud platform.Ability to use and improve Shell and Perl scripts.Experience in the following areas:- Apache Spark 2.x- Apache Spark RDD API- Apache Spark SQL DataFrame API- Apache Spark Streaming API- Scala- Query tuning and performance optimization in Spark.Integration with SQL databases Oracle, Postgres, and/or MySQL.Experience working with NoSQL, DataBricks.Designing, planning, and delivering solutions in an enterprise-scale environment.Experience in information technology within the financial services sector, preferably in Governance, Risk, and Compliance (GRC). Our client offers:A solid, flexible benefits package that can be tailored to individual needs, including a Multisport card, vouchers for shops, and much more - MyBenefit cafeteria.Premium medical services insurance for employees and their family members - Luxmed.Life and disability insurance for employees and their family members - Generali.Profitable Voluntary Pension Fund.Benefits from the social fund: holiday bonuses, daycare subsidies, etc.Integration and cultural events for employees.Awards and recognition program for outstanding employees.Referral bonuses for recommending employees.Relocation assistance -Accommodation, travel, and other expenses covered.