5 Jobs currently available

All jobs

  • Senior Full Stack (ReactJS+NodeJS) Developer

    Location Gdansk

    Godel is on the constant lookout for talented Full Stack (ReactJS+NodeJS) Developer to join one of our teams engaged in development of business applications for major British companies. We always have new and exciting projects with opportunities to improve your skills in a wide spectrum of technologies related to web development. If you are enthusiastic and ambitious developer, willing to take active participation in shaping of a product, this job is for you! Responsibilities:

    • Working closely with product owners and business analysis team to deliver the best possible customer experience that balances commercial need and technical constraints within the domain
    • Producing estimates on future development to support planning
    • The development of the user interface and underlying backend layer
    • Ensuring development adheres to industry standards, maintaining code quality, testing and employing the best practices and software design patterns
    • Supporting and maintaining the products owned by the team
    • Producing documentation to support other team members
    If you are interested, you can also participate in the following activities: 
    • Talent Management
    • Talent Acquisition
    • Education
    • Innovations
    • Onboarding
    • Consultancy
    Ideally you have:
    • Experience with the latest versions of React and Node.js
    • Understanding of working in Agile environment
    • Preferably 3+ years of solid development experience in commercial environment
    • Speaking English command at least B1 level
    Nice to have:
    • Understanding of CI/CD cycle, readiness to maintain pipeline configuration
    • Knowledge of various quality control tools for e2e and performance testing
    • Unit Testing (depending on the project Jasmine/Mocha/Jest/react-testing-framework)
    • E2E Testing (depending on the project Webdriver.io/Cypress)
    • Advanced git knowledge
    • Client side performance testing
    • Being self-motivated and able to work on own initiative as well as part of a team
    • Excellent communication and analytical skills for daily communication with English-speaking natives

    View full details
  • DevOps Engineer

    Location Gdansk

    Godel is on the constant lookout for talented DevOps/Platform   Engineers to join one of our teams engaged in development of business applications for major British companies. We always have new and exciting projects with opportunities to improve your skills in a wide spectrum of technologies related to web development. If you are enthusiastic and ambitious engineer, specialist, willing to take active participation in shaping of platform engineering future, this job is for you!  These are just some of the technologies that you might work with: 

    • Amazon Web Services, Microsoft Azure, Google Cloud 
    • Terraform, Ansible 
    • Azure DevOps, Gitlab, Jenkins, Github 
    • Linux, Windows 
    • Docker and Kubernetes 
    • Python, Bash, PowerShell scripting 
    Responsibilities: 
    • Design, build and operate cloud environments for continuously evolving business solutions 
    • Automate provisioning, configuration and maintenance of infrastructure with code 
    • Design, automate and integrate CI/CD processes for continuous delivery 
    • Perform architecture and technical reviews of infrastructure platform evaluate compliance to security and quality standards    
    • Design and implement architecture and migration approaches to the cloud  
    • Integrate and extend existing solutions with cloud services 
    If you are interested, you can also participate in the following activities: 
    • Talent Management 
    • Talent Acquisition 
    • Education 
    • Innovations 
    • Onboarding 
    • Consultancy 
    Ideally you have: 
    • 2+ years of experience with one or more of the following cloud providers: AWS, MS Azure, Google Cloud 
    • Solid experience with Terraform, Ansible and/or other automation tools 
    • Proficiency working in Linux and Windows environments 
    • Experience with CI/CD tools and delivery process automation 
    • Experience with containerizing services and running them on a scale 
    • Experience with setup for monitoring, alerting, logging solutions in production.  
    • Software development or system/network administration background 
    • Written and spoken English: intermediate or higher 
    Nice to have: 
    • AWS/Azure/Google cloud architect or developer certification 
    • Setup and maintain scalable operation process in production 
    • Technical and process audit experience 
    • Technical leadership experience 
    • Mentorship and people development experience 
     

    View full details
  • MIDDLE / SENIOR DATA ENGINEER (AZURE)

    Location Gdansk

    Godel is on the constant lookout for talented   Data Engineers to join one of our teams engaged in the development of business applications for major British companies. We always have new and exciting projects with opportunities to improve your skills in a wide spectrum of technologies related to business intelligence. If you are an enthusiastic and ambitious developer, willing to take active participation in the shaping of a product, this job is for you!  These are just some of the technologies that you might work with at Godel Data Division: 

    • Data Integration: Azure Data Factory (with SSIS) 
    • Data Processing: Azure Databricks 
    • Database & Data Warehouse:  Azure SQL Database, Azure Synapse, SQL Server  
    • Storage: Azure Blob, Azure Data Lake Storage 
    You will be responsible for: 
    • Creating functional design specifications, Azure reference architectures, and assisting with other project deliverables as needed. 
    • Design and Develop Platform as a Service (PaaS) Solutions using different Azure Services. 
    • Create a data factory, orchestrate data processing activities in a data-driven workflow, monitor and manage the data factory, move, transform and analyze data. 
    • Design complex enterprise Data solutions that utilize Azure Data Factory. Creating migration plans to move legacy SSIS packages into Azure Data Factory. 
    • Build conceptual and logical data models. 
    • Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed / elastic environments, and downstream applications and/or self-service solutions. 
    • Develop and document mechanisms for deployment, monitoring, and maintenance. 
    • Identifying   performance   bottlenecks, and accessing external data sources. 
    • Implementing security requirements. 
    • Monitoring and managing.  
    Ideally you have: 
    • Extensive knowledge of data architecture principles (e.g., data lake, data warehousing, etc.). 
    • Strong knowledge of relational databases, as well as skills in SQL query design and development. 
    • Experience working with the data in Python 
    • Conceptual understanding of cloud architectures, service tiers and hybrid deployment models. 
    • Ability to independently troubleshoot and performance tune large scale enterprise systems. 
    • Expertise with ETL/ELT design patterns, and DataMart structures (star, snowflake schemas, etc.). 
    • Experience with Microsoft Cloud Data Platform: Azure Data Lake, Azure Blob Storage, Azure Data Factory, Azure Cosmos DB, Azure Databricks, Synapse Analytics. 
    • Experience in Azure Data Factory (ADF) creating multiple pipelines and activities using Azure for full and incremental data loads into Azure Data Lake Store and Azure SQL DW. 
    • Hands-on experience with data migration methodologies and processes to transform on-premises data to cloud using tools like Azure Data Factory, Data Migration Service, SSIS, etc. 
    • Must have an ability to communicate clearly and be a team player. 
    • Intermediate level of English. 
    Nice to have: 
    • Hands-on programming experience in additional technologies, such as Python, Java, Scala , C#, Spark, Databricks, Powershell is a plus. 
    • Experience working with developer tools such as Azure DevOps and GitLabs. 
    • Power BI reporting tool experience. 
    • Understanding of big data tools such as Hadoop. 
    • Experience managing Microsoft Azure environments with VM's, VNETS, Subnets, NSG's, Resource Groups. 
    • Azure certification would be desired. 

    View full details
  • LEAD DATA ENGINEER (AWS)

    Location Gdansk

    Godel  is on the constant lookout for talented Lead Data Engineer to join one of our teams engaged in development of business applications for major British companies. We always have new and exciting projects with opportunities to improve your skills in a wide spectrum of technologies related to web development. If you are enthusiastic and ambitious developer, willing to take active participation in shaping of a product, this job is for you! These are just some of the technologies that you might work with at Godel Data Division:

    • Data Integration: Apache Airflow (MWAA)g
    • Data Processing: Apache Kafka (MSK), Apache Spark (EMR), Amazon Kinesis, AWS Batch
    • Database & Data Warehouse: Amazon DynamoDB, Amazon Redshift, Amazon Athena
    • Storage: Amazon S3, Amazon Glue
    Responsibilities:
    • Technical requirements elicitation
    • Providing technical leadership for the customer
    • Consult and collaborate with team members
    • Design and development of data platform (data lake) based on cloud services from AWS
    • Developing batch & stream data processing solutions, ETL processes, automated  workflows
    • Testing data pipelines and data quality using modern  approaches
    • Implementing industry’s best practices for collecting, transforming and accessing data
    • Code review
    • Communication with the customer and working directly with the Customer team
    • Working within an agile team
    If you are interested, you can also participate in the following activities:
    • Talent Management
    • Talent Acquisition
    • Education
    • Innovations
    • Onboarding
    • Consultancy
    Ideally you have:
    • Strong knowledge of AWS Cloud platform and desire to work with the following services: Amazon S3, EC2, ECR, AWS Batch, Amazon DynamoDB, Amazon Athena, Amazon Redshift
    • Deep experience of working with data pipelines: ETL/ELT, Data Quality and Data Management. Preferable technologies like: Apache Airflow, Apache Kafka, Apache Spark
    • Good knowledge in at least one of the programming languages Python, Java
    • Excellent communication and negotiation skills
    Nice to have:
    • Exposure to operation of business-critical data products
    • CI/CD: Git, Terraform, Jenkins, Test driven approach (TDD/BDD)
    • Metrics and Notifications: Prometheus, Grafana
    • Enthusiasm for agile and lean development

    View full details
  • Middle / Senior Data Engineer (AWS)

    Location Gdansk

    Godel is on the constant lookout for talented Data Engineer to join one of our teams engaged in development of business applications for major British companies. We always have new and exciting projects with opportunities to improve your skills in a wide spectrum of technologies related to Big Data development. If you are enthusiastic and ambitious developer, willing to take active participation in shaping of a product, this job is for you! You will be responsible for:

    • Design and development of data platform based on Big Data cloud services from AWS
    • Writing batch & stream data processing solutions, ETL processes, automated workflows
    • Testing data pipelines and data quality using modern approaches
    • Implementing industry’s best practices for collecting, transforming and accessing data
    • Communication with the customer and working directly with the Customer team
    • Working within an agile team
    Ideally you have:
    • Understanding of Cloud platforms such as AWS and desire to work with the following services: Amazon S3, EC2, ECR, AWS Batch, Amazon DynamoDB, Amazon Athena, Amazon Redshift
    • Experience of working with data pipelines: ETL/ELT, Data Quality and Data Management. Preferable technologies like: Apache Airflow, Apache Kafka, Apache Spark
    • Knowledge in at least one of the programming languages Python, Java
    • Intermediate level of English
    Nice to have:
    • Exposure to operation of business-critical data products
    • CI/CD: Git, Terraform, Jenkins, Test driven approach (TDD/BDD)
    • Metrics and Notifications: Prometheus, Grafana
    • Scala is advantageous
    • Enthusiasm for agile and lean development

    View full details

Ready to talk to Godel

My Chatbot