About Incubeta

Incubeta is an international team of experts in marketing, technology, data and creative, who came together because we want businesses to make the most of the opportunities the digital landscape provides. We operate in 22 locations with an incredible team of 900 around the world

Principal objective of the work position

Your challenge will be designing, implementing and managing scalable and efficient data solutions in cloud environments. You will also maintain optimized data architectures, ensuring the availability, security and efficiency of data storage and processing systems in the cloud.

How you would be applying your experience and knowledge

  • Design, develop and maintain cloud data architectures, using technologies and services such as AWS, Google Cloud Platform or Microsoft Azure.
  • Create and manage data pipelines for extraction, transformation and loading (ETL) from various sources to cloud storage systems.
  • Collaborate with development teams to identify data requirements and ensure effective integration of data flows into applications.
  • Implement security and compliance solutions to protect sensitive data stored and processed in the cloud
  • Optimize the performance of data solutions by monitoring and tuning ETL, query and cloud storage processes.
  • Collaborate with analytics teams and data scientists to provide access to required data and ensure the availability of relevant data sets
  • Stay current with the latest trends and advancements in cloud technologies and data analytics to propose continuous improvements to existing solutions
  • Document processes, procedures and data architectures to facilitate understanding and collaboration among team members

What skills you need

  • Strong knowledge of cloud architectures, with experience in platforms such as AWS, GCP or Azure.
  • Have one of the following 3 GCP certifications:
    • Professional Data Engineer
    • Machine Learning Engineer
    • Cloud Architect
  • Experience in designing and developing ETL pipelines for data processing.
  • Knowledge in programming languages such as Python, Java or Scala, as well as in query and data processing tools such as SQL and Apache Spark
  • Proficiency in English
  • Knowledge with security and compliance practices in sensitive data environments
  • Ability to collaborate effectively with cross-functional teams and communicate clearly and concisely
  • Analytical and problem-solving thinking to address technical challenges and optimize the performance of data solutions
  • Ability to stay current in a constantly evolving technology environment.