Share this Job
Apply now »


Req ID:  17484
Posted on: 

Madrid Goya (ES02), Spain

Department:  Center Of Excellence (50007082)
Job Family:  Information Technology

SICPA - Together with you we achieve excellence!



We are looking for our offices in Madrid (Spain):







This role will be responsible for leading the design and maintenance of Big Data applications, customizing at regional level, deploying in customer environments, creating complex data flows and providing technical support and mentoring for the software developers and data analysts. This role interacts with internal customers such as business analysts, project managers, and coordinators who collect requirements to define specifications and transform business requirements into the company’s data architecture to support next generation of products and data initiatives.




  • Develop Big Data solutions to support customer strategies related to risk profiling and law enforcement.
  • Responsible for the integration and analysis of data from multiple sources. Manage development activities including technical analysis, effort estimations, definition of scope and objectives, and coordination with other technical teams.
  • Ensure compliance with customer requirements and alignment with business strategy
  • Ensure compliance with policies related to data security and country-specific regulatory compliance procedures.
  • Create and maintain application test environments. Develop test plans and verify test results.
  • Identify new technical challenges and propose solutions, enhancements, and design improvements.
  • Assist in the installation, configuration, optimization, support, and maintenance of Big Data applications.
  • Mentor and train other development and technical staff on the solution.




  • Bachelor's degree in a relevant field (technical or business).
  • Five (5) years of experience in the development and implementation of Big Data Solutions with a proven track record in the development and support of data pipeline architectures, as well as optimizing complex data flows.
  • Demonstrated expertise in Apache Spark Batch jobs (Java or Scala).
  • Skilled in Hadoop HDFS and Apache Parquet, Apache Kafka.
  • Experience in the following technologies/areas a plus: Tableau, Docker, Openshift.
  • Ability to prioritize and work independently. Agile mindset. Effective at problem solving.
  • Proven team player with strong interpersonal skills, possessing a demonstrated ability to handle multiple projects with varying priorities.
  • Availability to travel internationally
  • Excellent communication and English language skills.
    JOIN US!
  • Our success comes from our highly skilled and talented employees
  • Respectful entrepreneurship and a long-term vision are key for success
  • Our people contribute to a more secure world
  • Diversity at all levels of an organisation is a strength
    We offer an exciting and challenging role, with great potential for personal development within a unique organization in a fascinating industry.



Apply now »