Senior Software Engineer - Data Science & AI
Sign in to Google to save your progress. Learn more
Hi. We are ASOS!
ASOS is the UK’s number one fashion and beauty destination, expanding globally at a rapid pace and our award-winning Tech teams sit at the heart of our business. We deliver technical innovations and pioneer incredible solutions to keep our 20-something market engaged, the cloud based architecture to support our global reach and the agile engineering methods to deliver value fast. We’re extremely ambitious and thrive on the individuality of our amazing employees.

We are building ASOS's AI platform – a unique big data infrastructure that will ingest and serve massive amount of data to power our predictive data products, leveraging the latest in machine learning and AI technologies.  

We are looking for outstanding cloud and big data folks to join us early on and help build the new platform and grow our team of polyglot engineers and data scientists. If you are passionate about technology and architecture, love hands-on development and want to build very scalable, highly available systems– you should work with us! You'll have the unique opportunity to work alongside data scientists to build innovative data products that will impact the business performance at all levels.
What you'll be doing
- Evaluate, select and deploy massive data processing infrastructure
- Design and build cloud-scale Services and API’s
- Handle all aspects of development – design, development, build, deployment, monitoring and operations.
- Research and experiment with emerging technologies and industry trends with a view to bringing business value through early adoption
- Work in an agile, cross-functional team composed of engineers and machine learning scientists, taking responsibility for delivering predictive data products

Our current stack consists of applications written in Python and Scala and deployed to Azure using Azure DevOps and Ansible. Our microservices are packaged in containers and deployed to Kubernetes. Our big data pipelines run on Spark and PySpark and are orchestrated using ADF. We use Git and VSTS for source control and Jira for backlog management.
Key Skills and Experience
Essential
- Core programming knowledge, proficient in Python and Shell scripting
- Broad knowledge of software delivery lifecycle and CI/CD toolset
- Quality Assurance and Testing frameworks
- Working knowledge of the tools and practices available and appropriate to each phase of software delivery, such as TDD, BDD, Integration Testing, Performance Testing etc.
- CI, Continuous Delivery, Build Automation
- Relational and non-relational database technologies
- Agile frameworks like Scrum or Kanban
- Fully-automated provisioning and application deployment pipelines
- Exposure to CI/CD tools like Azure DevOps, Ansible, TeamCity, Jenkins, Octopus
- Infrastructure automation using a scripting language like bash or powershell
- Configuration management
- Experience in delivering services that are highly-available, low-latency and scalable
- Exposure to a microservices architecture
- Cloud based development and delivery platform (Ideally Microsoft Azure)
- Container technologies like Docker, Kubernetes

Nice to Have
- Understanding of Big Data core concepts and technologies
- Experience in using Apache Spark
- Knowledge of advanced analytics and insights techniques (e.g. predictive analytics, machine learning, segmentation)
- Knowledge of deep learning frameworks like TensorFlow, Keras
- Knowledge of machine learning libraries like MLLIB, sklearn
- Experience in retail and/or e-commerce
Want to apply? Fill in the details below to apply through FATJ
Your name *
Current Position *
Email Address *
Number *
Covering Note
Link to your CV / Linkedin Profile *
Submit
Clear form
Never submit passwords through Google Forms.
This form was created inside of Find a Tech Job. Report Abuse