I am Nathan Bijnens, a Sr Solution Architect Manager, Data & AI with an extensive background as software developer and Solution Architect, with a passion for Business, Cloud and Data. I guide my team, coach and mentor my colleagues and set-out the strategy and go-to-market towards our strategic, commercial and public sector customers on data & cloud. Responsible for nn M€ revenue and double digit growth on Analytics, Apps+Data, AI/ML. I drive customers initiatives, leveraging Microsoft AI & Analytics on Azure to solve the biggest and most complex data challenges faced by Microsoft’s enterprise customers.
In a previous role I was the Lead Cloud Solution Architect for the European Union Institutions and NATO at Microsoft helping the European Union and NATO become a Data Driven Government.
With a successful track record of building innovative systems in highly political and complex environments, I not only help on envisioning what data can do for business, I analyze, give recommendations, design architectures and inspire teams. I build on my background and experience with data analytics and building Big Data systems, especially real-time Big Data and messaging-based systems.
I enjoy working with clients and partners, from giving advice, talking about the Business and Technological value of Big Data and AI, to imagining solutions. Working in an entrepreneurial, data driven, innovative environment gives me energy.
I am a passionate speaker and evangelist, on AI, Big Data, IoT and Cloud.
June 2021 - ...
July 2019 - May 2021
February 2016 - June 2019
As a CSA I drive high priority customer initiatives, leveraging Cortana Intelligence on Azure to solve the biggest and most complex data and IoT challenges faced by Microsoft's enterprise customers. It is a technical, customer facing role, accountable for the end-to-end customer deployment and usage experience for cloud data and IoT services.
As a Big Data Engineer, I worked on our Data Processing and Analytics stack, creating a Lambda and Kappa Architecture. Working with Apache Spark, Spark Streaming, Storm, Cassandra and Kafka. We mostly used Scala and bits of Java. We deployed using Chef on Mesos in the cloud.
We created a Spark as a Service exploratory environment for Data Scientists in the Cloud, based around Mesos, Docker, iPython (now Jupyter) and Spark Notebooks.
As a Big Data and DevOps Engineer, I was responsible for the scalability, quality and Operational Intelligence of a new platform. I introduced the Continuous Integration platform, using Jenkins, added unit tests. The platform is build using Chef, tried to be cloud independent (ran on Amazon AWS, Softlayer, ...).
As a Big Data and DevOps Engineer, I was responsible for the scalability, quality and Operational Intelligence of a new platform. I introduced the Continuous Integration platform, using Jenkins, added unit tests. The platform is build using Chef, tries to be cloud independent (runs on Amazon AWS, Softlayer, ...).
Project Manager and Lead Developer of ABBAMelda, a ticket and maintenance management system, originally developed by Siemens. ABBAMelda consists of a Java EE backend, an Informix database, with a PHP/jQuery frontend. Under my lead, we created an enhanced Tablet intranet site, improved the bulk upload possibilities, additional REST services, introduced unit testing and switched to git. I was responsible for coordinating with different teams within the Flemish Government, as well as the contractor (Macq).
Defining and implementing the architecture for a social media analytics startup. I designed and implemented a Lambda Architecture (in Java), on top of Storm and Hadoop, using Redis, Voldemort as well as Thrift.
Developing Oracle database views for integration of Greencat and Crystal Reports.
Discussions, presentations and conversations with partners, discussing potential Business Ideas and architectures. Evangelizing the Belgium market for Big Data.
I developed our credit management web application in PHP, I managed a small group of developers, took the lead on everything technical and coordinated with the directors, partners and clients.
I virtualized and automated the whole iController server setup using Puppet. I created and extended several open-source Puppet modules.
Setting up the Hadoop infrastructure, analyzing data with Hadoop Pig and Hive, creating dashboards using Symfony2 and Hbase.
Within the core Netlog framework, I worked on Event collection, Unit Testing and the migration to Git. I also benchmarked various caching and NoSQL technologies (Memcached, Redis, Membase).
As part of the launch of Twoo, I was responsible for the design and development of the Request Routing and the security and access control models. I also advised on Design Patterns and Best Practices.
For the full details, see nathan.gs/cv.
In possession of a driver's license.
Many presentations to clients and partners, typically on: