I am Nathan Bijnens, a developer with a passion for great code, the web and Big Data. I am interested in programming and system administration, especially where they meet, from scaling platforms to designing the architecture of new and existing products and everything in between.
I am focused on data analysis and building Big Data Applications. Using Hadoop, in combination with Hadoop Pig, Hive and Cascading. I follow the rise of real-time big data closely, actively developing applications on top of Storm. And designing Lambda-like architectures. The infrastructure side interests me as well, and I am learning more about Business Intelligence and visualizing big data. I advise on Big Data Strategies and evangelise Big Data to clients and at conferences.
I have a lot of experience with PHP, Java and other related technologies like MySQL, nosql, memcached, nginx and a lot more. I strongly believe in unit tests and design patterns, to write precise and easy maintainable code that works.
I am a passionate linux system engineer, follower of the devops movement. Using Puppet and Ganglia to automate and monitor deployments.
I enjoy working with clients and partners, from giving advice, talking about the Business and Technological value of Big Data, to Requirement Analysis.
I am inquisitive, I love learning about new things and improving what I know. I am very passionate about what I do, and I have strong analytical skills.
As a Big Data and DevOps Engineer, I am responsible for the scalability, quality and Operational Intelligence of a new platform. I introduced the Continuous Integration platform, using Jenkins, added unit tests. The platform is build using Chef, tries to be cloud independent (runs on Amazon AWS, Softlayer, ...).
Project Manager and Lead Developer of ABBAMelda, a ticket and maintenance management system, originally developed by Siemens. ABBAMelda consists of a Java EE backend, an Informix database, with a PHP/jQuery frontend. Under my lead, we created an enhanced Tablet intranet site, improved the bulk upload possibilities, additional REST services, introduced unit testing and switched to git. I was responsible for coordinating with different teams within the Flemish Government, as well as the contractor (Macq).
Defining and implementing the architecture for Octopin, a Pinterest social media analytics startup. I designed and implented a Lambda Architecture (in Java), on top of Storm and Hadoop, using Redis, Voldemort, Cascading as well as Thrift.
Defining and implementing the architecture for hshmrk, a data visualization startup. The application backend is written as a Jersey REST (Java), service, using ElasticSearch as storage. The frontend is a AngularJS and D3 webapplication. This approach allows us to easily scale.
Developing Oracle database views for integration of Greencat and Crystal Reports.
I co-develop and I am the current lead on the IHarvest project. It is a distributed HTTP Fetcher & Parser on top of Storm, writen in Java, the results are stored on HDFS for more extensive querying using Hadoop.
I co-develop on our internal Semantic Analysis Engine on top of Storm and ElasticSearch. The web interface is build around a Java, Jersey backend and a frontend in AngularJS.
Responsible for the contact with Microsoft.
Creating a Drupal based website, hosted on Windows Azure. I touched all aspects of creating this website, from designing, implementing to copy writing.
Designing the DataCrunchers business cards and a company flag.
I developed our credit management web application in PHP, I managed a small group of developers, took the lead on everything technical and coordinated with the directors, partners and clients.
I virtualized and automated the whole iController server setup using Puppet. I created and extended several open-source Puppet modules.
Creating a small web application to organize and input subscriptions into Octopus.
Setting up the Hadoop infrastructure, analyzing data with Hadoop Pig and creating a dashboard using Symfony2 and Hbase & Thrift.
Creating a new PHP Framework (no existing frameworks were allowed) as base for the new dating site Twoo. I advised on Design Patterns and Best Practices. I developed the Security and ACL platform as well.
I introduced an open source Event framework and presented it to my co-developers. I introduced new application log functionality, for logging to Hadoop. I evangelized Unit tests, including an initial implementation and presentations. I evaluated Redis, Memcached & Membase (now Couchbase).
I am following the DevOps movement. I am both using Puppet and Chef to automate and Ganglia to monitor critical infrastructure. I have open sourced and contributed to several Puppet modules.
In this context I also took my first steps with Ruby.
Developing a Continuous Integration strategy, with related tools like Jenkins.
I follow and try out with great interest Cloud related techniques and technologies, in all its forms: IaaS, PaaS, MaaS, … I have used as test or in production Amazon S3, Amazon EC2, Amazon MapReduce, Amazon IAM, Google BigQuery2 (private beta tester), Windows Azure Platform and Hadoop on Azure (private beta tester) and Softlayer.
In possession of a drivers license.