Doing DevOps in New Zealand

I was offered the opportunity to travel to New Zealand back in May to interview with three companies for a position as part of the LookSee campaign – which I wrote about in the blog post “Long Time No Blog – I won a trip to New Zealand”. That process started what would eventually lead to moving to the other side of the planet. Continue reading Doing DevOps in New Zealand

Lex Bots as Code – The Rise of the Machines

In this post I explore the basics on creating a Lex Bot as code. AWS Lex is a service that allows you to create a bot to respond to certain chat scripts. You essentially define the various ways people will talk to the bot, and then define AWS Lambda functions to actually perform the action requested and respond with the result of that action, or possibly continue on in the script to gather more information or merge into a new conversation altogether.

Continue reading Lex Bots as Code – The Rise of the Machines

Long Time No Blog: I Won a trip to New Zealand

So I started looking at positions in countries other than the United States to get a bit more breadth to my life experience. Inevitably I decided that European markets, Southern Australia, and New Zealand were likely the only options that had agreeable climates, stable governments, burgeoning tech job markets, and of course, spoke English.

Continue reading Long Time No Blog: I Won a trip to New Zealand

Part 12 – First Week Results using Tableau

Our stock sentiment analysis system has been working like a charm now for a little over 3 days. There was some confusion on the Yahoo News API which caused a temporary outage. But the data has moved around enough such that we can now start showing some data science on the collected data and see if we cannot make some educated choices in what we expect the stock price to do in the upcoming weeks.

Continue reading Part 12 – First Week Results using Tableau

Part 11: Replacing Spreadsheet with a MariaDB Sidecar Microservice

The Ticker News Sentiment Analysis System (TNSAS we’ll call it) has gone through quite a transformation lately. Not only have we taken advantage of our containerization to scale the tone analysis service independently from the content scraper service, but we have also taken advantage of Redis based caching to reduce the total number of calls sent to IBM Watson to an absolute minimum as well as providing a log of all scraped documents and their related analysis.

We report this data via the console and copy and paste to a spreadsheet for tracking over time. This is not ideal and require significant manual data collection over an extended period time to get data at a resolution small enough to enable something approachingĀ a real time system. We need a solution that can stand up and collect data consistently and then report it per ticker.

Continue reading Part 11: Replacing Spreadsheet with a MariaDB Sidecar Microservice