It's been an exciting few weeks! I've gone ahead and passed my first 'professional'-level cloud certification exam (AWS DevOps Engineer-Pro), and I'm deep into review for the AWS Solutions Architect-Pro, Advanced Networking-Specialty, and Database-Specialty exams as well. I'll take the Architect-Pro exam a little under two weeks from now, and the other two exams within the next two months. I'm feeling strong on that front, but there's definitely much more review to get done.
Day-to-day data engineering engagement has been great too; I'm taking the lead on actively and aggressively optimizing our CCPA/PII provisioning and deletion processes at Ibotta by leveraging Bloom filters. I hope we can share out that progress via a blog post or conference presentation in the near future.
Figure 1. DevOps Professional certification from AWS! More on the way soon.
An exciting development this week: I was able to take and pass the AWS Security-Specialty exam, which is the first in a series of Specialty- and Professional-level certifications I'll be taking over the next year.
I'm looking forward to that learning and prep! Next up is probably the new-ish Database-Specialty or perhaps the DevOps Engineer-Professional exam.
Figure 1. My first AWS Specialty-level certification - definitely a level up in challenge from the Associate-level certifications!
Hey there! It's been a while! I'll keep trying to make blog updates a 'regular thing', and every time I fall off the wagon, I'll just keep getting back on.
Between my last post and today, I've been occupied with a lot of different tasks both at work (at Ibotta, doing data engineering) and away from work (personal development goals and training). At work, I've been diving deeper into Hive and Presto optimization, management, and consulting (mostly within our data lake but also a bit with our data warehouse).
Outside of work, I've been proceeding along several different learning tracks:
The latter pursuit has been both humbling and incredibly rewarding - I really enjoy coding puzzles/challenges, and the Contest environment keeps me on my toes, improves my problem solving ability under pressure, and helps cultivate and retain a genuine empathy for tech interviewees who are seeing a coding problem for the first time.
I'll keep plugging away at these contests, but I'll likely have to think critically about my approach if I want to start breaking into and beyond the 90th percentile of contest participants. Nowadays I'm typically pretty fast at successfully and cleanly solving 2-3 out of 4 of the contest problems. To break through, I'm likely going to have to make a concentrated effort at learning and building facility at the common algorithms that come up in the 4th and last problem of each contest (usually designated 'hard'). I can sometimes solve these, but many of them have approaches that are really tough to figure out quickly and effectively.
Below is a chart of where I've been at (percentile-wise) in the recent Weekly and Biweekly LeetCode Contests. Wish me luck in continuing to strive for better performances (and having fun while doing so)!
Figure 1. Weekly and Biweekly LeetCode percentiles during Spring 2020. I've gotten a lot better at quickly solving most Contests' first through third problems ('easy' through 'medium' difficulty), but I'm going to have to focus and work hard to break through my '4th problem plateau' if I want to start consistently getting into the 90th percentile and above!
A lot has happened in the past few weeks!
First of all, I've finished up my 'Associate round' of certification with the 'big three' cloud providers (AWS, Azure, and Google Cloud Platform) by completing the AWS SysOps Administrator-Associate exam yesterday (Fig. 1). I've really enjoyed having this structured learning to complement the hands-on education I get every day at Ibotta!
Figure 1. My shiny new SysOps Admin-Associate certification for AWS wraps up my Associate-level cloud tour; I'll start preparing for Specialty and Professional exams in the coming months.
Next: my first open source pull request (Fig. 2) got merged not too long ago. As part of my 2019 goals, I wanted to at least submit a small PR to an open source repository that was important and interesting to me, and Terraform fit those criteria quite well.
Figure 2. It's not incredibly groundbreaking, but I, for one, was proud to complete my first open source PR. The research into this (and also my work with Terraform at Ibotta) has inspired me to learn more about Golang so I can make more substantial contributions.
Although that PR was quite brief in scope, I learned a lot about research and communication regarding open source software, and also was inspired to start learning much more about Golang (so I can contribute more substantial work to Hashicorp's repos in the future, for example). I'll post more on that soon!
Hi there! I wanted to give a quick 'solstice' update to show off my shiny new Azure and GCP credentials! I've been learning about both those cloud platforms at an associate level, and building some preliminary architectures in both spaces.
It's been fun! Preparation for each of these exams was pretty challenging because I was new to each platform and also because there are still relatively few review materials available for these (compared with AWS, at least). I'll round out my associate-level exams with the AWS SysOps Administrator exam in January, then switch gears to start actually building more stuff. Looking forward to it!
Figure 1. Azure Administrator-Associate badge earned in November 2019.
Figure 2. Google Cloud Platform Associate Cloud Engineer badge earned in December 2019.
A few weeks after coming back from HashiConf '19 in Seattle, I received notice that my Terraform Associate certification (Figure 1) was waiting for me! HashiCorp had been generous in allowing its conference attendees early access to the new Associate-level certifications in both Terraform and Vault, and I took the former while I was at the conference. I found the exam well-written, comprehensive, and fairly labeled as associate-level in difficulty. Preparing for and taking the exam was a good opportunity to learn more about Terraform than I'm exposed to (particularly enterprise-level offerings/services) on a daily basis at Ibotta. I really enjoy infrastructure-as-code, and hope to broaden and deepen my knowledge in this area in the months and years to come.
Figure 1. Terraform Associate badge/certification earned as a beta tester during HashiConf '19.
After returning from the conference, I also had an end-of-September appointment to expand my AWS knowledge base with the Certified Developer-Associate exam. This involved learning more about AWS-native CI/CD tools (we at Ibotta work a little outside AWS for our CI/CD processes in particular, even though most of our infrastructure is in AWS) and developer-oriented services (especially Elastic Beanstalk, Lambda, and X-Ray among others).
Figure 2. AWS Certified Developer-Associate badge earned in late September 2019.
Right now, I'm preparing for both Azure and Google Cloud Platform certifications before the end of the year. More on that soon!
Finally, and along the lines of the ongoing HashiCorp theme, I've put together a basic multicloud demo that's publicly/freely fork-able and clone-able. I created this project to illustrate simple multicloud principles using both a) the new free tier of Terraform Cloud (to deploy HTTP servers across AWS, Azure, and Google Cloud Platform) and b) a local deployment of HashiCorp Consul for cross-cloud service discovery and health checks. Feel free to browse and explore the code at your leisure and let me know your comments and criticism!
This week was a busy one, as I spent half of it attending HashiConf in Seattle, WA, followed by a late-night flight back to Denver for some data engineering work at Ibotta on Thursday and Friday.
I'm still trying to digest all the learnings gained from the trip to Seattle, and I'll post more on that between now and when I present a condensed version of my experience to other Ibotta coworkers in late October.
One of my favorite talks was by Tracy Holmes at HashiCorp; I've been looking for a toehold to step up and start making contributions in the open source world, and the outstanding backlog of open issues within the Terraform OSS repository (for provisioning and configuring infrastructure as code) could be a great place to start. Tracy's talk on becoming a contributor really cut through the intimidation I'd felt when thinking about how to begin.
Figure 1. A view from the crowd near the beginning of HashiConf 2019. The kickoff and other keynote presentations were here in the Regency Ballroom at the new Hyatt Regency hotel in downtown Seattle.
In parallel with presenting out some cloud-provider-agnostic findings to Ibotta in October, I'll also be adding a few more certifications to my learning path before this year's holiday season. I've signed up for Associate-level exams in both Microsoft Azure and Google Cloud Platform prior to 2020, and I'm aiming to round out the series of three Amazon Web Services Associate-level exams by then too.
It'll be an intense time, and I'm looking forward to engaging fully with it and sharing more soon!
A start on the AWS certification road; web/mobile development sandbox environment under construction
Hello again! It's been a while since I've posted, and I've been quite busy over the past year immersing myself in many different aspects of enterprise-level software development at Ibotta. We had some exciting news recently upon receiving a coveted 'unicorn' $1B valuation! And although that news was great, it's much more of a milestone than an end goal; we've got lots of ambitious projects in our pipeline and continue to work on scaling our tech stack efficiently and effectively.
During my time at Ibotta, I've spent some time working on backend APIs with our client-facing Transaction Data squad, and most recently transitioned over to a Data Engineering role. My current responsibilities have me right in the middle of a cross-functional effort to switch Ibotta's backend over to a more granular (and more real-time) event-based architecture. We're all looking forward to the possibilities that affords, including lofty new goals for 2020 and beyond. More on that soon!
While working with a variety of folks from Ibotta's Platform, DevTools, Architecture, and now Data Engineering squads, I've developed a continuously growing interest in cloud computing architecture and design. As a confirmation of that interest, I've started making my way through Amazon Web Services' (AWS's) suite of certification exams, beginning with the Solution Architect-Associate test. I've already got the Developer-Associate exam scheduled for late September, and will be taking the SysOps-Associate exam in December.
Figure 1. Passed the AWS SAA exam back in July! Now I'm looking toward the Developer-Associate exam in September and the SysOps-Associate exam in December.
If you've visited my sandbox environment recently (bradleypmartinsandbox.com), you may notice that my super-bare-bones Node/Mongo test website has been taken down for the time being and a placeholder 'under construction' page has been put up in its place (Figure 2).
Over the next year, I'd like to start putting up some more sophisticated portfolio samples that showcase my development in the tech space. A one-to-one EC2-hosted Node/Mongo app was cool to build and maintain for a while, but I think it'd be fun and invigorating to build a collection of resources showcasing a blossoming basket of skills, including serverless apps, Ruby/Rails webpages, and possibly even a mobile app or two!
I'll have more info over the coming weeks and months.
Figure 2. I've taken down my perfunctory sandbox 'switching station' app and the basic Node/Mongo web app that linked to (at bradleypmartinsandbox.com). Look for more news on replacement portfolio samples soon!
Hi all! For the past 4 months or so (that time has really flown by!) I've been spending most of my time and effort trying to become a better and more productive software engineer in my new role at Ibotta in Denver, CO. It's been an incredibly enriching and enjoyable experience, but it's also been difficult to carve out time to continue pursuing goals related to buttoning up applied math work I did in graduate school and the subsequent postdoc.
Still, as I've started to get my 'sea legs' on this new career path, I wanted to begin consciously making an effort to reboot those pursuits (as well as pushes toward other personal development goals). Toward that end, I've made some preliminary/perfunctory changes to titles and references of functions and drivers in the 2D wave equation examples shared in my public GitHub math repository (https://github.com/bradleypmartin/MathGraduateResearchAndCourseWork).
Although today's commit is certainly a low-effort change, I'm hoping that taking that small step (along with blogging about it here) can help set me on a path to complete more in-depth revision and clarification as I'd set out to do months ago.
Thanks for your patience and look forward to more info soon!
After committing a lot of web development and data science tutorial projects to GitHub lately, my collection of repositories was getting a bit bloated! So this morning, I collected a lot of the smaller repos into larger collections by subject area.
The only pettily unfortunate thing about this aggregation was that it nullified a lot of contribution/commit logs from the past few months, and as a result, I have a lot less shades of green populating my contribution activity tileset. It's okay, though; there'll be plenty of work to get "back in the green" soon enough. :)
Also, while I've been quite busy with some private projects for interviews over the past few weeks, I've also begun a React/Redux tutorial as planned. We'll see what other obligations pop up in the next week or two. If I'm free, I'd like to focus on blasting through that and then possibly on to a Cassandra-backed API project, possibly with a Java driver.
Figure 1. Although some of the code within these aggregated folders still needs cleanup, I feel a lot better about the high-level structure of my GitHub repos.