The way my IT and Development Background Fuels my Approach to Problem Solving

This is very much of a chicken or the egg situation. But there is much of a correlation with my approach to problem-solving that I use in both life and technical problems and troubleshooting.

Let me preface it with, it took me a while to learn this lesson when I was in elementary and junior high but everybody knows that I do not do hardware. I just don’t fuck with it. Even if it’s just a bad hard drive? Nope. One of the attributes that lead me to fall in love with software is that it is resilient. You can sit there and try and fix software or a bug in your code all day every day and usually you can’t break something that will actually cost money versus tearing up a motherboard and not having the money to replace it, as well as an angry family for tearing up the family computer.

While writing code, most of your time gets spent on mundane lines just setting up the application. Ex. Connect to the database server, validate the user, grab the user preferences, blah blah blah. Necessary but not particularly difficult after you’ve been doing it for several years. Then you start writing a process you have never written or in a different way. It never works on the first attempt. Ever. More often than not, most of your code is correct. There’s usually just something small you overlooked. You forgot to increment a variable in a loop. You forgot to end a statement with a semi-colon. You forgot to include a file. Whatever the problem, you stop the program, go back to your code and start glancing over the code you wrote. Sometimes the bug is immediately apparent and you correct the issue, test it and move on. Other times, the problem isn’t immediately apparent so you start debugging. Anyone who has experience debugging, what you don’t want to do is go back and completely rewrite everything. If you do, you then have to try to figure out which change fixed the actual problem. Instead, you go back to your code, study it, change one thing, test, if it works, great! If it doesn’t work, go back, undo the change you made and change something else. And repeat until you get the desired result.

Isn’t this how life works?

The longer you spend doing anything the more mundane code you have memorized. Code that you’ve written a thousand times before and have confidence in. Then once we encounter a new problem, we don’t come back and throw everything away, given that most of it are irrelevant to the current problem. We come back, change one small detail and try again until we get the result. And before we know it, this current problem is just in the pile of the code we know that works and we are off to debugging the next problem.

Uber and Lyft: A Perspective From a Person with Cerebral Palsy

I was nineteen. It was the summer of my first year in college. I had a summer job working on campus at California State University, Bakersfield at the faculty help desk preparing new computers with the necessary software and configurations before they were dispatched to the appropriate offices on campus. 

My parents lived two towns over from Bakersfield in a tiny town called Maricopa. In between Maricopa and Bakersfield there was a slightly larger town, Taft. My dad had to pass through Taft on his way to work everyday at 5:30 am so he would drop me off at the bus stop every morning where I’d patiently wait for forty minutes to an hour while listening to my iPod. I didn’t really mind the wait, it gave me time to think; except for on occasion the bus would come by and say they were full and I’d have to wait for the next one — two hours later. 

Getting home was the same thing. I would get on the bus, ride home for an hour and wait for my dad to get off and drive through Taft and pick me up on his way home. I usually only had to wait thirty minutes to an hour, until one day. I had a particularly bad day at work, it was hot, I was hungry and all I wanted to do was go home, eat dinner and go to sleep. An hour went by. Nothing. An hour turned to two. Right before it got to be three hours I saw my dad’s truck turn the corner. As soon as I hopped in, he apologized and said, “he got caught up in a meeting”. I immediately went to balling. I wasn’t mad or upset with him. In that moment I realized, I’ll never be able to drive. 

Fast forward to my mid-twenties. I had my own small business in IT and had already surpassed what the inner-city’s bus system could provide in a timely manner so I put someone on retainer for $800 per month to get me to job site to job site every day as well as running personal errands. Between her and the occasional taxi, my inner-city transportation needs were met. My sister had kept bugging me to try out this new service called Uber. Once I did, it literally changed my life. 

All of a sudden, I got to experience what all of my peers experienced — the ability to “get up and go”. From work to going out to running errands, all without needing assistance from a loved one or waiting hours and hours on public transportation. Usually whenever I request an Uber or a Lyft from my apartment in Bakersfield, someone is outside of my complex within three to five minutes. With the added bonus, at the end of the month I add up the amount I’ve spent across the two ride shares and it typically doesn’t equal a fraction of what the average person pays for a vehicle once you add up a car payment, car insurance, gas and maintenance. 

Linux: Transferring Your Windows 10 Entitlement License to a Virtual Machine



The very first thing that I did when I got my new Thinkpad in a few months ago was of course, format the hard drive and put Ubuntu on it without hesitation.

The very first thing that I did when I got my new Thinkpad in a few months ago was of course, format the hard drive and put Ubuntu on it without hesitation.

However, I am now working on a project that requires me to run Windows for Visual Studio. Since I refuse to run Windows as my host, I went ahead and spun up a Windows 10 virtual machine in VirtualBox but I still had to activate it. Like most machines bought within the last few years, they come with a Windows 10 license as an “entitlement”.

So to transfer my entitlement license from the Thinkpad was fairly easy. In your Linux terminal, simply type: sudo cat /sys/firmware/acpi/tables/MSDM and it will display your Windows 10 license key in plain text that you can then plug into your virtual machine so it can activate.

Upgrading My Personal Digital Security




With all of the data breaches happening such as Equifax, the update on the Yahoo breach that happened a while back, etc, it’s been on my project to do list to go in and work on my own personal security.

With all of the data breaches happening such as Equifax, the update on the Yahoo breach that happened a while back, etc, it’s been on my project to do list to go in and work on my own personal security.

I used to be really good at protecting myself online but you know, you get busy and/or lazy, start using the same password everywhere, disabling two factor because it’s a pain and so on.

LastPass

So where did I start? The very first step I took was to dust off my LastPass account and go through all of my online services and replace all of my passwords with randomly generated passwords (up to 100 characters where accepted) from the service.

Then it was time to secure LastPass. Since it now contains all of my passwords for all of my accounts, the last thing I want is someone to gain access to my account. For that, I ordered myself a new Yubikey. For those of you that don’t know, a Yubikey is a small usb device that looks like a usb flash drive but instead, every time you hit the button on the Yubikey, it omits a 44 character, one time password that the service then checks with the Yubico servers to verify that the code is authentic. So as a consequence, in order to get into my LastPass account, you need my username, password as well as physical access to my Yubikey.

Two Factor Authentication

The second thing I did was to turn on second factor authentication wherever possible.

What I found was a lot of services only offered two factor authentication via SMS which isn’t exactly the best way to implement two factor since it’s been proven that text messages can be sniffed out of the air and read, however it’s definitely better than no two factor at all.

Where it was offered, I turned on two factor authentication via Yubikey. Very few services offered it as an option but it was great to see that Google, Facebook and Dropbox all have it as an option.

Amazon Web Services

Most of my servers and databases are hosted with Amazon Web Services. I was fairly surprised that they don’t support YubiKeys as multi-factor authentication. So instead, I ordered one the devices that they recommend from Gemalto called a Safenet Display Card. It’s a credit card sized device that generates a six digit pin when you activate it. Once again, now in order to gain access to my AWS account, you’ll need my username, password and access to the display card.

Backup Codes

When you turn on second factor authentication, most services give you a list off “backup codes” that you can use to override the second factor device just in case your device gets destroyed or lost.

What you’re SUPPOSED to do is actually print the codes out and store them in a safe place. But since I absolutely LOATHE paper, I stored them all in a text file, put them all onto a flash drive and made arrangements with my best friend (who lives in an entirely different county) to physically store the flash drive in a fire/water proof safe in her apartment. This way the codes are entirely offline and are protected against nature disasters.

Local Security

This section I particularly went crazy with, mostly because I wanted to ensure that if my laptop ever got stolen, it would be completely unusable to the thief.

Since I use Ubuntu, it offers two different ways to encrypt your data.

Full Disk Encryption

When you install most distributions of Linux, they give you the option to encrypt the entire hard drive. So when you boot up the machine, before you even get to the username and password prompt, you need to enter a password to decrypt the hard drive. With the Yubikey, you can program the second “slot” to store a static password up to 38 characters. So that’s what I did and used that static password as the entire hard drive decryption key. Mostly because I’m lazy and didn’t want to enter multiple passwords whenever I turn on my computer or reboot.

Home Directory Encryption

Most Linux distributions offer to encrypt just your home directory where people store the majority of their documents.

This is usually the easiest way to do it just because it uses the password for your local account as the decryption key.

I went ahead and turned that on as well so the files in my home directory are double encrypted, once by the full disk encryption and once by the home directory encryption.

External Hard Drive Encryption

I always have a four terabyte external hard drive connected to my laptop for all my big files.

In Linux, you can have a drive formatted to be fully encrypted with the LUKS algorithm requiring you to enter a password when you connect the drive in order for the data to be decrypted. Yep, turned that on.

BIOS

The very first thing I changed in the BIOS was to change the boot order from CD/DVD then USB device then hard disk to hard disk first so someone can’t boot from a live DVD or flash drive.

Then I disabled the boot menu option so someone can’t change the boot order without going into the BIOS.

Of course I then put a password in the BIOS so someone can’t change anything on it without the password.

Lastly, not every BIOS has this option but my Thinkpad does, but a tamper detection mechanism. So that whenever a hardware change is detected, you must confirm the change in the BIOS for it to boot, which of course requires the BIOS password. This makes it where even if the thief is smart enough to take out the hard drive and put a different one in, it still requires a password making the laptop completely unusable.

Conclusion

Securing your data is absolutely a pain in the ass. However, we live in a time where it is now a must. Would you rather be slightly inconvenienced now or wait until your identity is compromised?


Why I Switched from Digital Ocean and WordPress to Squarespace: A Good Life Lesson




So mid-this summer I decided that I needed to have a permanent home on the web, especially somewhere where people could go and find out about my professional life and not just read my political views or details about my personal life.

So mid-this summer I decided that I needed to have a permanent home on the web, especially somewhere where people could go and find out about my professional life and not just read my political views or details about my personal life.

I’ve ran blogs for myself ever since my teenage years but you guys know how that typically goes, it’s cool for a week and then you forget about it. But I’ve always used WordPress for myself just because I liked having control over the server. So that’s what I did, I went on Digital Ocean, created myself a CentOS droplet, popped WordPress on it and good to go!

Not really.

As I started to add more and more content, it began crashing more and more frequently. As it had turned out it was a problem uniquely tied to Digital Ocean and one of the WordPress plugins I was using. However, the last time it had went down, I had just sent my resume out to several potential employers — with my website listed. I had already brought it up from going down once that day so the second time, I was pretty much done. After all, I’m a tech guy! I didn’t want potential employers trying to visit my site and it be down! I could just imagine, “We’re not going to hire him! He can’t even run a website!”

So I immediately grabbed my debit card, went over to Squarespace, signed up for an account, paid it for the year and within a few hours had everything transferred and back up and running. To be fair, I’ve had a lot of experience with Squarespace in the past. It’s been my go to web platform for my IT clients for years, mostly because I could go in and set everything up initially, spend a few hours with the client and show them around and they could make changes to their site themselves without having to call me.

But my point is this: save your energy for the most important work. Hell, I have web servers, application servers, database servers, caching servers, VOIP servers, Active Directory servers, etc that I’ve been managing and running 24/7 for years with very little downtime. So sure, I could have went over to AWS, spun up a few small instances, placed them behind a load balancer, blah, blah, blah and STILL have to be the one who manages it and have to be the one who fixes it when it has problems.

Sometimes it’s not worth the time or the energy just to be able to say you did it all yourself. I’m not sure if it’s because I’m getting older or what but I’m beginning to start picking my battles more carefully.


My Ridiculous Source Code Backup System




One of the most frustrating things in the world being a developer is losing code. So there’s very few things that I make sure to take extra care in than source code backup.

One of the most frustrating things in the world being a developer is losing code. So there’s very few things that I make sure to take extra care in than source code backup.

Of course I use version control via subversion on a service called Beanstalk. Yes, I know that GIT is better, I just grew up on subversion.

First off, when I check out my repositories on my local machine, I have the repository directory nested in my Dropbox Pro folder. I have Extended History turned on in my Dropbox which means it keeps each and every version of each file for a year versus the standard thirty days.

Now comes the post-commit process. On every single commit, the following occurs:

  • Gets checked into subversion (with commit message required)
  • Automatically deploys on development server
  • Then on my local machine:
    • Tars the entire project and places it in a temp directory
    • Copies the project to my Google Drive
    • Copies the project to Dropbox
    • Copies the project to my 4 terabyte external hard drive
    • Copies the project to an Amazon S3 bucket in California
    • Copies the project to an Amazon S3 bucket in Virginia
    • Copies the project to an Amazon S3 bucket in Tokyo that has a lifecycle rule applied to automatically retire all files in that bucket to Amazon Glacier after 24 hours
  • Removes the tar archive from the temp directory

I realize that it’s a tiny bit excessive but I’m not lying when I can’t stand losing code.


Why I ditched MySQL and put Bob on DynamoDB

Over the past few years, I have all but given up on using MySQL whenever I need to write a database, just because I don’t like having to be careful about how many queries per second I can conduct without worrying about how much load the database server can handle at once and I have never liked the Oracle licensing arrangements.

Over the past few years, I have all but given up on using MySQL whenever I need to write a database, just because I don’t like having to be careful about how many queries per second I can conduct without worrying about how much load the database server can handle at once and I have never liked the Oracle licensing arrangements.

When I first started working on Bob years ago, I meant it to only be ran off of a single Raspberry Pi 3 which worked well for a while back when all Bob was doing was sending me a text message every eight hours and notifying everyone if I didn’t respond. During that time, the Raspberry Pi was serving as both the web server (Apache) as well as the database server (MySQL) which worked great at the time. However, as I started adding more and more functionality to Bob such as location tracking, social media checks, etc the MySQL service on the Raspberry Pi would crash, but even worse, it would silently crash so I could go a few days without noticing it was down. Not exactly what you want from a program that is supposed to be monitoring your life 24/7.

I eventually worked around the issue by lightening the load on how much data it stored and how often the scripts queried the data but it was a half ass fix.

So last month, when I decided to seriously work on Bob again, the very first decision I made was to ditch MySQL, and overhaul the backend to run exclusively on Amazon’s DynamoDB.

Why DynamoDB?

First of all, I’ve always been a huge fan of Amazon Web Services. Secondly, it’s a complete unmanaged solution. You create the tables and add the data and Amazon manages the rest.

When you create your tables, you specify how many reads and writes per second that each table needs to perform at and Amazon automatically spreads your data across how ever many servers that’s needed to support the specified throughput (we’ll come back to this).

By default, all tables only run off of solid state hard drives making it incredibly fast.

No Licensing Fees

Although it’s not open source, there are no licensing fees to use DynamoDB, you only pay for the hardware consumption that you provision per hour. For instance, if you know that your application will be heavily used during business hours during weekdays, you can provision to have more throughput during those hours and only get charged for those hours. Which brings me to my favorite feature of DynamoDB, auto scaling.

Auto Scaling

As I mentioned before, when you setup your tables, you get to specify how many reads and writes per second you want each table to handle but the truly beautiful part is its completely dynamic meaning you can adjust it throughout the day.

With old database models, you would typically have to think of your maximum expected capacity and run at that maximum capacity 24/7. With DynamoDB, you can specify a minimum and maximum read and write capacity and it will automatically scale up or scale back down based on usage.

For example, I have all of my tables set with a minimum read and write capacity 5 per second and a maximum of 10000 and have a rule where if at anytime, if 50% of my capacity is being used, double my capacity up until 10000.

What does this mean for Bob?

The more data we can collect, the more accurate algorithms can be.

Let me give you one example, on my personal account I have my computers reporting to Bob my usage based on mouse movement. When I had MySQL powering the backend, I had to build in a sleep mechanism where when it detected mouse movement, the computer would report it to Bob and then put itself to sleep for sixty seconds because otherwise, it would try to report to Bob multiple times per second and eventually overwhelm the database server. Now we can collect data up to milliseconds instead of minutes.

When you think of everything that’s either putting data into Bob, or taking data out: everything from computer check ins to motion sensor data to scripts that run every minute, on the minute 24/7, you start to see why MySQL started getting so overwhelmed.

So with the database bottleneck almost completely removed, I look forward to throwing as much data as possible into Bob!

Day One Script for Ubuntu

Since I have been developing a lot more lately, I haven’t been on my iPad as much. So when I pulled up Day One today, I was kinda sad to see that my last entry was over a month ago.

I know Day One has a macOS app but my preferred development enviornment is Ubuntu running on a Lenovo Thinkpad. Since I took the day off from working on Bob today, I decided to write a short python script to write journal entries directly from my Ubuntu terminal.

It was actually pretty easy, I just created a recipe in IFTTT to send a Webhook request and turn the information that I pass to it into a journal entry. The whole thing took probably forty five minutes. I still have to add things like image support but that will be another night.

I’m just happy I can quickly add text entries at the moment! 

#!/usr/bin/python

import requests

print "Entry title:"
title=raw_input()

print "Content:"

contents = []

while True:

    try:
        line = raw_input("")

    except EOFError:
        break

    contents.append(line)

text = '\n'.join(contents)

print "Entry: \n\nTitle: "+title+"\n\nContent: "+text+"\n\nSave entry? "

text = text.replace('\n', '<br /><br />')

save=raw_input()

if (save == "Y" or save == "y"):
    r = requests.post("https://maker.ifttt.com/trigger/journal_entry/with/key/xxxxx", data={'value1': title, 'value2': text})

Upgrading your Raspberry Pi from Jessie to Stretch

The Debian team released their latest version of their operating system, Debian 9.0 Stretch back in June of this year, so it took a few months their Raspberry Pi variant, Raspbian to be adapted for Stretch but as of last month it is finally out!
Now usually with a major version upgrade I usually like to just do a clean install of the OS but I was feeling kind of lazy so I just decided to upgrade.

Note: Always be sure that your Raspberry Pi is backed up before you do a major upgrade such as this.

To upgrade, make sure that your current version of Raspbian Jessie is up to date by either in the terminal or SSH typing:

  • sudo apt-get update
  • sudo apt-get upgrade

Once Jessie is up to date, you’ll need to add the new repository for Stretch. To add the repository, type: sudo sed -i 's/jessie/stretch/g' /etc/apt/sources.list

Now, refresh your repository mirrors again by typing: sudo apt-get update

It’s now time to perform the upgrade. Go ahead and type: sudo apt-get upgrade again. It will prompt you several times during the upgrades about package conflicts and what not, for most cases the default option should be fine.

It should take about 15-20 minutes for the upgrade to complete depending on your internet connection. Once it’s done, go ahead and reboot your Pi by typing: sudo shutdown -r now

That’s it! You have now successfully upgraded your Pi to Raspbian Stretch!