Debugging “Temporary failure in name resolution” in Docker containers

I was encountering DNS issues with Docker containers orchestrated by DDEV earlier. Some hosts failed to resolve only within the container for some reason.

To fix it, I changed the following options in the `/etc/default/docker` file:

DOCKER_OPTS="--dns 8.8.8.8 --dns 8.8.4.4"

This helps override the DNS options for Docker containers. Afterwards restart the Docker service:

sudo service docker restart

This solved the problem for me. I found the solution on Stackoverflow.

GitHub Outage

Today I may have killed GitHub. Not really but it’s a funny coincidence.

I was working on something and after I committed my changes and pushing I noticed that the dev branch was ahead of my feature branch and there were now some merge conflicts.

I quickly rebased my feature branch but when trying to push the changes all of a sudden I got the following message:

“Weird”, I thought. I have gotten my pushes rejected before but not with this ominous “Internal Server Error” message. I thought I messed up something and GitHub couldn’t process my changes.

So I thought around and tried some other things, even reverting the other changes to force my rebase but nothing worked. I googled around and couldn’t quite find any answer until I took to everybody’s least favorite social media platform Twitter, saw that “GitHub” was trending and found this tweet. It wasn’t just me and GitHub had a partial outage.

How weird of a coincidence it is that this happens just as I want to push something. Today was also the day I moved a project with a full CI/CD pipeline to GitHub Actions for the first time. The stars just aligned, I guess.

Goodbye, LastPass

(this is not a sponsored article, I wish it was)

I have just officially ditched LastPass for Bitwarden. As a Software Developer I always advocate the use of password managers to everybody and I think with Bitwarden I have found the perfect one.

I originally started with Dashlane in 2016 and then switched to LastPass around 2019, mainly since LastPass allowed me to sync my passwords across devices for free. However, around a year ago LastPass updated their pricing model to force users to pay for device sync as well. Originally, I caved and simply paid for the subscription. I was pretty much happy with LastPass so I thought well, whatever.

Today my LastPass subscription expired and I did not renew it. A friend has told me about Bitwarden a while ago but I never took the time to take a look at it, until I got a reason to just now.

Bitwarden immediately has a great argument with the sheer amount of clients available. Browser extensions, desktop clients, mobile clients, command line interfaces, a web app. Bitwarden is pretty much available anywhere with an internet connection. I downloaded the Windows client, the Chrome extension and the iOS app for my setup.

Bitwarden also has an amazing import tool. After exporting a csv file from LastPass I could simply upload it to Bitwarden and specifically select that the format was coming from LastPass. Bitwarden accepted the file perfectly and I had no hiccups when importing. I remember that my move from Dashlane to LastPass was very complicated since there was no format standardization.

The iOS app also seems very good. It’s compatible with the iOS password autofill, allows me to use Face ID for unlocking and has free autosync.

So far this seems like the perfect password manager and the best thing of all: It does not cost me a cent. Bitwarden is also fully open-source which surprised me.

I created a CLI tool to easily upload files to Google Drive

If you know me, you might know that I have big sense of preservation. I’m a big fan of collecting video games for that reason, but it also applies to something else. I follow a lot of Twitch streamers and love watching their content, but one thing I really dislike about Twitch compared to YouTube, is that stream archives (“VODs”) are only stored for 60 days (for non-partners even less). This is a huge issue for someone like me who likes to go back to older content from time to time and re-watch older livestreams.

This is an issue I am currently trying to solve for myself and today I built a small tool that will probably help me a ton in the greater scheme of things.

My current solution to this was a small VPS I set up to download Twitch VODs and render them together with the chat using an open-source tool called TwitchDownloader. The developer of that tool kindly provided me with their own solution to rendering chat and video together, so I didn’t have to figure that out.

However recently I faced a new problem with this little make-shift solution: disk space. So far I had just downloaded the videos to the VPS and left them there because moving files this big around is a hassle. Now that the disk space is reaching it’s limit I was forced to come up with a quicker solution.

I have thought about setting up my own personal storage servers for this but that’s something that will take a bit longer to do properly, so I chose Google Drive for storage. I am on a Google Workspace (formerly G-Suite) plan which allows me to store a ton of files and not having to worry about disk space running out. Additionally, Google Drive processes video files so I can even watch these archived livestreams directly from their UI, without having to download them to my computer first.

Which is the reason I created my google-drive-upload-cli. A simple Kotlin-based CLI tool to upload large files to a Google Drive account. I had previously tried multiple existing CLI tools for Google Drive, but ran into issues with almost all of them and since Google doesn’t offer one themselves, I was forced to do it myself. The only alternative would’ve been downloading all of these files to my personal computer and then uploading them through the Google Drive UI, which would basically make my computer’s internet connection unusable for days.

I chose Kotlin since it’s a language I’m familiar with. I pretty much replaced Java with it entirely and it allows me to start a project quickly and get to programming. I had previously looked into how easily I can make something like this and found this StackOverflow answer. Seems I wasn’t the only one looking for quick and easy solutions to this problem. All I had to do was wrap all of that into a solution that refreshes OAuth tokens and I was almost done. For resumable uploads of bigger files, I borrowed an existing Java class that somebody had created a while back.

Now I added my little CLI to the existing scripts I had on my VPS and my solution is almost automatic. In the future I would like to upgrade this even more. I’ve had the idea of a web tool where I can just dump any link and it will automatically download and store the video for me for a while. Maybe I’ll get to that one day. For now, I can safely store and Twitch videos and not worry about them being lost.

It wasn’t a big project but it was fun to work on something open-source again, I haven’t done that in a while. I don’t know if I will continue working on this tool specifically, since it was just made as a quick solution to a simple problem. Maybe I’ll add something like support for uploading entire folders, who knows.

You can check out the source code on GitHub.