Since I set up a public jumphost for my homelab/network, I’ve been looking for an easy way to manage my SSH tunnels. After trying a couple of different managers, I’ve chosen to use Secure Pipes.
Did you know that Dell has a bootable linux ISO with firmware upgrades for their servers available? Neither did I, but luckily I found it today when needing it at a customer site.
I’m using Slack to alert and log a few things in my environment, and one of the things I use it for is to alert me if someone logs on via SSH to my public facing Jumphost.
For a good walkthrough on how to set up such a host, check out Tunnel all your remote connections through ssh with a linux jumpbox by Luca Dell’Oca.
Just like Lior Kamrat I’ve set up my own private Slack for messaging and alerting from various services running both in my lab and some external facing services. It’s only been running a few days, but so far it works brilliantly and helps me keep track.
VMworld Europe is just a couple of weeks away now, and I can’t wait to spend a week in sunny Barcelona. Last year my trip got cancelled in the last minute, but that will not be the case this year.
As usual I’m looking forward to a bunch of sessions, and general announcements, but for me the value of attending VMworld is in the networking with other people. Sessions and keynotes can be reviewed later, interacting with others can not.
As we all know by now, PernixData was gobbled up by Nutanix a while back, and since then there has been a nothing but silence on the future of the FVP and Architect products. Now it seems it’s over. The acquisition trigged a bunch of PernixData employees moving elsewhere, and now it’s the products time to move on as well.
As a part of my Homelab project, I’ve created a proper bash script to provide dynamic DNS updates for external resources, via CloudFlare. More details on the reasoning behind it can be found in Using CloudFlare for Dynamic DNS, but since that was posted I’ve fleshed the script out quite a bit more.
In my previous post, I tried to lay out the foundation and reasoning behind requiring a Dynamic DNS Service, and here is how I solved it using CloudFlare.
First of all, I moved my chosen domain name to CloudFlare, and made sure everything resolved ok with static records. Once that was working, I started playing around with the CloudFlare API, using Cocoa Rest Client. I’m no developer (as is probably very apparent by the script below), nor API wizard of any kind, but it was fairly easy figuring out how to craft a request that lists my DNS zone.
While working on my new Homelab setup, I’ve been investigating ways to provide hostname based access to several web services located in my DMZ zone. Since my provider doesn’t provide static IP addresses, I also need an external Dynamic DNS service, to provide said hostname mappings through the reverse proxy on the inside.
There are loads of Dynamic DNS services available, most of them lets you use some sort of predefined domain name scheme, and point it to your external IP, but I wanted to use a domain name that I own and control. Since I use CloudFlare to provide DNS services (amongst other things) for this very site, it was a natural choice to see if they could fit the bill for my lab needs as well. Turns out, not only can they provide the services I need for free, they also allow me to play around and have fun at the same time!
Way back in 2013, I published Preserve your Veeam B&R Backups Jobs when Moving vCenter, outlining how to “cheat” (by using a CNAME alias) to preserve your Veeam Backup & Replication jobs if you replace your VMware vCenter.
Naturally, when there is a new vCenter instance, all the Virtual Machine Managed Object Reference’s (MoRef) change, which makes Veeam Backup & Replication start a new backup/replication chain, since all VMs are treated as new ones. Not ideal by any means, but at least you wouldn’t have to recreate all your jobs.
Christian Mohn works as a Chief Technologist SDDC for Proact in Norway.
See his About page for more details, or find him on Twitter.