Monthly Archives: February 2021

Installing and Configuring containerd as a Kubernetes Container Runtime

In this post, I’m going to show you how to install containerd as the container runtime in a Kubernetes cluster. I will also cover setting the cgroup driver for containerd to systemd which is the preferred cgroup driver for Kubernetes. In Kubernetes version 1.20 Docker was deprecated and will be removed after 1.22. containerd is a CRI compatible container runtime and is one of the supported options you have as a container runtime in Kubernetes in this post Docker Kubernetes world. I do want to call out that you can use containers created with Docker in containerd.

Configure required modules

First load two modules in the current running environment and configure them to load on boot

sudo modprobe overlay
sudo modprobe br_netfilter

cat <<EOF | sudo tee /etc/modules-load.d/containerd.conf
overlay
br_netfilter
EOF

Configure required sysctl to persist across system reboots

cat <<EOF | sudo tee /etc/sysctl.d/99-kubernetes-cri.conf
net.bridge.bridge-nf-call-iptables  = 1
net.ipv4.ip_forward                 = 1
net.bridge.bridge-nf-call-ip6tables = 1
EOF

Apply sysctl parameters without reboot to current running enviroment

sudo sysctl --system

Install containerd packages

sudo apt-get update 
sudo apt-get install -y containerd

Create a containerd configuration file

sudo mkdir -p /etc/containerd
sudo containerd config default | sudo tee /etc/containerd/config.toml

Set the cgroup driver for runc to systemd

Set the cgroup driver for runc to systemd which is required for the kubelet.
For more information on this config file see the containerd configuration docs here and also here.

At the end of this section in /etc/containerd/config.toml

        [plugins."io.containerd.grpc.v1.cri".containerd.runtimes.runc]
        ...

Around line 86, add these two lines, indentation matters.

          [plugins."io.containerd.grpc.v1.cri".containerd.runtimes.runc.options]
            SystemdCgroup = true

Restart containerd with the new configuration

sudo systemctl restart containerd

And that’s it, from here you can install and configure Kubernetes on top of this container runtime. In an upcoming post, I will bootstrap a cluster using containerd as the container runtime.

Published Azure Arc-Enabled Data Services Revealed

I’m super proud to announce that Ben E. Weissman and I have published Azure Arc-Enabled Data Services Revealed available now at Apress and your favorite online book sellers! Buy the book now…or keep reading below if you need to be more convinced :)

A couple notes about the book, first I really enjoyed getting to work with this bleeding edge tech and collaborate with the SQL Server Engineering Team at Microsoft on this. I want to call out the support from our tech reviewer and Program Managed for Azure Arc Enabled Data Services, Travis Wright. Thanks for your help and support. Be sure to read the forward from Travis…it tells the story of why and how. From getting SQL Server on Linux, into containers, into Kubernetes, Big Data Clusters and now Arc Enabled Data Services. Awesome stuff. I also want callout my co-author and friend, Ben you are an awesome writer, thank you for including me in this adventure!

About the Book

Get introduced to Azure Arc-enabled data services and the powerful capabilities they provide to deploy and manage local, on-premises, and hybrid cloud data resources using the same centralized management and tooling you get from the Azure cloud. This book shows how you can deploy and manage databases running on SQL Server and Postgres in your corporate data center as if they were part of the Azure platform. You will learn how to benefit from the centralized management that Azure provides, the automated rollout of patches and updates, and more.

This book is the perfect choice for anyone looking for a hybrid or multi-vendor cloud strategy for their data estate. The authors walk you through the possibilities and requirements to get services such as Azure SQL Managed Instance and PostgresSQL HyperScale, deployed outside of Azure, so the services are accessible to companies that cannot move to the cloud or do not want to use the Microsoft cloud exclusively. The technology described in this book will be especially useful to those required to keep sensitive services, such as medical databases, away from the public cloud, but who still want to benefit from the Azure cloud and the centralized management and tooling that it supports.

What You Will Learn

  • The core concepts of Kubernetes
  • The fundamentals and architecture of Azure Arc-enabled data services
  • Build a multi-cloud strategy based on Azure data services
  • Deploy Azure Arc-enabled data services on premises or in any cloud
  • Deploy Azure Arc-enabled SQL Managed Instance on premises or in any cloud
  • Deploy Azure Arc-enabled PostgreSQL HyperScale on premises or in any cloud
  • Manage Azure-enabled data services running outside of Azure
  • Monitor Azure-enabled data services running outside of Azure through the Azure Portal

Who This Book Is For

Database administrators and architects who want to manage on-premises or hybrid cloud data resources from the Microsoft Azure cloud. Especially for those wishing to take advantage of cloud technologies while keeping sensitive data on premises and under physical control.

Azure Arc-Enabled Data Services Revealed

Getting SQL Agent Jobs and Job Steps Configuration

Recently I needed to take a look at all of the SQL Server Agent Jobs and their Jobs Steps for a customer. Specifically, I needed to review all of the Jobs and Job Steps for Ola Hallengren’s Maintenance Solution and look at the Backup, Index Maintenance and Integrity Jobs to ensure they’re configured properly and also account for any customizations and one-offs in the Job definitions. This customer has dozens of SQL Server instances and well, I wasn’t about to click through everything in SSMS…and writing this in TSQL would have been a good candidate for a Ph.D. dissertation. So let’s check out how I solved this problem using dbatools.

Enter dbatools…

In my first attempt at doing this I tried getting all the Jobs using Get-DbaAgentJob and exporting the Jobs to TSQL using Export-DbaScript. This did give me the code for all of the Jobs I was interested in. But that left me trying to decipher SQL Agent Job and Schedule syntax and encodings and I got all twisted up in the TSQL-ness of that. I needed this to be more readable.

So I thought…there has to be a better way…there is! So, I wrote the following. This code gets each SQL Agent Job, print the Job’s Name, NextRunDate, if it has a Schedule, Operator information, and then for each JobStep it prints the Step’s Name, Subsystem, and finally the Command. Using this I can quickly get a feel for the configurations across the environment.

Get a listing of all SQL Instances

    $Servers = Get-DbaRegisteredServer

Get all of the SQL Agent Jobs across all SQL Instances

    $jobs = Get-DbaAgentJob -SqlInstance $Servers.Name

Filter that list down to the SQL Agent Jobs that are in the Database Maintenance category

    $MaintenanceJobs = $jobs | Where-Object { $_.Category -eq 'Database Maintenance' } 

For each SQL Agent Job, print the Job’s Name, NextRunDate, if it has a Schedule, Operator information, and then for each JobStep print its Name, Agent Subsystem, and finally the Command.

    $JobsAndSteps = foreach ($MaintenanceJob in $MaintenanceJobs){
        foreach ($JobStep in $MaintenanceJob.JobSteps) {
            $obj = [PSCustomObject]@{
                SqlInstance = $MaintenanceJob.SqlInstance
                Name = $MaintenanceJob.Name
                NextRunDate = $MaintenanceJob.NextRunDate
                HasSchedule = $MaintenanceJob.HasSchedule
                OperatorToEmail = $MaintenanceJob.OperatorToEmail
                JobStepName = $JobStep.Name
                SubSystem = $JobStep.SubSystem
                Command = $JobStep.Command
                }
            $obj  
        }
    }

Here’s some sample output using Format-Table. From there I can quickly scan and analyze all the Jobs on all of the Instances in an environment.

$JobsAndSteps | Format-Table

SqlInstance     Name                                    NextRunDate           HasSchedule OperatorToEmail JobStepName                                           SubSystem Command
-----------     ----                                    -----------           ----------- --------------- -----------                                           --------- -------
PRODSQL1        DatabaseBackup - USER_DATABASES - FULL  2/3/2021 1:00:00 AM          True DbaTeam         DatabaseBackup - USER_DATABASES - FULL - Backup         CmdExec sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "EXECUTE [dbo].[DatabaseBackup] @Databases = 'USER_DATABASES', @Directory = N'T:\Backup', @Ba...
PRODSQL1        DatabaseBackup - USER_DATABASES - FULL  2/3/2021 1:00:00 AM          True DbaTeam         DatabaseBackup - USER_DATABASES - FULL - Sync           CmdExec ROBOCOPY SOME STUFF
PRODSQL1        DatabaseBackup - USER_DATABASES - FULL  2/3/2021 1:00:00 AM          True DbaTeam         DatabaseBackup - USER_DATABASES - FULL - Cleanup     PowerShell RUN SOME POWERSHELL TO DO COOL STUFF
PRODSQL2        DatabaseBackup - USER_DATABASES - FULL  2/3/2021 1:00:00 AM          True DbaTeam         DatabaseBackup - USER_DATABASES - FULL - Backup         CmdExec sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "EXECUTE [dbo].[DatabaseBackup] @Databases = 'USER_DATABASES', @Directory = N'T:\Backup', @Ba...
PRODSQL2        DatabaseBackup - USER_DATABASES - FULL  2/3/2021 1:00:00 AM          True DbaTeam         DatabaseBackup - USER_DATABASES - FULL - Sync           CmdExec ROBOCOPY SOME STUFF
PRODSQL2        DatabaseBackup - USER_DATABASES - FULL  2/3/2021 1:00:00 AM          True DbaTeam         DatabaseBackup - USER_DATABASES - FULL - Cleanup     PowerShell RUN SOME POWERSHELL TO DO COOL STUFF

You can also take that output and convert it to CSV and then Excel for analysis

$JobsAndSteps | ConvertTo-Csv -NoTypeInformation | Out-File JobSteps.csv