Lua plugin

Created:2024-08-07 Last Modified:2024-08-23

This document was translated by ChatGPT

#1. About Lua Plugin

Due to some users' Kubernetes environments possibly having special configurations or security requirements, the standardized way of extracting workload types and workload names may not work as expected. Alternatively, users might want to customize workload types and workload names based on their own logic. Therefore, DeepFlow supports users in extracting workload types and workload names by adding custom Lua plugins. The Lua plugin system enhances the flexibility and universality of K8s resource integration by calling Lua Functions at fixed points to obtain some user-defined workload types and names.

#2. Lua Plugin Writing Example

-- Fixed syntax to import the JSON parsing package
package.path = package.path..";/bin/?.lua"
local dkjson = require("dkjson")

-- Fixed function name and parameters
function GetWorkloadTypeAndName(metaJsonStr)
    -- Example of metadata JSON
    -- Note that the JSON passed in after the colon on this line "metadata": {
    --    "annotations": {
    --        "checksum/config": "",
    --        "cni.projectcalico.org/containerID": "",
    --        "cni.projectcalico.org/podIP": "",
    --        "cni.projectcalico.org/podIPs": ""
    --    },
    --    "creationTimestamp": "",
    --    "labels": {
    --        "app": "",
    --        "component": "",
    --        "controller-revision-hash": "",
    --        "pod-template-generation": ""
    --    },
    --    "name": "",
    --    "namespace": "",
    --    "ownerReferences": [
    --        {
    --            "apiVersion": "",
    --            "blockOwnerDeletion": true,
    --            "controller": true,
    --            "kind": "",
    --            "name": "",
    --            "uid": ""
    --        }
    --    ],
    --    "uid": ""
    -- }
    -- Fixed syntax to convert the incoming JSON string containing pod metadata into a Lua table
    local metaData = dkjson.decode(metaJsonStr,1,nil)
    local workloadType = ""
    local workloadName = ""
    -- Note that for customization flexibility, each pod metadata JSON string will be passed to the Lua script, and you need to filter out pods that do not require customization
    -- For those that meet the filter criteria, return two empty strings
    if condition then -- Here, condition is the criteria you need to filter pods
        return "", "" -- Return two empty strings
    end
    -- Return workloadType and workloadName through custom analysis of metaData
    return workloadType, workloadName

end
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48

#3. Upload Plugin

Lua plugins support runtime loading. After uploading the plugin using the deepflow-ctl tool in the DeepFlow runtime environment, it will be automatically loaded. Execute the following command in the environment:

# Replace /home/tom/hello.lua with the path to your Lua plugin and hello with the name you want to give the plugin
deepflow-ctl plugin create --type lua --image /home/tom/hello.lua --name hello --user server
1
2

DeepFlow supports loading multiple Lua plugins simultaneously. If you want different plugins to act on different Pods, make sure to write the filter rules properly. You can view the names of the plugins you have loaded with the following command:

deepflow-ctl plugin list
1

You can delete a specific plugin by its name with the following command:

deepflow-ctl plugin delete <name>
1

#4. Example

For instance, if the metadata of a Pod in the current k8s environment is as follows:

"metadata": {
        "annotations": {},
        "creationTimestamp": "2024-08-07T02:08:14Z",
        "labels": {},
        "name": "",
        "namespace": "",
        "ownerReferences": [
            {
                "apiVersion": "",
                "blockOwnerDeletion": true,
                "controller": true,
                "kind": "OpenGaussCluster",
                "name": "ogtest",
                "uid": ""
            }
        ],
        "resourceVersion": "",
        "uid": ""
    }
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19

The standardized way cannot extract the workload type and workload name because the kind of data is OpenGaussCluster, which is not a type supported by DeepFlow. The currently supported workload types are: Deployment/StatefulSet/DaemonSet/CloneSet. You can write the following Lua script to convert the workload type to a type supported by DeepFlow:

package.path = package.path..";/bin/?.lua"
local dkjson = require("dkjson")

function GetWorkloadTypeAndName(metaJsonStr)
    local metaData = dkjson.decode(metaJsonStr,1,nil)
    local ownerReferencesData = metaData["ownerReferences"] or {}
    local meteTable  = ownerReferencesData[1] or {}
    local workloadType = ""
    local workloadName = ""
    -- Get workloadType and ensure it is of string type
    workloadType = tostring(meteTable["kind"] or "")
    -- If we only want to process workloadType = "OpenGaussCluster", we can filter out other Pod metadata here
    if workloadType ~= "OpenGaussCluster" then
        -- Directly return empty strings for filtered out ones
        return "", ""
    else
        -- Process special Pods to make the returned workload type a supported type
        workloadType = "StatefulSet"
    end
    -- Get workloadName and ensure it is of string type
    -- Here, the Pod has ownerReferences data, and the name in ownerReferences is the workloadName
    -- If the Pod does not have ownerReferences data, you can calculate the workloadName based on the pod name
    local workloadName = tostring(meteTable["name"] or "")
    return workloadType, workloadName
end
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25

After uploading this plugin, you can extract the corresponding workload type and workload name for this Pod.