Pages

Saturday, April 9, 2022

Log Insight - event

Here is the Log Insight event in JSON format
 {  
   "events":  
     [  
       {  
         "fields":  
           [  
             {  
               "name":"id",  
               "content":"20fd5502013f0e2b7d577d765fa4bd14a595a61810c120d96fa869bbdd1dda8f"  
             },  
             {  
               "name":"container",  
               "content":"/vigilant_goldberg"  
             },  
             {  
               "name":"tag",  
               "content":"docker"  
             }  
           ],  
         "text": "text log message",  
         "timestamp":1649536690000  
       }  
     ]  
 }  

Monday, April 4, 2022

Photon OS - install fluentd agent for LogInsight

First of all, enable ICMP (ping) to Photon OS
Also allow HTTP connections on port 9323, where docker Prometheus node exporter exposes metrics. 

iptables -A INPUT -p ICMP -j ACCEPT
iptables -A OUTPUT -p ICMP -j ACCEPT
iptables -A INPUT -p tcp --dport 9323 -m conntrack --ctstate NEW,ESTABLISHED -j ACCEPT
iptables -A OUTPUT -p tcp --sport 9323 -m conntrack --ctstate ESTABLISHED -j ACCEPT
iptables-save > /etc/systemd/scripts/ip4save

We can continue with Fluentd agent installation.

Installation of Fluentd agent in Photon OS

# this will install Fluentd agent along with Ruby package manager (aka gem) used for other Ruby package installations

tdnf install rubygem-fluentd

# this will install wget to Photon OS to download some other required software components

tdnf install wget

# this will download VMware fluent-plugin-vmware-loginsight output plugin to do forward logs to VMware Log Insight

wget https://github.com/vmware/fluent-plugin-vmware-loginsight/releases/download/v1.0.0/fluent-plugin-vmware-loginsight-1.0.0.gem

# This will install VMware fluent-plugin-vmware-loginsight

gem install fluent-plugin-vmware-loginsight-1.0.0.gem

# This will install Docker fluent-plugin-docker

gem install fluent-plugin-docker

The Fluentd gem does not come with /etc/init.d/ scripts. You should use Process Management tools such as:

  • daemontools
  • runit
  • supervisord
  • upstart
  • systemd
Let's use systemd to manage fluentd as a linux service
See. https://medium.com/@benmorel/creating-a-linux-service-with-systemd-611b5c8b91d6

Fluentd is located at /usr/lib/ruby/gems/2.7.0/bin/fluentd 
Let's create a Linux service (fluentd) with systemd.

vi /etc/systemd/system/fluentd.service with following content
[Unit]
Description=Fluentd service
After=
StartLimitIntervalSec=0

[Service]
Type=simple
Restart=always
RestartSec=5
User=root
ExecStart= /usr/lib/ruby/gems/2.7.0/bin/fluentd

[Install]
WantedBy=multi-user.target

Now we can use standard systemd (systemctl) procedures to work with service.

systemctl enable fluentd
systemctl start fluentd
systemctl status fluentd

Configuration of Fluentd agent in Photon OS

Setup Fluentd configuration directory

/usr/lib/ruby/gems/2.7.0/gems/fluentd-1.11.3/bin/fluentd --setup /etc/fluent

Navigate to Fluentd configuration file (i.e. at /etc/fluent/fluent.conf).

Create the test config file manually into /etc/fluent/test_docker.conf

## built-in TCP input
## $ echo <json> | fluent-cat <tag>
<source>
  @type forward
  @id forward_input
</source> 
 
<match docker>
  @type stdout
  @id stdout_output
</match>

# run fluentd with test config

/usr/lib/ruby/gems/2.7.0/bin/fluentd -c /etc/fluent/test_docker.conf

# We can test logging by following command

docker run -it --log-driver=fluentd --log-opt tag="docker" alpine ash

and you can see log events on standard output

Default log driver and log options can be configured in docker configuration file /etc/docker/daemon.json

{
  "log-driver": "fluentd",
  "log-opts": {
    "tag": "docker",
    "mode": "non-blocking"
  },
  "metrics-addr" : "127.0.0.1:9323",
  "experimental" : true

Metrics-addr is the Prometheus node exporter of Docker.

Restart docker to activate new configuration 

systemctl restart docker

Now you can run docker without --log parameters and still use fluentd log routing.

docker run -it alpine ash

fluent-plugin-docker can be used to check and convert quoted JSON log messages into real JSON format

gem install fluent-plugin-docker

Fluentd configuration file is located in /etc/fluent/fluent.conf and below is the fluentd configuration example:

<source>  
  @type forward  
  @id forward_input  
</source>  
<filter docker>
  @type docker
</filter>
# Match everything else  
<match **>  
  @type copy  
  <store>  
   @type vmware_loginsight  
   @id out_vmw_li  
   scheme https  
   #ssl_verify true  
   ssl_verify false  
   # Loginsight host: One may use IP address or cname  
   host syslog.home.uw.cz  
   port 9543  
   #agent_id XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX  
   # Keys from log event whose values should be added as log message/text to  
   # Loginsight. Note these key/value pairs won't be added as metadata/fields  
   log_text_keys ["log","msg","message","source"]  
   # Use this flag if you want to enable http debug logs  
   http_conn_debug true  
   #http_conn_debug false  
  </store>  
  # copy plugin supports sending/copying logs to multiple plugins  
  # One may choose to send them to multiple LIs  
  # Or one may want send a copy to stdout for debugging  
  # Please note, if you use stdout along with LI, catch the logger's log to make  
  # sure they're not cyclic  
  #<store>  
  # @type stdout  
  #</store>  
</match>  

TODO: I still have to find a way how to merge multiline log messages into a single event.

If we want to send logs to two log servers, we can do so to use by two stores.

Let's install Fluentd plugin for Grafana Loki

gem install fluent-plugin-grafana-loki

and add additional <store>...</store> into /etc/fluent/fluent.conf

Here is the additional <store> snippet for loki ...

  <store>  
   @type loki  
   url "https://logs-prod-eu-west-0.grafana.net"  
   username "This is the loki user name"  
   password "For Grafana Cloud ... here should be the API key"  
   flush_interval 10s  
   flush_at_shutdown true  
   buffer_chunk_limit 1m  
   tenant dpasek  
   extra_labels {"worker":"fluentd"}  
   <label>  
    fluentd  
   </label>  
  </store>  

For more info about these topics, read the following articles ... 

Docker Logging (with runbook how to test it)
https://www.fluentd.org/guides/recipes/docker-logging



Configure Docker logging drivers
https://docs.docker.com/config/containers/logging/configure/

fluent-plugin-vmware-loginsight
https://github.com/vmware/fluent-plugin-vmware-loginsight

How to produce Prometheus metrics out of Logs using Fluentd

https://www.youtube.com/watch?v=fiqnLA2Qr98



Sunday, April 3, 2022

Understanding VMware Validated Solutions for VMware Cloud Foundation

 From: Gary Blake <gblake@vmware.com>

Date: Wednesday, March 2, 2022 at 9:17 AM
To: VMware Validated Solutions <validated-solutions@vmware.com>
Subject: Understanding VMware Validated Solutions for VMware Cloud Foundation

 

For the last 5+ years VMware has been developing and maintaining VMware Validated Designs, over that time we have seen significant changes across the VMware portfolio, in order to remain relevant the team needed to take a step back, perform an assessment on where we were and formulate a plan on how we could evolve the value traditionally offered by VMware Validated Designs – VMware Validated Solutions is this evolution but before I explain more it’s important to understand the background.

 

Background

When the VMware Validated Design initiative was started all those years ago it was obvious to the team involved that VMware had a great portfolio of products delivering many capabilities but the execution around assembling them in a single stack solution was a massive challenge. The main intent for VMware Validated Design was to:

·         Ensure that our customers can be successful

·         Drive transparency across the interoperability for the underlying bill of materials

·         Deliver consistent and repeatable architecture at enterprise scale

·         Perform consistent validation across the solution for initial deployment and lifecycle management

·         Deliver operational efficiency

 

It’s worth pointing out that VMware Validated Designs were and have never been a product per say, in that all the content ever produced has been freely available to all VMware customers on docs.vmware.com.  When VMware released VMware Cloud Foundation it was no surprise that the engineering teams working on both this and VMware Validated Design were merged into a new business unit, the crossover was clear, and it made perfect sense. Since that point the teams have spent numerous hours collaborating in an effort to align the architectures, examples being when VMware Cloud Foundation 3.0 was released and we introduced Bring-Your-Own-Network (BYON) capabilities this aligned to the same stance we had with VMware Validated Design, the introduction of VMware Cloud Builder and later combining into a single appliance and more recently VMware Cloud Foundation 4.0 and VMware Validated Design 6.0 where we had true architecture alignment, mainly in part due to significant changes in the underlying vSphere and NSX-T Data Center components and the VMware Validated Design for the first time included VMware Cloud Foundation as a first class citizen with the stack.

The Evolution

Since the VMware Validated Design 6.0 release, we have continued to ship a new version in line with each new version of VMware Cloud Foundation but along the way we have been hearing from our customers that some confusion has crept in, examples such as conflicting information in the VMware Validated Design with something documented in the VMware Cloud Foundation documentation or perhaps written in VMware Cloud Foundation specific blog post along with duplication of content. It’s for this reason around VMworld 2020 I was asked to join a working group to investigate and identify a future strategy.

 

The outcome of the working group was to focus on two distinct work streams, the first work stream is focused on VMware Cloud Foundation itself which we refer to as ‘The Platform’ and the second work stream is focused on delivering capabilities on top of VMware Cloud Foundation which we now refer to as ‘Solutions’. This is where the term VMware Validated Solutions comes in, the primary focus is to develop, validate and maintain byte size solutions that offer incremental value to our customers businesses, here we apply the same methodology, process and procedures used when we developed the VMware Validated Design.

 

As they say a picture can paint a thousand words, the figure below illustrates how ‘Solutions’ are layered on top of VMware Cloud Foundation.

 

cid2601531548*image001.png@01D82D6E.D1284780

 

Introducing VMware Validated Solutions

In sync with the release of VMware Cloud Foundation 4.3, we first introduced VMware Validated Solutions, the team has worked hard behind the scenes developing, building, and validating a number of ‘Solutions’. It’s not just been about the solutions themselves either, we wanted to make the content as discoverable as possible and this is the reason we chose to use the Tech Zone platform as the central landing page for all VMware Validated Solutions, here you will find a tile for each solution released. 

 

cid2601531548*image002.png@01D82D6F.06CE9010

 

 

Clicking the View Resource Page link will take you to the focus page for that solution, here you will find tiles with links to the content for that solution which will include:

 

  • Design Objectives
  • Detailed Design
  • Design Decisions
  • Planning and Preparation
  • Implementation
  • Operational Guidance
  • Solution Interoperability

 

Overtime as new content is developed it will be made available in the same interface, so you and customers should consider the landing page for all VMware Validated Solutions, as the single source of truth. For more details on what each ‘Solution’ provides check out the content, but for the record the following VMware Validated Solutions are available at this time:

 

 

The plan is to add additional ‘Solutions’ over time, these may be developed within our own team or by another group within VMware (which we refer to as a 2nd Party), look out for future What’s New update emails.

Infrastructure as Code

In addition to these new byte size ‘Solutions’ we also introduced the concept of ‘Infrastructure as Code’, this entails developing and providing automation to help accelerate the implementation steps to help customer install and configure in a robust and repeatable way, enabling them to realize the business benefits faster. This was exploratory for the team initially but since the initial launch we continue to add more coverage in fact 5 of the 7 solutions now have 100% coverage for implementation automation.

 

This automation is primarily delivered through a PowerShell Module that we developed called PowerValidatedSolutions, this can be installed by a customer directly from the Microsoft PS Gallery and used to perform various configuration procedures. Each function is purpose built to support the procedure being performed within the respective ‘Solution’ and where possible we use the SDDC Manager inventory to gather the details we need to perform actions to save on the user having to define input values. As part of the cmdlet we have also developed pre and post validation checks to ensure that if something fails it fails gracefully and we provide a clear reason. The source code is available as Open Source and can be downloaded from the Power Validated Solutions Git Hub repository, and of course contributions are welcome. It’s also worth pointing out that the PowerShell Module can be utilised not just for ‘Solutions’ but other scenarios too this is because to develop the procedure specific functions we had to develop many sub-functions to work with VMware product APIs.

 

On the Terraform front, this is today only used within the Private Cloud Automation for VMware Cloud Foundation solution but offers the same approach, the Terraform files can be downloaded from the Private Cloud Automation Git Hub repository.

Time To Deploy

Last but by no means least, each ‘Solution’ also comes with a ‘Time to Deploy’ value, the intent behind this is to provide an estimate of how long each solution might take to implement. The key point here being the actual implementation, this does not include the time to understand the solution design, preparing your environment or performing the data capture in relation to hostnames, IP Addresses etc. It’s also worth calling out that these times are also based on someone from within the VMware team performing the tasks, who are more than likely already familiar with the products or the solution.

VMware Blogs

Feb 22, 2022: VMware Validated Solutions – February 2022 Update

Jan 25, 2022: VMware Validated Solutions – January 2022 Update

Dec 01, 2021: VMware Validated Solutions – November 2021 Update

Oct 07, 2021: VMware Validated Solutions – October 2021 Update

Oct 07, 2021: Site Protection & Disaster Recovery for VMware Cloud Foundation Validated Solution

Oct 07, 2021: Planning & Preparation for Site Protection and Disaster Recovery with VMware Cloud Foundation

Sep 02, 2021: Introducing the Developer Ready Infrastructure VMware Validated Solution

Sep 01, 2021: Advanced Load Balancing for VMware Cloud Foundation Solution

Aug 24, 2021: Deliver Value-Enhancing Solutions with VMware Validated Solutions

What Next?

  • Go check them out for yourself, visit https://core.vmware.com/vmware-validated-solutions for more details.
  • Already utilising Validated Solutions or plan to soon, drop us an email and let us know which ‘Solutions’ and for which customer
  • Have feedback on existing content, ideas or suggestions on improvements or future ‘Solutions’, drop us a note

Email:              validated-solutions@vmware.com

Slack:               #validated-solutions