Hugo is a static site generator written in Go. It is conceptually similar to Jekyll, albeit with far more speed and flexibility. Hugo also supports generating output formats other than HTML, which allows users to pipe content directly into an Elasticsearch cluster.
In this guide, we are going to use this feature to tell Hugo to generate the exact format needed to submit the file to the _bulk endpoint of Elasticsearch.
First Steps
In order to make use of this documentation, you will need Hugo installed and configured on your system
- Make Sure You Have Hugo Installed. This guide assumes you already have Hugo installed and and configured on your system. Visit the Hugo Documentation to get started.
- Spin Up a Bonsai Elasticsearch Cluster. This guide will use a Bonsai cluster as the Elasticsearch backend.
- Create an Index on the Cluster. In this example, we’re going to push data into an index called
hugo
. This index needs to be created before any data can be stored on it. The index can be created either through the Interactive Console, or with a tool likecurl
:
# Use the URL for your cluster. A Bonsai URL looks something like this: curl -XPUT https://user123:pass456@my-awesome-cluster-1234.us-east-1.bonsai.io/hugo
Configure Hugo to Output to Bonsai Elasticsearch
Hugo’s configuration settings live in a file called
config.toml
by default. This file may also have a .json
or .yaml
/yml
extension. Add the following snippet based on your config file format:
TOML:
[outputs] home = ["HTML", "RSS", "Bonsai"] [outputFormats.Bonsai] baseName = "bonsai" isPlainText = true mediaType = "application/json" notAlternative = true [params.bonsai] vars = ["title", "summary", "date", "publishdate", "expirydate", "permalink"] params = ["categories", "tags"]
JSON:
{ "outputs": { "home": [ "HTML", "RSS", "Bonsai" ] }, "outputFormats": { "Bonsai": { "baseName": "bonsai", "isPlainText": true, "mediaType": "application/json", "notAlternative": true } }, "params": { "bonsai": { "vars": [ "title", "summary", "date", "publishdate", "expirydate", "permalink" ], "params": [ "categories", "tags" ] } } }
YAML:
outputs: home: - HTML - RSS - Bonsai outputFormats: Bonsai: baseName: bonsai isPlainText: true mediaType: application/json notAlternative: true params: bonsai: vars: - title - summary - date - publishdate - expirydate - permalink params: - categories - tags
This snippet defines a new output called “Bonsai”, and specifies some associated variables.
Creating the JSON template
Hugo needs to have a template for rendering data in a way that Elasticsearch will understand. To do this, we will define a JSON template that conforms to the Elasticsearch Bulk API.
Create a template called
layouts/_default/list.bonsai.json
and give it the following content:
{{/* Generates a valid Elasticsearch _bulk index payload */}} {{- $section := $.Site.GetPage "section" .Section }} {{- range .Site.AllPages -}} {{- if or (and (.IsDescendant $section) (and (not .Draft) (not .Params.private))) $section.IsHome -}} {{/* action / metadata */}} {{ (dict "index" (dict "_index" "hugo" "_type" "doc" "_id" .UniqueID)) | jsonify }} {{ (dict "objectID" .UniqueID "date" .Date.UTC.Unix "description" .Description "dir" .Dir "expirydate" .ExpiryDate.UTC.Unix "fuzzywordcount" .FuzzyWordCount "keywords" .Keywords "kind" .Kind "lang" .Lang "lastmod" .Lastmod.UTC.Unix "permalink" .Permalink "publishdate" .PublishDate "readingtime" .ReadingTime "relpermalink" .RelPermalink "summary" .Summary "title" .Title "type" .Type "url" .URL "weight" .Weight "wordcount" .WordCount "section" .Section "tags" .Params.Tags "categories" .Params.Categories "authors" .Params.Authors) | jsonify }} {{- end -}} {{- end }}
When the site is generated, this will result in creating a file called
public/bonsai.json
, which will have the content stored in a way that can be pushed directly into Elasticsearch using the Bulk API.
Push the Data Into Elasticsearch
To get the site’s data into Elasticsearch, render it by running
hugo
on the command line. Then send it to your Bonsai cluster with curl
:
curl -H "Content-Type: application/x-ndjson" -XPOST "https://user123:pass456@my-awesome-cluster-1234.us-east-1.bonsai.io/_bulk" --data-binary @public/bonsai.json
You should now be able to see your data in the Elasticsearch cluster:
$ curl -XGET "https://user123:pass456@my-awesome-cluster-1234.us-east-1.bonsai.io/_search" {"took":1,"timed_out":false,"_shards":{"total":2,"successful":2,"failed":0},"hits":{"total":1,"max_score":1.0,"hits":[{"_index":"hugo","_type":"doc","_id":...