HCL 2.0 replaces HCL 1.0

This is the first step in bringing HCL 2 into the main HCL repository.
Subsequent commits will prune and reorganize this in preparation for
an initial tagged HCL 2 release.
This commit is contained in:
Martin Atkins 2019-09-09 15:35:18 -07:00
commit af14e80638
527 changed files with 61314 additions and 10793 deletions

View File

@ -1,21 +0,0 @@
### HCL Template
```hcl
# Place your HCL configuration file here
```
### Expected behavior
What should have happened?
### Actual behavior
What actually happened?
### Steps to reproduce
1.
2.
3.
### References
Are there any other GitHub issues (open or closed) that should
be linked here? For example:
- GH-1234
- ...

9
.gitignore vendored
View File

@ -1,9 +0,0 @@
y.output
# ignore intellij files
.idea
*.iml
*.ipr
*.iws
*.test

12
.travis.sh Executable file
View File

@ -0,0 +1,12 @@
#!/bin/bash
set -e
echo "" > coverage.txt
for d in $(go list ./... | grep -v vendor); do
go test -coverprofile=profile.out -covermode=atomic $d
if [ -f profile.out ]; then
cat profile.out >> coverage.txt
rm profile.out
fi
done

View File

@ -1,13 +1,17 @@
sudo: false
language: go
go:
- 1.x
- tip
- 1.11.x
- 1.12.x
branches:
only:
- master
env:
- GO111MODULE=on
script: make test
before_install:
- go get -v ./...
script:
- ./.travis.sh
after_success:
- bash <(curl -s https://codecov.io/bash)

View File

@ -351,4 +351,3 @@ Exhibit B - “Incompatible With Secondary Licenses” Notice
This Source Code Form is “Incompatible
With Secondary Licenses”, as defined by
the Mozilla Public License, v. 2.0.

View File

@ -1,18 +0,0 @@
TEST?=./...
default: test
fmt: generate
go fmt ./...
test: generate
go get -t ./...
go test $(TEST) $(TESTARGS)
generate:
go generate ./...
updatedeps:
go get -u golang.org/x/tools/cmd/stringer
.PHONY: default generate test updatedeps

241
README.md
View File

@ -1,125 +1,172 @@
# HCL
[![GoDoc](https://godoc.org/github.com/hashicorp/hcl?status.png)](https://godoc.org/github.com/hashicorp/hcl) [![Build Status](https://travis-ci.org/hashicorp/hcl.svg?branch=master)](https://travis-ci.org/hashicorp/hcl)
HCL is a toolkit for creating structured configuration languages that are
both human- and machine-friendly, for use with command-line tools.
Although intended to be generally useful, it is primarily targeted
towards devops tools, servers, etc.
HCL (HashiCorp Configuration Language) is a configuration language built
by HashiCorp. The goal of HCL is to build a structured configuration language
that is both human and machine friendly for use with command-line tools, but
specifically targeted towards DevOps tools, servers, etc.
HCL has both a _native syntax_, intended to be pleasant to read and write for
humans, and a JSON-based variant that is easier for machines to generate
and parse.
HCL is also fully JSON compatible. That is, JSON can be used as completely
valid input to a system expecting HCL. This helps makes systems
interoperable with other systems.
The HCL native syntax is inspired by [libucl](https://github.com/vstakhov/libucl),
[nginx configuration](http://nginx.org/en/docs/beginners_guide.html#conf_structure),
and others.
HCL is heavily inspired by
[libucl](https://github.com/vstakhov/libucl),
nginx configuration, and others similar.
It includes an expression syntax that allows basic inline computation and,
with support from the calling application, use of variables and functions
for more dynamic configuration languages.
HCL provides a set of constructs that can be used by a calling application to
construct a configuration language. The application defines which attribute
names and nested block types are expected, and HCL parses the configuration
file, verifies that it conforms to the expected structure, and returns
high-level objects that the application can use for further processing.
## Experimental HCL2
This repository contains the experimental version 2 of HCL. This new version
combines the initial iteration of HCL with the interpolation language HIL
to produce a single configuration language that supports arbitrary expressions.
At this time the HCL2 syntax and the Go API are still evolving.
Backward-compatibility is not guaranteed and so any application using this
library should use vendoring.
The new implementation has a completely new parser and Go API, with no direct
migration path. Although the syntax is similar, the implementation takes some
very different approaches to improve on some "rough edges" that existed with
the original implementation and to allow for more robust error handling.
Once this new implementation reaches stability, its package paths will be
changed to reflect that it is the _current_ HCL implementation. At that time,
the original implementation will be archived.
## Why?
A common question when viewing HCL is to ask the question: why not
JSON, YAML, etc.?
Newcomers to HCL often ask: why not JSON, YAML, etc?
Prior to HCL, the tools we built at [HashiCorp](http://www.hashicorp.com)
used a variety of configuration languages from full programming languages
such as Ruby to complete data structure languages such as JSON. What we
learned is that some people wanted human-friendly configuration languages
and some people wanted machine-friendly languages.
Whereas JSON and YAML are formats for serializing data structures, HCL is
a syntax and API specifically designed for building structured configuration
formats.
JSON fits a nice balance in this, but is fairly verbose and most
importantly doesn't support comments. With YAML, we found that beginners
had a really hard time determining what the actual structure was, and
ended up guessing more often than not whether to use a hyphen, colon, etc.
in order to represent some configuration key.
HCL attempts to strike a compromise between generic serialization formats
such as JSON and configuration formats built around full programming languages
such as Ruby. HCL syntax is designed to be easily read and written by humans,
and allows _declarative_ logic to permit its use in more complex applications.
Full programming languages such as Ruby enable complex behavior
a configuration language shouldn't usually allow, and also forces
people to learn some set of Ruby.
HCL is intended as a base syntax for configuration formats built
around key-value pairs and hierarchical blocks whose structure is well-defined
by the calling application, and this definition of the configuration structure
allows for better error messages and more convenient definition within the
calling application.
Because of this, we decided to create our own configuration language
that is JSON-compatible. Our configuration language (HCL) is designed
to be written and modified by humans. The API for HCL allows JSON
as an input so that it is also machine-friendly (machines can generate
JSON instead of trying to generate HCL).
It can't be denied that JSON is very convenient as a _lingua franca_
for interoperability between different pieces of software. Because of this,
HCL defines a common configuration model that can be parsed from either its
native syntax or from a well-defined equivalent JSON structure. This allows
configuration to be provided as a mixture of human-authored configuration
files in the native syntax and machine-generated files in JSON.
Our goal with HCL is not to alienate other configuration languages.
It is instead to provide HCL as a specialized language for our tools,
and JSON as the interoperability layer.
## Information Model and Syntax
## Syntax
HCL is built around two primary concepts: _attributes_ and _blocks_. In
native syntax, a configuration file for a hypothetical application might look
something like this:
For a complete grammar, please see the parser itself. A high-level overview
of the syntax and grammar is listed here.
```hcl
io_mode = "async"
* Single line comments start with `#` or `//`
service "http" "web_proxy" {
listen_addr = "127.0.0.1:8080"
process "main" {
command = ["/usr/local/bin/awesome-app", "server"]
}
* Multi-line comments are wrapped in `/*` and `*/`. Nested block comments
are not allowed. A multi-line comment (also known as a block comment)
terminates at the first `*/` found.
* Values are assigned with the syntax `key = value` (whitespace doesn't
matter). The value can be any primitive: a string, number, boolean,
object, or list.
* Strings are double-quoted and can contain any UTF-8 characters.
Example: `"Hello, World"`
* Multi-line strings start with `<<EOF` at the end of a line, and end
with `EOF` on its own line ([here documents](https://en.wikipedia.org/wiki/Here_document)).
Any text may be used in place of `EOF`. Example:
```
<<FOO
hello
world
FOO
```
* Numbers are assumed to be base 10. If you prefix a number with 0x,
it is treated as a hexadecimal. If it is prefixed with 0, it is
treated as an octal. Numbers can be in scientific notation: "1e10".
* Boolean values: `true`, `false`
* Arrays can be made by wrapping it in `[]`. Example:
`["foo", "bar", 42]`. Arrays can contain primitives,
other arrays, and objects. As an alternative, lists
of objects can be created with repeated blocks, using
this structure:
```hcl
service {
key = "value"
}
service {
key = "value"
}
```
Objects and nested objects are created using the structure shown below:
```
variable "ami" {
description = "the AMI to use"
process "mgmt" {
command = ["/usr/local/bin/awesome-app", "mgmt"]
}
}
```
This would be equivalent to the following json:
``` json
The JSON equivalent of this configuration is the following:
```json
{
"variable": {
"ami": {
"description": "the AMI to use"
"io_mode": "async",
"service": {
"http": {
"web_proxy": {
"listen_addr": "127.0.0.1:8080",
"process": {
"main": {
"command": ["/usr/local/bin/awesome-app", "server"]
},
"mgmt": {
"command": ["/usr/local/bin/awesome-app", "mgmt"]
},
}
}
}
}
}
```
## Thanks
Regardless of which syntax is used, the API within the calling application
is the same. It can either work directly with the low-level attributes and
blocks, for more advanced use-cases, or it can use one of the _decoder_
packages to declaratively extract into either Go structs or dynamic value
structures.
Thanks to:
Attribute values can be expressions as well as just literal values:
* [@vstakhov](https://github.com/vstakhov) - The original libucl parser
and syntax that HCL was based off of.
```hcl
# Arithmetic with literals and application-provided variables
sum = 1 + addend
* [@fatih](https://github.com/fatih) - The rewritten HCL parser
in pure Go (no goyacc) and support for a printer.
# String interpolation and templates
message = "Hello, ${name}!"
# Application-provided functions
shouty_message = upper(message)
```
Although JSON syntax doesn't permit direct use of expressions, the interpolation
syntax allows use of arbitrary expressions within JSON strings:
```json
{
"sum": "${1 + addend}",
"message": "Hello, ${name}!",
"shouty_message": "${upper(message)}"
}
```
For more information, see the detailed specifications:
* [Syntax-agnostic Information Model](hcl/spec.md)
* [HCL Native Syntax](hcl/hclsyntax/spec.md)
* [JSON Representation](hcl/json/spec.md)
## Acknowledgements
HCL was heavily inspired by [libucl](https://github.com/vstakhov/libucl),
by [Vsevolod Stakhov](https://github.com/vstakhov).
HCL and HIL originate in [HashiCorp Terraform](https://terraform.io/),
with the original parsers for each written by
[Mitchell Hashimoto](https://github.com/mitchellh).
The original HCL parser was ported to pure Go (from yacc) by
[Fatih Arslan](https://github.com/fatih). The structure-related portions of
the new native syntax parser build on that work.
The original HIL parser was ported to pure Go (from yacc) by
[Martin Atkins](https://github.com/apparentlymart). The expression-related
portions of the new native syntax parser build on that work.
HCL2, which merged the original HCL and HIL languages into this single new
language, builds on design and prototyping work by
[Martin Atkins](https://github.com/apparentlymart) in
[zcl](https://github.com/zclconf/go-zcl).

View File

@ -1,19 +0,0 @@
version: "build-{branch}-{build}"
image: Visual Studio 2015
clone_folder: c:\gopath\src\github.com\hashicorp\hcl
environment:
GOPATH: c:\gopath
init:
- git config --global core.autocrlf false
install:
- cmd: >-
echo %Path%
go version
go env
go get -t ./...
build_script:
- cmd: go test -v ./...

100
cmd/hcldec/README.md Normal file
View File

@ -0,0 +1,100 @@
# hcldec
`hcldec` is a command line tool that transforms HCL input into JSON output
using a decoding specification given by the user.
This tool is intended as a "glue" tool, with use-cases like the following:
* Define a HCL-based configuration format for a third-party tool that takes
JSON as input, and then translate the HCL configuration into JSON before
running the tool. (See [the `npm-package` example](examples/npm-package).)
* Use HCL from languages where a HCL parser/decoder is not yet available.
At the time of writing, that's any language other than Go.
* In particular, define a HCL-based configuration format for a shell script
and then use `jq` to load the result into environment variables for
further processing. (See [the `sh-config-file` example](examples/sh-config-file).)
## Installation
If you have a working Go development environment, you can install this tool
with `go get` in the usual way:
```
$ go get -u github.com/hashicorp/hcl2/cmd/hcldec
```
This will install `hcldec` in `$GOPATH/bin`, which usually places it into
your shell `PATH` so you can then run it as `hcldec`.
## Usage
```
usage: hcldec --spec=<spec-file> [options] [hcl-file ...]
-o, --out string write to the given file, instead of stdout
-s, --spec string path to spec file (required)
-V, --vars json-or-file provide variables to the given configuration file(s)
-v, --version show the version number and immediately exit
```
The most important step in using `hcldec` is to write the specification that
defines how to interpret the given configuration files and translate them
into JSON. The following is a simple specification that creates a JSON
object from two top-level attributes in the input configuration:
```hcl
object {
attr "name" {
type = string
required = true
}
attr "is_member" {
type = bool
}
}
```
Specification files are conventionally kept in files with a `.hcldec`
extension. We'll call this one `example.hcldec`.
With the above specification, the following input file `example.conf` is
valid:
```hcl
name = "Raul"
```
The spec and the input file can then be provided to `hcldec` to extract a
JSON representation:
```
$ hcldec --spec=example.hcldec example.conf
{"name": "Raul"}
```
The specification defines both how to map the input into a JSON data structure
and what input is valid. The `required = true` specified for the `name`
allows `hcldec` to detect and raise an error when an attribute of that name
is not provided:
```
$ hcldec --spec=example.hcldec typo.conf
Error: Unsupported attribute
on example.conf line 1:
1: namme = "Juan"
An attribute named "namme" is not expected here. Did you mean "name"?
Error: Missing required attribute
on example.conf line 2:
The attribute "name" is required, but no definition was found.
```
## Further Reading
For more details on the `.hcldec` specification file format, see
[the spec file documentation](spec-format.md).

101
cmd/hcldec/diags_json.go Normal file
View File

@ -0,0 +1,101 @@
package main
import (
"encoding/json"
"io"
"github.com/hashicorp/hcl2/hcl"
)
type jsonDiagWriter struct {
w io.Writer
diags hcl.Diagnostics
}
var _ hcl.DiagnosticWriter = &jsonDiagWriter{}
func (wr *jsonDiagWriter) WriteDiagnostic(diag *hcl.Diagnostic) error {
wr.diags = append(wr.diags, diag)
return nil
}
func (wr *jsonDiagWriter) WriteDiagnostics(diags hcl.Diagnostics) error {
wr.diags = append(wr.diags, diags...)
return nil
}
func (wr *jsonDiagWriter) Flush() error {
if len(wr.diags) == 0 {
return nil
}
type PosJSON struct {
Line int `json:"line"`
Column int `json:"column"`
Byte int `json:"byte"`
}
type RangeJSON struct {
Filename string `json:"filename"`
Start PosJSON `json:"start"`
End PosJSON `json:"end"`
}
type DiagnosticJSON struct {
Severity string `json:"severity"`
Summary string `json:"summary"`
Detail string `json:"detail,omitempty"`
Subject *RangeJSON `json:"subject,omitempty"`
}
type DiagnosticsJSON struct {
Diagnostics []DiagnosticJSON `json:"diagnostics"`
}
diagsJSON := make([]DiagnosticJSON, 0, len(wr.diags))
for _, diag := range wr.diags {
var diagJSON DiagnosticJSON
switch diag.Severity {
case hcl.DiagError:
diagJSON.Severity = "error"
case hcl.DiagWarning:
diagJSON.Severity = "warning"
default:
diagJSON.Severity = "(unknown)" // should never happen
}
diagJSON.Summary = diag.Summary
diagJSON.Detail = diag.Detail
if diag.Subject != nil {
diagJSON.Subject = &RangeJSON{}
sJSON := diagJSON.Subject
rng := diag.Subject
sJSON.Filename = rng.Filename
sJSON.Start.Line = rng.Start.Line
sJSON.Start.Column = rng.Start.Column
sJSON.Start.Byte = rng.Start.Byte
sJSON.End.Line = rng.End.Line
sJSON.End.Column = rng.End.Column
sJSON.End.Byte = rng.End.Byte
}
diagsJSON = append(diagsJSON, diagJSON)
}
src, err := json.MarshalIndent(DiagnosticsJSON{diagsJSON}, "", " ")
if err != nil {
return err
}
_, err = wr.w.Write(src)
wr.w.Write([]byte{'\n'})
return err
}
type flusher interface {
Flush() error
}
func flush(maybeFlusher interface{}) error {
if f, ok := maybeFlusher.(flusher); ok {
return f.Flush()
}
return nil
}

View File

@ -0,0 +1,14 @@
name = "hello-world"
version = "v0.0.1"
author {
name = "Иван Петрович Сидоров"
}
contributor {
name = "Juan Pérez"
}
dependencies = {
left-pad = "1.2.0"
}

View File

@ -0,0 +1,136 @@
object {
attr "name" {
type = string
required = true
}
attr "version" {
type = string
required = true
}
attr "description" {
type = string
}
attr "keywords" {
type = list(string)
}
attr "homepage" {
# "homepage_url" in input file is translated to "homepage" in output
name = "homepage_url"
}
block "bugs" {
object {
attr "url" {
type = string
}
attr "email" {
type = string
}
}
}
attr "license" {
type = string
}
block "author" {
object {
attr "name" {
type = string
}
attr "email" {
type = string
}
attr "url" {
type = string
}
}
}
block_list "contributors" {
block_type = "contributor"
object {
attr "name" {
type = string
}
attr "email" {
type = string
}
attr "url" {
type = string
}
}
}
attr "files" {
type = list(string)
}
attr "main" {
type = string
}
attr "bin" {
type = map(string)
}
attr "man" {
type = list(string)
}
attr "directories" {
type = map(string)
}
block "repository" {
object {
attr "type" {
type = string
required = true
}
attr "url" {
type = string
required = true
}
}
}
attr "scripts" {
type = map(string)
}
attr "config" {
type = map(string)
}
attr "dependencies" {
type = map(string)
}
attr "devDependencies" {
name = "dev_dependencies"
type = map(string)
}
attr "peerDependencies" {
name = "peer_dependencies"
type = map(string)
}
attr "bundledDependencies" {
name = "bundled_dependencies"
type = map(string)
}
attr "optionalDependencies" {
name = "optional_dependencies"
type = map(string)
}
attr "engines" {
type = map(string)
}
attr "os" {
type = list(string)
}
attr "cpu" {
type = list(string)
}
attr "prefer_global" {
type = bool
}
default "private" {
attr {
name = "private"
type = bool
}
literal {
value = false
}
}
attr "publishConfig" {
type = map(any)
}
}

View File

@ -0,0 +1,10 @@
name = "Juan"
friend {
name = "John"
}
friend {
name = "Yann"
}
friend {
name = "Ermintrude"
}

View File

@ -0,0 +1,26 @@
#!/bin/bash
set -euo pipefail
# All paths from this point on are relative to the directory containing this
# script, for simplicity's sake.
cd "$( dirname "${BASH_SOURCE[0]}" )"
# Read the config file using hcldec and then use jq to extract values in a
# shell-friendly form. jq will ensure that the values are properly quoted and
# escaped for consumption by the shell.
CONFIG_VARS="$(hcldec --spec=spec.hcldec example.conf | jq -r '@sh "NAME=\(.name) GREETING=\(.greeting) FRIENDS=(\(.friends))"')"
if [ $? != 0 ]; then
# If hcldec or jq failed then it has already printed out some error messages
# and so we can bail out.
exit $?
fi
# Import our settings into our environment
eval "$CONFIG_VARS"
# ...and now, some contrived usage of the settings we loaded:
echo "$GREETING $NAME!"
for name in ${FRIENDS[@]}; do
echo "$GREETING $name, too!"
done

View File

@ -0,0 +1,23 @@
object {
attr "name" {
type = string
required = true
}
default "greeting" {
attr {
name = "greeting"
type = string
}
literal {
value = "Hello"
}
}
block_list "friends" {
block_type = "friend"
attr {
name = "name"
type = string
required = true
}
}
}

369
cmd/hcldec/main.go Normal file
View File

@ -0,0 +1,369 @@
package main
import (
"encoding/json"
"fmt"
"io/ioutil"
"os"
"strings"
"github.com/hashicorp/hcl2/hcl"
"github.com/hashicorp/hcl2/hcldec"
"github.com/hashicorp/hcl2/hclparse"
flag "github.com/spf13/pflag"
"github.com/zclconf/go-cty/cty"
"github.com/zclconf/go-cty/cty/function"
ctyjson "github.com/zclconf/go-cty/cty/json"
"golang.org/x/crypto/ssh/terminal"
)
const versionStr = "0.0.1-dev"
// vars is populated from --vars arguments on the command line, via a flag
// registration in init() below.
var vars = &varSpecs{}
var (
specFile = flag.StringP("spec", "s", "", "path to spec file (required)")
outputFile = flag.StringP("out", "o", "", "write to the given file, instead of stdout")
diagsFormat = flag.StringP("diags", "", "", "format any returned diagnostics in the given format; currently only \"json\" is accepted")
showVarRefs = flag.BoolP("var-refs", "", false, "rather than decoding input, produce a JSON description of the variables referenced by it")
withType = flag.BoolP("with-type", "", false, "include an additional object level at the top describing the HCL-oriented type of the result value")
showVersion = flag.BoolP("version", "v", false, "show the version number and immediately exit")
)
var parser = hclparse.NewParser()
var diagWr hcl.DiagnosticWriter // initialized in init
func init() {
flag.VarP(vars, "vars", "V", "provide variables to the given configuration file(s)")
}
func main() {
flag.Usage = usage
flag.Parse()
if *showVersion {
fmt.Println(versionStr)
os.Exit(0)
}
args := flag.Args()
switch *diagsFormat {
case "":
color := terminal.IsTerminal(int(os.Stderr.Fd()))
w, _, err := terminal.GetSize(int(os.Stdout.Fd()))
if err != nil {
w = 80
}
diagWr = hcl.NewDiagnosticTextWriter(os.Stderr, parser.Files(), uint(w), color)
case "json":
diagWr = &jsonDiagWriter{w: os.Stderr}
default:
fmt.Fprintf(os.Stderr, "Invalid diagnostics format %q: only \"json\" is supported.\n", *diagsFormat)
os.Exit(2)
}
err := realmain(args)
if err != nil {
fmt.Fprintf(os.Stderr, "Error: %s\n\n", err.Error())
os.Exit(1)
}
}
func realmain(args []string) error {
if *specFile == "" {
return fmt.Errorf("the --spec=... argument is required")
}
var diags hcl.Diagnostics
specContent, specDiags := loadSpecFile(*specFile)
diags = append(diags, specDiags...)
if specDiags.HasErrors() {
diagWr.WriteDiagnostics(diags)
flush(diagWr)
os.Exit(2)
}
spec := specContent.RootSpec
ctx := &hcl.EvalContext{
Variables: map[string]cty.Value{},
Functions: map[string]function.Function{},
}
for name, val := range specContent.Variables {
ctx.Variables[name] = val
}
for name, f := range specContent.Functions {
ctx.Functions[name] = f
}
if len(*vars) != 0 {
for i, varsSpec := range *vars {
var vals map[string]cty.Value
var valsDiags hcl.Diagnostics
if strings.HasPrefix(strings.TrimSpace(varsSpec), "{") {
// literal JSON object on the command line
vals, valsDiags = parseVarsArg(varsSpec, i)
} else {
// path to a file containing either HCL or JSON (by file extension)
vals, valsDiags = parseVarsFile(varsSpec)
}
diags = append(diags, valsDiags...)
for k, v := range vals {
ctx.Variables[k] = v
}
}
}
// If we have empty context elements then we'll nil them out so that
// we'll produce e.g. "variables are not allowed" errors instead of
// "variable not found" errors.
if len(ctx.Variables) == 0 {
ctx.Variables = nil
}
if len(ctx.Functions) == 0 {
ctx.Functions = nil
}
if ctx.Variables == nil && ctx.Functions == nil {
ctx = nil
}
var bodies []hcl.Body
if len(args) == 0 {
src, err := ioutil.ReadAll(os.Stdin)
if err != nil {
return fmt.Errorf("failed to read stdin: %s", err)
}
f, fDiags := parser.ParseHCL(src, "<stdin>")
diags = append(diags, fDiags...)
if !fDiags.HasErrors() {
bodies = append(bodies, f.Body)
}
} else {
for _, filename := range args {
var f *hcl.File
var fDiags hcl.Diagnostics
if strings.HasSuffix(filename, ".json") {
f, fDiags = parser.ParseJSONFile(filename)
} else {
f, fDiags = parser.ParseHCLFile(filename)
}
diags = append(diags, fDiags...)
if !fDiags.HasErrors() {
bodies = append(bodies, f.Body)
}
}
}
if diags.HasErrors() {
diagWr.WriteDiagnostics(diags)
flush(diagWr)
os.Exit(2)
}
var body hcl.Body
switch len(bodies) {
case 0:
// should never happen, but... okay?
body = hcl.EmptyBody()
case 1:
body = bodies[0]
default:
body = hcl.MergeBodies(bodies)
}
if *showVarRefs {
vars := hcldec.Variables(body, spec)
return showVarRefsJSON(vars, ctx)
}
val, decDiags := hcldec.Decode(body, spec, ctx)
diags = append(diags, decDiags...)
if diags.HasErrors() {
diagWr.WriteDiagnostics(diags)
flush(diagWr)
os.Exit(2)
}
wantType := val.Type()
if *withType {
// We'll instead ask to encode as dynamic, which will make the
// marshaler include type information.
wantType = cty.DynamicPseudoType
}
out, err := ctyjson.Marshal(val, wantType)
if err != nil {
return err
}
// hcldec will include explicit nulls where an ObjectSpec has a spec
// that refers to a missing item, but that'll probably be annoying for
// a consumer of our output to deal with so we'll just strip those
// out and reduce to only the non-null values.
out = stripJSONNullProperties(out)
target := os.Stdout
if *outputFile != "" {
target, err = os.OpenFile(*outputFile, os.O_TRUNC|os.O_CREATE|os.O_WRONLY, os.ModePerm)
if err != nil {
return fmt.Errorf("can't open %s for writing: %s", *outputFile, err)
}
}
fmt.Fprintf(target, "%s\n", out)
return nil
}
func usage() {
fmt.Fprintf(os.Stderr, "usage: hcldec --spec=<spec-file> [options] [hcl-file ...]\n")
flag.PrintDefaults()
os.Exit(2)
}
func showVarRefsJSON(vars []hcl.Traversal, ctx *hcl.EvalContext) error {
type PosJSON struct {
Line int `json:"line"`
Column int `json:"column"`
Byte int `json:"byte"`
}
type RangeJSON struct {
Filename string `json:"filename"`
Start PosJSON `json:"start"`
End PosJSON `json:"end"`
}
type StepJSON struct {
Kind string `json:"kind"`
Name string `json:"name,omitempty"`
Key json.RawMessage `json:"key,omitempty"`
Range RangeJSON `json:"range"`
}
type TraversalJSON struct {
RootName string `json:"root_name"`
Value json.RawMessage `json:"value,omitempty"`
Steps []StepJSON `json:"steps"`
Range RangeJSON `json:"range"`
}
ret := make([]TraversalJSON, 0, len(vars))
for _, traversal := range vars {
tJSON := TraversalJSON{
Steps: make([]StepJSON, 0, len(traversal)),
}
for _, step := range traversal {
var sJSON StepJSON
rng := step.SourceRange()
sJSON.Range.Filename = rng.Filename
sJSON.Range.Start.Line = rng.Start.Line
sJSON.Range.Start.Column = rng.Start.Column
sJSON.Range.Start.Byte = rng.Start.Byte
sJSON.Range.End.Line = rng.End.Line
sJSON.Range.End.Column = rng.End.Column
sJSON.Range.End.Byte = rng.End.Byte
switch ts := step.(type) {
case hcl.TraverseRoot:
sJSON.Kind = "root"
sJSON.Name = ts.Name
tJSON.RootName = ts.Name
case hcl.TraverseAttr:
sJSON.Kind = "attr"
sJSON.Name = ts.Name
case hcl.TraverseIndex:
sJSON.Kind = "index"
src, err := ctyjson.Marshal(ts.Key, ts.Key.Type())
if err == nil {
sJSON.Key = json.RawMessage(src)
}
default:
// Should never get here, since the above should be exhaustive
// for all possible traversal step types.
sJSON.Kind = "(unknown)"
}
tJSON.Steps = append(tJSON.Steps, sJSON)
}
// Best effort, we'll try to include the current known value of this
// traversal, if any.
val, diags := traversal.TraverseAbs(ctx)
if !diags.HasErrors() {
enc, err := ctyjson.Marshal(val, val.Type())
if err == nil {
tJSON.Value = json.RawMessage(enc)
}
}
rng := traversal.SourceRange()
tJSON.Range.Filename = rng.Filename
tJSON.Range.Start.Line = rng.Start.Line
tJSON.Range.Start.Column = rng.Start.Column
tJSON.Range.Start.Byte = rng.Start.Byte
tJSON.Range.End.Line = rng.End.Line
tJSON.Range.End.Column = rng.End.Column
tJSON.Range.End.Byte = rng.End.Byte
ret = append(ret, tJSON)
}
out, err := json.MarshalIndent(ret, "", " ")
if err != nil {
return fmt.Errorf("failed to marshal variable references as JSON: %s", err)
}
target := os.Stdout
if *outputFile != "" {
target, err = os.OpenFile(*outputFile, os.O_TRUNC|os.O_CREATE|os.O_WRONLY, os.ModePerm)
if err != nil {
return fmt.Errorf("can't open %s for writing: %s", *outputFile, err)
}
}
fmt.Fprintf(target, "%s\n", out)
return nil
}
func stripJSONNullProperties(src []byte) []byte {
var v interface{}
err := json.Unmarshal(src, &v)
if err != nil {
// We expect valid JSON
panic(err)
}
v = stripNullMapElements(v)
new, err := json.Marshal(v)
if err != nil {
panic(err)
}
return new
}
func stripNullMapElements(v interface{}) interface{} {
switch tv := v.(type) {
case map[string]interface{}:
for k, ev := range tv {
if ev == nil {
delete(tv, k)
} else {
tv[k] = stripNullMapElements(ev)
}
}
return v
case []interface{}:
for i, ev := range tv {
tv[i] = stripNullMapElements(ev)
}
return v
default:
return v
}
}

487
cmd/hcldec/spec-format.md Normal file
View File

@ -0,0 +1,487 @@
# `hcldec` spec format
The `hcldec` spec format instructs [`hcldec`](README.md) on how to validate
one or more configuration files given in the HCL syntax and how to translate
the result into JSON format.
The spec format is itself built from HCL syntax, with each HCL block serving
as a _spec_ whose block type and contents together describe a single mapping
action and, in most cases, a validation constraint. Each spec block produces
one JSON value.
A spec _file_ must have a single top-level spec block that describes the
top-level JSON value `hcldec` will return, and that spec block may have other
nested spec blocks (depending on its type) that produce nested structures and
additional validation constraints.
The most common usage of `hcldec` is to produce a JSON object whose properties
are derived from the top-level content of the input file. In this case, the
root of the given spec file will have an `object` spec block whose contents
describe how each of the object's properties are to be populated using
nested spec blocks.
Each spec is evaluated in the context of an HCL _body_, which is the HCL
terminology for one level of nesting in a configuration file. The top-level
objects in a file all belong to the root body of that file, and then each
nested block has its own body containing the elements within that block.
Some spec types select a new body as the context for their nested specs,
allowing nested HCL structures to be decoded.
## Spec Block Types
The following sections describe the different block types that can be used to
define specs within a spec file.
### `object` spec blocks
The `object` spec type is the most commonly used at the root of a spec file.
Its result is a JSON object whose properties are set based on any nested
spec blocks:
```hcl
object {
attr "name" {
type = string
}
block "address" {
object {
attr "street" {
type = string
}
# ...
}
}
}
```
Nested spec blocks inside `object` must always have an extra block label
`"name"`, `"address"` and `"street"` in the above example) that specifies
the name of the property that should be created in the JSON object result.
This label also acts as a default name selector for the nested spec, allowing
the `attr` blocks in the above example to omit the usually-required `name`
argument in cases where the HCL input name and JSON output name are the same.
An `object` spec block creates no validation constraints, but it passes on
any validation constraints created by the nested specs.
### `array` spec blocks
The `array` spec type produces a JSON array whose elements are set based on
any nested spec blocks:
```hcl
array {
attr {
name = "first_element"
type = string
}
attr {
name = "second_element"
type = string
}
}
```
An `array` spec block creates no validation constraints, but it passes on
any validation constraints created by the nested specs.
### `attr` spec blocks
The `attr` spec type reads the value of an attribute in the current body
and returns that value as its result. It also creates validation constraints
for the given attribute name and its value.
```hcl
attr {
name = "document_root"
type = string
required = true
}
```
`attr` spec blocks accept the following arguments:
* `name` (required) - The attribute name to expect within the HCL input file.
This may be omitted when a default name selector is created by a parent
`object` spec, if the input attribute name should match the output JSON
object property name.
* `type` (optional) - A [type expression](#type-expressions) that the given
attribute value must conform to. If this argument is set, `hcldec` will
automatically convert the given input value to this type or produce an
error if that is not possible.
* `required` (optional) - If set to `true`, `hcldec` will produce an error
if a value is not provided for the source attribute.
`attr` is a leaf spec type, so no nested spec blocks are permitted.
### `block` spec blocks
The `block` spec type applies one nested spec block to the contents of a
block within the current body and returns the result of that spec. It also
creates validation constraints for the given block type name.
```hcl
block {
block_type = "logging"
object {
attr "level" {
type = string
}
attr "file" {
type = string
}
}
}
```
`block` spec blocks accept the following arguments:
* `block_type` (required) - The block type name to expect within the HCL
input file. This may be omitted when a default name selector is created
by a parent `object` spec, if the input block type name should match the
output JSON object property name.
* `required` (optional) - If set to `true`, `hcldec` will produce an error
if a block of the specified type is not present in the current body.
`block` creates a validation constraint that there must be zero or one blocks
of the given type name, or exactly one if `required` is set.
`block` expects a single nested spec block, which is applied to the body of
the block of the given type when it is present.
### `block_list` spec blocks
The `block_list` spec type is similar to `block`, but it accepts zero or
more blocks of a specified type rather than requiring zero or one. The
result is a JSON array with one entry per block of the given type.
```hcl
block_list {
block_type = "log_file"
object {
attr "level" {
type = string
}
attr "filename" {
type = string
required = true
}
}
}
```
`block_list` spec blocks accept the following arguments:
* `block_type` (required) - The block type name to expect within the HCL
input file. This may be omitted when a default name selector is created
by a parent `object` spec, if the input block type name should match the
output JSON object property name.
* `min_items` (optional) - If set to a number greater than zero, `hcldec` will
produce an error if fewer than the given number of blocks are present.
* `max_items` (optional) - If set to a number greater than zero, `hcldec` will
produce an error if more than the given number of blocks are present. This
attribute must be greater than or equal to `min_items` if both are set.
`block` creates a validation constraint on the number of blocks of the given
type that must be present.
`block` expects a single nested spec block, which is applied to the body of
each matching block to produce the resulting list items.
### `block_set` spec blocks
The `block_set` spec type behaves the same as `block_list` except that
the result is in no specific order and any duplicate items are removed.
```hcl
block_set {
block_type = "log_file"
object {
attr "level" {
type = string
}
attr "filename" {
type = string
required = true
}
}
}
```
The contents of `block_set` are the same as for `block_list`.
### `block_map` spec blocks
The `block_map` spec type is similar to `block`, but it accepts zero or
more blocks of a specified type rather than requiring zero or one. The
result is a JSON object, or possibly multiple nested JSON objects, whose
properties are derived from the labels set on each matching block.
```hcl
block_map {
block_type = "log_file"
labels = ["filename"]
object {
attr "level" {
type = string
required = true
}
}
}
```
`block_map` spec blocks accept the following arguments:
* `block_type` (required) - The block type name to expect within the HCL
input file. This may be omitted when a default name selector is created
by a parent `object` spec, if the input block type name should match the
output JSON object property name.
* `labels` (required) - A list of user-oriented block label names. Each entry
in this list creates one level of object within the output value, and
requires one additional block header label on any child block of this type.
Block header labels are the quoted strings that appear after the block type
name but before the opening `{`.
`block` creates a validation constraint on the number of labels that blocks
of the given type must have.
`block` expects a single nested spec block, which is applied to the body of
each matching block to produce the resulting map items.
## `block_attrs` spec blocks
The `block_attrs` spec type is similar to an `attr` spec block of a map type,
but it produces a map from the attributes of a block rather than from an
attribute's expression.
```hcl
block_attrs {
block_type = "variables"
element_type = string
required = false
}
```
This allows a map with user-defined keys to be produced within block syntax,
but due to the constraints of that syntax it also means that the user will
be unable to dynamically-generate either individual key names using key
expressions or the entire map value using a `for` expression.
`block_attrs` spec blocks accept the following arguments:
* `block_type` (required) - The block type name to expect within the HCL
input file. This may be omitted when a default name selector is created
by a parent `object` spec, if the input block type name should match the
output JSON object property name.
* `element_type` (required) - The value type to require for each of the
attributes within a matched block. The resulting value will be a JSON
object whose property values are of this type.
* `required` (optional) - If `true`, an error will be produced if a block
of the given type is not present. If `false` -- the default -- an absent
block will be indicated by producing `null`.
## `literal` spec blocks
The `literal` spec type returns a given literal value, and creates no
validation constraints. It is most commonly used with the `default` spec
type to create a fallback value, but can also be used e.g. to fill out
required properties in an `object` spec that do not correspond to any
construct in the input configuration.
```hcl
literal {
value = "hello world"
}
```
`literal` spec blocks accept the following argument:
* `value` (required) - The value to return. This attribute may be an expression
that uses [functions](#spec-definition-functions).
`literal` is a leaf spec type, so no nested spec blocks are permitted.
## `default` spec blocks
The `default` spec type evaluates a sequence of nested specs in turn and
returns the result of the first one that produces a non-null value.
It creates no validation constraints of its own, but passes on the validation
constraints from its first nested block.
```hcl
default {
attr {
name = "private"
type = bool
}
literal {
value = false
}
}
```
A `default` spec block must have at least one nested spec block, and should
generally have at least two since otherwise the `default` wrapper is a no-op.
The second and any subsequent spec blocks are _fallback_ specs. These exhibit
their usual behavior but are not able to impose validation constraints on the
current body since they are not evaluated unless all prior specs produce
`null` as their result.
## `transform` spec blocks
The `transform` spec type evaluates one nested spec and then evaluates a given
expression with that nested spec result to produce a final value.
It creates no validation constraints of its own, but passes on the validation
constraints from its nested block.
```hcl
transform {
attr {
name = "size_in_mb"
type = number
}
# Convert result to a size in bytes
result = nested * 1024 * 1024
}
```
`transform` spec blocks accept the following argument:
* `result` (required) - The expression to evaluate on the result of the nested
spec. The variable `nested` is defined when evaluating this expression, with
the result value of the nested spec.
The `result` expression may use [functions](#spec-definition-functions).
## Predefined Variables
`hcldec` accepts values for variables to expose into the input file's
expression scope as CLI options, and this is the most common way to pass
values since it allows them to be dynamically populated by the calling
application.
However, it's also possible to pre-define variables with constant values
within a spec file, using the top-level `variables` block type:
```hcl
variables {
name = "Stephen"
}
```
Variables of the same name defined via the `hcldec` command line with override
predefined variables of the same name, so this mechanism can also be used to
provide defaults for variables that are overridden only in certain contexts.
## Custom Functions
The spec can make arbitrary HCL functions available in the input file's
expression scope, and thus allow simple computation within the input file,
in addition to HCL's built-in operators.
Custom functions are defined in the spec file with the top-level `function`
block type:
```
function "add_one" {
params = [n]
result = n + 1
}
```
Functions behave in a similar way to the `transform` spec type in that the
given `result` attribute expression is evaluated with additional variables
defined with the same names as the defined `params`.
The [spec definition functions](#spec-definition-functions) can be used within
custom function expressions, allowing them to be optionally exposed into the
input file:
```
function "upper" {
params = [str]
result = upper(str)
}
function "min" {
params = []
variadic_param = nums
result = min(nums...)
}
```
Custom functions defined in the spec cannot be called from the spec itself.
## Spec Definition Functions
Certain expressions within a specification may use the following functions.
The documentation for each spec type above specifies where functions may
be used.
* `abs(number)` returns the absolute (positive) value of the given number.
* `coalesce(vals...)` returns the first non-null value given.
* `concat(lists...)` concatenates together all of the given lists to produce a new list.
* `hasindex(val, idx)` returns true if the expression `val[idx]` could succeed.
* `int(number)` returns the integer portion of the given number, rounding towards zero.
* `jsondecode(str)` interprets the given string as JSON and returns the resulting data structure.
* `jsonencode(val)` returns a JSON-serialized version of the given value.
* `length(collection)` returns the number of elements in the given collection (list, set, map, object, or tuple).
* `lower(string)` returns the given string with all uppercase letters converted to lowercase.
* `max(numbers...)` returns the greatest of the given numbers.
* `min(numbers...)` returns the smallest of the given numbers.
* `reverse(string)` returns the given string with all of the characters in reverse order.
* `strlen(string)` returns the number of characters in the given string.
* `substr(string, offset, length)` returns the requested substring of the given string.
* `upper(string)` returns the given string with all lowercase letters converted to uppercase.
Note that these expressions are valid in the context of the _spec_ file, not
the _input_. Functions can be exposed into the input file using
[Custom Functions](#custom-functions) within the spec, which may in turn
refer to these spec definition functions.
## Type Expressions
Type expressions are used to describe the expected type of an attribute, as
an additional validation constraint.
A type expression uses primitive type names and compound type constructors.
A type constructor builds a new type based on one or more type expression
arguments.
The following type names and type constructors are supported:
* `any` is a wildcard that accepts a value of any type. (In HCL terms, this
is the _dynamic pseudo-type_.)
* `string` is a Unicode string.
* `number` is an arbitrary-precision floating point number.
* `bool` is a boolean value (`true` or `false`)
* `list(element_type)` constructs a list type with the given element type
* `set(element_type)` constructs a set type with the given element type
* `map(element_type)` constructs a map type with the given element type
* `object({name1 = element_type, name2 = element_type, ...})` constructs
an object type with the given attribute types.
* `tuple([element_type, element_type, ...])` constructs a tuple type with
the given element types. This can be used, for example, to require an
array with a particular number of elements, or with elements of different
types.
The above types are as defined by
[the HCL syntax-agnostic information model](../../hcl/spec.md). After
validation, values are lowered to JSON's type system, which is a subset
of the HCL type system.
`null` is a valid value of any type, and not a type itself.

645
cmd/hcldec/spec.go Normal file
View File

@ -0,0 +1,645 @@
package main
import (
"fmt"
"github.com/hashicorp/hcl2/ext/userfunc"
"github.com/hashicorp/hcl2/gohcl"
"github.com/hashicorp/hcl2/hcl"
"github.com/hashicorp/hcl2/hcldec"
"github.com/zclconf/go-cty/cty"
"github.com/zclconf/go-cty/cty/function"
)
type specFileContent struct {
Variables map[string]cty.Value
Functions map[string]function.Function
RootSpec hcldec.Spec
}
var specCtx = &hcl.EvalContext{
Functions: specFuncs,
}
func loadSpecFile(filename string) (specFileContent, hcl.Diagnostics) {
file, diags := parser.ParseHCLFile(filename)
if diags.HasErrors() {
return specFileContent{RootSpec: errSpec}, diags
}
vars, funcs, specBody, declDiags := decodeSpecDecls(file.Body)
diags = append(diags, declDiags...)
spec, specDiags := decodeSpecRoot(specBody)
diags = append(diags, specDiags...)
return specFileContent{
Variables: vars,
Functions: funcs,
RootSpec: spec,
}, diags
}
func decodeSpecDecls(body hcl.Body) (map[string]cty.Value, map[string]function.Function, hcl.Body, hcl.Diagnostics) {
funcs, body, diags := userfunc.DecodeUserFunctions(body, "function", func() *hcl.EvalContext {
return specCtx
})
content, body, moreDiags := body.PartialContent(&hcl.BodySchema{
Blocks: []hcl.BlockHeaderSchema{
{
Type: "variables",
},
},
})
diags = append(diags, moreDiags...)
vars := make(map[string]cty.Value)
for _, block := range content.Blocks {
// We only have one block type in our schema, so we can assume all
// blocks are of that type.
attrs, moreDiags := block.Body.JustAttributes()
diags = append(diags, moreDiags...)
for name, attr := range attrs {
val, moreDiags := attr.Expr.Value(specCtx)
diags = append(diags, moreDiags...)
vars[name] = val
}
}
return vars, funcs, body, diags
}
func decodeSpecRoot(body hcl.Body) (hcldec.Spec, hcl.Diagnostics) {
content, diags := body.Content(specSchemaUnlabelled)
if len(content.Blocks) == 0 {
if diags.HasErrors() {
// If we already have errors then they probably explain
// why we have no blocks, so we'll skip our additional
// error message added below.
return errSpec, diags
}
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Missing spec block",
Detail: "A spec file must have exactly one root block specifying how to map to a JSON value.",
Subject: body.MissingItemRange().Ptr(),
})
return errSpec, diags
}
if len(content.Blocks) > 1 {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Extraneous spec block",
Detail: "A spec file must have exactly one root block specifying how to map to a JSON value.",
Subject: &content.Blocks[1].DefRange,
})
return errSpec, diags
}
spec, specDiags := decodeSpecBlock(content.Blocks[0])
diags = append(diags, specDiags...)
return spec, diags
}
func decodeSpecBlock(block *hcl.Block) (hcldec.Spec, hcl.Diagnostics) {
var impliedName string
if len(block.Labels) > 0 {
impliedName = block.Labels[0]
}
switch block.Type {
case "object":
return decodeObjectSpec(block.Body)
case "array":
return decodeArraySpec(block.Body)
case "attr":
return decodeAttrSpec(block.Body, impliedName)
case "block":
return decodeBlockSpec(block.Body, impliedName)
case "block_list":
return decodeBlockListSpec(block.Body, impliedName)
case "block_set":
return decodeBlockSetSpec(block.Body, impliedName)
case "block_map":
return decodeBlockMapSpec(block.Body, impliedName)
case "block_attrs":
return decodeBlockAttrsSpec(block.Body, impliedName)
case "default":
return decodeDefaultSpec(block.Body)
case "transform":
return decodeTransformSpec(block.Body)
case "literal":
return decodeLiteralSpec(block.Body)
default:
// Should never happen, because the above cases should be exhaustive
// for our schema.
var diags hcl.Diagnostics
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Invalid spec block",
Detail: fmt.Sprintf("Blocks of type %q are not expected here.", block.Type),
Subject: &block.TypeRange,
})
return errSpec, diags
}
}
func decodeObjectSpec(body hcl.Body) (hcldec.Spec, hcl.Diagnostics) {
content, diags := body.Content(specSchemaLabelled)
spec := make(hcldec.ObjectSpec)
for _, block := range content.Blocks {
propSpec, propDiags := decodeSpecBlock(block)
diags = append(diags, propDiags...)
spec[block.Labels[0]] = propSpec
}
return spec, diags
}
func decodeArraySpec(body hcl.Body) (hcldec.Spec, hcl.Diagnostics) {
content, diags := body.Content(specSchemaUnlabelled)
spec := make(hcldec.TupleSpec, 0, len(content.Blocks))
for _, block := range content.Blocks {
elemSpec, elemDiags := decodeSpecBlock(block)
diags = append(diags, elemDiags...)
spec = append(spec, elemSpec)
}
return spec, diags
}
func decodeAttrSpec(body hcl.Body, impliedName string) (hcldec.Spec, hcl.Diagnostics) {
type content struct {
Name *string `hcl:"name"`
Type hcl.Expression `hcl:"type"`
Required *bool `hcl:"required"`
}
var args content
diags := gohcl.DecodeBody(body, nil, &args)
if diags.HasErrors() {
return errSpec, diags
}
spec := &hcldec.AttrSpec{
Name: impliedName,
}
if args.Required != nil {
spec.Required = *args.Required
}
if args.Name != nil {
spec.Name = *args.Name
}
var typeDiags hcl.Diagnostics
spec.Type, typeDiags = evalTypeExpr(args.Type)
diags = append(diags, typeDiags...)
if spec.Name == "" {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Missing name in attribute spec",
Detail: "The name attribute is required, to specify the attribute name that is expected in an input HCL file.",
Subject: body.MissingItemRange().Ptr(),
})
return errSpec, diags
}
return spec, diags
}
func decodeBlockSpec(body hcl.Body, impliedName string) (hcldec.Spec, hcl.Diagnostics) {
type content struct {
TypeName *string `hcl:"block_type"`
Required *bool `hcl:"required"`
Nested hcl.Body `hcl:",remain"`
}
var args content
diags := gohcl.DecodeBody(body, nil, &args)
if diags.HasErrors() {
return errSpec, diags
}
spec := &hcldec.BlockSpec{
TypeName: impliedName,
}
if args.Required != nil {
spec.Required = *args.Required
}
if args.TypeName != nil {
spec.TypeName = *args.TypeName
}
nested, nestedDiags := decodeBlockNestedSpec(args.Nested)
diags = append(diags, nestedDiags...)
spec.Nested = nested
return spec, diags
}
func decodeBlockListSpec(body hcl.Body, impliedName string) (hcldec.Spec, hcl.Diagnostics) {
type content struct {
TypeName *string `hcl:"block_type"`
MinItems *int `hcl:"min_items"`
MaxItems *int `hcl:"max_items"`
Nested hcl.Body `hcl:",remain"`
}
var args content
diags := gohcl.DecodeBody(body, nil, &args)
if diags.HasErrors() {
return errSpec, diags
}
spec := &hcldec.BlockListSpec{
TypeName: impliedName,
}
if args.MinItems != nil {
spec.MinItems = *args.MinItems
}
if args.MaxItems != nil {
spec.MaxItems = *args.MaxItems
}
if args.TypeName != nil {
spec.TypeName = *args.TypeName
}
nested, nestedDiags := decodeBlockNestedSpec(args.Nested)
diags = append(diags, nestedDiags...)
spec.Nested = nested
if spec.TypeName == "" {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Missing block_type in block_list spec",
Detail: "The block_type attribute is required, to specify the block type name that is expected in an input HCL file.",
Subject: body.MissingItemRange().Ptr(),
})
return errSpec, diags
}
return spec, diags
}
func decodeBlockSetSpec(body hcl.Body, impliedName string) (hcldec.Spec, hcl.Diagnostics) {
type content struct {
TypeName *string `hcl:"block_type"`
MinItems *int `hcl:"min_items"`
MaxItems *int `hcl:"max_items"`
Nested hcl.Body `hcl:",remain"`
}
var args content
diags := gohcl.DecodeBody(body, nil, &args)
if diags.HasErrors() {
return errSpec, diags
}
spec := &hcldec.BlockSetSpec{
TypeName: impliedName,
}
if args.MinItems != nil {
spec.MinItems = *args.MinItems
}
if args.MaxItems != nil {
spec.MaxItems = *args.MaxItems
}
if args.TypeName != nil {
spec.TypeName = *args.TypeName
}
nested, nestedDiags := decodeBlockNestedSpec(args.Nested)
diags = append(diags, nestedDiags...)
spec.Nested = nested
if spec.TypeName == "" {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Missing block_type in block_set spec",
Detail: "The block_type attribute is required, to specify the block type name that is expected in an input HCL file.",
Subject: body.MissingItemRange().Ptr(),
})
return errSpec, diags
}
return spec, diags
}
func decodeBlockMapSpec(body hcl.Body, impliedName string) (hcldec.Spec, hcl.Diagnostics) {
type content struct {
TypeName *string `hcl:"block_type"`
Labels []string `hcl:"labels"`
Nested hcl.Body `hcl:",remain"`
}
var args content
diags := gohcl.DecodeBody(body, nil, &args)
if diags.HasErrors() {
return errSpec, diags
}
spec := &hcldec.BlockMapSpec{
TypeName: impliedName,
}
if args.TypeName != nil {
spec.TypeName = *args.TypeName
}
spec.LabelNames = args.Labels
nested, nestedDiags := decodeBlockNestedSpec(args.Nested)
diags = append(diags, nestedDiags...)
spec.Nested = nested
if spec.TypeName == "" {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Missing block_type in block_map spec",
Detail: "The block_type attribute is required, to specify the block type name that is expected in an input HCL file.",
Subject: body.MissingItemRange().Ptr(),
})
return errSpec, diags
}
if len(spec.LabelNames) < 1 {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Invalid block label name list",
Detail: "A block_map must have at least one label specified.",
Subject: body.MissingItemRange().Ptr(),
})
return errSpec, diags
}
if hcldec.ImpliedType(spec).HasDynamicTypes() {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Invalid block_map spec",
Detail: "A block_map spec may not contain attributes with type 'any'.",
Subject: body.MissingItemRange().Ptr(),
})
}
return spec, diags
}
func decodeBlockNestedSpec(body hcl.Body) (hcldec.Spec, hcl.Diagnostics) {
content, diags := body.Content(specSchemaUnlabelled)
if len(content.Blocks) == 0 {
if diags.HasErrors() {
// If we already have errors then they probably explain
// why we have no blocks, so we'll skip our additional
// error message added below.
return errSpec, diags
}
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Missing spec block",
Detail: "A block spec must have exactly one child spec specifying how to decode block contents.",
Subject: body.MissingItemRange().Ptr(),
})
return errSpec, diags
}
if len(content.Blocks) > 1 {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Extraneous spec block",
Detail: "A block spec must have exactly one child spec specifying how to decode block contents.",
Subject: &content.Blocks[1].DefRange,
})
return errSpec, diags
}
spec, specDiags := decodeSpecBlock(content.Blocks[0])
diags = append(diags, specDiags...)
return spec, diags
}
func decodeBlockAttrsSpec(body hcl.Body, impliedName string) (hcldec.Spec, hcl.Diagnostics) {
type content struct {
TypeName *string `hcl:"block_type"`
ElementType hcl.Expression `hcl:"element_type"`
Required *bool `hcl:"required"`
}
var args content
diags := gohcl.DecodeBody(body, nil, &args)
if diags.HasErrors() {
return errSpec, diags
}
spec := &hcldec.BlockAttrsSpec{
TypeName: impliedName,
}
if args.Required != nil {
spec.Required = *args.Required
}
if args.TypeName != nil {
spec.TypeName = *args.TypeName
}
var typeDiags hcl.Diagnostics
spec.ElementType, typeDiags = evalTypeExpr(args.ElementType)
diags = append(diags, typeDiags...)
if spec.TypeName == "" {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Missing block_type in block_attrs spec",
Detail: "The block_type attribute is required, to specify the block type name that is expected in an input HCL file.",
Subject: body.MissingItemRange().Ptr(),
})
return errSpec, diags
}
return spec, diags
}
func decodeLiteralSpec(body hcl.Body) (hcldec.Spec, hcl.Diagnostics) {
type content struct {
Value cty.Value `hcl:"value"`
}
var args content
diags := gohcl.DecodeBody(body, specCtx, &args)
if diags.HasErrors() {
return errSpec, diags
}
return &hcldec.LiteralSpec{
Value: args.Value,
}, diags
}
func decodeDefaultSpec(body hcl.Body) (hcldec.Spec, hcl.Diagnostics) {
content, diags := body.Content(specSchemaUnlabelled)
if len(content.Blocks) == 0 {
if diags.HasErrors() {
// If we already have errors then they probably explain
// why we have no blocks, so we'll skip our additional
// error message added below.
return errSpec, diags
}
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Missing spec block",
Detail: "A default block must have at least one nested spec, each specifying a possible outcome.",
Subject: body.MissingItemRange().Ptr(),
})
return errSpec, diags
}
if len(content.Blocks) == 1 && !diags.HasErrors() {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagWarning,
Summary: "Useless default block",
Detail: "A default block with only one spec is equivalent to using that spec alone.",
Subject: &content.Blocks[1].DefRange,
})
}
var spec hcldec.Spec
for _, block := range content.Blocks {
candidateSpec, candidateDiags := decodeSpecBlock(block)
diags = append(diags, candidateDiags...)
if candidateDiags.HasErrors() {
continue
}
if spec == nil {
spec = candidateSpec
} else {
spec = &hcldec.DefaultSpec{
Primary: spec,
Default: candidateSpec,
}
}
}
return spec, diags
}
func decodeTransformSpec(body hcl.Body) (hcldec.Spec, hcl.Diagnostics) {
type content struct {
Result hcl.Expression `hcl:"result"`
Nested hcl.Body `hcl:",remain"`
}
var args content
diags := gohcl.DecodeBody(body, nil, &args)
if diags.HasErrors() {
return errSpec, diags
}
spec := &hcldec.TransformExprSpec{
Expr: args.Result,
VarName: "nested",
TransformCtx: specCtx,
}
nestedContent, nestedDiags := args.Nested.Content(specSchemaUnlabelled)
diags = append(diags, nestedDiags...)
if len(nestedContent.Blocks) != 1 {
if nestedDiags.HasErrors() {
// If we already have errors then they probably explain
// why we have the wrong number of blocks, so we'll skip our
// additional error message added below.
return errSpec, diags
}
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Invalid transform spec",
Detail: "A transform spec block must have exactly one nested spec block.",
Subject: body.MissingItemRange().Ptr(),
})
return errSpec, diags
}
nestedSpec, nestedDiags := decodeSpecBlock(nestedContent.Blocks[0])
diags = append(diags, nestedDiags...)
spec.Wrapped = nestedSpec
return spec, diags
}
var errSpec = &hcldec.LiteralSpec{
Value: cty.NullVal(cty.DynamicPseudoType),
}
var specBlockTypes = []string{
"object",
"array",
"literal",
"attr",
"block",
"block_list",
"block_map",
"block_set",
"default",
"transform",
}
var specSchemaUnlabelled *hcl.BodySchema
var specSchemaLabelled *hcl.BodySchema
var specSchemaLabelledLabels = []string{"key"}
func init() {
specSchemaLabelled = &hcl.BodySchema{
Blocks: make([]hcl.BlockHeaderSchema, 0, len(specBlockTypes)),
}
specSchemaUnlabelled = &hcl.BodySchema{
Blocks: make([]hcl.BlockHeaderSchema, 0, len(specBlockTypes)),
}
for _, name := range specBlockTypes {
specSchemaLabelled.Blocks = append(
specSchemaLabelled.Blocks,
hcl.BlockHeaderSchema{
Type: name,
LabelNames: specSchemaLabelledLabels,
},
)
specSchemaUnlabelled.Blocks = append(
specSchemaUnlabelled.Blocks,
hcl.BlockHeaderSchema{
Type: name,
},
)
}
}

24
cmd/hcldec/spec_funcs.go Normal file
View File

@ -0,0 +1,24 @@
package main
import (
"github.com/zclconf/go-cty/cty/function"
"github.com/zclconf/go-cty/cty/function/stdlib"
)
var specFuncs = map[string]function.Function{
"abs": stdlib.AbsoluteFunc,
"coalesce": stdlib.CoalesceFunc,
"concat": stdlib.ConcatFunc,
"hasindex": stdlib.HasIndexFunc,
"int": stdlib.IntFunc,
"jsondecode": stdlib.JSONDecodeFunc,
"jsonencode": stdlib.JSONEncodeFunc,
"length": stdlib.LengthFunc,
"lower": stdlib.LowerFunc,
"max": stdlib.MaxFunc,
"min": stdlib.MinFunc,
"reverse": stdlib.ReverseFunc,
"strlen": stdlib.StrlenFunc,
"substr": stdlib.SubstrFunc,
"upper": stdlib.UpperFunc,
}

129
cmd/hcldec/type_expr.go Normal file
View File

@ -0,0 +1,129 @@
package main
import (
"fmt"
"reflect"
"github.com/hashicorp/hcl2/hcl"
"github.com/zclconf/go-cty/cty"
"github.com/zclconf/go-cty/cty/function"
)
var typeType = cty.Capsule("type", reflect.TypeOf(cty.NilType))
var typeEvalCtx = &hcl.EvalContext{
Variables: map[string]cty.Value{
"string": wrapTypeType(cty.String),
"bool": wrapTypeType(cty.Bool),
"number": wrapTypeType(cty.Number),
"any": wrapTypeType(cty.DynamicPseudoType),
},
Functions: map[string]function.Function{
"list": function.New(&function.Spec{
Params: []function.Parameter{
{
Name: "element_type",
Type: typeType,
},
},
Type: function.StaticReturnType(typeType),
Impl: func(args []cty.Value, retType cty.Type) (cty.Value, error) {
ety := unwrapTypeType(args[0])
ty := cty.List(ety)
return wrapTypeType(ty), nil
},
}),
"set": function.New(&function.Spec{
Params: []function.Parameter{
{
Name: "element_type",
Type: typeType,
},
},
Type: function.StaticReturnType(typeType),
Impl: func(args []cty.Value, retType cty.Type) (cty.Value, error) {
ety := unwrapTypeType(args[0])
ty := cty.Set(ety)
return wrapTypeType(ty), nil
},
}),
"map": function.New(&function.Spec{
Params: []function.Parameter{
{
Name: "element_type",
Type: typeType,
},
},
Type: function.StaticReturnType(typeType),
Impl: func(args []cty.Value, retType cty.Type) (cty.Value, error) {
ety := unwrapTypeType(args[0])
ty := cty.Map(ety)
return wrapTypeType(ty), nil
},
}),
"tuple": function.New(&function.Spec{
Params: []function.Parameter{
{
Name: "element_types",
Type: cty.List(typeType),
},
},
Type: function.StaticReturnType(typeType),
Impl: func(args []cty.Value, retType cty.Type) (cty.Value, error) {
etysVal := args[0]
etys := make([]cty.Type, 0, etysVal.LengthInt())
for it := etysVal.ElementIterator(); it.Next(); {
_, wrapEty := it.Element()
etys = append(etys, unwrapTypeType(wrapEty))
}
ty := cty.Tuple(etys)
return wrapTypeType(ty), nil
},
}),
"object": function.New(&function.Spec{
Params: []function.Parameter{
{
Name: "attribute_types",
Type: cty.Map(typeType),
},
},
Type: function.StaticReturnType(typeType),
Impl: func(args []cty.Value, retType cty.Type) (cty.Value, error) {
atysVal := args[0]
atys := make(map[string]cty.Type)
for it := atysVal.ElementIterator(); it.Next(); {
nameVal, wrapAty := it.Element()
name := nameVal.AsString()
atys[name] = unwrapTypeType(wrapAty)
}
ty := cty.Object(atys)
return wrapTypeType(ty), nil
},
}),
},
}
func evalTypeExpr(expr hcl.Expression) (cty.Type, hcl.Diagnostics) {
result, diags := expr.Value(typeEvalCtx)
if result.IsNull() {
return cty.DynamicPseudoType, diags
}
if !result.Type().Equals(typeType) {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Invalid type expression",
Detail: fmt.Sprintf("A type is required, not %s.", result.Type().FriendlyName()),
})
return cty.DynamicPseudoType, diags
}
return unwrapTypeType(result), diags
}
func wrapTypeType(ty cty.Type) cty.Value {
return cty.CapsuleVal(typeType, &ty)
}
func unwrapTypeType(val cty.Value) cty.Type {
return *(val.EncapsulatedValue().(*cty.Type))
}

74
cmd/hcldec/vars.go Normal file
View File

@ -0,0 +1,74 @@
package main
import (
"fmt"
"strings"
"github.com/hashicorp/hcl2/hcl"
"github.com/zclconf/go-cty/cty"
)
func parseVarsArg(src string, argIdx int) (map[string]cty.Value, hcl.Diagnostics) {
fakeFn := fmt.Sprintf("<vars argument %d>", argIdx)
f, diags := parser.ParseJSON([]byte(src), fakeFn)
if f == nil {
return nil, diags
}
vals, valsDiags := parseVarsBody(f.Body)
diags = append(diags, valsDiags...)
return vals, diags
}
func parseVarsFile(filename string) (map[string]cty.Value, hcl.Diagnostics) {
var f *hcl.File
var diags hcl.Diagnostics
if strings.HasSuffix(filename, ".json") {
f, diags = parser.ParseJSONFile(filename)
} else {
f, diags = parser.ParseHCLFile(filename)
}
if f == nil {
return nil, diags
}
vals, valsDiags := parseVarsBody(f.Body)
diags = append(diags, valsDiags...)
return vals, diags
}
func parseVarsBody(body hcl.Body) (map[string]cty.Value, hcl.Diagnostics) {
attrs, diags := body.JustAttributes()
if attrs == nil {
return nil, diags
}
vals := make(map[string]cty.Value, len(attrs))
for name, attr := range attrs {
val, valDiags := attr.Expr.Value(nil)
diags = append(diags, valDiags...)
vals[name] = val
}
return vals, diags
}
// varSpecs is an implementation of pflag.Value that accumulates a list of
// raw values, ignoring any quoting. This is similar to pflag.StringSlice
// but does not complain if there are literal quotes inside the value, which
// is important for us to accept JSON literals here.
type varSpecs []string
func (vs *varSpecs) String() string {
return strings.Join([]string(*vs), ", ")
}
func (vs *varSpecs) Set(new string) error {
*vs = append(*vs, new)
return nil
}
func (vs *varSpecs) Type() string {
return "json-or-file"
}

148
cmd/hclfmt/main.go Normal file
View File

@ -0,0 +1,148 @@
package main
import (
"bytes"
"errors"
"flag"
"fmt"
"io/ioutil"
"os"
"strings"
"github.com/hashicorp/hcl2/hcl"
"github.com/hashicorp/hcl2/hclparse"
"github.com/hashicorp/hcl2/hclwrite"
"golang.org/x/crypto/ssh/terminal"
)
const versionStr = "0.0.1-dev"
var (
check = flag.Bool("check", false, "perform a syntax check on the given files and produce diagnostics")
reqNoChange = flag.Bool("require-no-change", false, "return a non-zero status if any files are changed during formatting")
overwrite = flag.Bool("w", false, "overwrite source files instead of writing to stdout")
showVersion = flag.Bool("version", false, "show the version number and immediately exit")
)
var parser = hclparse.NewParser()
var diagWr hcl.DiagnosticWriter // initialized in init
var checkErrs = false
var changed []string
func init() {
color := terminal.IsTerminal(int(os.Stderr.Fd()))
w, _, err := terminal.GetSize(int(os.Stdout.Fd()))
if err != nil {
w = 80
}
diagWr = hcl.NewDiagnosticTextWriter(os.Stderr, parser.Files(), uint(w), color)
}
func main() {
err := realmain()
if err != nil {
fmt.Fprintln(os.Stderr, err.Error())
os.Exit(1)
}
}
func realmain() error {
flag.Usage = usage
flag.Parse()
if *showVersion {
fmt.Println(versionStr)
return nil
}
err := processFiles()
if err != nil {
return err
}
if checkErrs {
return errors.New("one or more files contained errors")
}
if *reqNoChange {
if len(changed) != 0 {
return fmt.Errorf("file(s) were changed: %s", strings.Join(changed, ", "))
}
}
return nil
}
func processFiles() error {
if flag.NArg() == 0 {
if *overwrite {
return errors.New("error: cannot use -w without source filenames")
}
return processFile("<stdin>", os.Stdin)
}
for i := 0; i < flag.NArg(); i++ {
path := flag.Arg(i)
switch dir, err := os.Stat(path); {
case err != nil:
return err
case dir.IsDir():
// This tool can't walk a whole directory because it doesn't
// know what file naming schemes will be used by different
// HCL-embedding applications, so it'll leave that sort of
// functionality for apps themselves to implement.
return fmt.Errorf("can't format directory %s", path)
default:
if err := processFile(path, nil); err != nil {
return err
}
}
}
return nil
}
func processFile(fn string, in *os.File) error {
var err error
if in == nil {
in, err = os.Open(fn)
if err != nil {
return fmt.Errorf("failed to open %s: %s", fn, err)
}
}
inSrc, err := ioutil.ReadAll(in)
if err != nil {
return fmt.Errorf("failed to read %s: %s", fn, err)
}
if *check {
_, diags := parser.ParseHCL(inSrc, fn)
diagWr.WriteDiagnostics(diags)
if diags.HasErrors() {
checkErrs = true
return nil
}
}
outSrc := hclwrite.Format(inSrc)
if !bytes.Equal(inSrc, outSrc) {
changed = append(changed, fn)
}
if *overwrite {
return ioutil.WriteFile(fn, outSrc, 0644)
}
_, err = os.Stdout.Write(outSrc)
return err
}
func usage() {
fmt.Fprintf(os.Stderr, "usage: hclfmt [flags] [path ...]\n")
flag.PrintDefaults()
os.Exit(2)
}

View File

@ -0,0 +1,4 @@
# `hclspecsuite`
`hclspecsuite` is the test harness for
[the HCL specification test suite](../../specsuite/README.md).

View File

@ -0,0 +1,108 @@
package main
import (
"encoding/json"
"fmt"
"github.com/hashicorp/hcl2/hcl"
)
func decodeJSONDiagnostics(src []byte) hcl.Diagnostics {
type PosJSON struct {
Line int `json:"line"`
Column int `json:"column"`
Byte int `json:"byte"`
}
type RangeJSON struct {
Filename string `json:"filename"`
Start PosJSON `json:"start"`
End PosJSON `json:"end"`
}
type DiagnosticJSON struct {
Severity string `json:"severity"`
Summary string `json:"summary"`
Detail string `json:"detail,omitempty"`
Subject *RangeJSON `json:"subject,omitempty"`
}
type DiagnosticsJSON struct {
Diagnostics []DiagnosticJSON `json:"diagnostics"`
}
var raw DiagnosticsJSON
var diags hcl.Diagnostics
err := json.Unmarshal(src, &raw)
if err != nil {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Failed to parse hcldec diagnostics result",
Detail: fmt.Sprintf("Sub-program hcldec produced invalid diagnostics: %s.", err),
})
return diags
}
if len(raw.Diagnostics) == 0 {
return nil
}
diags = make(hcl.Diagnostics, 0, len(raw.Diagnostics))
for _, rawDiag := range raw.Diagnostics {
var severity hcl.DiagnosticSeverity
switch rawDiag.Severity {
case "error":
severity = hcl.DiagError
case "warning":
severity = hcl.DiagWarning
default:
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Failed to parse hcldec diagnostics result",
Detail: fmt.Sprintf("Diagnostic has unsupported severity %q.", rawDiag.Severity),
})
continue
}
diag := &hcl.Diagnostic{
Severity: severity,
Summary: rawDiag.Summary,
Detail: rawDiag.Detail,
}
if rawDiag.Subject != nil {
rawRange := rawDiag.Subject
diag.Subject = &hcl.Range{
Filename: rawRange.Filename,
Start: hcl.Pos{
Line: rawRange.Start.Line,
Column: rawRange.Start.Column,
Byte: rawRange.Start.Byte,
},
End: hcl.Pos{
Line: rawRange.End.Line,
Column: rawRange.End.Column,
Byte: rawRange.End.Byte,
},
}
}
diags = append(diags, diag)
}
return diags
}
func severityString(severity hcl.DiagnosticSeverity) string {
switch severity {
case hcl.DiagError:
return "error"
case hcl.DiagWarning:
return "warning"
default:
return "unsupported-severity"
}
}
func rangeString(rng hcl.Range) string {
return fmt.Sprintf(
"from line %d column %d byte %d to line %d column %d byte %d",
rng.Start.Line, rng.Start.Column, rng.Start.Byte,
rng.End.Line, rng.End.Column, rng.End.Byte,
)
}

8
cmd/hclspecsuite/log.go Normal file
View File

@ -0,0 +1,8 @@
package main
import (
"github.com/hashicorp/hcl2/hcl"
)
type LogBeginCallback func(testName string, testFile *TestFile)
type LogProblemsCallback func(testName string, testFile *TestFile, diags hcl.Diagnostics)

71
cmd/hclspecsuite/main.go Normal file
View File

@ -0,0 +1,71 @@
package main
import (
"fmt"
"os"
"os/exec"
"golang.org/x/crypto/ssh/terminal"
"github.com/hashicorp/hcl2/hcl"
"github.com/hashicorp/hcl2/hclparse"
)
func main() {
os.Exit(realMain(os.Args[1:]))
}
func realMain(args []string) int {
if len(args) != 2 {
fmt.Fprintf(os.Stderr, "Usage: hclspecsuite <tests-dir> <hcldec-file>\n")
return 2
}
testsDir := args[0]
hcldecPath := args[1]
hcldecPath, err := exec.LookPath(hcldecPath)
if err != nil {
fmt.Fprintf(os.Stderr, "%s\n", err)
return 2
}
parser := hclparse.NewParser()
color := terminal.IsTerminal(int(os.Stderr.Fd()))
w, _, err := terminal.GetSize(int(os.Stdout.Fd()))
if err != nil {
w = 80
}
diagWr := hcl.NewDiagnosticTextWriter(os.Stderr, parser.Files(), uint(w), color)
var diagCount int
runner := &Runner{
parser: parser,
hcldecPath: hcldecPath,
baseDir: testsDir,
logBegin: func(name string, file *TestFile) {
fmt.Printf("- %s\n", name)
},
logProblems: func(name string, file *TestFile, diags hcl.Diagnostics) {
if len(diags) != 0 {
os.Stderr.WriteString("\n")
diagWr.WriteDiagnostics(diags)
diagCount += len(diags)
}
fmt.Printf("- %s\n", name)
},
}
diags := runner.Run()
if len(diags) != 0 {
os.Stderr.WriteString("\n\n\n== Test harness problems:\n\n")
diagWr.WriteDiagnostics(diags)
diagCount += len(diags)
}
if diagCount > 0 {
return 2
}
return 0
}

521
cmd/hclspecsuite/runner.go Normal file
View File

@ -0,0 +1,521 @@
package main
import (
"bytes"
"encoding/json"
"fmt"
"io/ioutil"
"os"
"os/exec"
"path/filepath"
"sort"
"strings"
"github.com/zclconf/go-cty/cty"
"github.com/zclconf/go-cty/cty/convert"
ctyjson "github.com/zclconf/go-cty/cty/json"
"github.com/hashicorp/hcl2/ext/typeexpr"
"github.com/hashicorp/hcl2/hcl"
"github.com/hashicorp/hcl2/hclparse"
)
type Runner struct {
parser *hclparse.Parser
hcldecPath string
baseDir string
logBegin LogBeginCallback
logProblems LogProblemsCallback
}
func (r *Runner) Run() hcl.Diagnostics {
return r.runDir(r.baseDir)
}
func (r *Runner) runDir(dir string) hcl.Diagnostics {
var diags hcl.Diagnostics
infos, err := ioutil.ReadDir(dir)
if err != nil {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Failed to read test directory",
Detail: fmt.Sprintf("The directory %q could not be opened: %s.", dir, err),
})
return diags
}
var tests []string
var subDirs []string
for _, info := range infos {
name := info.Name()
if strings.HasPrefix(name, ".") {
continue
}
if info.IsDir() {
subDirs = append(subDirs, name)
}
if strings.HasSuffix(name, ".t") {
tests = append(tests, name)
}
}
sort.Strings(tests)
sort.Strings(subDirs)
for _, filename := range tests {
filename = filepath.Join(dir, filename)
testDiags := r.runTest(filename)
diags = append(diags, testDiags...)
}
for _, dirName := range subDirs {
dir := filepath.Join(dir, dirName)
dirDiags := r.runDir(dir)
diags = append(diags, dirDiags...)
}
return diags
}
func (r *Runner) runTest(filename string) hcl.Diagnostics {
prettyName := r.prettyTestName(filename)
tf, diags := r.LoadTestFile(filename)
if diags.HasErrors() {
// We'll still log, so it's clearer which test the diagnostics belong to.
if r.logBegin != nil {
r.logBegin(prettyName, nil)
}
if r.logProblems != nil {
r.logProblems(prettyName, nil, diags)
return nil // don't duplicate the diagnostics we already reported
}
return diags
}
if r.logBegin != nil {
r.logBegin(prettyName, tf)
}
basePath := filename[:len(filename)-2]
specFilename := basePath + ".hcldec"
nativeFilename := basePath + ".hcl"
jsonFilename := basePath + ".hcl.json"
// We'll add the source code of the spec file to our own parser, even
// though it'll actually be parsed by the hcldec child process, since that
// way we can produce nice diagnostic messages if hcldec fails to process
// the spec file.
src, err := ioutil.ReadFile(specFilename)
if err == nil {
r.parser.AddFile(specFilename, &hcl.File{
Bytes: src,
})
}
if _, err := os.Stat(specFilename); err != nil {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Missing .hcldec file",
Detail: fmt.Sprintf("No specification file for test %s: %s.", prettyName, err),
})
return diags
}
if _, err := os.Stat(nativeFilename); err == nil {
moreDiags := r.runTestInput(specFilename, nativeFilename, tf)
diags = append(diags, moreDiags...)
}
if _, err := os.Stat(jsonFilename); err == nil {
moreDiags := r.runTestInput(specFilename, jsonFilename, tf)
diags = append(diags, moreDiags...)
}
if r.logProblems != nil {
r.logProblems(prettyName, nil, diags)
return nil // don't duplicate the diagnostics we already reported
}
return diags
}
func (r *Runner) runTestInput(specFilename, inputFilename string, tf *TestFile) hcl.Diagnostics {
// We'll add the source code of the input file to our own parser, even
// though it'll actually be parsed by the hcldec child process, since that
// way we can produce nice diagnostic messages if hcldec fails to process
// the input file.
src, err := ioutil.ReadFile(inputFilename)
if err == nil {
r.parser.AddFile(inputFilename, &hcl.File{
Bytes: src,
})
}
var diags hcl.Diagnostics
if tf.ChecksTraversals {
gotTraversals, moreDiags := r.hcldecVariables(specFilename, inputFilename)
diags = append(diags, moreDiags...)
if !moreDiags.HasErrors() {
expected := tf.ExpectedTraversals
for _, got := range gotTraversals {
e := findTraversalSpec(got, expected)
rng := got.SourceRange()
if e == nil {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Unexpected traversal",
Detail: "Detected traversal that is not indicated as expected in the test file.",
Subject: &rng,
})
} else {
moreDiags := checkTraversalsMatch(got, inputFilename, e)
diags = append(diags, moreDiags...)
}
}
// Look for any traversals that didn't show up at all.
for _, e := range expected {
if t := findTraversalForSpec(e, gotTraversals); t == nil {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Missing expected traversal",
Detail: "This expected traversal was not detected.",
Subject: e.Traversal.SourceRange().Ptr(),
})
}
}
}
}
val, transformDiags := r.hcldecTransform(specFilename, inputFilename)
if len(tf.ExpectedDiags) == 0 {
diags = append(diags, transformDiags...)
if transformDiags.HasErrors() {
// If hcldec failed then there's no point in continuing.
return diags
}
if errs := val.Type().TestConformance(tf.ResultType); len(errs) > 0 {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Incorrect result type",
Detail: fmt.Sprintf(
"Input file %s produced %s, but was expecting %s.",
inputFilename, typeexpr.TypeString(val.Type()), typeexpr.TypeString(tf.ResultType),
),
})
}
if tf.Result != cty.NilVal {
cmpVal, err := convert.Convert(tf.Result, tf.ResultType)
if err != nil {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Incorrect type for result value",
Detail: fmt.Sprintf(
"Result does not conform to the given result type: %s.", err,
),
Subject: &tf.ResultRange,
})
} else {
if !val.RawEquals(cmpVal) {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Incorrect result value",
Detail: fmt.Sprintf(
"Input file %s produced %#v, but was expecting %#v.",
inputFilename, val, tf.Result,
),
})
}
}
}
} else {
// We're expecting diagnostics, and so we'll need to correlate the
// severities and source ranges of our actual diagnostics against
// what we were expecting.
type DiagnosticEntry struct {
Severity hcl.DiagnosticSeverity
Range hcl.Range
}
got := make(map[DiagnosticEntry]*hcl.Diagnostic)
want := make(map[DiagnosticEntry]hcl.Range)
for _, diag := range transformDiags {
if diag.Subject == nil {
// Sourceless diagnostics can never be expected, so we'll just
// pass these through as-is and assume they are hcldec
// operational errors.
diags = append(diags, diag)
continue
}
if diag.Subject.Filename != inputFilename {
// If the problem is for something other than the input file
// then it can't be expected.
diags = append(diags, diag)
continue
}
entry := DiagnosticEntry{
Severity: diag.Severity,
Range: *diag.Subject,
}
got[entry] = diag
}
for _, e := range tf.ExpectedDiags {
e.Range.Filename = inputFilename // assumed here, since we don't allow any other filename to be expected
entry := DiagnosticEntry{
Severity: e.Severity,
Range: e.Range,
}
want[entry] = e.DeclRange
}
for gotEntry, diag := range got {
if _, wanted := want[gotEntry]; !wanted {
// Pass through the diagnostic itself so the user can see what happened
diags = append(diags, diag)
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Unexpected diagnostic",
Detail: fmt.Sprintf(
"No %s diagnostic was expected %s. The unexpected diagnostic was shown above.",
severityString(gotEntry.Severity), rangeString(gotEntry.Range),
),
Subject: gotEntry.Range.Ptr(),
})
}
}
for wantEntry, declRange := range want {
if _, gotted := got[wantEntry]; !gotted {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Missing expected diagnostic",
Detail: fmt.Sprintf(
"No %s diagnostic was generated %s.",
severityString(wantEntry.Severity), rangeString(wantEntry.Range),
),
Subject: declRange.Ptr(),
})
}
}
}
return diags
}
func (r *Runner) hcldecTransform(specFile, inputFile string) (cty.Value, hcl.Diagnostics) {
var diags hcl.Diagnostics
var outBuffer bytes.Buffer
var errBuffer bytes.Buffer
cmd := &exec.Cmd{
Path: r.hcldecPath,
Args: []string{
r.hcldecPath,
"--spec=" + specFile,
"--diags=json",
"--with-type",
inputFile,
},
Stdout: &outBuffer,
Stderr: &errBuffer,
}
err := cmd.Run()
if err != nil {
if _, isExit := err.(*exec.ExitError); !isExit {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Failed to run hcldec",
Detail: fmt.Sprintf("Sub-program hcldec failed to start: %s.", err),
})
return cty.DynamicVal, diags
}
// If we exited unsuccessfully then we'll expect diagnostics on stderr
moreDiags := decodeJSONDiagnostics(errBuffer.Bytes())
diags = append(diags, moreDiags...)
return cty.DynamicVal, diags
} else {
// Otherwise, we expect a JSON result value on stdout. Since we used
// --with-type above, we can decode as DynamicPseudoType to recover
// exactly the type that was saved, without the usual JSON lossiness.
val, err := ctyjson.Unmarshal(outBuffer.Bytes(), cty.DynamicPseudoType)
if err != nil {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Failed to parse hcldec result",
Detail: fmt.Sprintf("Sub-program hcldec produced an invalid result: %s.", err),
})
return cty.DynamicVal, diags
}
return val, diags
}
}
func (r *Runner) hcldecVariables(specFile, inputFile string) ([]hcl.Traversal, hcl.Diagnostics) {
var diags hcl.Diagnostics
var outBuffer bytes.Buffer
var errBuffer bytes.Buffer
cmd := &exec.Cmd{
Path: r.hcldecPath,
Args: []string{
r.hcldecPath,
"--spec=" + specFile,
"--diags=json",
"--var-refs",
inputFile,
},
Stdout: &outBuffer,
Stderr: &errBuffer,
}
err := cmd.Run()
if err != nil {
if _, isExit := err.(*exec.ExitError); !isExit {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Failed to run hcldec",
Detail: fmt.Sprintf("Sub-program hcldec (evaluating input) failed to start: %s.", err),
})
return nil, diags
}
// If we exited unsuccessfully then we'll expect diagnostics on stderr
moreDiags := decodeJSONDiagnostics(errBuffer.Bytes())
diags = append(diags, moreDiags...)
return nil, diags
} else {
// Otherwise, we expect a JSON description of the traversals on stdout.
type PosJSON struct {
Line int `json:"line"`
Column int `json:"column"`
Byte int `json:"byte"`
}
type RangeJSON struct {
Filename string `json:"filename"`
Start PosJSON `json:"start"`
End PosJSON `json:"end"`
}
type StepJSON struct {
Kind string `json:"kind"`
Name string `json:"name,omitempty"`
Key json.RawMessage `json:"key,omitempty"`
Range RangeJSON `json:"range"`
}
type TraversalJSON struct {
Steps []StepJSON `json:"steps"`
}
var raw []TraversalJSON
err := json.Unmarshal(outBuffer.Bytes(), &raw)
if err != nil {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Failed to parse hcldec result",
Detail: fmt.Sprintf("Sub-program hcldec (with --var-refs) produced an invalid result: %s.", err),
})
return nil, diags
}
var ret []hcl.Traversal
if len(raw) == 0 {
return ret, diags
}
ret = make([]hcl.Traversal, 0, len(raw))
for _, rawT := range raw {
traversal := make(hcl.Traversal, 0, len(rawT.Steps))
for _, rawS := range rawT.Steps {
rng := hcl.Range{
Filename: rawS.Range.Filename,
Start: hcl.Pos{
Line: rawS.Range.Start.Line,
Column: rawS.Range.Start.Column,
Byte: rawS.Range.Start.Byte,
},
End: hcl.Pos{
Line: rawS.Range.End.Line,
Column: rawS.Range.End.Column,
Byte: rawS.Range.End.Byte,
},
}
switch rawS.Kind {
case "root":
traversal = append(traversal, hcl.TraverseRoot{
Name: rawS.Name,
SrcRange: rng,
})
case "attr":
traversal = append(traversal, hcl.TraverseAttr{
Name: rawS.Name,
SrcRange: rng,
})
case "index":
ty, err := ctyjson.ImpliedType([]byte(rawS.Key))
if err != nil {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Failed to parse hcldec result",
Detail: fmt.Sprintf("Sub-program hcldec (with --var-refs) produced an invalid result: traversal step has invalid index key %s.", rawS.Key),
})
return nil, diags
}
keyVal, err := ctyjson.Unmarshal([]byte(rawS.Key), ty)
if err != nil {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Failed to parse hcldec result",
Detail: fmt.Sprintf("Sub-program hcldec (with --var-refs) produced a result with an invalid index key %s: %s.", rawS.Key, err),
})
return nil, diags
}
traversal = append(traversal, hcl.TraverseIndex{
Key: keyVal,
SrcRange: rng,
})
default:
// Should never happen since the above cases are exhaustive,
// but we'll catch it gracefully since this is coming from
// a possibly-buggy hcldec implementation that we're testing.
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Failed to parse hcldec result",
Detail: fmt.Sprintf("Sub-program hcldec (with --var-refs) produced an invalid result: traversal step of unsupported kind %q.", rawS.Kind),
})
return nil, diags
}
}
ret = append(ret, traversal)
}
return ret, diags
}
}
func (r *Runner) prettyDirName(dir string) string {
rel, err := filepath.Rel(r.baseDir, dir)
if err != nil {
return filepath.ToSlash(dir)
}
return filepath.ToSlash(rel)
}
func (r *Runner) prettyTestName(filename string) string {
dir := filepath.Dir(filename)
dirName := r.prettyDirName(dir)
filename = filepath.Base(filename)
testName := filename[:len(filename)-2]
if dirName == "." {
return testName
}
return fmt.Sprintf("%s/%s", dirName, testName)
}

View File

@ -0,0 +1,350 @@
package main
import (
"fmt"
"github.com/zclconf/go-cty/cty"
"github.com/zclconf/go-cty/cty/convert"
"github.com/hashicorp/hcl2/ext/typeexpr"
"github.com/hashicorp/hcl2/gohcl"
"github.com/hashicorp/hcl2/hcl"
)
type TestFile struct {
Result cty.Value
ResultType cty.Type
ChecksTraversals bool
ExpectedTraversals []*TestFileExpectTraversal
ExpectedDiags []*TestFileExpectDiag
ResultRange hcl.Range
ResultTypeRange hcl.Range
}
type TestFileExpectTraversal struct {
Traversal hcl.Traversal
Range hcl.Range
DeclRange hcl.Range
}
type TestFileExpectDiag struct {
Severity hcl.DiagnosticSeverity
Range hcl.Range
DeclRange hcl.Range
}
func (r *Runner) LoadTestFile(filename string) (*TestFile, hcl.Diagnostics) {
f, diags := r.parser.ParseHCLFile(filename)
if diags.HasErrors() {
return nil, diags
}
content, moreDiags := f.Body.Content(testFileSchema)
diags = append(diags, moreDiags...)
if moreDiags.HasErrors() {
return nil, diags
}
ret := &TestFile{
ResultType: cty.DynamicPseudoType,
}
if typeAttr, exists := content.Attributes["result_type"]; exists {
ty, moreDiags := typeexpr.TypeConstraint(typeAttr.Expr)
diags = append(diags, moreDiags...)
if !moreDiags.HasErrors() {
ret.ResultType = ty
}
ret.ResultTypeRange = typeAttr.Expr.Range()
}
if resultAttr, exists := content.Attributes["result"]; exists {
resultVal, moreDiags := resultAttr.Expr.Value(nil)
diags = append(diags, moreDiags...)
if !moreDiags.HasErrors() {
resultVal, err := convert.Convert(resultVal, ret.ResultType)
if err != nil {
diags = diags.Append(&hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Invalid result value",
Detail: fmt.Sprintf("The result value does not conform to the given result type: %s.", err),
Subject: resultAttr.Expr.Range().Ptr(),
})
} else {
ret.Result = resultVal
}
}
ret.ResultRange = resultAttr.Expr.Range()
}
for _, block := range content.Blocks {
switch block.Type {
case "traversals":
if ret.ChecksTraversals {
// Indicates a duplicate traversals block
diags = diags.Append(&hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Duplicate \"traversals\" block",
Detail: fmt.Sprintf("Only one traversals block is expected."),
Subject: &block.TypeRange,
})
continue
}
expectTraversals, moreDiags := r.decodeTraversalsBlock(block)
diags = append(diags, moreDiags...)
if !moreDiags.HasErrors() {
ret.ChecksTraversals = true
ret.ExpectedTraversals = expectTraversals
}
case "diagnostics":
if len(ret.ExpectedDiags) > 0 {
// Indicates a duplicate diagnostics block
diags = diags.Append(&hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Duplicate \"diagnostics\" block",
Detail: fmt.Sprintf("Only one diagnostics block is expected."),
Subject: &block.TypeRange,
})
continue
}
expectDiags, moreDiags := r.decodeDiagnosticsBlock(block)
diags = append(diags, moreDiags...)
ret.ExpectedDiags = expectDiags
default:
// Shouldn't get here, because the above cases are exhaustive for
// our test file schema.
panic(fmt.Sprintf("unsupported block type %q", block.Type))
}
}
if ret.Result != cty.NilVal && len(ret.ExpectedDiags) > 0 {
diags = diags.Append(&hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Conflicting spec expectations",
Detail: "This test spec includes expected diagnostics, so it may not also include an expected result.",
Subject: &content.Attributes["result"].Range,
})
}
return ret, diags
}
func (r *Runner) decodeTraversalsBlock(block *hcl.Block) ([]*TestFileExpectTraversal, hcl.Diagnostics) {
var diags hcl.Diagnostics
content, moreDiags := block.Body.Content(testFileTraversalsSchema)
diags = append(diags, moreDiags...)
if moreDiags.HasErrors() {
return nil, diags
}
var ret []*TestFileExpectTraversal
for _, block := range content.Blocks {
// There's only one block type in our schema, so we can assume all
// blocks are of that type.
expectTraversal, moreDiags := r.decodeTraversalExpectBlock(block)
diags = append(diags, moreDiags...)
if expectTraversal != nil {
ret = append(ret, expectTraversal)
}
}
return ret, diags
}
func (r *Runner) decodeTraversalExpectBlock(block *hcl.Block) (*TestFileExpectTraversal, hcl.Diagnostics) {
var diags hcl.Diagnostics
rng, body, moreDiags := r.decodeRangeFromBody(block.Body)
diags = append(diags, moreDiags...)
content, moreDiags := body.Content(testFileTraversalExpectSchema)
diags = append(diags, moreDiags...)
if moreDiags.HasErrors() {
return nil, diags
}
var traversal hcl.Traversal
{
refAttr := content.Attributes["ref"]
traversal, moreDiags = hcl.AbsTraversalForExpr(refAttr.Expr)
diags = append(diags, moreDiags...)
if moreDiags.HasErrors() {
return nil, diags
}
}
return &TestFileExpectTraversal{
Traversal: traversal,
Range: rng,
DeclRange: block.DefRange,
}, diags
}
func (r *Runner) decodeDiagnosticsBlock(block *hcl.Block) ([]*TestFileExpectDiag, hcl.Diagnostics) {
var diags hcl.Diagnostics
content, moreDiags := block.Body.Content(testFileDiagnosticsSchema)
diags = append(diags, moreDiags...)
if moreDiags.HasErrors() {
return nil, diags
}
if len(content.Blocks) == 0 {
diags = diags.Append(&hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Empty diagnostics block",
Detail: "If a diagnostics block is present, at least one expectation statement (\"error\" or \"warning\" block) must be included.",
Subject: &block.TypeRange,
})
return nil, diags
}
ret := make([]*TestFileExpectDiag, 0, len(content.Blocks))
for _, block := range content.Blocks {
rng, remain, moreDiags := r.decodeRangeFromBody(block.Body)
diags = append(diags, moreDiags...)
if diags.HasErrors() {
continue
}
// Should have nothing else in the block aside from the range definition.
_, moreDiags = remain.Content(&hcl.BodySchema{})
diags = append(diags, moreDiags...)
var severity hcl.DiagnosticSeverity
switch block.Type {
case "error":
severity = hcl.DiagError
case "warning":
severity = hcl.DiagWarning
default:
panic(fmt.Sprintf("unsupported block type %q", block.Type))
}
ret = append(ret, &TestFileExpectDiag{
Severity: severity,
Range: rng,
DeclRange: block.TypeRange,
})
}
return ret, diags
}
func (r *Runner) decodeRangeFromBody(body hcl.Body) (hcl.Range, hcl.Body, hcl.Diagnostics) {
type RawPos struct {
Line int `hcl:"line"`
Column int `hcl:"column"`
Byte int `hcl:"byte"`
}
type RawRange struct {
From RawPos `hcl:"from,block"`
To RawPos `hcl:"to,block"`
Remain hcl.Body `hcl:",remain"`
}
var raw RawRange
diags := gohcl.DecodeBody(body, nil, &raw)
return hcl.Range{
// We intentionally omit Filename here, because the test spec doesn't
// need to specify that explicitly: we can infer it to be the file
// path we pass to hcldec.
Start: hcl.Pos{
Line: raw.From.Line,
Column: raw.From.Column,
Byte: raw.From.Byte,
},
End: hcl.Pos{
Line: raw.To.Line,
Column: raw.To.Column,
Byte: raw.To.Byte,
},
}, raw.Remain, diags
}
var testFileSchema = &hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "result",
},
{
Name: "result_type",
},
},
Blocks: []hcl.BlockHeaderSchema{
{
Type: "traversals",
},
{
Type: "diagnostics",
},
},
}
var testFileTraversalsSchema = &hcl.BodySchema{
Blocks: []hcl.BlockHeaderSchema{
{
Type: "expect",
},
},
}
var testFileTraversalExpectSchema = &hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "ref",
Required: true,
},
},
Blocks: []hcl.BlockHeaderSchema{
{
Type: "range",
},
},
}
var testFileDiagnosticsSchema = &hcl.BodySchema{
Blocks: []hcl.BlockHeaderSchema{
{
Type: "error",
},
{
Type: "warning",
},
},
}
var testFileRangeSchema = &hcl.BodySchema{
Blocks: []hcl.BlockHeaderSchema{
{
Type: "from",
},
{
Type: "to",
},
},
}
var testFilePosSchema = &hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "line",
Required: true,
},
{
Name: "column",
Required: true,
},
{
Name: "byte",
Required: true,
},
},
}

View File

@ -0,0 +1,117 @@
package main
import (
"fmt"
"reflect"
"github.com/hashicorp/hcl2/hcl"
)
func findTraversalSpec(got hcl.Traversal, candidates []*TestFileExpectTraversal) *TestFileExpectTraversal {
for _, candidate := range candidates {
if traversalsAreEquivalent(candidate.Traversal, got) {
return candidate
}
}
return nil
}
func findTraversalForSpec(want *TestFileExpectTraversal, have []hcl.Traversal) hcl.Traversal {
for _, candidate := range have {
if traversalsAreEquivalent(candidate, want.Traversal) {
return candidate
}
}
return nil
}
func traversalsAreEquivalent(a, b hcl.Traversal) bool {
if len(a) != len(b) {
return false
}
for i := range a {
aStep := a[i]
bStep := b[i]
if reflect.TypeOf(aStep) != reflect.TypeOf(bStep) {
return false
}
// We can now assume that both are of the same type.
switch ts := aStep.(type) {
case hcl.TraverseRoot:
if bStep.(hcl.TraverseRoot).Name != ts.Name {
return false
}
case hcl.TraverseAttr:
if bStep.(hcl.TraverseAttr).Name != ts.Name {
return false
}
case hcl.TraverseIndex:
if !bStep.(hcl.TraverseIndex).Key.RawEquals(ts.Key) {
return false
}
default:
return false
}
}
return true
}
// checkTraversalsMatch determines if a given traversal matches the given
// expectation, which must've been produced by an earlier call to
// findTraversalSpec for the same traversal.
func checkTraversalsMatch(got hcl.Traversal, filename string, match *TestFileExpectTraversal) hcl.Diagnostics {
var diags hcl.Diagnostics
gotRng := got.SourceRange()
wantRng := match.Range
if got, want := gotRng.Filename, filename; got != want {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Incorrect filename in detected traversal",
Detail: fmt.Sprintf(
"Filename was reported as %q, but was expecting %q.",
got, want,
),
Subject: match.Traversal.SourceRange().Ptr(),
})
return diags
}
// If we have the expected filename then we'll use that to construct the
// full "want range" here so that we can use it to point to the appropriate
// location in the remaining diagnostics.
wantRng.Filename = filename
if got, want := gotRng.Start, wantRng.Start; got != want {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Incorrect start position in detected traversal",
Detail: fmt.Sprintf(
"Start position was reported as line %d column %d byte %d, but was expecting line %d column %d byte %d.",
got.Line, got.Column, got.Byte,
want.Line, want.Column, want.Byte,
),
Subject: &wantRng,
})
}
if got, want := gotRng.End, wantRng.End; got != want {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Incorrect end position in detected traversal",
Detail: fmt.Sprintf(
"End position was reported as line %d column %d byte %d, but was expecting line %d column %d byte %d.",
got.Line, got.Column, got.Byte,
want.Line, want.Column, want.Byte,
),
Subject: &wantRng,
})
}
return diags
}

View File

@ -1,767 +0,0 @@
package hcl
import (
"errors"
"fmt"
"reflect"
"sort"
"strconv"
"strings"
"github.com/hashicorp/hcl/hcl/ast"
"github.com/hashicorp/hcl/hcl/parser"
"github.com/hashicorp/hcl/hcl/token"
)
// This is the tag to use with structures to have settings for HCL
const tagName = "hcl"
var (
// nodeType holds a reference to the type of ast.Node
nodeType reflect.Type = findNodeType()
)
// Unmarshal accepts a byte slice as input and writes the
// data to the value pointed to by v.
func Unmarshal(bs []byte, v interface{}) error {
root, err := parse(bs)
if err != nil {
return err
}
return DecodeObject(v, root)
}
// Decode reads the given input and decodes it into the structure
// given by `out`.
func Decode(out interface{}, in string) error {
obj, err := Parse(in)
if err != nil {
return err
}
return DecodeObject(out, obj)
}
// DecodeObject is a lower-level version of Decode. It decodes a
// raw Object into the given output.
func DecodeObject(out interface{}, n ast.Node) error {
val := reflect.ValueOf(out)
if val.Kind() != reflect.Ptr {
return errors.New("result must be a pointer")
}
// If we have the file, we really decode the root node
if f, ok := n.(*ast.File); ok {
n = f.Node
}
var d decoder
return d.decode("root", n, val.Elem())
}
type decoder struct {
stack []reflect.Kind
}
func (d *decoder) decode(name string, node ast.Node, result reflect.Value) error {
k := result
// If we have an interface with a valid value, we use that
// for the check.
if result.Kind() == reflect.Interface {
elem := result.Elem()
if elem.IsValid() {
k = elem
}
}
// Push current onto stack unless it is an interface.
if k.Kind() != reflect.Interface {
d.stack = append(d.stack, k.Kind())
// Schedule a pop
defer func() {
d.stack = d.stack[:len(d.stack)-1]
}()
}
switch k.Kind() {
case reflect.Bool:
return d.decodeBool(name, node, result)
case reflect.Float32, reflect.Float64:
return d.decodeFloat(name, node, result)
case reflect.Int, reflect.Int32, reflect.Int64:
return d.decodeInt(name, node, result)
case reflect.Interface:
// When we see an interface, we make our own thing
return d.decodeInterface(name, node, result)
case reflect.Map:
return d.decodeMap(name, node, result)
case reflect.Ptr:
return d.decodePtr(name, node, result)
case reflect.Slice:
return d.decodeSlice(name, node, result)
case reflect.String:
return d.decodeString(name, node, result)
case reflect.Struct:
return d.decodeStruct(name, node, result)
default:
return &parser.PosError{
Pos: node.Pos(),
Err: fmt.Errorf("%s: unknown kind to decode into: %s", name, k.Kind()),
}
}
}
func (d *decoder) decodeBool(name string, node ast.Node, result reflect.Value) error {
switch n := node.(type) {
case *ast.LiteralType:
switch n.Token.Type {
case token.BOOL, token.STRING, token.NUMBER:
var v bool
s := strings.ToLower(strings.Replace(n.Token.Text, "\"", "", -1))
switch s {
case "1", "true":
v = true
case "0", "false":
v = false
default:
return fmt.Errorf("decodeBool: Unknown value for boolean: %s", n.Token.Text)
}
result.Set(reflect.ValueOf(v))
return nil
}
}
return &parser.PosError{
Pos: node.Pos(),
Err: fmt.Errorf("%s: unknown type %T", name, node),
}
}
func (d *decoder) decodeFloat(name string, node ast.Node, result reflect.Value) error {
switch n := node.(type) {
case *ast.LiteralType:
if n.Token.Type == token.FLOAT || n.Token.Type == token.NUMBER {
v, err := strconv.ParseFloat(n.Token.Text, 64)
if err != nil {
return err
}
result.Set(reflect.ValueOf(v).Convert(result.Type()))
return nil
}
}
return &parser.PosError{
Pos: node.Pos(),
Err: fmt.Errorf("%s: unknown type %T", name, node),
}
}
func (d *decoder) decodeInt(name string, node ast.Node, result reflect.Value) error {
switch n := node.(type) {
case *ast.LiteralType:
switch n.Token.Type {
case token.NUMBER:
v, err := strconv.ParseInt(n.Token.Text, 0, 0)
if err != nil {
return err
}
if result.Kind() == reflect.Interface {
result.Set(reflect.ValueOf(int(v)))
} else {
result.SetInt(v)
}
return nil
case token.STRING:
v, err := strconv.ParseInt(n.Token.Value().(string), 0, 0)
if err != nil {
return err
}
if result.Kind() == reflect.Interface {
result.Set(reflect.ValueOf(int(v)))
} else {
result.SetInt(v)
}
return nil
}
}
return &parser.PosError{
Pos: node.Pos(),
Err: fmt.Errorf("%s: unknown type %T", name, node),
}
}
func (d *decoder) decodeInterface(name string, node ast.Node, result reflect.Value) error {
// When we see an ast.Node, we retain the value to enable deferred decoding.
// Very useful in situations where we want to preserve ast.Node information
// like Pos
if result.Type() == nodeType && result.CanSet() {
result.Set(reflect.ValueOf(node))
return nil
}
var set reflect.Value
redecode := true
// For testing types, ObjectType should just be treated as a list. We
// set this to a temporary var because we want to pass in the real node.
testNode := node
if ot, ok := node.(*ast.ObjectType); ok {
testNode = ot.List
}
switch n := testNode.(type) {
case *ast.ObjectList:
// If we're at the root or we're directly within a slice, then we
// decode objects into map[string]interface{}, otherwise we decode
// them into lists.
if len(d.stack) == 0 || d.stack[len(d.stack)-1] == reflect.Slice {
var temp map[string]interface{}
tempVal := reflect.ValueOf(temp)
result := reflect.MakeMap(
reflect.MapOf(
reflect.TypeOf(""),
tempVal.Type().Elem()))
set = result
} else {
var temp []map[string]interface{}
tempVal := reflect.ValueOf(temp)
result := reflect.MakeSlice(
reflect.SliceOf(tempVal.Type().Elem()), 0, len(n.Items))
set = result
}
case *ast.ObjectType:
// If we're at the root or we're directly within a slice, then we
// decode objects into map[string]interface{}, otherwise we decode
// them into lists.
if len(d.stack) == 0 || d.stack[len(d.stack)-1] == reflect.Slice {
var temp map[string]interface{}
tempVal := reflect.ValueOf(temp)
result := reflect.MakeMap(
reflect.MapOf(
reflect.TypeOf(""),
tempVal.Type().Elem()))
set = result
} else {
var temp []map[string]interface{}
tempVal := reflect.ValueOf(temp)
result := reflect.MakeSlice(
reflect.SliceOf(tempVal.Type().Elem()), 0, 1)
set = result
}
case *ast.ListType:
var temp []interface{}
tempVal := reflect.ValueOf(temp)
result := reflect.MakeSlice(
reflect.SliceOf(tempVal.Type().Elem()), 0, 0)
set = result
case *ast.LiteralType:
switch n.Token.Type {
case token.BOOL:
var result bool
set = reflect.Indirect(reflect.New(reflect.TypeOf(result)))
case token.FLOAT:
var result float64
set = reflect.Indirect(reflect.New(reflect.TypeOf(result)))
case token.NUMBER:
var result int
set = reflect.Indirect(reflect.New(reflect.TypeOf(result)))
case token.STRING, token.HEREDOC:
set = reflect.Indirect(reflect.New(reflect.TypeOf("")))
default:
return &parser.PosError{
Pos: node.Pos(),
Err: fmt.Errorf("%s: cannot decode into interface: %T", name, node),
}
}
default:
return fmt.Errorf(
"%s: cannot decode into interface: %T",
name, node)
}
// Set the result to what its supposed to be, then reset
// result so we don't reflect into this method anymore.
result.Set(set)
if redecode {
// Revisit the node so that we can use the newly instantiated
// thing and populate it.
if err := d.decode(name, node, result); err != nil {
return err
}
}
return nil
}
func (d *decoder) decodeMap(name string, node ast.Node, result reflect.Value) error {
if item, ok := node.(*ast.ObjectItem); ok {
node = &ast.ObjectList{Items: []*ast.ObjectItem{item}}
}
if ot, ok := node.(*ast.ObjectType); ok {
node = ot.List
}
n, ok := node.(*ast.ObjectList)
if !ok {
return &parser.PosError{
Pos: node.Pos(),
Err: fmt.Errorf("%s: not an object type for map (%T)", name, node),
}
}
// If we have an interface, then we can address the interface,
// but not the slice itself, so get the element but set the interface
set := result
if result.Kind() == reflect.Interface {
result = result.Elem()
}
resultType := result.Type()
resultElemType := resultType.Elem()
resultKeyType := resultType.Key()
if resultKeyType.Kind() != reflect.String {
return &parser.PosError{
Pos: node.Pos(),
Err: fmt.Errorf("%s: map must have string keys", name),
}
}
// Make a map if it is nil
resultMap := result
if result.IsNil() {
resultMap = reflect.MakeMap(
reflect.MapOf(resultKeyType, resultElemType))
}
// Go through each element and decode it.
done := make(map[string]struct{})
for _, item := range n.Items {
if item.Val == nil {
continue
}
// github.com/hashicorp/terraform/issue/5740
if len(item.Keys) == 0 {
return &parser.PosError{
Pos: node.Pos(),
Err: fmt.Errorf("%s: map must have string keys", name),
}
}
// Get the key we're dealing with, which is the first item
keyStr := item.Keys[0].Token.Value().(string)
// If we've already processed this key, then ignore it
if _, ok := done[keyStr]; ok {
continue
}
// Determine the value. If we have more than one key, then we
// get the objectlist of only these keys.
itemVal := item.Val
if len(item.Keys) > 1 {
itemVal = n.Filter(keyStr)
done[keyStr] = struct{}{}
}
// Make the field name
fieldName := fmt.Sprintf("%s.%s", name, keyStr)
// Get the key/value as reflection values
key := reflect.ValueOf(keyStr)
val := reflect.Indirect(reflect.New(resultElemType))
// If we have a pre-existing value in the map, use that
oldVal := resultMap.MapIndex(key)
if oldVal.IsValid() {
val.Set(oldVal)
}
// Decode!
if err := d.decode(fieldName, itemVal, val); err != nil {
return err
}
// Set the value on the map
resultMap.SetMapIndex(key, val)
}
// Set the final map if we can
set.Set(resultMap)
return nil
}
func (d *decoder) decodePtr(name string, node ast.Node, result reflect.Value) error {
// if pointer is not nil, decode into existing value
if !result.IsNil() {
return d.decode(name, node, result.Elem())
}
// Create an element of the concrete (non pointer) type and decode
// into that. Then set the value of the pointer to this type.
resultType := result.Type()
resultElemType := resultType.Elem()
val := reflect.New(resultElemType)
if err := d.decode(name, node, reflect.Indirect(val)); err != nil {
return err
}
result.Set(val)
return nil
}
func (d *decoder) decodeSlice(name string, node ast.Node, result reflect.Value) error {
// If we have an interface, then we can address the interface,
// but not the slice itself, so get the element but set the interface
set := result
if result.Kind() == reflect.Interface {
result = result.Elem()
}
// Create the slice if it isn't nil
resultType := result.Type()
resultElemType := resultType.Elem()
if result.IsNil() {
resultSliceType := reflect.SliceOf(resultElemType)
result = reflect.MakeSlice(
resultSliceType, 0, 0)
}
// Figure out the items we'll be copying into the slice
var items []ast.Node
switch n := node.(type) {
case *ast.ObjectList:
items = make([]ast.Node, len(n.Items))
for i, item := range n.Items {
items[i] = item
}
case *ast.ObjectType:
items = []ast.Node{n}
case *ast.ListType:
items = n.List
default:
return &parser.PosError{
Pos: node.Pos(),
Err: fmt.Errorf("unknown slice type: %T", node),
}
}
for i, item := range items {
fieldName := fmt.Sprintf("%s[%d]", name, i)
// Decode
val := reflect.Indirect(reflect.New(resultElemType))
// if item is an object that was decoded from ambiguous JSON and
// flattened, make sure it's expanded if it needs to decode into a
// defined structure.
item := expandObject(item, val)
if err := d.decode(fieldName, item, val); err != nil {
return err
}
// Append it onto the slice
result = reflect.Append(result, val)
}
set.Set(result)
return nil
}
// expandObject detects if an ambiguous JSON object was flattened to a List which
// should be decoded into a struct, and expands the ast to properly deocode.
func expandObject(node ast.Node, result reflect.Value) ast.Node {
item, ok := node.(*ast.ObjectItem)
if !ok {
return node
}
elemType := result.Type()
// our target type must be a struct
switch elemType.Kind() {
case reflect.Ptr:
switch elemType.Elem().Kind() {
case reflect.Struct:
//OK
default:
return node
}
case reflect.Struct:
//OK
default:
return node
}
// A list value will have a key and field name. If it had more fields,
// it wouldn't have been flattened.
if len(item.Keys) != 2 {
return node
}
keyToken := item.Keys[0].Token
item.Keys = item.Keys[1:]
// we need to un-flatten the ast enough to decode
newNode := &ast.ObjectItem{
Keys: []*ast.ObjectKey{
{
Token: keyToken,
},
},
Val: &ast.ObjectType{
List: &ast.ObjectList{
Items: []*ast.ObjectItem{item},
},
},
}
return newNode
}
func (d *decoder) decodeString(name string, node ast.Node, result reflect.Value) error {
switch n := node.(type) {
case *ast.LiteralType:
switch n.Token.Type {
case token.NUMBER, token.FLOAT, token.BOOL:
result.Set(reflect.ValueOf(n.Token.Text).Convert(result.Type()))
return nil
case token.STRING, token.HEREDOC:
result.Set(reflect.ValueOf(n.Token.Value()).Convert(result.Type()))
return nil
}
}
return &parser.PosError{
Pos: node.Pos(),
Err: fmt.Errorf("%s: unknown type for string %T", name, node),
}
}
func (d *decoder) decodeStruct(name string, node ast.Node, result reflect.Value) error {
var item *ast.ObjectItem
if it, ok := node.(*ast.ObjectItem); ok {
item = it
node = it.Val
}
if ot, ok := node.(*ast.ObjectType); ok {
node = ot.List
}
// Handle the special case where the object itself is a literal. Previously
// the yacc parser would always ensure top-level elements were arrays. The new
// parser does not make the same guarantees, thus we need to convert any
// top-level literal elements into a list.
if _, ok := node.(*ast.LiteralType); ok && item != nil {
node = &ast.ObjectList{Items: []*ast.ObjectItem{item}}
}
list, ok := node.(*ast.ObjectList)
if !ok {
return &parser.PosError{
Pos: node.Pos(),
Err: fmt.Errorf("%s: not an object type for struct (%T)", name, node),
}
}
// This slice will keep track of all the structs we'll be decoding.
// There can be more than one struct if there are embedded structs
// that are squashed.
structs := make([]reflect.Value, 1, 5)
structs[0] = result
// Compile the list of all the fields that we're going to be decoding
// from all the structs.
type field struct {
field reflect.StructField
val reflect.Value
}
fields := []field{}
for len(structs) > 0 {
structVal := structs[0]
structs = structs[1:]
structType := structVal.Type()
for i := 0; i < structType.NumField(); i++ {
fieldType := structType.Field(i)
tagParts := strings.Split(fieldType.Tag.Get(tagName), ",")
// Ignore fields with tag name "-"
if tagParts[0] == "-" {
continue
}
if fieldType.Anonymous {
fieldKind := fieldType.Type.Kind()
if fieldKind != reflect.Struct {
return &parser.PosError{
Pos: node.Pos(),
Err: fmt.Errorf("%s: unsupported type to struct: %s",
fieldType.Name, fieldKind),
}
}
// We have an embedded field. We "squash" the fields down
// if specified in the tag.
squash := false
for _, tag := range tagParts[1:] {
if tag == "squash" {
squash = true
break
}
}
if squash {
structs = append(
structs, result.FieldByName(fieldType.Name))
continue
}
}
// Normal struct field, store it away
fields = append(fields, field{fieldType, structVal.Field(i)})
}
}
decodedFields := make([]string, 0, len(fields))
decodedFieldsVal := make([]reflect.Value, 0)
unusedKeysVal := make([]reflect.Value, 0)
// fill unusedNodeKeys with keys from the AST
// a slice because we have to do equals case fold to match Filter
unusedNodeKeys := make([]string, 0)
for _, item := range list.Items {
for _, k := range item.Keys {
unusedNodeKeys = append(unusedNodeKeys, k.Token.Value().(string))
}
}
for _, f := range fields {
field, fieldValue := f.field, f.val
if !fieldValue.IsValid() {
// This should never happen
panic("field is not valid")
}
// If we can't set the field, then it is unexported or something,
// and we just continue onwards.
if !fieldValue.CanSet() {
continue
}
fieldName := field.Name
tagValue := field.Tag.Get(tagName)
tagParts := strings.SplitN(tagValue, ",", 2)
if len(tagParts) >= 2 {
switch tagParts[1] {
case "decodedFields":
decodedFieldsVal = append(decodedFieldsVal, fieldValue)
continue
case "key":
if item == nil {
return &parser.PosError{
Pos: node.Pos(),
Err: fmt.Errorf("%s: %s asked for 'key', impossible",
name, fieldName),
}
}
fieldValue.SetString(item.Keys[0].Token.Value().(string))
continue
case "unusedKeys":
unusedKeysVal = append(unusedKeysVal, fieldValue)
continue
}
}
if tagParts[0] != "" {
fieldName = tagParts[0]
}
// Determine the element we'll use to decode. If it is a single
// match (only object with the field), then we decode it exactly.
// If it is a prefix match, then we decode the matches.
filter := list.Filter(fieldName)
prefixMatches := filter.Children()
matches := filter.Elem()
if len(matches.Items) == 0 && len(prefixMatches.Items) == 0 {
continue
}
// Track the used keys
unusedNodeKeys = removeCaseFold(unusedNodeKeys, fieldName)
// Create the field name and decode. We range over the elements
// because we actually want the value.
fieldName = fmt.Sprintf("%s.%s", name, fieldName)
if len(prefixMatches.Items) > 0 {
if err := d.decode(fieldName, prefixMatches, fieldValue); err != nil {
return err
}
}
for _, match := range matches.Items {
var decodeNode ast.Node = match.Val
if ot, ok := decodeNode.(*ast.ObjectType); ok {
decodeNode = &ast.ObjectList{Items: ot.List.Items}
}
if err := d.decode(fieldName, decodeNode, fieldValue); err != nil {
return err
}
}
decodedFields = append(decodedFields, field.Name)
}
if len(decodedFieldsVal) > 0 {
// Sort it so that it is deterministic
sort.Strings(decodedFields)
for _, v := range decodedFieldsVal {
v.Set(reflect.ValueOf(decodedFields))
}
}
if len(unusedNodeKeys) > 0 {
// like decodedFields, populated the unusedKeys field(s)
sort.Strings(unusedNodeKeys)
for _, v := range unusedKeysVal {
v.Set(reflect.ValueOf(unusedNodeKeys))
}
}
return nil
}
// findNodeType returns the type of ast.Node
func findNodeType() reflect.Type {
var nodeContainer struct {
Node ast.Node
}
value := reflect.ValueOf(nodeContainer).FieldByName("Node")
return value.Type()
}
func removeCaseFold(xs []string, y string) []string {
for i, x := range xs {
if strings.EqualFold(x, y) {
return append(xs[:i], xs[i+1:]...)
}
}
return xs
}

File diff suppressed because it is too large Load Diff

9
ext/README.md Normal file
View File

@ -0,0 +1,9 @@
# HCL Extensions
This directory contains some packages implementing some extensions to HCL
that add features by building on the core API in the main `hcl` package.
These serve as optional language extensions for use-cases that are limited only
to specific callers. Generally these make the language more expressive at
the expense of increased dynamic behavior that may be undesirable for
applications that need to impose more rigid structure on configuration.

184
ext/dynblock/README.md Normal file
View File

@ -0,0 +1,184 @@
# HCL Dynamic Blocks Extension
This HCL extension implements a special block type named "dynamic" that can
be used to dynamically generate blocks of other types by iterating over
collection values.
Normally the block structure in an HCL configuration file is rigid, even
though dynamic expressions can be used within attribute values. This is
convenient for most applications since it allows the overall structure of
the document to be decoded easily, but in some applications it is desirable
to allow dynamic block generation within certain portions of the configuration.
Dynamic block generation is performed using the `dynamic` block type:
```hcl
toplevel {
nested {
foo = "static block 1"
}
dynamic "nested" {
for_each = ["a", "b", "c"]
iterator = nested
content {
foo = "dynamic block ${nested.value}"
}
}
nested {
foo = "static block 2"
}
}
```
The above is interpreted as if it were written as follows:
```hcl
toplevel {
nested {
foo = "static block 1"
}
nested {
foo = "dynamic block a"
}
nested {
foo = "dynamic block b"
}
nested {
foo = "dynamic block c"
}
nested {
foo = "static block 2"
}
}
```
Since HCL block syntax is not normally exposed to the possibility of unknown
values, this extension must make some compromises when asked to iterate over
an unknown collection. If the length of the collection cannot be statically
recognized (because it is an unknown value of list, map, or set type) then
the `dynamic` construct will generate a _single_ dynamic block whose iterator
key and value are both unknown values of the dynamic pseudo-type, thus causing
any attribute values derived from iteration to appear as unknown values. There
is no explicit representation of the fact that the length of the collection may
eventually be different than one.
## Usage
Pass a body to function `Expand` to obtain a new body that will, on access
to its content, evaluate and expand any nested `dynamic` blocks.
Dynamic block processing is also automatically propagated into any nested
blocks that are returned, allowing users to nest dynamic blocks inside
one another and to nest dynamic blocks inside other static blocks.
HCL structural decoding does not normally have access to an `EvalContext`, so
any variables and functions that should be available to the `for_each`
and `labels` expressions must be passed in when calling `Expand`. Expressions
within the `content` block are evaluated separately and so can be passed a
separate `EvalContext` if desired, during normal attribute expression
evaluation.
## Detecting Variables
Some applications dynamically generate an `EvalContext` by analyzing which
variables are referenced by an expression before evaluating it.
This unfortunately requires some extra effort when this analysis is required
for the context passed to `Expand`: the HCL API requires a schema to be
provided in order to do any analysis of the blocks in a body, but the low-level
schema model provides a description of only one level of nested blocks at
a time, and thus a new schema must be provided for each additional level of
nesting.
To make this arduous process as convenient as possible, this package provides
a helper function `WalkForEachVariables`, which returns a `WalkVariablesNode`
instance that can be used to find variables directly in a given body and also
determine which nested blocks require recursive calls. Using this mechanism
requires that the caller be able to look up a schema given a nested block type.
For _simple_ formats where a specific block type name always has the same schema
regardless of context, a walk can be implemented as follows:
```go
func walkVariables(node dynblock.WalkVariablesNode, schema *hcl.BodySchema) []hcl.Traversal {
vars, children := node.Visit(schema)
for _, child := range children {
var childSchema *hcl.BodySchema
switch child.BlockTypeName {
case "a":
childSchema = &hcl.BodySchema{
Blocks: []hcl.BlockHeaderSchema{
{
Type: "b",
LabelNames: []string{"key"},
},
},
}
case "b":
childSchema = &hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "val",
Required: true,
},
},
}
default:
// Should never happen, because the above cases should be exhaustive
// for the application's configuration format.
panic(fmt.Errorf("can't find schema for unknown block type %q", child.BlockTypeName))
}
vars = append(vars, testWalkAndAccumVars(child.Node, childSchema)...)
}
}
```
### Detecting Variables with `hcldec` Specifications
For applications that use the higher-level `hcldec` package to decode nested
configuration structures into `cty` values, the same specification can be used
to automatically drive the recursive variable-detection walk described above.
The helper function `ForEachVariablesHCLDec` allows an entire recursive
configuration structure to be analyzed in a single call given a `hcldec.Spec`
that describes the nested block structure. This means a `hcldec`-based
application can support dynamic blocks with only a little additional effort:
```go
func decodeBody(body hcl.Body, spec hcldec.Spec) (cty.Value, hcl.Diagnostics) {
// Determine which variables are needed to expand dynamic blocks
neededForDynamic := dynblock.ForEachVariablesHCLDec(body, spec)
// Build a suitable EvalContext and expand dynamic blocks
dynCtx := buildEvalContext(neededForDynamic)
dynBody := dynblock.Expand(body, dynCtx)
// Determine which variables are needed to fully decode the expanded body
// This will analyze expressions that came both from static blocks in the
// original body and from blocks that were dynamically added by Expand.
neededForDecode := hcldec.Variables(dynBody, spec)
// Build a suitable EvalContext and then fully decode the body as per the
// hcldec specification.
decCtx := buildEvalContext(neededForDecode)
return hcldec.Decode(dynBody, spec, decCtx)
}
func buildEvalContext(needed []hcl.Traversal) *hcl.EvalContext {
// (to be implemented by your application)
}
```
# Performance
This extension is going quite harshly against the grain of the HCL API, and
so it uses lots of wrapping objects and temporary data structures to get its
work done. HCL in general is not suitable for use in high-performance situations
or situations sensitive to memory pressure, but that is _especially_ true for
this extension.

262
ext/dynblock/expand_body.go Normal file
View File

@ -0,0 +1,262 @@
package dynblock
import (
"fmt"
"github.com/hashicorp/hcl2/hcl"
"github.com/zclconf/go-cty/cty"
)
// expandBody wraps another hcl.Body and expands any "dynamic" blocks found
// inside whenever Content or PartialContent is called.
type expandBody struct {
original hcl.Body
forEachCtx *hcl.EvalContext
iteration *iteration // non-nil if we're nested inside another "dynamic" block
// These are used with PartialContent to produce a "remaining items"
// body to return. They are nil on all bodies fresh out of the transformer.
//
// Note that this is re-implemented here rather than delegating to the
// existing support required by the underlying body because we need to
// retain access to the entire original body on subsequent decode operations
// so we can retain any "dynamic" blocks for types we didn't take consume
// on the first pass.
hiddenAttrs map[string]struct{}
hiddenBlocks map[string]hcl.BlockHeaderSchema
}
func (b *expandBody) Content(schema *hcl.BodySchema) (*hcl.BodyContent, hcl.Diagnostics) {
extSchema := b.extendSchema(schema)
rawContent, diags := b.original.Content(extSchema)
blocks, blockDiags := b.expandBlocks(schema, rawContent.Blocks, false)
diags = append(diags, blockDiags...)
attrs := b.prepareAttributes(rawContent.Attributes)
content := &hcl.BodyContent{
Attributes: attrs,
Blocks: blocks,
MissingItemRange: b.original.MissingItemRange(),
}
return content, diags
}
func (b *expandBody) PartialContent(schema *hcl.BodySchema) (*hcl.BodyContent, hcl.Body, hcl.Diagnostics) {
extSchema := b.extendSchema(schema)
rawContent, _, diags := b.original.PartialContent(extSchema)
// We discard the "remain" argument above because we're going to construct
// our own remain that also takes into account remaining "dynamic" blocks.
blocks, blockDiags := b.expandBlocks(schema, rawContent.Blocks, true)
diags = append(diags, blockDiags...)
attrs := b.prepareAttributes(rawContent.Attributes)
content := &hcl.BodyContent{
Attributes: attrs,
Blocks: blocks,
MissingItemRange: b.original.MissingItemRange(),
}
remain := &expandBody{
original: b.original,
forEachCtx: b.forEachCtx,
iteration: b.iteration,
hiddenAttrs: make(map[string]struct{}),
hiddenBlocks: make(map[string]hcl.BlockHeaderSchema),
}
for name := range b.hiddenAttrs {
remain.hiddenAttrs[name] = struct{}{}
}
for typeName, blockS := range b.hiddenBlocks {
remain.hiddenBlocks[typeName] = blockS
}
for _, attrS := range schema.Attributes {
remain.hiddenAttrs[attrS.Name] = struct{}{}
}
for _, blockS := range schema.Blocks {
remain.hiddenBlocks[blockS.Type] = blockS
}
return content, remain, diags
}
func (b *expandBody) extendSchema(schema *hcl.BodySchema) *hcl.BodySchema {
// We augment the requested schema to also include our special "dynamic"
// block type, since then we'll get instances of it interleaved with
// all of the literal child blocks we must also include.
extSchema := &hcl.BodySchema{
Attributes: schema.Attributes,
Blocks: make([]hcl.BlockHeaderSchema, len(schema.Blocks), len(schema.Blocks)+len(b.hiddenBlocks)+1),
}
copy(extSchema.Blocks, schema.Blocks)
extSchema.Blocks = append(extSchema.Blocks, dynamicBlockHeaderSchema)
// If we have any hiddenBlocks then we also need to register those here
// so that a call to "Content" on the underlying body won't fail.
// (We'll filter these out again once we process the result of either
// Content or PartialContent.)
for _, blockS := range b.hiddenBlocks {
extSchema.Blocks = append(extSchema.Blocks, blockS)
}
// If we have any hiddenAttrs then we also need to register these, for
// the same reason as we deal with hiddenBlocks above.
if len(b.hiddenAttrs) != 0 {
newAttrs := make([]hcl.AttributeSchema, len(schema.Attributes), len(schema.Attributes)+len(b.hiddenAttrs))
copy(newAttrs, extSchema.Attributes)
for name := range b.hiddenAttrs {
newAttrs = append(newAttrs, hcl.AttributeSchema{
Name: name,
Required: false,
})
}
extSchema.Attributes = newAttrs
}
return extSchema
}
func (b *expandBody) prepareAttributes(rawAttrs hcl.Attributes) hcl.Attributes {
if len(b.hiddenAttrs) == 0 && b.iteration == nil {
// Easy path: just pass through the attrs from the original body verbatim
return rawAttrs
}
// Otherwise we have some work to do: we must filter out any attributes
// that are hidden (since a previous PartialContent call already saw these)
// and wrap the expressions of the inner attributes so that they will
// have access to our iteration variables.
attrs := make(hcl.Attributes, len(rawAttrs))
for name, rawAttr := range rawAttrs {
if _, hidden := b.hiddenAttrs[name]; hidden {
continue
}
if b.iteration != nil {
attr := *rawAttr // shallow copy so we can mutate it
attr.Expr = exprWrap{
Expression: attr.Expr,
i: b.iteration,
}
attrs[name] = &attr
} else {
// If we have no active iteration then no wrapping is required.
attrs[name] = rawAttr
}
}
return attrs
}
func (b *expandBody) expandBlocks(schema *hcl.BodySchema, rawBlocks hcl.Blocks, partial bool) (hcl.Blocks, hcl.Diagnostics) {
var blocks hcl.Blocks
var diags hcl.Diagnostics
for _, rawBlock := range rawBlocks {
switch rawBlock.Type {
case "dynamic":
realBlockType := rawBlock.Labels[0]
if _, hidden := b.hiddenBlocks[realBlockType]; hidden {
continue
}
var blockS *hcl.BlockHeaderSchema
for _, candidate := range schema.Blocks {
if candidate.Type == realBlockType {
blockS = &candidate
break
}
}
if blockS == nil {
// Not a block type that the caller requested.
if !partial {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Unsupported block type",
Detail: fmt.Sprintf("Blocks of type %q are not expected here.", realBlockType),
Subject: &rawBlock.LabelRanges[0],
})
}
continue
}
spec, specDiags := b.decodeSpec(blockS, rawBlock)
diags = append(diags, specDiags...)
if specDiags.HasErrors() {
continue
}
if spec.forEachVal.IsKnown() {
for it := spec.forEachVal.ElementIterator(); it.Next(); {
key, value := it.Element()
i := b.iteration.MakeChild(spec.iteratorName, key, value)
block, blockDiags := spec.newBlock(i, b.forEachCtx)
diags = append(diags, blockDiags...)
if block != nil {
// Attach our new iteration context so that attributes
// and other nested blocks can refer to our iterator.
block.Body = b.expandChild(block.Body, i)
blocks = append(blocks, block)
}
}
} else {
// If our top-level iteration value isn't known then we're forced
// to compromise since HCL doesn't have any concept of an
// "unknown block". In this case then, we'll produce a single
// dynamic block with the iterator values set to DynamicVal,
// which at least makes the potential for a block visible
// in our result, even though it's not represented in a fully-accurate
// way.
i := b.iteration.MakeChild(spec.iteratorName, cty.DynamicVal, cty.DynamicVal)
block, blockDiags := spec.newBlock(i, b.forEachCtx)
diags = append(diags, blockDiags...)
if block != nil {
block.Body = b.expandChild(block.Body, i)
// We additionally force all of the leaf attribute values
// in the result to be unknown so the calling application
// can, if necessary, use that as a heuristic to detect
// when a single nested block might be standing in for
// multiple blocks yet to be expanded. This retains the
// structure of the generated body but forces all of its
// leaf attribute values to be unknown.
block.Body = unknownBody{block.Body}
blocks = append(blocks, block)
}
}
default:
if _, hidden := b.hiddenBlocks[rawBlock.Type]; !hidden {
// A static block doesn't create a new iteration context, but
// it does need to inherit _our own_ iteration context in
// case it contains expressions that refer to our inherited
// iterators, or nested "dynamic" blocks.
expandedBlock := *rawBlock // shallow copy
expandedBlock.Body = b.expandChild(rawBlock.Body, b.iteration)
blocks = append(blocks, &expandedBlock)
}
}
}
return blocks, diags
}
func (b *expandBody) expandChild(child hcl.Body, i *iteration) hcl.Body {
chiCtx := i.EvalContext(b.forEachCtx)
ret := Expand(child, chiCtx)
ret.(*expandBody).iteration = i
return ret
}
func (b *expandBody) JustAttributes() (hcl.Attributes, hcl.Diagnostics) {
// blocks aren't allowed in JustAttributes mode and this body can
// only produce blocks, so we'll just pass straight through to our
// underlying body here.
return b.original.JustAttributes()
}
func (b *expandBody) MissingItemRange() hcl.Range {
return b.original.MissingItemRange()
}

View File

@ -0,0 +1,383 @@
package dynblock
import (
"testing"
"github.com/hashicorp/hcl2/hcl"
"github.com/hashicorp/hcl2/hcldec"
"github.com/hashicorp/hcl2/hcltest"
"github.com/zclconf/go-cty/cty"
)
func TestExpand(t *testing.T) {
srcBody := hcltest.MockBody(&hcl.BodyContent{
Blocks: hcl.Blocks{
{
Type: "a",
Labels: []string{"static0"},
LabelRanges: []hcl.Range{hcl.Range{}},
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"val": hcltest.MockExprLiteral(cty.StringVal("static a 0")),
}),
}),
},
{
Type: "b",
Body: hcltest.MockBody(&hcl.BodyContent{
Blocks: hcl.Blocks{
{
Type: "c",
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"val0": hcltest.MockExprLiteral(cty.StringVal("static c 0")),
}),
}),
},
{
Type: "dynamic",
Labels: []string{"c"},
LabelRanges: []hcl.Range{hcl.Range{}},
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"for_each": hcltest.MockExprLiteral(cty.ListVal([]cty.Value{
cty.StringVal("dynamic c 0"),
cty.StringVal("dynamic c 1"),
})),
"iterator": hcltest.MockExprVariable("dyn_c"),
}),
Blocks: hcl.Blocks{
{
Type: "content",
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"val0": hcltest.MockExprTraversalSrc("dyn_c.value"),
}),
}),
},
},
}),
},
},
}),
},
{
Type: "dynamic",
Labels: []string{"a"},
LabelRanges: []hcl.Range{hcl.Range{}},
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"for_each": hcltest.MockExprLiteral(cty.ListVal([]cty.Value{
cty.StringVal("dynamic a 0"),
cty.StringVal("dynamic a 1"),
cty.StringVal("dynamic a 2"),
})),
"labels": hcltest.MockExprList([]hcl.Expression{
hcltest.MockExprTraversalSrc("a.key"),
}),
}),
Blocks: hcl.Blocks{
{
Type: "content",
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"val": hcltest.MockExprTraversalSrc("a.value"),
}),
}),
},
},
}),
},
{
Type: "dynamic",
Labels: []string{"b"},
LabelRanges: []hcl.Range{hcl.Range{}},
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"for_each": hcltest.MockExprLiteral(cty.ListVal([]cty.Value{
cty.StringVal("dynamic b 0"),
cty.StringVal("dynamic b 1"),
})),
"iterator": hcltest.MockExprVariable("dyn_b"),
}),
Blocks: hcl.Blocks{
{
Type: "content",
Body: hcltest.MockBody(&hcl.BodyContent{
Blocks: hcl.Blocks{
{
Type: "c",
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"val0": hcltest.MockExprLiteral(cty.StringVal("static c 1")),
"val1": hcltest.MockExprTraversalSrc("dyn_b.value"),
}),
}),
},
{
Type: "dynamic",
Labels: []string{"c"},
LabelRanges: []hcl.Range{hcl.Range{}},
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"for_each": hcltest.MockExprLiteral(cty.ListVal([]cty.Value{
cty.StringVal("dynamic c 2"),
cty.StringVal("dynamic c 3"),
})),
}),
Blocks: hcl.Blocks{
{
Type: "content",
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"val0": hcltest.MockExprTraversalSrc("c.value"),
"val1": hcltest.MockExprTraversalSrc("dyn_b.value"),
}),
}),
},
},
}),
},
},
}),
},
},
}),
},
{
Type: "dynamic",
Labels: []string{"b"},
LabelRanges: []hcl.Range{hcl.Range{}},
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"for_each": hcltest.MockExprLiteral(cty.MapVal(map[string]cty.Value{
"foo": cty.ListVal([]cty.Value{
cty.StringVal("dynamic c nested 0"),
cty.StringVal("dynamic c nested 1"),
}),
})),
"iterator": hcltest.MockExprVariable("dyn_b"),
}),
Blocks: hcl.Blocks{
{
Type: "content",
Body: hcltest.MockBody(&hcl.BodyContent{
Blocks: hcl.Blocks{
{
Type: "dynamic",
Labels: []string{"c"},
LabelRanges: []hcl.Range{hcl.Range{}},
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"for_each": hcltest.MockExprTraversalSrc("dyn_b.value"),
}),
Blocks: hcl.Blocks{
{
Type: "content",
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"val0": hcltest.MockExprTraversalSrc("c.value"),
"val1": hcltest.MockExprTraversalSrc("dyn_b.key"),
}),
}),
},
},
}),
},
},
}),
},
},
}),
},
{
Type: "dynamic",
Labels: []string{"b"},
LabelRanges: []hcl.Range{hcl.Range{}},
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"for_each": hcltest.MockExprLiteral(cty.UnknownVal(cty.Map(cty.String))),
"iterator": hcltest.MockExprVariable("dyn_b"),
}),
Blocks: hcl.Blocks{
{
Type: "content",
Body: hcltest.MockBody(&hcl.BodyContent{
Blocks: hcl.Blocks{
{
Type: "dynamic",
Labels: []string{"c"},
LabelRanges: []hcl.Range{hcl.Range{}},
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"for_each": hcltest.MockExprTraversalSrc("dyn_b.value"),
}),
Blocks: hcl.Blocks{
{
Type: "content",
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"val0": hcltest.MockExprTraversalSrc("c.value"),
"val1": hcltest.MockExprTraversalSrc("dyn_b.key"),
}),
}),
},
},
}),
},
},
}),
},
},
}),
},
{
Type: "a",
Labels: []string{"static1"},
LabelRanges: []hcl.Range{hcl.Range{}},
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"val": hcltest.MockExprLiteral(cty.StringVal("static a 1")),
}),
}),
},
},
})
dynBody := Expand(srcBody, nil)
var remain hcl.Body
t.Run("PartialDecode", func(t *testing.T) {
decSpec := &hcldec.BlockMapSpec{
TypeName: "a",
LabelNames: []string{"key"},
Nested: &hcldec.AttrSpec{
Name: "val",
Type: cty.String,
Required: true,
},
}
var got cty.Value
var diags hcl.Diagnostics
got, remain, diags = hcldec.PartialDecode(dynBody, decSpec, nil)
if len(diags) != 0 {
t.Errorf("unexpected diagnostics")
for _, diag := range diags {
t.Logf("- %s", diag)
}
return
}
want := cty.MapVal(map[string]cty.Value{
"static0": cty.StringVal("static a 0"),
"static1": cty.StringVal("static a 1"),
"0": cty.StringVal("dynamic a 0"),
"1": cty.StringVal("dynamic a 1"),
"2": cty.StringVal("dynamic a 2"),
})
if !got.RawEquals(want) {
t.Errorf("wrong result\ngot: %#v\nwant: %#v", got, want)
}
})
t.Run("Decode", func(t *testing.T) {
decSpec := &hcldec.BlockListSpec{
TypeName: "b",
Nested: &hcldec.BlockListSpec{
TypeName: "c",
Nested: &hcldec.ObjectSpec{
"val0": &hcldec.AttrSpec{
Name: "val0",
Type: cty.String,
},
"val1": &hcldec.AttrSpec{
Name: "val1",
Type: cty.String,
},
},
},
}
var got cty.Value
var diags hcl.Diagnostics
got, diags = hcldec.Decode(remain, decSpec, nil)
if len(diags) != 0 {
t.Errorf("unexpected diagnostics")
for _, diag := range diags {
t.Logf("- %s", diag)
}
return
}
want := cty.ListVal([]cty.Value{
cty.ListVal([]cty.Value{
cty.ObjectVal(map[string]cty.Value{
"val0": cty.StringVal("static c 0"),
"val1": cty.NullVal(cty.String),
}),
cty.ObjectVal(map[string]cty.Value{
"val0": cty.StringVal("dynamic c 0"),
"val1": cty.NullVal(cty.String),
}),
cty.ObjectVal(map[string]cty.Value{
"val0": cty.StringVal("dynamic c 1"),
"val1": cty.NullVal(cty.String),
}),
}),
cty.ListVal([]cty.Value{
cty.ObjectVal(map[string]cty.Value{
"val0": cty.StringVal("static c 1"),
"val1": cty.StringVal("dynamic b 0"),
}),
cty.ObjectVal(map[string]cty.Value{
"val0": cty.StringVal("dynamic c 2"),
"val1": cty.StringVal("dynamic b 0"),
}),
cty.ObjectVal(map[string]cty.Value{
"val0": cty.StringVal("dynamic c 3"),
"val1": cty.StringVal("dynamic b 0"),
}),
}),
cty.ListVal([]cty.Value{
cty.ObjectVal(map[string]cty.Value{
"val0": cty.StringVal("static c 1"),
"val1": cty.StringVal("dynamic b 1"),
}),
cty.ObjectVal(map[string]cty.Value{
"val0": cty.StringVal("dynamic c 2"),
"val1": cty.StringVal("dynamic b 1"),
}),
cty.ObjectVal(map[string]cty.Value{
"val0": cty.StringVal("dynamic c 3"),
"val1": cty.StringVal("dynamic b 1"),
}),
}),
cty.ListVal([]cty.Value{
cty.ObjectVal(map[string]cty.Value{
"val0": cty.StringVal("dynamic c nested 0"),
"val1": cty.StringVal("foo"),
}),
cty.ObjectVal(map[string]cty.Value{
"val0": cty.StringVal("dynamic c nested 1"),
"val1": cty.StringVal("foo"),
}),
}),
cty.ListVal([]cty.Value{
// This one comes from a dynamic block with an unknown for_each
// value, so we produce a single block object with all of the
// leaf attribute values set to unknown values.
cty.ObjectVal(map[string]cty.Value{
"val0": cty.UnknownVal(cty.String),
"val1": cty.UnknownVal(cty.String),
}),
}),
})
if !got.RawEquals(want) {
t.Errorf("wrong result\ngot: %#v\nwant: %#v", got, want)
}
})
}

215
ext/dynblock/expand_spec.go Normal file
View File

@ -0,0 +1,215 @@
package dynblock
import (
"fmt"
"github.com/hashicorp/hcl2/hcl"
"github.com/zclconf/go-cty/cty"
"github.com/zclconf/go-cty/cty/convert"
)
type expandSpec struct {
blockType string
blockTypeRange hcl.Range
defRange hcl.Range
forEachVal cty.Value
iteratorName string
labelExprs []hcl.Expression
contentBody hcl.Body
inherited map[string]*iteration
}
func (b *expandBody) decodeSpec(blockS *hcl.BlockHeaderSchema, rawSpec *hcl.Block) (*expandSpec, hcl.Diagnostics) {
var diags hcl.Diagnostics
var schema *hcl.BodySchema
if len(blockS.LabelNames) != 0 {
schema = dynamicBlockBodySchemaLabels
} else {
schema = dynamicBlockBodySchemaNoLabels
}
specContent, specDiags := rawSpec.Body.Content(schema)
diags = append(diags, specDiags...)
if specDiags.HasErrors() {
return nil, diags
}
//// for_each attribute
eachAttr := specContent.Attributes["for_each"]
eachVal, eachDiags := eachAttr.Expr.Value(b.forEachCtx)
diags = append(diags, eachDiags...)
if !eachVal.CanIterateElements() && eachVal.Type() != cty.DynamicPseudoType {
// We skip this error for DynamicPseudoType because that means we either
// have a null (which is checked immediately below) or an unknown
// (which is handled in the expandBody Content methods).
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Invalid dynamic for_each value",
Detail: fmt.Sprintf("Cannot use a %s value in for_each. An iterable collection is required.", eachVal.Type().FriendlyName()),
Subject: eachAttr.Expr.Range().Ptr(),
Expression: eachAttr.Expr,
EvalContext: b.forEachCtx,
})
return nil, diags
}
if eachVal.IsNull() {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Invalid dynamic for_each value",
Detail: "Cannot use a null value in for_each.",
Subject: eachAttr.Expr.Range().Ptr(),
Expression: eachAttr.Expr,
EvalContext: b.forEachCtx,
})
return nil, diags
}
//// iterator attribute
iteratorName := blockS.Type
if iteratorAttr := specContent.Attributes["iterator"]; iteratorAttr != nil {
itTraversal, itDiags := hcl.AbsTraversalForExpr(iteratorAttr.Expr)
diags = append(diags, itDiags...)
if itDiags.HasErrors() {
return nil, diags
}
if len(itTraversal) != 1 {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Invalid dynamic iterator name",
Detail: "Dynamic iterator must be a single variable name.",
Subject: itTraversal.SourceRange().Ptr(),
})
return nil, diags
}
iteratorName = itTraversal.RootName()
}
var labelExprs []hcl.Expression
if labelsAttr := specContent.Attributes["labels"]; labelsAttr != nil {
var labelDiags hcl.Diagnostics
labelExprs, labelDiags = hcl.ExprList(labelsAttr.Expr)
diags = append(diags, labelDiags...)
if labelDiags.HasErrors() {
return nil, diags
}
if len(labelExprs) > len(blockS.LabelNames) {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Extraneous dynamic block label",
Detail: fmt.Sprintf("Blocks of type %q require %d label(s).", blockS.Type, len(blockS.LabelNames)),
Subject: labelExprs[len(blockS.LabelNames)].Range().Ptr(),
})
return nil, diags
} else if len(labelExprs) < len(blockS.LabelNames) {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Insufficient dynamic block labels",
Detail: fmt.Sprintf("Blocks of type %q require %d label(s).", blockS.Type, len(blockS.LabelNames)),
Subject: labelsAttr.Expr.Range().Ptr(),
})
return nil, diags
}
}
// Since our schema requests only blocks of type "content", we can assume
// that all entries in specContent.Blocks are content blocks.
if len(specContent.Blocks) == 0 {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Missing dynamic content block",
Detail: "A dynamic block must have a nested block of type \"content\" to describe the body of each generated block.",
Subject: &specContent.MissingItemRange,
})
return nil, diags
}
if len(specContent.Blocks) > 1 {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Extraneous dynamic content block",
Detail: "Only one nested content block is allowed for each dynamic block.",
Subject: &specContent.Blocks[1].DefRange,
})
return nil, diags
}
return &expandSpec{
blockType: blockS.Type,
blockTypeRange: rawSpec.LabelRanges[0],
defRange: rawSpec.DefRange,
forEachVal: eachVal,
iteratorName: iteratorName,
labelExprs: labelExprs,
contentBody: specContent.Blocks[0].Body,
}, diags
}
func (s *expandSpec) newBlock(i *iteration, ctx *hcl.EvalContext) (*hcl.Block, hcl.Diagnostics) {
var diags hcl.Diagnostics
var labels []string
var labelRanges []hcl.Range
lCtx := i.EvalContext(ctx)
for _, labelExpr := range s.labelExprs {
labelVal, labelDiags := labelExpr.Value(lCtx)
diags = append(diags, labelDiags...)
if labelDiags.HasErrors() {
return nil, diags
}
var convErr error
labelVal, convErr = convert.Convert(labelVal, cty.String)
if convErr != nil {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Invalid dynamic block label",
Detail: fmt.Sprintf("Cannot use this value as a dynamic block label: %s.", convErr),
Subject: labelExpr.Range().Ptr(),
Expression: labelExpr,
EvalContext: lCtx,
})
return nil, diags
}
if labelVal.IsNull() {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Invalid dynamic block label",
Detail: "Cannot use a null value as a dynamic block label.",
Subject: labelExpr.Range().Ptr(),
Expression: labelExpr,
EvalContext: lCtx,
})
return nil, diags
}
if !labelVal.IsKnown() {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Invalid dynamic block label",
Detail: "This value is not yet known. Dynamic block labels must be immediately-known values.",
Subject: labelExpr.Range().Ptr(),
Expression: labelExpr,
EvalContext: lCtx,
})
return nil, diags
}
labels = append(labels, labelVal.AsString())
labelRanges = append(labelRanges, labelExpr.Range())
}
block := &hcl.Block{
Type: s.blockType,
TypeRange: s.blockTypeRange,
Labels: labels,
LabelRanges: labelRanges,
DefRange: s.defRange,
Body: s.contentBody,
}
return block, diags
}

42
ext/dynblock/expr_wrap.go Normal file
View File

@ -0,0 +1,42 @@
package dynblock
import (
"github.com/hashicorp/hcl2/hcl"
"github.com/zclconf/go-cty/cty"
)
type exprWrap struct {
hcl.Expression
i *iteration
}
func (e exprWrap) Variables() []hcl.Traversal {
raw := e.Expression.Variables()
ret := make([]hcl.Traversal, 0, len(raw))
// Filter out traversals that refer to our iterator name or any
// iterator we've inherited; we're going to provide those in
// our Value wrapper, so the caller doesn't need to know about them.
for _, traversal := range raw {
rootName := traversal.RootName()
if rootName == e.i.IteratorName {
continue
}
if _, inherited := e.i.Inherited[rootName]; inherited {
continue
}
ret = append(ret, traversal)
}
return ret
}
func (e exprWrap) Value(ctx *hcl.EvalContext) (cty.Value, hcl.Diagnostics) {
extCtx := e.i.EvalContext(ctx)
return e.Expression.Value(extCtx)
}
// UnwrapExpression returns the expression being wrapped by this instance.
// This allows the original expression to be recovered by hcl.UnwrapExpression.
func (e exprWrap) UnwrapExpression() hcl.Expression {
return e.Expression
}

66
ext/dynblock/iteration.go Normal file
View File

@ -0,0 +1,66 @@
package dynblock
import (
"github.com/hashicorp/hcl2/hcl"
"github.com/zclconf/go-cty/cty"
)
type iteration struct {
IteratorName string
Key cty.Value
Value cty.Value
Inherited map[string]*iteration
}
func (s *expandSpec) MakeIteration(key, value cty.Value) *iteration {
return &iteration{
IteratorName: s.iteratorName,
Key: key,
Value: value,
Inherited: s.inherited,
}
}
func (i *iteration) Object() cty.Value {
return cty.ObjectVal(map[string]cty.Value{
"key": i.Key,
"value": i.Value,
})
}
func (i *iteration) EvalContext(base *hcl.EvalContext) *hcl.EvalContext {
new := base.NewChild()
if i != nil {
new.Variables = map[string]cty.Value{}
for name, otherIt := range i.Inherited {
new.Variables[name] = otherIt.Object()
}
new.Variables[i.IteratorName] = i.Object()
}
return new
}
func (i *iteration) MakeChild(iteratorName string, key, value cty.Value) *iteration {
if i == nil {
// Create entirely new root iteration, then
return &iteration{
IteratorName: iteratorName,
Key: key,
Value: value,
}
}
inherited := map[string]*iteration{}
for name, otherIt := range i.Inherited {
inherited[name] = otherIt
}
inherited[i.IteratorName] = i
return &iteration{
IteratorName: iteratorName,
Key: key,
Value: value,
Inherited: inherited,
}
}

44
ext/dynblock/public.go Normal file
View File

@ -0,0 +1,44 @@
package dynblock
import (
"github.com/hashicorp/hcl2/hcl"
)
// Expand "dynamic" blocks in the given body, returning a new body that
// has those blocks expanded.
//
// The given EvalContext is used when evaluating "for_each" and "labels"
// attributes within dynamic blocks, allowing those expressions access to
// variables and functions beyond the iterator variable created by the
// iteration.
//
// Expand returns no diagnostics because no blocks are actually expanded
// until a call to Content or PartialContent on the returned body, which
// will then expand only the blocks selected by the schema.
//
// "dynamic" blocks are also expanded automatically within nested blocks
// in the given body, including within other dynamic blocks, thus allowing
// multi-dimensional iteration. However, it is not possible to
// dynamically-generate the "dynamic" blocks themselves except through nesting.
//
// parent {
// dynamic "child" {
// for_each = child_objs
// content {
// dynamic "grandchild" {
// for_each = child.value.children
// labels = [grandchild.key]
// content {
// parent_key = child.key
// value = grandchild.value
// }
// }
// }
// }
// }
func Expand(body hcl.Body, ctx *hcl.EvalContext) hcl.Body {
return &expandBody{
original: body,
forEachCtx: ctx,
}
}

50
ext/dynblock/schema.go Normal file
View File

@ -0,0 +1,50 @@
package dynblock
import "github.com/hashicorp/hcl2/hcl"
var dynamicBlockHeaderSchema = hcl.BlockHeaderSchema{
Type: "dynamic",
LabelNames: []string{"type"},
}
var dynamicBlockBodySchemaLabels = &hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "for_each",
Required: true,
},
{
Name: "iterator",
Required: false,
},
{
Name: "labels",
Required: true,
},
},
Blocks: []hcl.BlockHeaderSchema{
{
Type: "content",
LabelNames: nil,
},
},
}
var dynamicBlockBodySchemaNoLabels = &hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "for_each",
Required: true,
},
{
Name: "iterator",
Required: false,
},
},
Blocks: []hcl.BlockHeaderSchema{
{
Type: "content",
LabelNames: nil,
},
},
}

View File

@ -0,0 +1,84 @@
package dynblock
import (
"github.com/hashicorp/hcl2/hcl"
"github.com/zclconf/go-cty/cty"
)
// unknownBody is a funny body that just reports everything inside it as
// unknown. It uses a given other body as a sort of template for what attributes
// and blocks are inside -- including source location information -- but
// subsitutes unknown values of unknown type for all attributes.
//
// This rather odd process is used to handle expansion of dynamic blocks whose
// for_each expression is unknown. Since a block cannot itself be unknown,
// we instead arrange for everything _inside_ the block to be unknown instead,
// to give the best possible approximation.
type unknownBody struct {
template hcl.Body
}
var _ hcl.Body = unknownBody{}
func (b unknownBody) Content(schema *hcl.BodySchema) (*hcl.BodyContent, hcl.Diagnostics) {
content, diags := b.template.Content(schema)
content = b.fixupContent(content)
// We're intentionally preserving the diagnostics reported from the
// inner body so that we can still report where the template body doesn't
// match the requested schema.
return content, diags
}
func (b unknownBody) PartialContent(schema *hcl.BodySchema) (*hcl.BodyContent, hcl.Body, hcl.Diagnostics) {
content, remain, diags := b.template.PartialContent(schema)
content = b.fixupContent(content)
remain = unknownBody{remain} // remaining content must also be wrapped
// We're intentionally preserving the diagnostics reported from the
// inner body so that we can still report where the template body doesn't
// match the requested schema.
return content, remain, diags
}
func (b unknownBody) JustAttributes() (hcl.Attributes, hcl.Diagnostics) {
attrs, diags := b.template.JustAttributes()
attrs = b.fixupAttrs(attrs)
// We're intentionally preserving the diagnostics reported from the
// inner body so that we can still report where the template body doesn't
// match the requested schema.
return attrs, diags
}
func (b unknownBody) MissingItemRange() hcl.Range {
return b.template.MissingItemRange()
}
func (b unknownBody) fixupContent(got *hcl.BodyContent) *hcl.BodyContent {
ret := &hcl.BodyContent{}
ret.Attributes = b.fixupAttrs(got.Attributes)
if len(got.Blocks) > 0 {
ret.Blocks = make(hcl.Blocks, 0, len(got.Blocks))
for _, gotBlock := range got.Blocks {
new := *gotBlock // shallow copy
new.Body = unknownBody{gotBlock.Body} // nested content must also be marked unknown
ret.Blocks = append(ret.Blocks, &new)
}
}
return ret
}
func (b unknownBody) fixupAttrs(got hcl.Attributes) hcl.Attributes {
if len(got) == 0 {
return nil
}
ret := make(hcl.Attributes, len(got))
for name, gotAttr := range got {
new := *gotAttr // shallow copy
new.Expr = hcl.StaticExpr(cty.DynamicVal, gotAttr.Expr.Range())
ret[name] = &new
}
return ret
}

209
ext/dynblock/variables.go Normal file
View File

@ -0,0 +1,209 @@
package dynblock
import (
"github.com/hashicorp/hcl2/hcl"
"github.com/zclconf/go-cty/cty"
)
// WalkVariables begins the recursive process of walking all expressions and
// nested blocks in the given body and its child bodies while taking into
// account any "dynamic" blocks.
//
// This function requires that the caller walk through the nested block
// structure in the given body level-by-level so that an appropriate schema
// can be provided at each level to inform further processing. This workflow
// is thus easiest to use for calling applications that have some higher-level
// schema representation available with which to drive this multi-step
// process. If your application uses the hcldec package, you may be able to
// use VariablesHCLDec instead for a more automatic approach.
func WalkVariables(body hcl.Body) WalkVariablesNode {
return WalkVariablesNode{
body: body,
includeContent: true,
}
}
// WalkExpandVariables is like Variables but it includes only the variables
// required for successful block expansion, ignoring any variables referenced
// inside block contents. The result is the minimal set of all variables
// required for a call to Expand, excluding variables that would only be
// needed to subsequently call Content or PartialContent on the expanded
// body.
func WalkExpandVariables(body hcl.Body) WalkVariablesNode {
return WalkVariablesNode{
body: body,
}
}
type WalkVariablesNode struct {
body hcl.Body
it *iteration
includeContent bool
}
type WalkVariablesChild struct {
BlockTypeName string
Node WalkVariablesNode
}
// Body returns the HCL Body associated with the child node, in case the caller
// wants to do some sort of inspection of it in order to decide what schema
// to pass to Visit.
//
// Most implementations should just fetch a fixed schema based on the
// BlockTypeName field and not access this. Deciding on a schema dynamically
// based on the body is a strange thing to do and generally necessary only if
// your caller is already doing other bizarre things with HCL bodies.
func (c WalkVariablesChild) Body() hcl.Body {
return c.Node.body
}
// Visit returns the variable traversals required for any "dynamic" blocks
// directly in the body associated with this node, and also returns any child
// nodes that must be visited in order to continue the walk.
//
// Each child node has its associated block type name given in its BlockTypeName
// field, which the calling application should use to determine the appropriate
// schema for the content of each child node and pass it to the child node's
// own Visit method to continue the walk recursively.
func (n WalkVariablesNode) Visit(schema *hcl.BodySchema) (vars []hcl.Traversal, children []WalkVariablesChild) {
extSchema := n.extendSchema(schema)
container, _, _ := n.body.PartialContent(extSchema)
if container == nil {
return vars, children
}
children = make([]WalkVariablesChild, 0, len(container.Blocks))
if n.includeContent {
for _, attr := range container.Attributes {
for _, traversal := range attr.Expr.Variables() {
var ours, inherited bool
if n.it != nil {
ours = traversal.RootName() == n.it.IteratorName
_, inherited = n.it.Inherited[traversal.RootName()]
}
if !(ours || inherited) {
vars = append(vars, traversal)
}
}
}
}
for _, block := range container.Blocks {
switch block.Type {
case "dynamic":
blockTypeName := block.Labels[0]
inner, _, _ := block.Body.PartialContent(variableDetectionInnerSchema)
if inner == nil {
continue
}
iteratorName := blockTypeName
if attr, exists := inner.Attributes["iterator"]; exists {
iterTraversal, _ := hcl.AbsTraversalForExpr(attr.Expr)
if len(iterTraversal) == 0 {
// Ignore this invalid dynamic block, since it'll produce
// an error if someone tries to extract content from it
// later anyway.
continue
}
iteratorName = iterTraversal.RootName()
}
blockIt := n.it.MakeChild(iteratorName, cty.DynamicVal, cty.DynamicVal)
if attr, exists := inner.Attributes["for_each"]; exists {
// Filter out iterator names inherited from parent blocks
for _, traversal := range attr.Expr.Variables() {
if _, inherited := blockIt.Inherited[traversal.RootName()]; !inherited {
vars = append(vars, traversal)
}
}
}
if attr, exists := inner.Attributes["labels"]; exists {
// Filter out both our own iterator name _and_ those inherited
// from parent blocks, since we provide _both_ of these to the
// label expressions.
for _, traversal := range attr.Expr.Variables() {
ours := traversal.RootName() == iteratorName
_, inherited := blockIt.Inherited[traversal.RootName()]
if !(ours || inherited) {
vars = append(vars, traversal)
}
}
}
for _, contentBlock := range inner.Blocks {
// We only request "content" blocks in our schema, so we know
// any blocks we find here will be content blocks. We require
// exactly one content block for actual expansion, but we'll
// be more liberal here so that callers can still collect
// variables from erroneous "dynamic" blocks.
children = append(children, WalkVariablesChild{
BlockTypeName: blockTypeName,
Node: WalkVariablesNode{
body: contentBlock.Body,
it: blockIt,
includeContent: n.includeContent,
},
})
}
default:
children = append(children, WalkVariablesChild{
BlockTypeName: block.Type,
Node: WalkVariablesNode{
body: block.Body,
it: n.it,
includeContent: n.includeContent,
},
})
}
}
return vars, children
}
func (n WalkVariablesNode) extendSchema(schema *hcl.BodySchema) *hcl.BodySchema {
// We augment the requested schema to also include our special "dynamic"
// block type, since then we'll get instances of it interleaved with
// all of the literal child blocks we must also include.
extSchema := &hcl.BodySchema{
Attributes: schema.Attributes,
Blocks: make([]hcl.BlockHeaderSchema, len(schema.Blocks), len(schema.Blocks)+1),
}
copy(extSchema.Blocks, schema.Blocks)
extSchema.Blocks = append(extSchema.Blocks, dynamicBlockHeaderSchema)
return extSchema
}
// This is a more relaxed schema than what's in schema.go, since we
// want to maximize the amount of variables we can find even if there
// are erroneous blocks.
var variableDetectionInnerSchema = &hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "for_each",
Required: false,
},
{
Name: "labels",
Required: false,
},
{
Name: "iterator",
Required: false,
},
},
Blocks: []hcl.BlockHeaderSchema{
{
Type: "content",
},
},
}

View File

@ -0,0 +1,43 @@
package dynblock
import (
"github.com/hashicorp/hcl2/hcl"
"github.com/hashicorp/hcl2/hcldec"
)
// VariablesHCLDec is a wrapper around WalkVariables that uses the given hcldec
// specification to automatically drive the recursive walk through nested
// blocks in the given body.
//
// This is a drop-in replacement for hcldec.Variables which is able to treat
// blocks of type "dynamic" in the same special way that dynblock.Expand would,
// exposing both the variables referenced in the "for_each" and "labels"
// arguments and variables used in the nested "content" block.
func VariablesHCLDec(body hcl.Body, spec hcldec.Spec) []hcl.Traversal {
rootNode := WalkVariables(body)
return walkVariablesWithHCLDec(rootNode, spec)
}
// ExpandVariablesHCLDec is like VariablesHCLDec but it includes only the
// minimal set of variables required to call Expand, ignoring variables that
// are referenced only inside normal block contents. See WalkExpandVariables
// for more information.
func ExpandVariablesHCLDec(body hcl.Body, spec hcldec.Spec) []hcl.Traversal {
rootNode := WalkExpandVariables(body)
return walkVariablesWithHCLDec(rootNode, spec)
}
func walkVariablesWithHCLDec(node WalkVariablesNode, spec hcldec.Spec) []hcl.Traversal {
vars, children := node.Visit(hcldec.ImpliedSchema(spec))
if len(children) > 0 {
childSpecs := hcldec.ChildBlockTypes(spec)
for _, child := range children {
if childSpec, exists := childSpecs[child.BlockTypeName]; exists {
vars = append(vars, walkVariablesWithHCLDec(child.Node, childSpec)...)
}
}
}
return vars
}

View File

@ -0,0 +1,155 @@
package dynblock
import (
"reflect"
"testing"
"github.com/hashicorp/hcl2/hcldec"
"github.com/zclconf/go-cty/cty"
"github.com/davecgh/go-spew/spew"
"github.com/hashicorp/hcl2/hcl"
"github.com/hashicorp/hcl2/hcl/hclsyntax"
)
func TestVariables(t *testing.T) {
const src = `
# We have some references to things inside the "val" attribute inside each
# of our "b" blocks, which should be included in the result of WalkVariables
# but not WalkExpandVariables.
a {
dynamic "b" {
for_each = [for i, v in some_list_0: "${i}=${v},${baz}"]
labels = ["${b.value} ${something_else_0}"]
content {
val = "${b.value} ${something_else_1}"
}
}
}
dynamic "a" {
for_each = some_list_1
content {
b "foo" {
val = "${a.value} ${something_else_2}"
}
dynamic "b" {
for_each = some_list_2
iterator = dyn_b
labels = ["${a.value} ${dyn_b.value} ${b} ${something_else_3}"]
content {
val = "${a.value} ${dyn_b.value} ${something_else_4}"
}
}
}
}
dynamic "a" {
for_each = some_list_3
iterator = dyn_a
content {
b "foo" {
val = "${dyn_a.value} ${something_else_5}"
}
dynamic "b" {
for_each = some_list_4
labels = ["${dyn_a.value} ${b.value} ${a} ${something_else_6}"]
content {
val = "${dyn_a.value} ${b.value} ${something_else_7}"
}
}
}
}
`
f, diags := hclsyntax.ParseConfig([]byte(src), "", hcl.Pos{})
if len(diags) != 0 {
t.Errorf("unexpected diagnostics during parse")
for _, diag := range diags {
t.Logf("- %s", diag)
}
return
}
spec := &hcldec.BlockListSpec{
TypeName: "a",
Nested: &hcldec.BlockMapSpec{
TypeName: "b",
LabelNames: []string{"key"},
Nested: &hcldec.AttrSpec{
Name: "val",
Type: cty.String,
},
},
}
t.Run("WalkVariables", func(t *testing.T) {
traversals := VariablesHCLDec(f.Body, spec)
got := make([]string, len(traversals))
for i, traversal := range traversals {
got[i] = traversal.RootName()
}
// The block structure is traversed one level at a time, so the ordering
// here is reflecting first a pass of the root, then the first child
// under the root, then the first child under that, etc.
want := []string{
"some_list_1",
"some_list_3",
"some_list_0",
"baz",
"something_else_0",
"something_else_1", // Would not be included for WalkExpandVariables because it only appears in content
"some_list_2",
"b", // This is correct because it is referenced in a context where the iterator is overridden to be dyn_b
"something_else_3",
"something_else_2", // Would not be included for WalkExpandVariables because it only appears in content
"something_else_4", // Would not be included for WalkExpandVariables because it only appears in content
"some_list_4",
"a", // This is correct because it is referenced in a context where the iterator is overridden to be dyn_a
"something_else_6",
"something_else_5", // Would not be included for WalkExpandVariables because it only appears in content
"something_else_7", // Would not be included for WalkExpandVariables because it only appears in content
}
if !reflect.DeepEqual(got, want) {
t.Errorf("wrong result\ngot: %swant: %s", spew.Sdump(got), spew.Sdump(want))
}
})
t.Run("WalkExpandVariables", func(t *testing.T) {
traversals := ExpandVariablesHCLDec(f.Body, spec)
got := make([]string, len(traversals))
for i, traversal := range traversals {
got[i] = traversal.RootName()
}
// The block structure is traversed one level at a time, so the ordering
// here is reflecting first a pass of the root, then the first child
// under the root, then the first child under that, etc.
want := []string{
"some_list_1",
"some_list_3",
"some_list_0",
"baz",
"something_else_0",
"some_list_2",
"b", // This is correct because it is referenced in a context where the iterator is overridden to be dyn_b
"something_else_3",
"some_list_4",
"a", // This is correct because it is referenced in a context where the iterator is overridden to be dyn_a
"something_else_6",
}
if !reflect.DeepEqual(got, want) {
t.Errorf("wrong result\ngot: %swant: %s", spew.Sdump(got), spew.Sdump(want))
}
})
}

12
ext/include/doc.go Normal file
View File

@ -0,0 +1,12 @@
// Package include implements a HCL extension that allows inclusion of
// one HCL body into another using blocks of type "include", with the following
// structure:
//
// include {
// path = "./foo.hcl"
// }
//
// The processing of the given path is delegated to the calling application,
// allowing it to decide how to interpret the path and which syntaxes to
// support for referenced files.
package include

View File

@ -0,0 +1,52 @@
package include
import (
"path/filepath"
"strings"
"github.com/hashicorp/hcl2/hcl"
"github.com/hashicorp/hcl2/hclparse"
)
// FileResolver creates and returns a Resolver that interprets include paths
// as filesystem paths relative to the calling configuration file.
//
// When an include is requested, the source filename of the calling config
// file is first interpreted relative to the given basePath, and then the
// path given in configuration is interpreted relative to the resulting
// absolute caller configuration directory.
//
// This resolver assumes that all calling bodies are loaded from local files
// and that the paths to these files were correctly provided to the parser,
// either absolute or relative to the given basePath.
//
// If the path given in configuration ends with ".json" then the referenced
// file is interpreted as JSON. Otherwise, it is interpreted as HCL native
// syntax.
func FileResolver(baseDir string, parser *hclparse.Parser) Resolver {
return &fileResolver{
BaseDir: baseDir,
Parser: parser,
}
}
type fileResolver struct {
BaseDir string
Parser *hclparse.Parser
}
func (r fileResolver) ResolveBodyPath(path string, refRange hcl.Range) (hcl.Body, hcl.Diagnostics) {
callerFile := filepath.Join(r.BaseDir, refRange.Filename)
callerDir := filepath.Dir(callerFile)
targetFile := filepath.Join(callerDir, path)
var f *hcl.File
var diags hcl.Diagnostics
if strings.HasSuffix(targetFile, ".json") {
f, diags = r.Parser.ParseJSONFile(targetFile)
} else {
f, diags = r.Parser.ParseHCLFile(targetFile)
}
return f.Body, diags
}

View File

@ -0,0 +1,29 @@
package include
import (
"fmt"
"github.com/hashicorp/hcl2/hcl"
)
// MapResolver returns a Resolver that consults the given map for preloaded
// bodies (the values) associated with static include paths (the keys).
//
// An error diagnostic is returned if a path is requested that does not appear
// as a key in the given map.
func MapResolver(m map[string]hcl.Body) Resolver {
return ResolverFunc(func(path string, refRange hcl.Range) (hcl.Body, hcl.Diagnostics) {
if body, ok := m[path]; ok {
return body, nil
}
return nil, hcl.Diagnostics{
{
Severity: hcl.DiagError,
Summary: "Invalid include path",
Detail: fmt.Sprintf("The include path %q is not recognized.", path),
Subject: &refRange,
},
}
})
}

28
ext/include/resolver.go Normal file
View File

@ -0,0 +1,28 @@
package include
import (
"github.com/hashicorp/hcl2/hcl"
)
// A Resolver maps an include path (an arbitrary string, but usually something
// filepath-like) to a hcl.Body.
//
// The parameter "refRange" is the source range of the expression in the calling
// body that provided the given path, for use in generating "invalid path"-type
// diagnostics.
//
// If the returned body is nil, it will be ignored.
//
// Any returned diagnostics will be emitted when content is requested from the
// final composed body (after all includes have been dealt with).
type Resolver interface {
ResolveBodyPath(path string, refRange hcl.Range) (hcl.Body, hcl.Diagnostics)
}
// ResolverFunc is a function type that implements Resolver.
type ResolverFunc func(path string, refRange hcl.Range) (hcl.Body, hcl.Diagnostics)
// ResolveBodyPath is an implementation of Resolver.ResolveBodyPath.
func (f ResolverFunc) ResolveBodyPath(path string, refRange hcl.Range) (hcl.Body, hcl.Diagnostics) {
return f(path, refRange)
}

View File

@ -0,0 +1,92 @@
package include
import (
"github.com/hashicorp/hcl2/ext/transform"
"github.com/hashicorp/hcl2/gohcl"
"github.com/hashicorp/hcl2/hcl"
)
// Transformer builds a transformer that finds any "include" blocks in a body
// and produces a merged body that contains the original content plus the
// content of the other bodies referenced by the include blocks.
//
// blockType specifies the type of block to interpret. The conventional type name
// is "include".
//
// ctx provides an evaluation context for the path expressions in include blocks.
// If nil, path expressions may not reference variables nor functions.
//
// The given resolver is used to translate path strings (after expression
// evaluation) into bodies. FileResolver returns a reasonable implementation for
// applications that read configuration files from local disk.
//
// The returned Transformer can either be used directly to process includes
// in a shallow fashion on a single body, or it can be used with
// transform.Deep (from the sibling transform package) to allow includes
// at all levels of a nested block structure:
//
// transformer = include.Transformer("include", nil, include.FileResolver(".", parser))
// body = transform.Deep(body, transformer)
// // "body" will now have includes resolved in its own content and that
// // of any descendent blocks.
//
func Transformer(blockType string, ctx *hcl.EvalContext, resolver Resolver) transform.Transformer {
return &transformer{
Schema: &hcl.BodySchema{
Blocks: []hcl.BlockHeaderSchema{
{
Type: blockType,
},
},
},
Ctx: ctx,
Resolver: resolver,
}
}
type transformer struct {
Schema *hcl.BodySchema
Ctx *hcl.EvalContext
Resolver Resolver
}
func (t *transformer) TransformBody(in hcl.Body) hcl.Body {
content, remain, diags := in.PartialContent(t.Schema)
if content == nil || len(content.Blocks) == 0 {
// Nothing to do!
return transform.BodyWithDiagnostics(remain, diags)
}
bodies := make([]hcl.Body, 1, len(content.Blocks)+1)
bodies[0] = remain // content in "remain" takes priority over includes
for _, block := range content.Blocks {
incContent, incDiags := block.Body.Content(includeBlockSchema)
diags = append(diags, incDiags...)
if incDiags.HasErrors() {
continue
}
pathExpr := incContent.Attributes["path"].Expr
var path string
incDiags = gohcl.DecodeExpression(pathExpr, t.Ctx, &path)
diags = append(diags, incDiags...)
if incDiags.HasErrors() {
continue
}
incBody, incDiags := t.Resolver.ResolveBodyPath(path, pathExpr.Range())
bodies = append(bodies, transform.BodyWithDiagnostics(incBody, incDiags))
}
return hcl.MergeBodies(bodies)
}
var includeBlockSchema = &hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "path",
Required: true,
},
},
}

View File

@ -0,0 +1,112 @@
package include
import (
"reflect"
"testing"
"github.com/davecgh/go-spew/spew"
"github.com/hashicorp/hcl2/gohcl"
"github.com/hashicorp/hcl2/hcl"
"github.com/hashicorp/hcl2/hcltest"
"github.com/zclconf/go-cty/cty"
)
func TestTransformer(t *testing.T) {
caller := hcltest.MockBody(&hcl.BodyContent{
Blocks: hcl.Blocks{
{
Type: "include",
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"path": hcltest.MockExprVariable("var_path"),
}),
}),
},
{
Type: "include",
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"path": hcltest.MockExprLiteral(cty.StringVal("include2")),
}),
}),
},
{
Type: "foo",
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"from": hcltest.MockExprLiteral(cty.StringVal("caller")),
}),
}),
},
},
})
resolver := MapResolver(map[string]hcl.Body{
"include1": hcltest.MockBody(&hcl.BodyContent{
Blocks: hcl.Blocks{
{
Type: "foo",
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"from": hcltest.MockExprLiteral(cty.StringVal("include1")),
}),
}),
},
},
}),
"include2": hcltest.MockBody(&hcl.BodyContent{
Blocks: hcl.Blocks{
{
Type: "foo",
Body: hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"from": hcltest.MockExprLiteral(cty.StringVal("include2")),
}),
}),
},
},
}),
})
ctx := &hcl.EvalContext{
Variables: map[string]cty.Value{
"var_path": cty.StringVal("include1"),
},
}
transformer := Transformer("include", ctx, resolver)
merged := transformer.TransformBody(caller)
type foo struct {
From string `hcl:"from,attr"`
}
type result struct {
Foos []foo `hcl:"foo,block"`
}
var got result
diags := gohcl.DecodeBody(merged, nil, &got)
if len(diags) != 0 {
t.Errorf("unexpected diags")
for _, diag := range diags {
t.Logf("- %s", diag)
}
}
want := result{
Foos: []foo{
{
From: "caller",
},
{
From: "include1",
},
{
From: "include2",
},
},
}
if !reflect.DeepEqual(got, want) {
t.Errorf("wrong result\ngot: %swant: %s", spew.Sdump(got), spew.Sdump(want))
}
}

7
ext/transform/doc.go Normal file
View File

@ -0,0 +1,7 @@
// Package transform is a helper package for writing extensions that work
// by applying transforms to bodies.
//
// It defines a type for body transformers, and then provides utilities in
// terms of that type for working with transformers, including recursively
// applying such transforms as heirarchical block structures are extracted.
package transform

108
ext/transform/error.go Normal file
View File

@ -0,0 +1,108 @@
package transform
import (
"github.com/hashicorp/hcl2/hcl"
)
// NewErrorBody returns a hcl.Body that returns the given diagnostics whenever
// any of its content-access methods are called.
//
// The given diagnostics must have at least one diagnostic of severity
// hcl.DiagError, or this function will panic.
//
// This can be used to prepare a return value for a Transformer that
// can't complete due to an error. While the transform itself will succeed,
// the error will be returned as soon as a caller attempts to extract content
// from the resulting body.
func NewErrorBody(diags hcl.Diagnostics) hcl.Body {
if !diags.HasErrors() {
panic("NewErrorBody called without any error diagnostics")
}
return diagBody{
Diags: diags,
}
}
// BodyWithDiagnostics returns a hcl.Body that wraps another hcl.Body
// and emits the given diagnostics for any content-extraction method.
//
// Unlike the result of NewErrorBody, a body with diagnostics still runs
// the extraction actions on the underlying body if (and only if) the given
// diagnostics do not contain errors, but prepends the given diagnostics with
// any diagnostics produced by the action.
//
// If the given diagnostics is empty, the given body is returned verbatim.
//
// This function is intended for conveniently reporting errors and/or warnings
// produced during a transform, ensuring that they will be seen when the
// caller eventually extracts content from the returned body.
func BodyWithDiagnostics(body hcl.Body, diags hcl.Diagnostics) hcl.Body {
if len(diags) == 0 {
// nothing to do!
return body
}
return diagBody{
Diags: diags,
Wrapped: body,
}
}
type diagBody struct {
Diags hcl.Diagnostics
Wrapped hcl.Body
}
func (b diagBody) Content(schema *hcl.BodySchema) (*hcl.BodyContent, hcl.Diagnostics) {
if b.Diags.HasErrors() {
return b.emptyContent(), b.Diags
}
content, wrappedDiags := b.Wrapped.Content(schema)
diags := make(hcl.Diagnostics, 0, len(b.Diags)+len(wrappedDiags))
diags = append(diags, b.Diags...)
diags = append(diags, wrappedDiags...)
return content, diags
}
func (b diagBody) PartialContent(schema *hcl.BodySchema) (*hcl.BodyContent, hcl.Body, hcl.Diagnostics) {
if b.Diags.HasErrors() {
return b.emptyContent(), b.Wrapped, b.Diags
}
content, remain, wrappedDiags := b.Wrapped.PartialContent(schema)
diags := make(hcl.Diagnostics, 0, len(b.Diags)+len(wrappedDiags))
diags = append(diags, b.Diags...)
diags = append(diags, wrappedDiags...)
return content, remain, diags
}
func (b diagBody) JustAttributes() (hcl.Attributes, hcl.Diagnostics) {
if b.Diags.HasErrors() {
return nil, b.Diags
}
attributes, wrappedDiags := b.Wrapped.JustAttributes()
diags := make(hcl.Diagnostics, 0, len(b.Diags)+len(wrappedDiags))
diags = append(diags, b.Diags...)
diags = append(diags, wrappedDiags...)
return attributes, diags
}
func (b diagBody) MissingItemRange() hcl.Range {
if b.Wrapped != nil {
return b.Wrapped.MissingItemRange()
}
// Placeholder. This should never be seen in practice because decoding
// a diagBody without a wrapped body should always produce an error.
return hcl.Range{
Filename: "<empty>",
}
}
func (b diagBody) emptyContent() *hcl.BodyContent {
return &hcl.BodyContent{
MissingItemRange: b.MissingItemRange(),
}
}

View File

@ -0,0 +1,83 @@
package transform
import (
"github.com/hashicorp/hcl2/hcl"
)
// Shallow is equivalent to calling transformer.TransformBody(body), and
// is provided only for completeness of the top-level API.
func Shallow(body hcl.Body, transformer Transformer) hcl.Body {
return transformer.TransformBody(body)
}
// Deep applies the given transform to the given body and then
// wraps the result such that any descendent blocks that are decoded will
// also have the transform applied to their bodies.
//
// This allows for language extensions that define a particular block type
// for a particular body and all nested blocks within it.
//
// Due to the wrapping behavior, the body resulting from this function
// will not be of the type returned by the transformer. Callers may call
// only the methods defined for interface hcl.Body, and may not type-assert
// to access other methods.
func Deep(body hcl.Body, transformer Transformer) hcl.Body {
return deepWrapper{
Transformed: transformer.TransformBody(body),
Transformer: transformer,
}
}
// deepWrapper is a hcl.Body implementation that ensures that a given
// transformer is applied to another given body when content is extracted,
// and that it recursively applies to any child blocks that are extracted.
type deepWrapper struct {
Transformed hcl.Body
Transformer Transformer
}
func (w deepWrapper) Content(schema *hcl.BodySchema) (*hcl.BodyContent, hcl.Diagnostics) {
content, diags := w.Transformed.Content(schema)
content = w.transformContent(content)
return content, diags
}
func (w deepWrapper) PartialContent(schema *hcl.BodySchema) (*hcl.BodyContent, hcl.Body, hcl.Diagnostics) {
content, remain, diags := w.Transformed.PartialContent(schema)
content = w.transformContent(content)
return content, remain, diags
}
func (w deepWrapper) transformContent(content *hcl.BodyContent) *hcl.BodyContent {
if len(content.Blocks) == 0 {
// Easy path: if there are no blocks then there are no child bodies to wrap
return content
}
// Since we're going to change things here, we'll be polite and clone the
// structure so that we don't risk impacting any internal state of the
// original body.
ret := &hcl.BodyContent{
Attributes: content.Attributes,
MissingItemRange: content.MissingItemRange,
Blocks: make(hcl.Blocks, len(content.Blocks)),
}
for i, givenBlock := range content.Blocks {
// Shallow-copy the block so we can mutate it
newBlock := *givenBlock
newBlock.Body = Deep(newBlock.Body, w.Transformer)
ret.Blocks[i] = &newBlock
}
return ret
}
func (w deepWrapper) JustAttributes() (hcl.Attributes, hcl.Diagnostics) {
// Attributes can't have bodies or nested blocks, so this is just a thin wrapper.
return w.Transformed.JustAttributes()
}
func (w deepWrapper) MissingItemRange() hcl.Range {
return w.Transformed.MissingItemRange()
}

View File

@ -0,0 +1,102 @@
package transform
import (
"testing"
"reflect"
"github.com/hashicorp/hcl2/hcl"
"github.com/hashicorp/hcl2/hcltest"
"github.com/zclconf/go-cty/cty"
)
// Assert that deepWrapper implements Body
var deepWrapperIsBody hcl.Body = deepWrapper{}
func TestDeep(t *testing.T) {
testTransform := TransformerFunc(func(body hcl.Body) hcl.Body {
_, remain, diags := body.PartialContent(&hcl.BodySchema{
Blocks: []hcl.BlockHeaderSchema{
{
Type: "remove",
},
},
})
return BodyWithDiagnostics(remain, diags)
})
src := hcltest.MockBody(&hcl.BodyContent{
Attributes: hcltest.MockAttrs(map[string]hcl.Expression{
"true": hcltest.MockExprLiteral(cty.True),
}),
Blocks: []*hcl.Block{
{
Type: "remove",
Body: hcl.EmptyBody(),
},
{
Type: "child",
Body: hcltest.MockBody(&hcl.BodyContent{
Blocks: []*hcl.Block{
{
Type: "remove",
},
},
}),
},
},
})
wrapped := Deep(src, testTransform)
rootContent, diags := wrapped.Content(&hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "true",
},
},
Blocks: []hcl.BlockHeaderSchema{
{
Type: "child",
},
},
})
if len(diags) != 0 {
t.Errorf("unexpected diagnostics for root content")
for _, diag := range diags {
t.Logf("- %s", diag)
}
}
wantAttrs := hcltest.MockAttrs(map[string]hcl.Expression{
"true": hcltest.MockExprLiteral(cty.True),
})
if !reflect.DeepEqual(rootContent.Attributes, wantAttrs) {
t.Errorf("wrong root attributes\ngot: %#v\nwant: %#v", rootContent.Attributes, wantAttrs)
}
if got, want := len(rootContent.Blocks), 1; got != want {
t.Fatalf("wrong number of root blocks %d; want %d", got, want)
}
if got, want := rootContent.Blocks[0].Type, "child"; got != want {
t.Errorf("wrong block type %s; want %s", got, want)
}
childBlock := rootContent.Blocks[0]
childContent, diags := childBlock.Body.Content(&hcl.BodySchema{})
if len(diags) != 0 {
t.Errorf("unexpected diagnostics for child content")
for _, diag := range diags {
t.Logf("- %s", diag)
}
}
if len(childContent.Attributes) != 0 {
t.Errorf("unexpected attributes in child content; want empty content")
}
if len(childContent.Blocks) != 0 {
t.Errorf("unexpected blocks in child content; want empty content")
}
}

View File

@ -0,0 +1,40 @@
package transform
import (
"github.com/hashicorp/hcl2/hcl"
)
// A Transformer takes a given body, applies some (possibly no-op)
// transform to it, and returns the new body.
//
// It must _not_ mutate the given body in-place.
//
// The transform call cannot fail, but it _can_ return a body that immediately
// returns diagnostics when its methods are called. NewErrorBody is a utility
// to help with this.
type Transformer interface {
TransformBody(hcl.Body) hcl.Body
}
// TransformerFunc is a function type that implements Transformer.
type TransformerFunc func(hcl.Body) hcl.Body
// TransformBody is an implementation of Transformer.TransformBody.
func (f TransformerFunc) TransformBody(in hcl.Body) hcl.Body {
return f(in)
}
type chain []Transformer
// Chain takes a slice of transformers and returns a single new
// Transformer that applies each of the given transformers in sequence.
func Chain(c []Transformer) Transformer {
return chain(c)
}
func (c chain) TransformBody(body hcl.Body) hcl.Body {
for _, t := range c {
body = t.TransformBody(body)
}
return body
}

67
ext/typeexpr/README.md Normal file
View File

@ -0,0 +1,67 @@
# HCL Type Expressions Extension
This HCL extension defines a convention for describing HCL types using function
call and variable reference syntax, allowing configuration formats to include
type information provided by users.
The type syntax is processed statically from a hcl.Expression, so it cannot
use any of the usual language operators. This is similar to type expressions
in statically-typed programming languages.
```hcl
variable "example" {
type = list(string)
}
```
The extension is built using the `hcl.ExprAsKeyword` and `hcl.ExprCall`
functions, and so it relies on the underlying syntax to define how "keyword"
and "call" are interpreted. The above shows how they are interpreted in
the HCL native syntax, while the following shows the same information
expressed in JSON:
```json
{
"variable": {
"example": {
"type": "list(string)"
}
}
}
```
Notice that since we have additional contextual information that we intend
to allow only calls and keywords the JSON syntax is able to parse the given
string directly as an expression, rather than as a template as would be
the case for normal expression evaluation.
For more information, see [the godoc reference](http://godoc.org/github.com/hashicorp/hcl2/ext/typeexpr).
## Type Expression Syntax
When expressed in the native syntax, the following expressions are permitted
in a type expression:
* `string` - string
* `bool` - boolean
* `number` - number
* `any` - `cty.DynamicPseudoType` (in function `TypeConstraint` only)
* `list(<type_expr>)` - list of the type given as an argument
* `set(<type_expr>)` - set of the type given as an argument
* `map(<type_expr>)` - map of the type given as an argument
* `tuple([<type_exprs...>])` - tuple with the element types given in the single list argument
* `object({<attr_name>=<type_expr>, ...}` - object with the attributes and corresponding types given in the single map argument
For example:
* `list(string)`
* `object({name=string,age=number})`
* `map(object({name=string,age=number}))`
Note that the object constructor syntax is not fully-general for all possible
object types because it requires the attribute names to be valid identifiers.
In practice it is expected that any time an object type is being fixed for
type checking it will be one that has identifiers as its attributes; object
types with weird attributes generally show up only from arbitrary object
constructors in configuration files, which are usually treated either as maps
or as the dynamic pseudo-type.

11
ext/typeexpr/doc.go Normal file
View File

@ -0,0 +1,11 @@
// Package typeexpr extends HCL with a convention for describing HCL types
// within configuration files.
//
// The type syntax is processed statically from a hcl.Expression, so it cannot
// use any of the usual language operators. This is similar to type expressions
// in statically-typed programming languages.
//
// variable "example" {
// type = list(string)
// }
package typeexpr

196
ext/typeexpr/get_type.go Normal file
View File

@ -0,0 +1,196 @@
package typeexpr
import (
"fmt"
"github.com/hashicorp/hcl2/hcl"
"github.com/zclconf/go-cty/cty"
)
const invalidTypeSummary = "Invalid type specification"
// getType is the internal implementation of both Type and TypeConstraint,
// using the passed flag to distinguish. When constraint is false, the "any"
// keyword will produce an error.
func getType(expr hcl.Expression, constraint bool) (cty.Type, hcl.Diagnostics) {
// First we'll try for one of our keywords
kw := hcl.ExprAsKeyword(expr)
switch kw {
case "bool":
return cty.Bool, nil
case "string":
return cty.String, nil
case "number":
return cty.Number, nil
case "any":
if constraint {
return cty.DynamicPseudoType, nil
}
return cty.DynamicPseudoType, hcl.Diagnostics{{
Severity: hcl.DiagError,
Summary: invalidTypeSummary,
Detail: fmt.Sprintf("The keyword %q cannot be used in this type specification: an exact type is required.", kw),
Subject: expr.Range().Ptr(),
}}
case "list", "map", "set":
return cty.DynamicPseudoType, hcl.Diagnostics{{
Severity: hcl.DiagError,
Summary: invalidTypeSummary,
Detail: fmt.Sprintf("The %s type constructor requires one argument specifying the element type.", kw),
Subject: expr.Range().Ptr(),
}}
case "object":
return cty.DynamicPseudoType, hcl.Diagnostics{{
Severity: hcl.DiagError,
Summary: invalidTypeSummary,
Detail: "The object type constructor requires one argument specifying the attribute types and values as a map.",
Subject: expr.Range().Ptr(),
}}
case "tuple":
return cty.DynamicPseudoType, hcl.Diagnostics{{
Severity: hcl.DiagError,
Summary: invalidTypeSummary,
Detail: "The tuple type constructor requires one argument specifying the element types as a list.",
Subject: expr.Range().Ptr(),
}}
case "":
// okay! we'll fall through and try processing as a call, then.
default:
return cty.DynamicPseudoType, hcl.Diagnostics{{
Severity: hcl.DiagError,
Summary: invalidTypeSummary,
Detail: fmt.Sprintf("The keyword %q is not a valid type specification.", kw),
Subject: expr.Range().Ptr(),
}}
}
// If we get down here then our expression isn't just a keyword, so we'll
// try to process it as a call instead.
call, diags := hcl.ExprCall(expr)
if diags.HasErrors() {
return cty.DynamicPseudoType, hcl.Diagnostics{{
Severity: hcl.DiagError,
Summary: invalidTypeSummary,
Detail: "A type specification is either a primitive type keyword (bool, number, string) or a complex type constructor call, like list(string).",
Subject: expr.Range().Ptr(),
}}
}
switch call.Name {
case "bool", "string", "number", "any":
return cty.DynamicPseudoType, hcl.Diagnostics{{
Severity: hcl.DiagError,
Summary: invalidTypeSummary,
Detail: fmt.Sprintf("Primitive type keyword %q does not expect arguments.", call.Name),
Subject: &call.ArgsRange,
}}
}
if len(call.Arguments) != 1 {
contextRange := call.ArgsRange
subjectRange := call.ArgsRange
if len(call.Arguments) > 1 {
// If we have too many arguments (as opposed to too _few_) then
// we'll highlight the extraneous arguments as the diagnostic
// subject.
subjectRange = hcl.RangeBetween(call.Arguments[1].Range(), call.Arguments[len(call.Arguments)-1].Range())
}
switch call.Name {
case "list", "set", "map":
return cty.DynamicPseudoType, hcl.Diagnostics{{
Severity: hcl.DiagError,
Summary: invalidTypeSummary,
Detail: fmt.Sprintf("The %s type constructor requires one argument specifying the element type.", call.Name),
Subject: &subjectRange,
Context: &contextRange,
}}
case "object":
return cty.DynamicPseudoType, hcl.Diagnostics{{
Severity: hcl.DiagError,
Summary: invalidTypeSummary,
Detail: "The object type constructor requires one argument specifying the attribute types and values as a map.",
Subject: &subjectRange,
Context: &contextRange,
}}
case "tuple":
return cty.DynamicPseudoType, hcl.Diagnostics{{
Severity: hcl.DiagError,
Summary: invalidTypeSummary,
Detail: "The tuple type constructor requires one argument specifying the element types as a list.",
Subject: &subjectRange,
Context: &contextRange,
}}
}
}
switch call.Name {
case "list":
ety, diags := getType(call.Arguments[0], constraint)
return cty.List(ety), diags
case "set":
ety, diags := getType(call.Arguments[0], constraint)
return cty.Set(ety), diags
case "map":
ety, diags := getType(call.Arguments[0], constraint)
return cty.Map(ety), diags
case "object":
attrDefs, diags := hcl.ExprMap(call.Arguments[0])
if diags.HasErrors() {
return cty.DynamicPseudoType, hcl.Diagnostics{{
Severity: hcl.DiagError,
Summary: invalidTypeSummary,
Detail: "Object type constructor requires a map whose keys are attribute names and whose values are the corresponding attribute types.",
Subject: call.Arguments[0].Range().Ptr(),
Context: expr.Range().Ptr(),
}}
}
atys := make(map[string]cty.Type)
for _, attrDef := range attrDefs {
attrName := hcl.ExprAsKeyword(attrDef.Key)
if attrName == "" {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: invalidTypeSummary,
Detail: "Object constructor map keys must be attribute names.",
Subject: attrDef.Key.Range().Ptr(),
Context: expr.Range().Ptr(),
})
continue
}
aty, attrDiags := getType(attrDef.Value, constraint)
diags = append(diags, attrDiags...)
atys[attrName] = aty
}
return cty.Object(atys), diags
case "tuple":
elemDefs, diags := hcl.ExprList(call.Arguments[0])
if diags.HasErrors() {
return cty.DynamicPseudoType, hcl.Diagnostics{{
Severity: hcl.DiagError,
Summary: invalidTypeSummary,
Detail: "Tuple type constructor requires a list of element types.",
Subject: call.Arguments[0].Range().Ptr(),
Context: expr.Range().Ptr(),
}}
}
etys := make([]cty.Type, len(elemDefs))
for i, defExpr := range elemDefs {
ety, elemDiags := getType(defExpr, constraint)
diags = append(diags, elemDiags...)
etys[i] = ety
}
return cty.Tuple(etys), diags
default:
// Can't access call.Arguments in this path because we've not validated
// that it contains exactly one expression here.
return cty.DynamicPseudoType, hcl.Diagnostics{{
Severity: hcl.DiagError,
Summary: invalidTypeSummary,
Detail: fmt.Sprintf("Keyword %q is not a valid type constructor.", call.Name),
Subject: expr.Range().Ptr(),
}}
}
}

View File

@ -0,0 +1,352 @@
package typeexpr
import (
"testing"
"github.com/hashicorp/hcl2/gohcl"
"github.com/hashicorp/hcl2/hcl"
"github.com/hashicorp/hcl2/hcl/hclsyntax"
"github.com/hashicorp/hcl2/hcl/json"
"github.com/zclconf/go-cty/cty"
)
func TestGetType(t *testing.T) {
tests := []struct {
Source string
Constraint bool
Want cty.Type
WantError string
}{
// keywords
{
`bool`,
false,
cty.Bool,
"",
},
{
`number`,
false,
cty.Number,
"",
},
{
`string`,
false,
cty.String,
"",
},
{
`any`,
false,
cty.DynamicPseudoType,
`The keyword "any" cannot be used in this type specification: an exact type is required.`,
},
{
`any`,
true,
cty.DynamicPseudoType,
"",
},
{
`list`,
false,
cty.DynamicPseudoType,
"The list type constructor requires one argument specifying the element type.",
},
{
`map`,
false,
cty.DynamicPseudoType,
"The map type constructor requires one argument specifying the element type.",
},
{
`set`,
false,
cty.DynamicPseudoType,
"The set type constructor requires one argument specifying the element type.",
},
{
`object`,
false,
cty.DynamicPseudoType,
"The object type constructor requires one argument specifying the attribute types and values as a map.",
},
{
`tuple`,
false,
cty.DynamicPseudoType,
"The tuple type constructor requires one argument specifying the element types as a list.",
},
// constructors
{
`bool()`,
false,
cty.DynamicPseudoType,
`Primitive type keyword "bool" does not expect arguments.`,
},
{
`number()`,
false,
cty.DynamicPseudoType,
`Primitive type keyword "number" does not expect arguments.`,
},
{
`string()`,
false,
cty.DynamicPseudoType,
`Primitive type keyword "string" does not expect arguments.`,
},
{
`any()`,
false,
cty.DynamicPseudoType,
`Primitive type keyword "any" does not expect arguments.`,
},
{
`any()`,
true,
cty.DynamicPseudoType,
`Primitive type keyword "any" does not expect arguments.`,
},
{
`list(string)`,
false,
cty.List(cty.String),
``,
},
{
`set(string)`,
false,
cty.Set(cty.String),
``,
},
{
`map(string)`,
false,
cty.Map(cty.String),
``,
},
{
`list()`,
false,
cty.DynamicPseudoType,
`The list type constructor requires one argument specifying the element type.`,
},
{
`list(string, string)`,
false,
cty.DynamicPseudoType,
`The list type constructor requires one argument specifying the element type.`,
},
{
`list(any)`,
false,
cty.List(cty.DynamicPseudoType),
`The keyword "any" cannot be used in this type specification: an exact type is required.`,
},
{
`list(any)`,
true,
cty.List(cty.DynamicPseudoType),
``,
},
{
`object({})`,
false,
cty.EmptyObject,
``,
},
{
`object({name=string})`,
false,
cty.Object(map[string]cty.Type{"name": cty.String}),
``,
},
{
`object({"name"=string})`,
false,
cty.EmptyObject,
`Object constructor map keys must be attribute names.`,
},
{
`object({name=nope})`,
false,
cty.Object(map[string]cty.Type{"name": cty.DynamicPseudoType}),
`The keyword "nope" is not a valid type specification.`,
},
{
`object()`,
false,
cty.DynamicPseudoType,
`The object type constructor requires one argument specifying the attribute types and values as a map.`,
},
{
`object(string)`,
false,
cty.DynamicPseudoType,
`Object type constructor requires a map whose keys are attribute names and whose values are the corresponding attribute types.`,
},
{
`tuple([])`,
false,
cty.EmptyTuple,
``,
},
{
`tuple([string, bool])`,
false,
cty.Tuple([]cty.Type{cty.String, cty.Bool}),
``,
},
{
`tuple([nope])`,
false,
cty.Tuple([]cty.Type{cty.DynamicPseudoType}),
`The keyword "nope" is not a valid type specification.`,
},
{
`tuple()`,
false,
cty.DynamicPseudoType,
`The tuple type constructor requires one argument specifying the element types as a list.`,
},
{
`tuple(string)`,
false,
cty.DynamicPseudoType,
`Tuple type constructor requires a list of element types.`,
},
{
`shwoop(string)`,
false,
cty.DynamicPseudoType,
`Keyword "shwoop" is not a valid type constructor.`,
},
{
`list("string")`,
false,
cty.List(cty.DynamicPseudoType),
`A type specification is either a primitive type keyword (bool, number, string) or a complex type constructor call, like list(string).`,
},
// More interesting combinations
{
`list(object({}))`,
false,
cty.List(cty.EmptyObject),
``,
},
{
`list(map(tuple([])))`,
false,
cty.List(cty.Map(cty.EmptyTuple)),
``,
},
}
for _, test := range tests {
t.Run(test.Source, func(t *testing.T) {
expr, diags := hclsyntax.ParseExpression([]byte(test.Source), "", hcl.Pos{Line: 1, Column: 1})
if diags.HasErrors() {
t.Fatalf("failed to parse: %s", diags)
}
got, diags := getType(expr, test.Constraint)
if test.WantError == "" {
for _, diag := range diags {
t.Error(diag)
}
} else {
found := false
for _, diag := range diags {
t.Log(diag)
if diag.Severity == hcl.DiagError && diag.Detail == test.WantError {
found = true
}
}
if !found {
t.Errorf("missing expected error detail message: %s", test.WantError)
}
}
if !got.Equals(test.Want) {
t.Errorf("wrong result\ngot: %#v\nwant: %#v", got, test.Want)
}
})
}
}
func TestGetTypeJSON(t *testing.T) {
// We have fewer test cases here because we're mainly exercising the
// extra indirection in the JSON syntax package, which ultimately calls
// into the native syntax parser (which we tested extensively in
// TestGetType).
tests := []struct {
Source string
Constraint bool
Want cty.Type
WantError string
}{
{
`{"expr":"bool"}`,
false,
cty.Bool,
"",
},
{
`{"expr":"list(bool)"}`,
false,
cty.List(cty.Bool),
"",
},
{
`{"expr":"list"}`,
false,
cty.DynamicPseudoType,
"The list type constructor requires one argument specifying the element type.",
},
}
for _, test := range tests {
t.Run(test.Source, func(t *testing.T) {
file, diags := json.Parse([]byte(test.Source), "")
if diags.HasErrors() {
t.Fatalf("failed to parse: %s", diags)
}
type TestContent struct {
Expr hcl.Expression `hcl:"expr"`
}
var content TestContent
diags = gohcl.DecodeBody(file.Body, nil, &content)
if diags.HasErrors() {
t.Fatalf("failed to decode: %s", diags)
}
got, diags := getType(content.Expr, test.Constraint)
if test.WantError == "" {
for _, diag := range diags {
t.Error(diag)
}
} else {
found := false
for _, diag := range diags {
t.Log(diag)
if diag.Severity == hcl.DiagError && diag.Detail == test.WantError {
found = true
}
}
if !found {
t.Errorf("missing expected error detail message: %s", test.WantError)
}
}
if !got.Equals(test.Want) {
t.Errorf("wrong result\ngot: %#v\nwant: %#v", got, test.Want)
}
})
}
}

129
ext/typeexpr/public.go Normal file
View File

@ -0,0 +1,129 @@
package typeexpr
import (
"bytes"
"fmt"
"sort"
"github.com/hashicorp/hcl2/hcl/hclsyntax"
"github.com/hashicorp/hcl2/hcl"
"github.com/zclconf/go-cty/cty"
)
// Type attempts to process the given expression as a type expression and, if
// successful, returns the resulting type. If unsuccessful, error diagnostics
// are returned.
func Type(expr hcl.Expression) (cty.Type, hcl.Diagnostics) {
return getType(expr, false)
}
// TypeConstraint attempts to parse the given expression as a type constraint
// and, if successful, returns the resulting type. If unsuccessful, error
// diagnostics are returned.
//
// A type constraint has the same structure as a type, but it additionally
// allows the keyword "any" to represent cty.DynamicPseudoType, which is often
// used as a wildcard in type checking and type conversion operations.
func TypeConstraint(expr hcl.Expression) (cty.Type, hcl.Diagnostics) {
return getType(expr, true)
}
// TypeString returns a string rendering of the given type as it would be
// expected to appear in the HCL native syntax.
//
// This is primarily intended for showing types to the user in an application
// that uses typexpr, where the user can be assumed to be familiar with the
// type expression syntax. In applications that do not use typeexpr these
// results may be confusing to the user and so type.FriendlyName may be
// preferable, even though it's less precise.
//
// TypeString produces reasonable results only for types like what would be
// produced by the Type and TypeConstraint functions. In particular, it cannot
// support capsule types.
func TypeString(ty cty.Type) string {
// Easy cases first
switch ty {
case cty.String:
return "string"
case cty.Bool:
return "bool"
case cty.Number:
return "number"
case cty.DynamicPseudoType:
return "any"
}
if ty.IsCapsuleType() {
panic("TypeString does not support capsule types")
}
if ty.IsCollectionType() {
ety := ty.ElementType()
etyString := TypeString(ety)
switch {
case ty.IsListType():
return fmt.Sprintf("list(%s)", etyString)
case ty.IsSetType():
return fmt.Sprintf("set(%s)", etyString)
case ty.IsMapType():
return fmt.Sprintf("map(%s)", etyString)
default:
// Should never happen because the above is exhaustive
panic("unsupported collection type")
}
}
if ty.IsObjectType() {
var buf bytes.Buffer
buf.WriteString("object({")
atys := ty.AttributeTypes()
names := make([]string, 0, len(atys))
for name := range atys {
names = append(names, name)
}
sort.Strings(names)
first := true
for _, name := range names {
aty := atys[name]
if !first {
buf.WriteByte(',')
}
if !hclsyntax.ValidIdentifier(name) {
// Should never happen for any type produced by this package,
// but we'll do something reasonable here just so we don't
// produce garbage if someone gives us a hand-assembled object
// type that has weird attribute names.
// Using Go-style quoting here isn't perfect, since it doesn't
// exactly match HCL syntax, but it's fine for an edge-case.
buf.WriteString(fmt.Sprintf("%q", name))
} else {
buf.WriteString(name)
}
buf.WriteByte('=')
buf.WriteString(TypeString(aty))
first = false
}
buf.WriteString("})")
return buf.String()
}
if ty.IsTupleType() {
var buf bytes.Buffer
buf.WriteString("tuple([")
etys := ty.TupleElementTypes()
first := true
for _, ety := range etys {
if !first {
buf.WriteByte(',')
}
buf.WriteString(TypeString(ety))
first = false
}
buf.WriteString("])")
return buf.String()
}
// Should never happen because we covered all cases above.
panic(fmt.Errorf("unsupported type %#v", ty))
}

View File

@ -0,0 +1,100 @@
package typeexpr
import (
"testing"
"github.com/zclconf/go-cty/cty"
)
func TestTypeString(t *testing.T) {
tests := []struct {
Type cty.Type
Want string
}{
{
cty.DynamicPseudoType,
"any",
},
{
cty.String,
"string",
},
{
cty.Number,
"number",
},
{
cty.Bool,
"bool",
},
{
cty.List(cty.Number),
"list(number)",
},
{
cty.Set(cty.Bool),
"set(bool)",
},
{
cty.Map(cty.String),
"map(string)",
},
{
cty.EmptyObject,
"object({})",
},
{
cty.Object(map[string]cty.Type{"foo": cty.Bool}),
"object({foo=bool})",
},
{
cty.Object(map[string]cty.Type{"foo": cty.Bool, "bar": cty.String}),
"object({bar=string,foo=bool})",
},
{
cty.EmptyTuple,
"tuple([])",
},
{
cty.Tuple([]cty.Type{cty.Bool}),
"tuple([bool])",
},
{
cty.Tuple([]cty.Type{cty.Bool, cty.String}),
"tuple([bool,string])",
},
{
cty.List(cty.DynamicPseudoType),
"list(any)",
},
{
cty.Tuple([]cty.Type{cty.DynamicPseudoType}),
"tuple([any])",
},
{
cty.Object(map[string]cty.Type{"foo": cty.DynamicPseudoType}),
"object({foo=any})",
},
{
// We don't expect to find attributes that aren't valid identifiers
// because we only promise to support types that this package
// would've created, but we allow this situation during rendering
// just because it's convenient for applications trying to produce
// error messages about mismatched types. Note that the quoted
// attribute name is not actually accepted by our Type and
// TypeConstraint functions, so this is one situation where the
// TypeString result cannot be re-parsed by those functions.
cty.Object(map[string]cty.Type{"foo bar baz": cty.String}),
`object({"foo bar baz"=string})`,
},
}
for _, test := range tests {
t.Run(test.Type.GoString(), func(t *testing.T) {
got := TypeString(test.Type)
if got != test.Want {
t.Errorf("wrong result\ntype: %#v\ngot: %s\nwant: %s", test.Type, got, test.Want)
}
})
}
}

28
ext/userfunc/README.md Normal file
View File

@ -0,0 +1,28 @@
# HCL User Functions Extension
This HCL extension allows a calling application to support user-defined
functions.
Functions are defined via a specific block type, like this:
```hcl
function "add" {
params = [a, b]
result = a + b
}
function "list" {
params = []
variadic_param = items
result = items
}
```
The extension is implemented as a pre-processor for `cty.Body` objects. Given
a body that may contain functions, the `DecodeUserFunctions` function searches
for blocks that define functions and returns a functions map suitable for
inclusion in a `hcl.EvalContext`. It also returns a new `cty.Body` that
contains the remainder of the content from the given body, allowing for
further processing of remaining content.
For more information, see [the godoc reference](http://godoc.org/github.com/hashicorp/hcl2/ext/userfunc).

156
ext/userfunc/decode.go Normal file
View File

@ -0,0 +1,156 @@
package userfunc
import (
"github.com/hashicorp/hcl2/hcl"
"github.com/zclconf/go-cty/cty"
"github.com/zclconf/go-cty/cty/function"
)
var funcBodySchema = &hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "params",
Required: true,
},
{
Name: "variadic_param",
Required: false,
},
{
Name: "result",
Required: true,
},
},
}
func decodeUserFunctions(body hcl.Body, blockType string, contextFunc ContextFunc) (funcs map[string]function.Function, remain hcl.Body, diags hcl.Diagnostics) {
schema := &hcl.BodySchema{
Blocks: []hcl.BlockHeaderSchema{
{
Type: blockType,
LabelNames: []string{"name"},
},
},
}
content, remain, diags := body.PartialContent(schema)
if diags.HasErrors() {
return nil, remain, diags
}
// first call to getBaseCtx will populate context, and then the same
// context will be used for all subsequent calls. It's assumed that
// all functions in a given body should see an identical context.
var baseCtx *hcl.EvalContext
getBaseCtx := func() *hcl.EvalContext {
if baseCtx == nil {
if contextFunc != nil {
baseCtx = contextFunc()
}
}
// baseCtx might still be nil here, and that's okay
return baseCtx
}
funcs = make(map[string]function.Function)
Blocks:
for _, block := range content.Blocks {
name := block.Labels[0]
funcContent, funcDiags := block.Body.Content(funcBodySchema)
diags = append(diags, funcDiags...)
if funcDiags.HasErrors() {
continue
}
paramsExpr := funcContent.Attributes["params"].Expr
resultExpr := funcContent.Attributes["result"].Expr
var varParamExpr hcl.Expression
if funcContent.Attributes["variadic_param"] != nil {
varParamExpr = funcContent.Attributes["variadic_param"].Expr
}
var params []string
var varParam string
paramExprs, paramsDiags := hcl.ExprList(paramsExpr)
diags = append(diags, paramsDiags...)
if paramsDiags.HasErrors() {
continue
}
for _, paramExpr := range paramExprs {
param := hcl.ExprAsKeyword(paramExpr)
if param == "" {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Invalid param element",
Detail: "Each parameter name must be an identifier.",
Subject: paramExpr.Range().Ptr(),
})
continue Blocks
}
params = append(params, param)
}
if varParamExpr != nil {
varParam = hcl.ExprAsKeyword(varParamExpr)
if varParam == "" {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Invalid variadic_param",
Detail: "The variadic parameter name must be an identifier.",
Subject: varParamExpr.Range().Ptr(),
})
continue
}
}
spec := &function.Spec{}
for _, paramName := range params {
spec.Params = append(spec.Params, function.Parameter{
Name: paramName,
Type: cty.DynamicPseudoType,
})
}
if varParamExpr != nil {
spec.VarParam = &function.Parameter{
Name: varParam,
Type: cty.DynamicPseudoType,
}
}
impl := func(args []cty.Value) (cty.Value, error) {
ctx := getBaseCtx()
ctx = ctx.NewChild()
ctx.Variables = make(map[string]cty.Value)
// The cty function machinery guarantees that we have at least
// enough args to fill all of our params.
for i, paramName := range params {
ctx.Variables[paramName] = args[i]
}
if spec.VarParam != nil {
varArgs := args[len(params):]
ctx.Variables[varParam] = cty.TupleVal(varArgs)
}
result, diags := resultExpr.Value(ctx)
if diags.HasErrors() {
// Smuggle the diagnostics out via the error channel, since
// a diagnostics sequence implements error. Caller can
// type-assert this to recover the individual diagnostics
// if desired.
return cty.DynamicVal, diags
}
return result, nil
}
spec.Type = func(args []cty.Value) (cty.Type, error) {
val, err := impl(args)
return val.Type(), err
}
spec.Impl = func(args []cty.Value, retType cty.Type) (cty.Value, error) {
return impl(args)
}
funcs[name] = function.New(spec)
}
return funcs, remain, diags
}

174
ext/userfunc/decode_test.go Normal file
View File

@ -0,0 +1,174 @@
package userfunc
import (
"fmt"
"testing"
"github.com/hashicorp/hcl2/hcl"
"github.com/hashicorp/hcl2/hcl/hclsyntax"
"github.com/zclconf/go-cty/cty"
)
func TestDecodeUserFunctions(t *testing.T) {
tests := []struct {
src string
testExpr string
baseCtx *hcl.EvalContext
want cty.Value
diagCount int
}{
{
`
function "greet" {
params = [name]
result = "Hello, ${name}."
}
`,
`greet("Ermintrude")`,
nil,
cty.StringVal("Hello, Ermintrude."),
0,
},
{
`
function "greet" {
params = [name]
result = "Hello, ${name}."
}
`,
`greet()`,
nil,
cty.DynamicVal,
1, // missing value for "name"
},
{
`
function "greet" {
params = [name]
result = "Hello, ${name}."
}
`,
`greet("Ermintrude", "extra")`,
nil,
cty.DynamicVal,
1, // too many arguments
},
{
`
function "add" {
params = [a, b]
result = a + b
}
`,
`add(1, 5)`,
nil,
cty.NumberIntVal(6),
0,
},
{
`
function "argstuple" {
params = []
variadic_param = args
result = args
}
`,
`argstuple("a", true, 1)`,
nil,
cty.TupleVal([]cty.Value{cty.StringVal("a"), cty.True, cty.NumberIntVal(1)}),
0,
},
{
`
function "missing_var" {
params = []
result = nonexist
}
`,
`missing_var()`,
nil,
cty.DynamicVal,
1, // no variable named "nonexist"
},
{
`
function "closure" {
params = []
result = upvalue
}
`,
`closure()`,
&hcl.EvalContext{
Variables: map[string]cty.Value{
"upvalue": cty.True,
},
},
cty.True,
0,
},
{
`
function "neg" {
params = [val]
result = -val
}
function "add" {
params = [a, b]
result = a + b
}
`,
`neg(add(1, 3))`,
nil,
cty.NumberIntVal(-4),
0,
},
{
`
function "neg" {
parrams = [val]
result = -val
}
`,
`null`,
nil,
cty.NullVal(cty.DynamicPseudoType),
2, // missing attribute "params", and unknown attribute "parrams"
},
}
for i, test := range tests {
t.Run(fmt.Sprintf("%02d", i), func(t *testing.T) {
f, diags := hclsyntax.ParseConfig([]byte(test.src), "config", hcl.Pos{Line: 1, Column: 1})
if f == nil || f.Body == nil {
t.Fatalf("got nil file or body")
}
funcs, _, funcsDiags := decodeUserFunctions(f.Body, "function", func() *hcl.EvalContext {
return test.baseCtx
})
diags = append(diags, funcsDiags...)
expr, exprParseDiags := hclsyntax.ParseExpression([]byte(test.testExpr), "testexpr", hcl.Pos{Line: 1, Column: 1})
diags = append(diags, exprParseDiags...)
if expr == nil {
t.Fatalf("parsing test expr returned nil")
}
got, exprDiags := expr.Value(&hcl.EvalContext{
Functions: funcs,
})
diags = append(diags, exprDiags...)
if len(diags) != test.diagCount {
t.Errorf("wrong number of diagnostics %d; want %d", len(diags), test.diagCount)
for _, diag := range diags {
t.Logf("- %s", diag)
}
}
if !got.RawEquals(test.want) {
t.Errorf("wrong result\ngot: %#v\nwant: %#v", got, test.want)
}
})
}
}

22
ext/userfunc/doc.go Normal file
View File

@ -0,0 +1,22 @@
// Package userfunc implements a HCL extension that allows user-defined
// functions in HCL configuration.
//
// Using this extension requires some integration effort on the part of the
// calling application, to pass any declared functions into a HCL evaluation
// context after processing.
//
// The function declaration syntax looks like this:
//
// function "foo" {
// params = ["name"]
// result = "Hello, ${name}!"
// }
//
// When a user-defined function is called, the expression given for the "result"
// attribute is evaluated in an isolated evaluation context that defines variables
// named after the given parameter names.
//
// The block name "function" may be overridden by the calling application, if
// that default name conflicts with an existing block or attribute name in
// the application.
package userfunc

42
ext/userfunc/public.go Normal file
View File

@ -0,0 +1,42 @@
package userfunc
import (
"github.com/hashicorp/hcl2/hcl"
"github.com/zclconf/go-cty/cty/function"
)
// A ContextFunc is a callback used to produce the base EvalContext for
// running a particular set of functions.
//
// This is a function rather than an EvalContext directly to allow functions
// to be decoded before their context is complete. This will be true, for
// example, for applications that wish to allow functions to refer to themselves.
//
// The simplest use of a ContextFunc is to give user functions access to the
// same global variables and functions available elsewhere in an application's
// configuration language, but more complex applications may use different
// contexts to support lexical scoping depending on where in a configuration
// structure a function declaration is found, etc.
type ContextFunc func() *hcl.EvalContext
// DecodeUserFunctions looks for blocks of the given type in the given body
// and, for each one found, interprets it as a custom function definition.
//
// On success, the result is a mapping of function names to implementations,
// along with a new body that represents the remaining content of the given
// body which can be used for further processing.
//
// The result expression of each function is parsed during decoding but not
// evaluated until the function is called.
//
// If the given ContextFunc is non-nil, it will be called to obtain the
// context in which the function result expressions will be evaluated. If nil,
// or if it returns nil, the result expression will have access only to
// variables named after the declared parameters. A non-nil context turns
// the returned functions into closures, bound to the given context.
//
// If the returned diagnostics set has errors then the function map and
// remain body may be nil or incomplete.
func DecodeUserFunctions(body hcl.Body, blockType string, context ContextFunc) (funcs map[string]function.Function, remain hcl.Body, diags hcl.Diagnostics) {
return decodeUserFunctions(body, blockType, context)
}

View File

@ -0,0 +1,97 @@
{
"fileTypes": [
"hcl",
"hcldec"
],
"name": "HCL",
"patterns": [
{
"begin": "#|//",
"captures": {
"0": {
"name": "punctuation.definition.comment.hcl"
}
},
"comment": "Comments",
"end": "$\\n?",
"name": "comment.line.hcl"
},
{
"begin": "/\\*",
"captures": {
"0": {
"name": "punctuation.definition.comment.hcl"
}
},
"comment": "Block comments",
"end": "\\*/",
"name": "comment.block.hcl"
},
{
"begin": "{",
"beginCaptures": {
"0": {
"name": "punctuation.definition.block.hcl"
}
},
"comment": "Nested Blocks",
"end": "}",
"endCaptures": {
"0": {
"name": "punctuation.definition.block.hcl"
}
},
"name": "meta.block.hcl",
"patterns": [
{
"include": "$self"
}
]
},
{
"captures": {
"1": {
"name": "string.hcl punctuation.definition.string.begin.hcl"
},
"2": {
"name": "string.value.hcl"
},
"3": {
"name": "string.hcl punctuation.definition.string.end.hcl"
}
},
"comment": "Quoted Block Labels",
"match": "(\")([^\"]+)(\")"
},
{
"begin": "(\\w+)\\s*(=)\\s*",
"beginCaptures": {
"1": {
"name": "variable.other.assignment.hcl"
},
"2": {
"name": "keyword.operator.hcl"
}
},
"comment": "Attribute Definitions",
"end": "$",
"name": "meta.attr.hcl",
"patterns": [
{
"include": "source.hclexpr"
}
]
},
{
"captures": {
"0": {
"name": "keyword.other.hcl"
}
},
"comment": "Keywords",
"match": "[-\\w]+"
}
],
"scopeName": "source.hcl",
"uuid": "55e8075d-e2e3-4e44-8446-744a9860e476"
}

157
extras/grammar/HCL.tmLanguage Executable file
View File

@ -0,0 +1,157 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>fileTypes</key>
<array>
<string>hcl</string>
<string>hcldec</string>
</array>
<key>name</key>
<string>HCL</string>
<key>patterns</key>
<array>
<dict>
<key>begin</key>
<string>#|//</string>
<key>captures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>punctuation.definition.comment.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>Comments</string>
<key>end</key>
<string>$\n?</string>
<key>name</key>
<string>comment.line.hcl</string>
</dict>
<dict>
<key>begin</key>
<string>/\*</string>
<key>captures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>punctuation.definition.comment.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>Block comments</string>
<key>end</key>
<string>\*/</string>
<key>name</key>
<string>comment.block.hcl</string>
</dict>
<dict>
<key>begin</key>
<string>{</string>
<key>beginCaptures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>punctuation.definition.block.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>Nested Blocks</string>
<key>end</key>
<string>}</string>
<key>endCaptures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>punctuation.definition.block.hcl</string>
</dict>
</dict>
<key>name</key>
<string>meta.block.hcl</string>
<key>patterns</key>
<array>
<dict>
<key>include</key>
<string>$self</string>
</dict>
</array>
</dict>
<dict>
<key>captures</key>
<dict>
<key>1</key>
<dict>
<key>name</key>
<string>string.hcl punctuation.definition.string.begin.hcl</string>
</dict>
<key>2</key>
<dict>
<key>name</key>
<string>string.value.hcl</string>
</dict>
<key>3</key>
<dict>
<key>name</key>
<string>string.hcl punctuation.definition.string.end.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>Quoted Block Labels</string>
<key>match</key>
<string>(&#34;)([^&#34;]+)(&#34;)</string>
</dict>
<dict>
<key>begin</key>
<string>(\w+)\s*(=)\s*</string>
<key>beginCaptures</key>
<dict>
<key>1</key>
<dict>
<key>name</key>
<string>variable.other.assignment.hcl</string>
</dict>
<key>2</key>
<dict>
<key>name</key>
<string>keyword.operator.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>Attribute Definitions</string>
<key>end</key>
<string>$</string>
<key>name</key>
<string>meta.attr.hcl</string>
<key>patterns</key>
<array>
<dict>
<key>include</key>
<string>source.hclexpr</string>
</dict>
</array>
</dict>
<dict>
<key>captures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>keyword.other.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>Keywords</string>
<key>match</key>
<string>[-\w]+</string>
</dict>
</array>
<key>scopeName</key>
<string>source.hcl</string>
<key>uuid</key>
<string>55e8075d-e2e3-4e44-8446-744a9860e476</string>
</dict>
</plist>

View File

@ -0,0 +1,53 @@
name: HCL
scopeName: source.hcl
fileTypes: [hcl, hcldec]
uuid: 55e8075d-e2e3-4e44-8446-744a9860e476
patterns:
- comment: Comments
name: comment.line.hcl
begin: '#|//'
end: $\n?
captures:
'0': {name: punctuation.definition.comment.hcl}
- comment: Block comments
name: comment.block.hcl
begin: /\*
end: \*/
captures:
'0': {name: punctuation.definition.comment.hcl}
- comment: Nested Blocks
name: meta.block.hcl
begin: "{"
beginCaptures:
'0': {name: punctuation.definition.block.hcl}
end: "}"
endCaptures:
'0': {name: punctuation.definition.block.hcl}
patterns:
- include: "$self"
- comment: Quoted Block Labels
match: '(")([^"]+)(")'
captures:
'1': {name: string.hcl punctuation.definition.string.begin.hcl}
'2': {name: string.value.hcl}
'3': {name: string.hcl punctuation.definition.string.end.hcl}
- comment: Attribute Definitions
name: meta.attr.hcl
begin: '(\w+)\s*(=)\s*'
beginCaptures:
'1': {name: variable.other.assignment.hcl}
'2': {name: keyword.operator.hcl}
end: '$'
patterns:
- include: "source.hclexpr"
- comment: Keywords
match: '[-\w]+'
captures:
'0': {name: keyword.other.hcl}

View File

@ -0,0 +1,212 @@
{
"fileTypes": [],
"name": "HCL Expression",
"patterns": [
{
"begin": "#|//",
"captures": {
"0": {
"name": "punctuation.definition.comment.hcl"
}
},
"comment": "Comments",
"end": "$\\n?",
"name": "comment.line.hcl"
},
{
"begin": "/\\*",
"captures": {
"0": {
"name": "punctuation.definition.comment.hcl"
}
},
"comment": "Block comments",
"end": "\\*/",
"name": "comment.block.hcl"
},
{
"comment": "Language constants (true, false, null)",
"match": "\\b(true|false|null)\\b",
"name": "constant.language.hcl"
},
{
"comment": "Numbers",
"match": "\\b([0-9]+)(.[0-9]+)?([eE][0-9]+)?\\b",
"name": "constant.numeric.hcl"
},
{
"begin": "([-\\w]+)(\\()",
"beginCaptures": {
"1": {
"name": "keyword.other.function.inline.hcl"
},
"2": {
"name": "keyword.other.section.begin.hcl"
}
},
"comment": "Function Calls",
"end": "(\\))",
"endCaptures": {
"1": {
"name": "keyword.other.section.end.hcl"
}
},
"patterns": [
{
"include": "$self"
}
]
},
{
"captures": {
"0": {
"name": "variable.other.hcl"
}
},
"comment": "Variables and Attribute Names",
"match": "[-\\w]+"
},
{
"begin": "(?\u003e\\s*\u003c\u003c(\\w+))",
"beginCaptures": {
"0": {
"name": "punctuation.definition.string.begin.hcl"
},
"1": {
"name": "keyword.operator.heredoc.hcl"
}
},
"comment": "Heredoc Templates",
"end": "^\\s*\\1$",
"endCaptures": {
"0": {
"name": "punctuation.definition.string.end.hcl keyword.operator.heredoc.hcl"
}
},
"patterns": [
{
"include": "source.hcltemplate"
}
]
},
{
"begin": "\\\"",
"beginCaptures": {
"0": {
"name": "string.hcl punctuation.definition.string.begin.hcl"
}
},
"comment": "String Templates",
"end": "\\\"",
"endCaptures": {
"0": {
"name": "string.hcl punctuation.definition.string.end.hcl"
}
},
"patterns": [
{
"include": "source.hcltemplate"
},
{
"match": "(^\"|$\\{|%\\{)+",
"name": "string.quoted.double.hcl"
}
]
},
{
"captures": {
"0": {
"name": "keyword.operator.hcl"
}
},
"comment": "Operators",
"match": "(!=|==|\u003e=|\u003c=|\u0026\u0026|\\|\\||[-+*/%\u003c\u003e!?:])"
},
{
"begin": "\\(",
"beginCaptures": {
"0": {
"name": "meta.brace.round.hcl"
}
},
"comment": "Parentheses",
"end": "\\)",
"endCaptures": {
"0": {
"name": "meta.brace.round.hcl"
}
},
"patterns": [
{
"include": "$self"
}
]
},
{
"begin": "\\[",
"beginCaptures": {
"0": {
"name": "meta.brace.square.hcl"
}
},
"comment": "Tuple Constructor",
"end": "\\]",
"endCaptures": {
"0": {
"name": "meta.brace.square.hcl"
}
},
"patterns": [
{
"captures": {
"0": {
"name": "keyword.control.hcl"
}
},
"match": "(for|in)"
},
{
"include": "$self"
}
]
},
{
"begin": "\\{",
"beginCaptures": {
"0": {
"name": "meta.brace.curly.hcl"
}
},
"comment": "Object Constructor",
"end": "\\}",
"endCaptures": {
"0": {
"name": "meta.brace.curly.hcl"
}
},
"patterns": [
{
"captures": {
"0": {
"name": "keyword.control.hcl"
}
},
"match": "(for|in)"
},
{
"captures": {
"0": {
"name": "keyword.operator.hcl"
}
},
"match": "(=\u003e|\\.\\.\\.)"
},
{
"include": "$self"
}
]
}
],
"scopeName": "source.hclexpr",
"uuid": "6c358551-0381-4128-9ea3-277b21943b5c"
}

View File

@ -0,0 +1,336 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>fileTypes</key>
<array>
</array>
<key>name</key>
<string>HCL Expression</string>
<key>patterns</key>
<array>
<dict>
<key>begin</key>
<string>#|//</string>
<key>captures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>punctuation.definition.comment.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>Comments</string>
<key>end</key>
<string>$\n?</string>
<key>name</key>
<string>comment.line.hcl</string>
</dict>
<dict>
<key>begin</key>
<string>/\*</string>
<key>captures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>punctuation.definition.comment.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>Block comments</string>
<key>end</key>
<string>\*/</string>
<key>name</key>
<string>comment.block.hcl</string>
</dict>
<dict>
<key>comment</key>
<string>Language constants (true, false, null)</string>
<key>match</key>
<string>\b(true|false|null)\b</string>
<key>name</key>
<string>constant.language.hcl</string>
</dict>
<dict>
<key>comment</key>
<string>Numbers</string>
<key>match</key>
<string>\b([0-9]+)(.[0-9]+)?([eE][0-9]+)?\b</string>
<key>name</key>
<string>constant.numeric.hcl</string>
</dict>
<dict>
<key>begin</key>
<string>([-\w]+)(\()</string>
<key>beginCaptures</key>
<dict>
<key>1</key>
<dict>
<key>name</key>
<string>keyword.other.function.inline.hcl</string>
</dict>
<key>2</key>
<dict>
<key>name</key>
<string>keyword.other.section.begin.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>Function Calls</string>
<key>end</key>
<string>(\))</string>
<key>endCaptures</key>
<dict>
<key>1</key>
<dict>
<key>name</key>
<string>keyword.other.section.end.hcl</string>
</dict>
</dict>
<key>patterns</key>
<array>
<dict>
<key>include</key>
<string>$self</string>
</dict>
</array>
</dict>
<dict>
<key>captures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>variable.other.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>Variables and Attribute Names</string>
<key>match</key>
<string>[-\w]+</string>
</dict>
<dict>
<key>begin</key>
<string>(?&gt;\s*&lt;&lt;(\w+))</string>
<key>beginCaptures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>punctuation.definition.string.begin.hcl</string>
</dict>
<key>1</key>
<dict>
<key>name</key>
<string>keyword.operator.heredoc.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>Heredoc Templates</string>
<key>end</key>
<string>^\s*\1$</string>
<key>endCaptures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>punctuation.definition.string.end.hcl keyword.operator.heredoc.hcl</string>
</dict>
</dict>
<key>patterns</key>
<array>
<dict>
<key>include</key>
<string>source.hcltemplate</string>
</dict>
</array>
</dict>
<dict>
<key>begin</key>
<string>\&#34;</string>
<key>beginCaptures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>string.hcl punctuation.definition.string.begin.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>String Templates</string>
<key>end</key>
<string>\&#34;</string>
<key>endCaptures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>string.hcl punctuation.definition.string.end.hcl</string>
</dict>
</dict>
<key>patterns</key>
<array>
<dict>
<key>include</key>
<string>source.hcltemplate</string>
</dict>
<dict>
<key>match</key>
<string>(^&#34;|$\{|%\{)+</string>
<key>name</key>
<string>string.quoted.double.hcl</string>
</dict>
</array>
</dict>
<dict>
<key>captures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>keyword.operator.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>Operators</string>
<key>match</key>
<string>(!=|==|&gt;=|&lt;=|&amp;&amp;|\|\||[-+*/%&lt;&gt;!?:])</string>
</dict>
<dict>
<key>begin</key>
<string>\(</string>
<key>beginCaptures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>meta.brace.round.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>Parentheses</string>
<key>end</key>
<string>\)</string>
<key>endCaptures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>meta.brace.round.hcl</string>
</dict>
</dict>
<key>patterns</key>
<array>
<dict>
<key>include</key>
<string>$self</string>
</dict>
</array>
</dict>
<dict>
<key>begin</key>
<string>\[</string>
<key>beginCaptures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>meta.brace.square.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>Tuple Constructor</string>
<key>end</key>
<string>\]</string>
<key>endCaptures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>meta.brace.square.hcl</string>
</dict>
</dict>
<key>patterns</key>
<array>
<dict>
<key>captures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>keyword.control.hcl</string>
</dict>
</dict>
<key>match</key>
<string>(for|in)</string>
</dict>
<dict>
<key>include</key>
<string>$self</string>
</dict>
</array>
</dict>
<dict>
<key>begin</key>
<string>\{</string>
<key>beginCaptures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>meta.brace.curly.hcl</string>
</dict>
</dict>
<key>comment</key>
<string>Object Constructor</string>
<key>end</key>
<string>\}</string>
<key>endCaptures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>meta.brace.curly.hcl</string>
</dict>
</dict>
<key>patterns</key>
<array>
<dict>
<key>captures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>keyword.control.hcl</string>
</dict>
</dict>
<key>match</key>
<string>(for|in)</string>
</dict>
<dict>
<key>captures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>keyword.operator.hcl</string>
</dict>
</dict>
<key>match</key>
<string>(=&gt;|\.\.\.)</string>
</dict>
<dict>
<key>include</key>
<string>$self</string>
</dict>
</array>
</dict>
</array>
<key>scopeName</key>
<string>source.hclexpr</string>
<key>uuid</key>
<string>6c358551-0381-4128-9ea3-277b21943b5c</string>
</dict>
</plist>

View File

@ -0,0 +1,111 @@
name: HCL Expression
scopeName: source.hclexpr
fileTypes: []
uuid: 6c358551-0381-4128-9ea3-277b21943b5c
patterns:
- comment: Comments
name: comment.line.hcl
begin: '#|//'
end: $\n?
captures:
'0': {name: punctuation.definition.comment.hcl}
- comment: Block comments
name: comment.block.hcl
begin: /\*
end: \*/
captures:
'0': {name: punctuation.definition.comment.hcl}
- comment: Language constants (true, false, null)
name: constant.language.hcl
match: \b(true|false|null)\b
- comment: Numbers
name: constant.numeric.hcl
match: \b([0-9]+)(.[0-9]+)?([eE][0-9]+)?\b
- comment: Function Calls
begin: ([-\w]+)(\()
beginCaptures:
'1': {name: keyword.other.function.inline.hcl}
'2': {name: keyword.other.section.begin.hcl}
end: (\))
endCaptures:
'1': {name: keyword.other.section.end.hcl}
patterns:
- include: '$self'
- comment: Variables and Attribute Names
match: '[-\w]+'
captures:
'0': {name: variable.other.hcl}
- comment: Heredoc Templates
begin: (?>\s*<<(\w+))
beginCaptures:
'0': {name: punctuation.definition.string.begin.hcl}
'1': {name: keyword.operator.heredoc.hcl}
end: ^\s*\1$
endCaptures:
'0': {name: punctuation.definition.string.end.hcl keyword.operator.heredoc.hcl}
patterns:
- include: 'source.hcltemplate'
- comment: String Templates
begin: \"
beginCaptures:
'0': {name: string.hcl punctuation.definition.string.begin.hcl}
end: \"
endCaptures:
'0': {name: string.hcl punctuation.definition.string.end.hcl}
patterns:
- include: 'source.hcltemplate'
- match: '(^"|$\{|%\{)+'
name: "string.quoted.double.hcl"
- comment: Operators
match: '(!=|==|>=|<=|&&|\|\||[-+*/%<>!?:])'
captures:
'0': {name: keyword.operator.hcl}
- comment: Parentheses
begin: '\('
beginCaptures:
'0': {name: meta.brace.round.hcl}
end: '\)'
endCaptures:
'0': {name: meta.brace.round.hcl}
patterns:
- include: '$self'
- comment: Tuple Constructor
begin: '\['
beginCaptures:
'0': {name: meta.brace.square.hcl}
end: '\]'
endCaptures:
'0': {name: meta.brace.square.hcl}
patterns:
- match: '(for|in)'
captures:
'0': {name: keyword.control.hcl}
- include: '$self'
- comment: Object Constructor
begin: '\{'
beginCaptures:
'0': {name: meta.brace.curly.hcl}
end: '\}'
endCaptures:
'0': {name: meta.brace.curly.hcl}
patterns:
- match: '(for|in)'
captures:
'0': {name: keyword.control.hcl}
- match: '(=>|\.\.\.)'
captures:
'0': {name: keyword.operator.hcl}
- include: '$self'

View File

@ -0,0 +1,107 @@
{
"fileTypes": [
"tmpl"
],
"name": "HCL Template",
"patterns": [
{
"begin": "[^\\$]?(\\$\\{~?)",
"beginCaptures": {
"1": {
"name": "entity.tag.embedded.start.hcltemplate"
}
},
"comment": "Interpolation Sequences",
"end": "~?}",
"endCaptures": {
"0": {
"name": "entity.tag.embedded.end.hcltemplate"
}
},
"name": "meta.interp.hcltemplate",
"patterns": [
{
"include": "source.hclexpr"
}
]
},
{
"begin": "[^\\%]?(\\%\\{~?)",
"beginCaptures": {
"1": {
"name": "entity.tag.embedded.start.hcltemplate"
}
},
"comment": "Control Sequences",
"end": "~?}",
"endCaptures": {
"0": {
"name": "entity.tag.embedded.end.hcltemplate"
}
},
"name": "meta.control.hcltemplate",
"patterns": [
{
"include": "#templateif"
},
{
"include": "#templatefor"
},
{
"include": "#templatesimplekw"
}
]
}
],
"repository": {
"templatefor": {
"begin": "(for)\\s*(\\w+)\\s*(,\\s*(\\w+)\\s*)?(in)",
"beginCaptures": {
"1": {
"name": "keyword.control.hcltemplate"
},
"2": {
"name": "variable.other.hcl"
},
"4": {
"name": "variable.other.hcl"
},
"5": {
"name": "keyword.control.hcltemplate"
}
},
"end": "(?=~?\\})",
"name": "meta.templatefor.hcltemplate",
"patterns": [
{
"include": "source.hclexpr"
}
]
},
"templateif": {
"begin": "(if)\\s*",
"beginCaptures": {
"1": {
"name": "keyword.control.hcltemplate"
}
},
"end": "(?=~?\\})",
"name": "meta.templateif.hcltemplate",
"patterns": [
{
"include": "source.hclexpr"
}
]
},
"templatesimplekw": {
"captures": {
"0": {
"name": "keyword.control.hcl"
}
},
"match": "(else|endif|endfor)"
}
},
"scopeName": "source.hcltemplate",
"uuid": "ac6be18e-d44f-4a73-bd8f-b973fd26df05"
}

View File

@ -0,0 +1,172 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>fileTypes</key>
<array>
<string>tmpl</string>
</array>
<key>name</key>
<string>HCL Template</string>
<key>patterns</key>
<array>
<dict>
<key>begin</key>
<string>[^\$]?(\$\{~?)</string>
<key>beginCaptures</key>
<dict>
<key>1</key>
<dict>
<key>name</key>
<string>entity.tag.embedded.start.hcltemplate</string>
</dict>
</dict>
<key>comment</key>
<string>Interpolation Sequences</string>
<key>end</key>
<string>~?}</string>
<key>endCaptures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>entity.tag.embedded.end.hcltemplate</string>
</dict>
</dict>
<key>name</key>
<string>meta.interp.hcltemplate</string>
<key>patterns</key>
<array>
<dict>
<key>include</key>
<string>source.hclexpr</string>
</dict>
</array>
</dict>
<dict>
<key>begin</key>
<string>[^\%]?(\%\{~?)</string>
<key>beginCaptures</key>
<dict>
<key>1</key>
<dict>
<key>name</key>
<string>entity.tag.embedded.start.hcltemplate</string>
</dict>
</dict>
<key>comment</key>
<string>Control Sequences</string>
<key>end</key>
<string>~?}</string>
<key>endCaptures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>entity.tag.embedded.end.hcltemplate</string>
</dict>
</dict>
<key>name</key>
<string>meta.control.hcltemplate</string>
<key>patterns</key>
<array>
<dict>
<key>include</key>
<string>#templateif</string>
</dict>
<dict>
<key>include</key>
<string>#templatefor</string>
</dict>
<dict>
<key>include</key>
<string>#templatesimplekw</string>
</dict>
</array>
</dict>
</array>
<key>repository</key>
<dict>
<key>templatefor</key>
<dict>
<key>begin</key>
<string>(for)\s*(\w+)\s*(,\s*(\w+)\s*)?(in)</string>
<key>beginCaptures</key>
<dict>
<key>1</key>
<dict>
<key>name</key>
<string>keyword.control.hcltemplate</string>
</dict>
<key>2</key>
<dict>
<key>name</key>
<string>variable.other.hcl</string>
</dict>
<key>4</key>
<dict>
<key>name</key>
<string>variable.other.hcl</string>
</dict>
<key>5</key>
<dict>
<key>name</key>
<string>keyword.control.hcltemplate</string>
</dict>
</dict>
<key>end</key>
<string>(?=~?\})</string>
<key>name</key>
<string>meta.templatefor.hcltemplate</string>
<key>patterns</key>
<array>
<dict>
<key>include</key>
<string>source.hclexpr</string>
</dict>
</array>
</dict>
<key>templateif</key>
<dict>
<key>begin</key>
<string>(if)\s*</string>
<key>beginCaptures</key>
<dict>
<key>1</key>
<dict>
<key>name</key>
<string>keyword.control.hcltemplate</string>
</dict>
</dict>
<key>end</key>
<string>(?=~?\})</string>
<key>name</key>
<string>meta.templateif.hcltemplate</string>
<key>patterns</key>
<array>
<dict>
<key>include</key>
<string>source.hclexpr</string>
</dict>
</array>
</dict>
<key>templatesimplekw</key>
<dict>
<key>captures</key>
<dict>
<key>0</key>
<dict>
<key>name</key>
<string>keyword.control.hcl</string>
</dict>
</dict>
<key>match</key>
<string>(else|endif|endfor)</string>
</dict>
</dict>
<key>scopeName</key>
<string>source.hcltemplate</string>
<key>uuid</key>
<string>ac6be18e-d44f-4a73-bd8f-b973fd26df05</string>
</dict>
</plist>

View File

@ -0,0 +1,58 @@
name: HCL Template
scopeName: source.hcltemplate
fileTypes: [tmpl]
uuid: ac6be18e-d44f-4a73-bd8f-b973fd26df05
patterns:
- comment: Interpolation Sequences
name: meta.interp.hcltemplate
begin: '[^\$]?(\$\{~?)'
beginCaptures:
'1': {name: entity.tag.embedded.start.hcltemplate}
end: '~?}'
endCaptures:
'0': {name: entity.tag.embedded.end.hcltemplate}
patterns:
- include: "source.hclexpr"
- comment: Control Sequences
name: meta.control.hcltemplate
begin: '[^\%]?(\%\{~?)'
beginCaptures:
'1': {name: entity.tag.embedded.start.hcltemplate}
end: '~?}'
endCaptures:
'0': {name: entity.tag.embedded.end.hcltemplate}
patterns:
- include: "#templateif"
- include: "#templatefor"
- include: "#templatesimplekw"
repository:
templateif:
name: meta.templateif.hcltemplate
begin: '(if)\s*'
beginCaptures:
'1': {name: keyword.control.hcltemplate}
end: '(?=~?\})'
patterns:
- include: "source.hclexpr"
templatefor:
name: meta.templatefor.hcltemplate
begin: '(for)\s*(\w+)\s*(,\s*(\w+)\s*)?(in)'
beginCaptures:
'1': {name: keyword.control.hcltemplate}
'2': {name: variable.other.hcl}
'4': {name: variable.other.hcl}
'5': {name: keyword.control.hcltemplate}
end: '(?=~?\})'
patterns:
- include: "source.hclexpr"
templatesimplekw:
match: (else|endif|endfor)
captures:
'0': {name: keyword.control.hcl}

119
extras/grammar/build.go Normal file
View File

@ -0,0 +1,119 @@
// This is a helper to transform the HCL.yaml-tmLanguage file (the source of
// record) into both HCL.json-tmLanguage and HCL.tmLanguage (in plist XML
// format).
//
// Run this after making updates to HCL.yaml-tmLanguage to generate the other
// formats.
//
// This file is intended to be run with "go run":
//
// go run ./build.go
//
// This file is also set up to run itself under "go generate":
//
// go generate .
package main
//go:generate go run ./build.go
import (
"encoding/json"
"fmt"
"io/ioutil"
"log"
"os"
yaml "gopkg.in/yaml.v2"
plist "howett.net/plist"
multierror "github.com/hashicorp/go-multierror"
)
func main() {
err := realMain()
if err != nil {
log.Fatal(err)
}
os.Exit(0)
}
func realMain() error {
var err error
buildErr := build("HCL")
if buildErr != nil {
err = multierror.Append(err, fmt.Errorf("in HCL: %s", buildErr))
}
buildErr = build("HCLTemplate")
if buildErr != nil {
err = multierror.Append(err, fmt.Errorf("in HCLTemplate: %s", buildErr))
}
buildErr = build("HCLExpression")
if buildErr != nil {
err = multierror.Append(err, fmt.Errorf("in HCLExpression: %s", buildErr))
}
return err
}
func build(basename string) error {
yamlSrc, err := ioutil.ReadFile(basename + ".yaml-tmLanguage")
if err != nil {
return err
}
var content interface{}
err = yaml.Unmarshal(yamlSrc, &content)
if err != nil {
return err
}
// Normalize the value so it's both JSON- and plist-friendly.
content = prepare(content)
jsonSrc, err := json.MarshalIndent(content, "", " ")
if err != nil {
return err
}
plistSrc, err := plist.MarshalIndent(content, plist.XMLFormat, " ")
if err != nil {
return err
}
err = ioutil.WriteFile(basename+".json-tmLanguage", jsonSrc, os.ModePerm)
if err != nil {
return err
}
err = ioutil.WriteFile(basename+".tmLanguage", plistSrc, os.ModePerm)
if err != nil {
return err
}
return nil
}
func prepare(v interface{}) interface{} {
switch tv := v.(type) {
case map[interface{}]interface{}:
var ret map[string]interface{}
if len(tv) == 0 {
return ret
}
ret = make(map[string]interface{}, len(tv))
for k, v := range tv {
ret[k.(string)] = prepare(v)
}
return ret
case []interface{}:
for i := range tv {
tv[i] = prepare(tv[i])
}
return tv
default:
return v
}
}

33
go.mod
View File

@ -1,3 +1,32 @@
module github.com/hashicorp/hcl
// WARNING: This module will move to a new path when it transitions from
// being "experimental" to being released.
module github.com/hashicorp/hcl2
require github.com/davecgh/go-spew v1.1.1
require (
github.com/agext/levenshtein v1.2.1
github.com/apparentlymart/go-dump v0.0.0-20180507223929-23540a00eaa3
github.com/apparentlymart/go-textseg v1.0.0
github.com/bsm/go-vlq v0.0.0-20150828105119-ec6e8d4f5f4e
github.com/davecgh/go-spew v1.1.1
github.com/go-test/deep v1.0.3
github.com/google/go-cmp v0.2.0
github.com/hashicorp/errwrap v0.0.0-20180715044906-d6c0cd880357 // indirect
github.com/hashicorp/go-multierror v0.0.0-20180717150148-3d5d8f294aa0
github.com/kr/pretty v0.1.0
github.com/kylelemons/godebug v0.0.0-20170820004349-d65d576e9348
github.com/mitchellh/go-wordwrap v0.0.0-20150314170334-ad45545899c7
github.com/onsi/ginkgo v1.7.0 // indirect
github.com/onsi/gomega v1.4.3 // indirect
github.com/pmezard/go-difflib v1.0.0 // indirect
github.com/sergi/go-diff v1.0.0
github.com/spf13/pflag v1.0.2
github.com/stretchr/testify v1.2.2 // indirect
github.com/zclconf/go-cty v1.0.0
golang.org/x/crypto v0.0.0-20190426145343-a29dc8fdc734
golang.org/x/net v0.0.0-20190502183928-7f726cade0ab // indirect
golang.org/x/sync v0.0.0-20190423024810-112230192c58 // indirect
golang.org/x/sys v0.0.0-20190502175342-a43fa875dd82 // indirect
golang.org/x/text v0.3.2 // indirect
gopkg.in/yaml.v2 v2.2.2
howett.net/plist v0.0.0-20181124034731-591f970eefbb
)

85
go.sum
View File

@ -1,2 +1,87 @@
github.com/agext/levenshtein v1.2.1 h1:QmvMAjj2aEICytGiWzmxoE0x2KZvE0fvmqMOfy2tjT8=
github.com/agext/levenshtein v1.2.1/go.mod h1:JEDfjyjHDjOF/1e4FlBE/PkbqA9OfWu2ki2W0IB5558=
github.com/apparentlymart/go-dump v0.0.0-20180507223929-23540a00eaa3 h1:ZSTrOEhiM5J5RFxEaFvMZVEAM1KvT1YzbEOwB2EAGjA=
github.com/apparentlymart/go-dump v0.0.0-20180507223929-23540a00eaa3/go.mod h1:oL81AME2rN47vu18xqj1S1jPIPuN7afo62yKTNn3XMM=
github.com/apparentlymart/go-textseg v1.0.0 h1:rRmlIsPEEhUTIKQb7T++Nz/A5Q6C9IuX2wFoYVvnCs0=
github.com/apparentlymart/go-textseg v1.0.0/go.mod h1:z96Txxhf3xSFMPmb5X/1W05FF/Nj9VFpLOpjS5yuumk=
github.com/bsm/go-vlq v0.0.0-20150828105119-ec6e8d4f5f4e h1:D64GF/Xr5zSUnM3q1Jylzo4sK7szhP/ON+nb2DB5XJA=
github.com/bsm/go-vlq v0.0.0-20150828105119-ec6e8d4f5f4e/go.mod h1:N+BjUcTjSxc2mtRGSCPsat1kze3CUtvJN3/jTXlp29k=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/fsnotify/fsnotify v1.4.7 h1:IXs+QLmnXW2CcXuY+8Mzv/fWEsPGWxqefPtCP5CnV9I=
github.com/fsnotify/fsnotify v1.4.7/go.mod h1:jwhsz4b93w/PPRr/qN1Yymfu8t87LnFCMoQvtojpjFo=
github.com/go-test/deep v1.0.3 h1:ZrJSEWsXzPOxaZnFteGEfooLba+ju3FYIbOrS+rQd68=
github.com/go-test/deep v1.0.3/go.mod h1:wGDj63lr65AM2AQyKZd/NYHGb0R+1RLqB8NKt3aSFNA=
github.com/golang/protobuf v1.1.0/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
github.com/golang/protobuf v1.2.0 h1:P3YflyNX/ehuJFLhxviNdFxQPkGK5cDcApsge1SqnvM=
github.com/golang/protobuf v1.2.0/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
github.com/google/go-cmp v0.2.0 h1:+dTQ8DZQJz0Mb/HjFlkptS1FeQ4cWSnN941F8aEG4SQ=
github.com/google/go-cmp v0.2.0/go.mod h1:oXzfMopK8JAjlY9xF4vHSVASa0yLyX7SntLO5aqRK0M=
github.com/hashicorp/errwrap v0.0.0-20180715044906-d6c0cd880357 h1:Rem2+U35z1QtPQc6r+WolF7yXiefXqDKyk+lN2pE164=
github.com/hashicorp/errwrap v0.0.0-20180715044906-d6c0cd880357/go.mod h1:YH+1FKiLXxHSkmPseP+kNlulaMuP3n2brvKWEqk/Jc4=
github.com/hashicorp/go-multierror v0.0.0-20180717150148-3d5d8f294aa0 h1:j30noezaCfvNLcdMYSvHLv81DxYRSt1grlpseG67vhU=
github.com/hashicorp/go-multierror v0.0.0-20180717150148-3d5d8f294aa0/go.mod h1:JMRHfdO9jKNzS/+BTlxCjKNQHg/jZAft8U7LloJvN7I=
github.com/hpcloud/tail v1.0.0 h1:nfCOvKYfkgYP8hkirhJocXT2+zOD8yUNjXaWfTlyFKI=
github.com/hpcloud/tail v1.0.0/go.mod h1:ab1qPbhIpdTxEkNHXyeSf5vhxWSCs/tWer42PpOxQnU=
github.com/jessevdk/go-flags v1.4.0/go.mod h1:4FA24M0QyGHXBuZZK/XkWh8h0e1EYbRYJSGM75WSRxI=
github.com/kr/pretty v0.1.0 h1:L/CwN0zerZDmRFUapSPitk6f+Q3+0za1rQkzVuMiMFI=
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=
github.com/kr/text v0.1.0 h1:45sCR5RtlFHMR4UwH9sdQ5TC8v0qDQCHnXt+kaKSTVE=
github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
github.com/kylelemons/godebug v0.0.0-20170820004349-d65d576e9348 h1:MtvEpTB6LX3vkb4ax0b5D2DHbNAUsen0Gx5wZoq3lV4=
github.com/kylelemons/godebug v0.0.0-20170820004349-d65d576e9348/go.mod h1:B69LEHPfb2qLo0BaaOLcbitczOKLWTsrBG9LczfCD4k=
github.com/mitchellh/go-wordwrap v0.0.0-20150314170334-ad45545899c7 h1:DpOJ2HYzCv8LZP15IdmG+YdwD2luVPHITV96TkirNBM=
github.com/mitchellh/go-wordwrap v0.0.0-20150314170334-ad45545899c7/go.mod h1:ZXFpozHsX6DPmq2I0TCekCxypsnAUbP2oI0UX1GXzOo=
github.com/onsi/ginkgo v1.6.0/go.mod h1:lLunBs/Ym6LB5Z9jYTR76FiuTmxDTDusOGeTQH+WWjE=
github.com/onsi/ginkgo v1.7.0 h1:WSHQ+IS43OoUrWtD1/bbclrwK8TTH5hzp+umCiuxHgs=
github.com/onsi/ginkgo v1.7.0/go.mod h1:lLunBs/Ym6LB5Z9jYTR76FiuTmxDTDusOGeTQH+WWjE=
github.com/onsi/gomega v1.4.3 h1:RE1xgDvH7imwFD45h+u2SgIfERHlS2yNG4DObb5BSKU=
github.com/onsi/gomega v1.4.3/go.mod h1:ex+gbHU/CVuBBDIJjb2X0qEXbFg53c61hWP/1CpauHY=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/sergi/go-diff v1.0.0 h1:Kpca3qRNrduNnOQeazBd0ysaKrUJiIuISHxogkT9RPQ=
github.com/sergi/go-diff v1.0.0/go.mod h1:0CfEIISq7TuYL3j771MWULgwwjU+GofnZX9QAmXWZgo=
github.com/spf13/pflag v1.0.2 h1:Fy0orTDgHdbnzHcsOgfCN4LtHf0ec3wwtiwJqwvf3Gc=
github.com/spf13/pflag v1.0.2/go.mod h1:DYY7MBk1bdzusC3SYhjObp+wFpr4gzcvqqNjLnInEg4=
github.com/stretchr/testify v1.2.2 h1:bSDNvY7ZPG5RlJ8otE/7V6gMiyenm9RtJ7IUVIAoJ1w=
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
github.com/vmihailenco/msgpack v3.3.3+incompatible/go.mod h1:fy3FlTQTDXWkZ7Bh6AcGMlsjHatGryHQYUTf1ShIgkk=
github.com/zclconf/go-cty v1.0.0 h1:EWtv3gKe2wPLIB9hQRQJa7k/059oIfAqcEkCNnaVckk=
github.com/zclconf/go-cty v1.0.0/go.mod h1:xnAOWiHeOqg2nWS62VtQ7pbOu17FtxJNW8RLEih+O3s=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/crypto v0.0.0-20190426145343-a29dc8fdc734 h1:p/H982KKEjUnLJkM3tt/LemDnOc1GiZL5FCVlORJ5zo=
golang.org/x/crypto v0.0.0-20190426145343-a29dc8fdc734/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI=
golang.org/x/net v0.0.0-20180811021610-c39426892332/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20180906233101-161cd47e91fd/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20190404232315-eb5bcb51f2a3/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=
golang.org/x/net v0.0.0-20190502183928-7f726cade0ab h1:9RfW3ktsOZxgo9YNbBAjq1FWzc/igwEcUzZz8IXgSbk=
golang.org/x/net v0.0.0-20190502183928-7f726cade0ab/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=
golang.org/x/sync v0.0.0-20180314180146-1d60e4601c6f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20190423024810-112230192c58 h1:8gQV6CLnAEikrhgkHFbMAEhagSSnXWGV915qUMm9mrU=
golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sys v0.0.0-20180909124046-d0be0721c37e h1:o3PsSEY8E4eXWkXrIP9YJALUkVZqzHJT5DOasTyn8Vs=
golang.org/x/sys v0.0.0-20180909124046-d0be0721c37e/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20190502175342-a43fa875dd82 h1:vsphBvatvfbhlb4PO1BYSr9dzugGxJ/SQHoNufZJq1w=
golang.org/x/sys v0.0.0-20190502175342-a43fa875dd82/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/text v0.3.0 h1:g61tztE5qeGQ89tm6NTjjM9VPIm088od1l6aSorWRWg=
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/text v0.3.2 h1:tW2bmiBqwgJj/UpqtC8EpXEZVYOwU0yG4iWbprSVAcs=
golang.org/x/text v0.3.2/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk=
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
google.golang.org/appengine v1.1.0/go.mod h1:EbEs0AVv82hx2wNQdGPgUI5lhzA/G0D9YwlJXL52JkM=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127 h1:qIbj1fsPNlZgppZ+VLlY7N33q108Sa+fhmuc+sWQYwY=
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/fsnotify.v1 v1.4.7 h1:xOHLXZwVvI9hhs+cLKq5+I5onOuwQLhQwiu63xxlHs4=
gopkg.in/fsnotify.v1 v1.4.7/go.mod h1:Tz8NjZHkW78fSQdbUxIjBTcgA1z1m8ZHf0WmKUhAMys=
gopkg.in/tomb.v1 v1.0.0-20141024135613-dd632973f1e7 h1:uRGJdciOHaEIrze2W8Q3AKkepLTh2hOroT7a+7czfdQ=
gopkg.in/tomb.v1 v1.0.0-20141024135613-dd632973f1e7/go.mod h1:dt/ZhP58zS4L8KSrWDmTeBkI65Dw0HsyUHuEVlX15mw=
gopkg.in/yaml.v2 v2.2.1 h1:mUhvW9EsL+naU5Q3cakzfE91YhliOondGd6ZrsDBHQE=
gopkg.in/yaml.v2 v2.2.1/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.2.2 h1:ZCJp+EgiOT7lHqUV2J862kp8Qj64Jo6az82+3Td9dZw=
gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
howett.net/plist v0.0.0-20181124034731-591f970eefbb h1:jhnBjNi9UFpfpl8YZhA9CrOqpnJdvzuiHsl/dnxl11M=
howett.net/plist v0.0.0-20181124034731-591f970eefbb/go.mod h1:vMygbs4qMhSZSc4lCUl2OEE+rDiIIJAIdR4m7MiMcm0=

304
gohcl/decode.go Normal file
View File

@ -0,0 +1,304 @@
package gohcl
import (
"fmt"
"reflect"
"github.com/zclconf/go-cty/cty"
"github.com/hashicorp/hcl2/hcl"
"github.com/zclconf/go-cty/cty/convert"
"github.com/zclconf/go-cty/cty/gocty"
)
// DecodeBody extracts the configuration within the given body into the given
// value. This value must be a non-nil pointer to either a struct or
// a map, where in the former case the configuration will be decoded using
// struct tags and in the latter case only attributes are allowed and their
// values are decoded into the map.
//
// The given EvalContext is used to resolve any variables or functions in
// expressions encountered while decoding. This may be nil to require only
// constant values, for simple applications that do not support variables or
// functions.
//
// The returned diagnostics should be inspected with its HasErrors method to
// determine if the populated value is valid and complete. If error diagnostics
// are returned then the given value may have been partially-populated but
// may still be accessed by a careful caller for static analysis and editor
// integration use-cases.
func DecodeBody(body hcl.Body, ctx *hcl.EvalContext, val interface{}) hcl.Diagnostics {
rv := reflect.ValueOf(val)
if rv.Kind() != reflect.Ptr {
panic(fmt.Sprintf("target value must be a pointer, not %s", rv.Type().String()))
}
return decodeBodyToValue(body, ctx, rv.Elem())
}
func decodeBodyToValue(body hcl.Body, ctx *hcl.EvalContext, val reflect.Value) hcl.Diagnostics {
et := val.Type()
switch et.Kind() {
case reflect.Struct:
return decodeBodyToStruct(body, ctx, val)
case reflect.Map:
return decodeBodyToMap(body, ctx, val)
default:
panic(fmt.Sprintf("target value must be pointer to struct or map, not %s", et.String()))
}
}
func decodeBodyToStruct(body hcl.Body, ctx *hcl.EvalContext, val reflect.Value) hcl.Diagnostics {
schema, partial := ImpliedBodySchema(val.Interface())
var content *hcl.BodyContent
var leftovers hcl.Body
var diags hcl.Diagnostics
if partial {
content, leftovers, diags = body.PartialContent(schema)
} else {
content, diags = body.Content(schema)
}
if content == nil {
return diags
}
tags := getFieldTags(val.Type())
if tags.Remain != nil {
fieldIdx := *tags.Remain
field := val.Type().Field(fieldIdx)
fieldV := val.Field(fieldIdx)
switch {
case bodyType.AssignableTo(field.Type):
fieldV.Set(reflect.ValueOf(leftovers))
case attrsType.AssignableTo(field.Type):
attrs, attrsDiags := leftovers.JustAttributes()
if len(attrsDiags) > 0 {
diags = append(diags, attrsDiags...)
}
fieldV.Set(reflect.ValueOf(attrs))
default:
diags = append(diags, decodeBodyToValue(leftovers, ctx, fieldV)...)
}
}
for name, fieldIdx := range tags.Attributes {
attr := content.Attributes[name]
field := val.Type().Field(fieldIdx)
fieldV := val.Field(fieldIdx)
if attr == nil {
if !exprType.AssignableTo(field.Type) {
continue
}
// As a special case, if the target is of type hcl.Expression then
// we'll assign an actual expression that evalues to a cty null,
// so the caller can deal with it within the cty realm rather
// than within the Go realm.
synthExpr := hcl.StaticExpr(cty.NullVal(cty.DynamicPseudoType), body.MissingItemRange())
fieldV.Set(reflect.ValueOf(synthExpr))
continue
}
switch {
case attrType.AssignableTo(field.Type):
fieldV.Set(reflect.ValueOf(attr))
case exprType.AssignableTo(field.Type):
fieldV.Set(reflect.ValueOf(attr.Expr))
default:
diags = append(diags, DecodeExpression(
attr.Expr, ctx, fieldV.Addr().Interface(),
)...)
}
}
blocksByType := content.Blocks.ByType()
for typeName, fieldIdx := range tags.Blocks {
blocks := blocksByType[typeName]
field := val.Type().Field(fieldIdx)
ty := field.Type
isSlice := false
isPtr := false
if ty.Kind() == reflect.Slice {
isSlice = true
ty = ty.Elem()
}
if ty.Kind() == reflect.Ptr {
isPtr = true
ty = ty.Elem()
}
if len(blocks) > 1 && !isSlice {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: fmt.Sprintf("Duplicate %s block", typeName),
Detail: fmt.Sprintf(
"Only one %s block is allowed. Another was defined at %s.",
typeName, blocks[0].DefRange.String(),
),
Subject: &blocks[1].DefRange,
})
continue
}
if len(blocks) == 0 {
if isSlice || isPtr {
val.Field(fieldIdx).Set(reflect.Zero(field.Type))
} else {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: fmt.Sprintf("Missing %s block", typeName),
Detail: fmt.Sprintf("A %s block is required.", typeName),
Subject: body.MissingItemRange().Ptr(),
})
}
continue
}
switch {
case isSlice:
elemType := ty
if isPtr {
elemType = reflect.PtrTo(ty)
}
sli := reflect.MakeSlice(reflect.SliceOf(elemType), len(blocks), len(blocks))
for i, block := range blocks {
if isPtr {
v := reflect.New(ty)
diags = append(diags, decodeBlockToValue(block, ctx, v.Elem())...)
sli.Index(i).Set(v)
} else {
diags = append(diags, decodeBlockToValue(block, ctx, sli.Index(i))...)
}
}
val.Field(fieldIdx).Set(sli)
default:
block := blocks[0]
if isPtr {
v := reflect.New(ty)
diags = append(diags, decodeBlockToValue(block, ctx, v.Elem())...)
val.Field(fieldIdx).Set(v)
} else {
diags = append(diags, decodeBlockToValue(block, ctx, val.Field(fieldIdx))...)
}
}
}
return diags
}
func decodeBodyToMap(body hcl.Body, ctx *hcl.EvalContext, v reflect.Value) hcl.Diagnostics {
attrs, diags := body.JustAttributes()
if attrs == nil {
return diags
}
mv := reflect.MakeMap(v.Type())
for k, attr := range attrs {
switch {
case attrType.AssignableTo(v.Type().Elem()):
mv.SetMapIndex(reflect.ValueOf(k), reflect.ValueOf(attr))
case exprType.AssignableTo(v.Type().Elem()):
mv.SetMapIndex(reflect.ValueOf(k), reflect.ValueOf(attr.Expr))
default:
ev := reflect.New(v.Type().Elem())
diags = append(diags, DecodeExpression(attr.Expr, ctx, ev.Interface())...)
mv.SetMapIndex(reflect.ValueOf(k), ev.Elem())
}
}
v.Set(mv)
return diags
}
func decodeBlockToValue(block *hcl.Block, ctx *hcl.EvalContext, v reflect.Value) hcl.Diagnostics {
var diags hcl.Diagnostics
ty := v.Type()
switch {
case blockType.AssignableTo(ty):
v.Elem().Set(reflect.ValueOf(block))
case bodyType.AssignableTo(ty):
v.Elem().Set(reflect.ValueOf(block.Body))
case attrsType.AssignableTo(ty):
attrs, attrsDiags := block.Body.JustAttributes()
if len(attrsDiags) > 0 {
diags = append(diags, attrsDiags...)
}
v.Elem().Set(reflect.ValueOf(attrs))
default:
diags = append(diags, decodeBodyToValue(block.Body, ctx, v)...)
if len(block.Labels) > 0 {
blockTags := getFieldTags(ty)
for li, lv := range block.Labels {
lfieldIdx := blockTags.Labels[li].FieldIndex
v.Field(lfieldIdx).Set(reflect.ValueOf(lv))
}
}
}
return diags
}
// DecodeExpression extracts the value of the given expression into the given
// value. This value must be something that gocty is able to decode into,
// since the final decoding is delegated to that package.
//
// The given EvalContext is used to resolve any variables or functions in
// expressions encountered while decoding. This may be nil to require only
// constant values, for simple applications that do not support variables or
// functions.
//
// The returned diagnostics should be inspected with its HasErrors method to
// determine if the populated value is valid and complete. If error diagnostics
// are returned then the given value may have been partially-populated but
// may still be accessed by a careful caller for static analysis and editor
// integration use-cases.
func DecodeExpression(expr hcl.Expression, ctx *hcl.EvalContext, val interface{}) hcl.Diagnostics {
srcVal, diags := expr.Value(ctx)
convTy, err := gocty.ImpliedType(val)
if err != nil {
panic(fmt.Sprintf("unsuitable DecodeExpression target: %s", err))
}
srcVal, err = convert.Convert(srcVal, convTy)
if err != nil {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Unsuitable value type",
Detail: fmt.Sprintf("Unsuitable value: %s", err.Error()),
Subject: expr.StartRange().Ptr(),
Context: expr.Range().Ptr(),
})
return diags
}
err = gocty.FromCtyValue(srcVal, val)
if err != nil {
diags = append(diags, &hcl.Diagnostic{
Severity: hcl.DiagError,
Summary: "Unsuitable value type",
Detail: fmt.Sprintf("Unsuitable value: %s", err.Error()),
Subject: expr.StartRange().Ptr(),
Context: expr.Range().Ptr(),
})
}
return diags
}

645
gohcl/decode_test.go Normal file
View File

@ -0,0 +1,645 @@
package gohcl
import (
"encoding/json"
"fmt"
"reflect"
"testing"
"github.com/davecgh/go-spew/spew"
"github.com/hashicorp/hcl2/hcl"
hclJSON "github.com/hashicorp/hcl2/hcl/json"
"github.com/zclconf/go-cty/cty"
)
func TestDecodeBody(t *testing.T) {
deepEquals := func(other interface{}) func(v interface{}) bool {
return func(v interface{}) bool {
return reflect.DeepEqual(v, other)
}
}
type withNameExpression struct {
Name hcl.Expression `hcl:"name"`
}
tests := []struct {
Body map[string]interface{}
Target interface{}
Check func(v interface{}) bool
DiagCount int
}{
{
map[string]interface{}{},
struct{}{},
deepEquals(struct{}{}),
0,
},
{
map[string]interface{}{},
struct {
Name string `hcl:"name"`
}{},
deepEquals(struct {
Name string `hcl:"name"`
}{}),
1, // name is required
},
{
map[string]interface{}{},
struct {
Name *string `hcl:"name"`
}{},
deepEquals(struct {
Name *string `hcl:"name"`
}{}),
0,
}, // name nil
{
map[string]interface{}{},
struct {
Name string `hcl:"name,optional"`
}{},
deepEquals(struct {
Name string `hcl:"name,optional"`
}{}),
0,
}, // name optional
{
map[string]interface{}{},
withNameExpression{},
func(v interface{}) bool {
if v == nil {
return false
}
wne, valid := v.(withNameExpression)
if !valid {
return false
}
if wne.Name == nil {
return false
}
nameVal, _ := wne.Name.Value(nil)
if !nameVal.IsNull() {
return false
}
return true
},
0,
},
{
map[string]interface{}{
"name": "Ermintrude",
},
withNameExpression{},
func(v interface{}) bool {
if v == nil {
return false
}
wne, valid := v.(withNameExpression)
if !valid {
return false
}
if wne.Name == nil {
return false
}
nameVal, _ := wne.Name.Value(nil)
if !nameVal.Equals(cty.StringVal("Ermintrude")).True() {
return false
}
return true
},
0,
},
{
map[string]interface{}{
"name": "Ermintrude",
},
struct {
Name string `hcl:"name"`
}{},
deepEquals(struct {
Name string `hcl:"name"`
}{"Ermintrude"}),
0,
},
{
map[string]interface{}{
"name": "Ermintrude",
"age": 23,
},
struct {
Name string `hcl:"name"`
}{},
deepEquals(struct {
Name string `hcl:"name"`
}{"Ermintrude"}),
1, // Extraneous "age" property
},
{
map[string]interface{}{
"name": "Ermintrude",
"age": 50,
},
struct {
Name string `hcl:"name"`
Attrs hcl.Attributes `hcl:",remain"`
}{},
func(gotI interface{}) bool {
got := gotI.(struct {
Name string `hcl:"name"`
Attrs hcl.Attributes `hcl:",remain"`
})
return got.Name == "Ermintrude" && len(got.Attrs) == 1 && got.Attrs["age"] != nil
},
0,
},
{
map[string]interface{}{
"name": "Ermintrude",
"age": 50,
},
struct {
Name string `hcl:"name"`
Remain hcl.Body `hcl:",remain"`
}{},
func(gotI interface{}) bool {
got := gotI.(struct {
Name string `hcl:"name"`
Remain hcl.Body `hcl:",remain"`
})
attrs, _ := got.Remain.JustAttributes()
return got.Name == "Ermintrude" && len(attrs) == 1 && attrs["age"] != nil
},
0,
},
{
map[string]interface{}{
"name": "Ermintrude",
"living": true,
},
struct {
Name string `hcl:"name"`
Remain map[string]cty.Value `hcl:",remain"`
}{},
deepEquals(struct {
Name string `hcl:"name"`
Remain map[string]cty.Value `hcl:",remain"`
}{
Name: "Ermintrude",
Remain: map[string]cty.Value{
"living": cty.True,
},
}),
0,
},
{
map[string]interface{}{
"noodle": map[string]interface{}{},
},
struct {
Noodle struct{} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
// Generating no diagnostics is good enough for this one.
return true
},
0,
},
{
map[string]interface{}{
"noodle": []map[string]interface{}{{}},
},
struct {
Noodle struct{} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
// Generating no diagnostics is good enough for this one.
return true
},
0,
},
{
map[string]interface{}{
"noodle": []map[string]interface{}{{}, {}},
},
struct {
Noodle struct{} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
// Generating one diagnostic is good enough for this one.
return true
},
1,
},
{
map[string]interface{}{},
struct {
Noodle struct{} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
// Generating one diagnostic is good enough for this one.
return true
},
1,
},
{
map[string]interface{}{
"noodle": []map[string]interface{}{},
},
struct {
Noodle struct{} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
// Generating one diagnostic is good enough for this one.
return true
},
1,
},
{
map[string]interface{}{
"noodle": map[string]interface{}{},
},
struct {
Noodle *struct{} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
return gotI.(struct {
Noodle *struct{} `hcl:"noodle,block"`
}).Noodle != nil
},
0,
},
{
map[string]interface{}{
"noodle": []map[string]interface{}{{}},
},
struct {
Noodle *struct{} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
return gotI.(struct {
Noodle *struct{} `hcl:"noodle,block"`
}).Noodle != nil
},
0,
},
{
map[string]interface{}{
"noodle": []map[string]interface{}{},
},
struct {
Noodle *struct{} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
return gotI.(struct {
Noodle *struct{} `hcl:"noodle,block"`
}).Noodle == nil
},
0,
},
{
map[string]interface{}{
"noodle": []map[string]interface{}{{}, {}},
},
struct {
Noodle *struct{} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
// Generating one diagnostic is good enough for this one.
return true
},
1,
},
{
map[string]interface{}{
"noodle": []map[string]interface{}{},
},
struct {
Noodle []struct{} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
noodle := gotI.(struct {
Noodle []struct{} `hcl:"noodle,block"`
}).Noodle
return len(noodle) == 0
},
0,
},
{
map[string]interface{}{
"noodle": []map[string]interface{}{{}},
},
struct {
Noodle []struct{} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
noodle := gotI.(struct {
Noodle []struct{} `hcl:"noodle,block"`
}).Noodle
return len(noodle) == 1
},
0,
},
{
map[string]interface{}{
"noodle": []map[string]interface{}{{}, {}},
},
struct {
Noodle []struct{} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
noodle := gotI.(struct {
Noodle []struct{} `hcl:"noodle,block"`
}).Noodle
return len(noodle) == 2
},
0,
},
{
map[string]interface{}{
"noodle": map[string]interface{}{},
},
struct {
Noodle struct {
Name string `hcl:"name,label"`
} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
// Generating two diagnostics is good enough for this one.
// (one for the missing noodle block and the other for
// the JSON serialization detecting the missing level of
// heirarchy for the label.)
return true
},
2,
},
{
map[string]interface{}{
"noodle": map[string]interface{}{
"foo_foo": map[string]interface{}{},
},
},
struct {
Noodle struct {
Name string `hcl:"name,label"`
} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
noodle := gotI.(struct {
Noodle struct {
Name string `hcl:"name,label"`
} `hcl:"noodle,block"`
}).Noodle
return noodle.Name == "foo_foo"
},
0,
},
{
map[string]interface{}{
"noodle": map[string]interface{}{
"foo_foo": map[string]interface{}{},
"bar_baz": map[string]interface{}{},
},
},
struct {
Noodle struct {
Name string `hcl:"name,label"`
} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
// One diagnostic is enough for this one.
return true
},
1,
},
{
map[string]interface{}{
"noodle": map[string]interface{}{
"foo_foo": map[string]interface{}{},
"bar_baz": map[string]interface{}{},
},
},
struct {
Noodles []struct {
Name string `hcl:"name,label"`
} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
noodles := gotI.(struct {
Noodles []struct {
Name string `hcl:"name,label"`
} `hcl:"noodle,block"`
}).Noodles
return len(noodles) == 2 && (noodles[0].Name == "foo_foo" || noodles[0].Name == "bar_baz") && (noodles[1].Name == "foo_foo" || noodles[1].Name == "bar_baz") && noodles[0].Name != noodles[1].Name
},
0,
},
{
map[string]interface{}{
"noodle": map[string]interface{}{
"foo_foo": map[string]interface{}{
"type": "rice",
},
},
},
struct {
Noodle struct {
Name string `hcl:"name,label"`
Type string `hcl:"type"`
} `hcl:"noodle,block"`
}{},
func(gotI interface{}) bool {
noodle := gotI.(struct {
Noodle struct {
Name string `hcl:"name,label"`
Type string `hcl:"type"`
} `hcl:"noodle,block"`
}).Noodle
return noodle.Name == "foo_foo" && noodle.Type == "rice"
},
0,
},
{
map[string]interface{}{
"name": "Ermintrude",
"age": 34,
},
map[string]string(nil),
deepEquals(map[string]string{
"name": "Ermintrude",
"age": "34",
}),
0,
},
{
map[string]interface{}{
"name": "Ermintrude",
"age": 89,
},
map[string]*hcl.Attribute(nil),
func(gotI interface{}) bool {
got := gotI.(map[string]*hcl.Attribute)
return len(got) == 2 && got["name"] != nil && got["age"] != nil
},
0,
},
{
map[string]interface{}{
"name": "Ermintrude",
"age": 13,
},
map[string]hcl.Expression(nil),
func(gotI interface{}) bool {
got := gotI.(map[string]hcl.Expression)
return len(got) == 2 && got["name"] != nil && got["age"] != nil
},
0,
},
{
map[string]interface{}{
"name": "Ermintrude",
"living": true,
},
map[string]cty.Value(nil),
deepEquals(map[string]cty.Value{
"name": cty.StringVal("Ermintrude"),
"living": cty.True,
}),
0,
},
}
for i, test := range tests {
// For convenience here we're going to use the JSON parser
// to process the given body.
buf, err := json.Marshal(test.Body)
if err != nil {
t.Fatalf("error JSON-encoding body for test %d: %s", i, err)
}
t.Run(string(buf), func(t *testing.T) {
file, diags := hclJSON.Parse(buf, "test.json")
if len(diags) != 0 {
t.Fatalf("diagnostics while parsing: %s", diags.Error())
}
targetVal := reflect.New(reflect.TypeOf(test.Target))
diags = DecodeBody(file.Body, nil, targetVal.Interface())
if len(diags) != test.DiagCount {
t.Errorf("wrong number of diagnostics %d; want %d", len(diags), test.DiagCount)
for _, diag := range diags {
t.Logf(" - %s", diag.Error())
}
}
got := targetVal.Elem().Interface()
if !test.Check(got) {
t.Errorf("wrong result\ngot: %s", spew.Sdump(got))
}
})
}
}
func TestDecodeExpression(t *testing.T) {
tests := []struct {
Value cty.Value
Target interface{}
Want interface{}
DiagCount int
}{
{
cty.StringVal("hello"),
"",
"hello",
0,
},
{
cty.StringVal("hello"),
cty.NilVal,
cty.StringVal("hello"),
0,
},
{
cty.NumberIntVal(2),
"",
"2",
0,
},
{
cty.StringVal("true"),
false,
true,
0,
},
{
cty.NullVal(cty.String),
"",
"",
1, // null value is not allowed
},
{
cty.UnknownVal(cty.String),
"",
"",
1, // value must be known
},
{
cty.ListVal([]cty.Value{cty.True}),
false,
false,
1, // bool required
},
}
for i, test := range tests {
t.Run(fmt.Sprintf("%02d", i), func(t *testing.T) {
expr := &fixedExpression{test.Value}
targetVal := reflect.New(reflect.TypeOf(test.Target))
diags := DecodeExpression(expr, nil, targetVal.Interface())
if len(diags) != test.DiagCount {
t.Errorf("wrong number of diagnostics %d; want %d", len(diags), test.DiagCount)
for _, diag := range diags {
t.Logf(" - %s", diag.Error())
}
}
got := targetVal.Elem().Interface()
if !reflect.DeepEqual(got, test.Want) {
t.Errorf("wrong result\ngot: %#v\nwant: %#v", got, test.Want)
}
})
}
}
type fixedExpression struct {
val cty.Value
}
func (e *fixedExpression) Value(ctx *hcl.EvalContext) (cty.Value, hcl.Diagnostics) {
return e.val, nil
}
func (e *fixedExpression) Range() (r hcl.Range) {
return
}
func (e *fixedExpression) StartRange() (r hcl.Range) {
return
}
func (e *fixedExpression) Variables() []hcl.Traversal {
return nil
}

53
gohcl/doc.go Normal file
View File

@ -0,0 +1,53 @@
// Package gohcl allows decoding HCL configurations into Go data structures.
//
// It provides a convenient and concise way of describing the schema for
// configuration and then accessing the resulting data via native Go
// types.
//
// A struct field tag scheme is used, similar to other decoding and
// unmarshalling libraries. The tags are formatted as in the following example:
//
// ThingType string `hcl:"thing_type,attr"`
//
// Within each tag there are two comma-separated tokens. The first is the
// name of the corresponding construct in configuration, while the second
// is a keyword giving the kind of construct expected. The following
// kind keywords are supported:
//
// attr (the default) indicates that the value is to be populated from an attribute
// block indicates that the value is to populated from a block
// label indicates that the value is to populated from a block label
// remain indicates that the value is to be populated from the remaining body after populating other fields
//
// "attr" fields may either be of type *hcl.Expression, in which case the raw
// expression is assigned, or of any type accepted by gocty, in which case
// gocty will be used to assign the value to a native Go type.
//
// "block" fields may be of type *hcl.Block or hcl.Body, in which case the
// corresponding raw value is assigned, or may be a struct that recursively
// uses the same tags. Block fields may also be slices of any of these types,
// in which case multiple blocks of the corresponding type are decoded into
// the slice.
//
// "label" fields are considered only in a struct used as the type of a field
// marked as "block", and are used sequentially to capture the labels of
// the blocks being decoded. In this case, the name token is used only as
// an identifier for the label in diagnostic messages.
//
// "remain" can be placed on a single field that may be either of type
// hcl.Body or hcl.Attributes, in which case any remaining body content is
// placed into this field for delayed processing. If no "remain" field is
// present then any attributes or blocks not matched by another valid tag
// will cause an error diagnostic.
//
// Only a subset of this tagging/typing vocabulary is supported for the
// "Encode" family of functions. See the EncodeIntoBody docs for full details
// on the constraints there.
//
// Broadly-speaking this package deals with two types of error. The first is
// errors in the configuration itself, which are returned as diagnostics
// written with the configuration author as the target audience. The second
// is bugs in the calling program, such as invalid struct tags, which are
// surfaced via panics since there can be no useful runtime handling of such
// errors and they should certainly not be returned to the user as diagnostics.
package gohcl

191
gohcl/encode.go Normal file
View File

@ -0,0 +1,191 @@
package gohcl
import (
"fmt"
"reflect"
"sort"
"github.com/hashicorp/hcl2/hclwrite"
"github.com/zclconf/go-cty/cty/gocty"
)
// EncodeIntoBody replaces the contents of the given hclwrite Body with
// attributes and blocks derived from the given value, which must be a
// struct value or a pointer to a struct value with the struct tags defined
// in this package.
//
// This function can work only with fully-decoded data. It will ignore any
// fields tagged as "remain", any fields that decode attributes into either
// hcl.Attribute or hcl.Expression values, and any fields that decode blocks
// into hcl.Attributes values. This function does not have enough information
// to complete the decoding of these types.
//
// Any fields tagged as "label" are ignored by this function. Use EncodeAsBlock
// to produce a whole hclwrite.Block including block labels.
//
// As long as a suitable value is given to encode and the destination body
// is non-nil, this function will always complete. It will panic in case of
// any errors in the calling program, such as passing an inappropriate type
// or a nil body.
//
// The layout of the resulting HCL source is derived from the ordering of
// the struct fields, with blank lines around nested blocks of different types.
// Fields representing attributes should usually precede those representing
// blocks so that the attributes can group togather in the result. For more
// control, use the hclwrite API directly.
func EncodeIntoBody(val interface{}, dst *hclwrite.Body) {
rv := reflect.ValueOf(val)
ty := rv.Type()
if ty.Kind() == reflect.Ptr {
rv = rv.Elem()
ty = rv.Type()
}
if ty.Kind() != reflect.Struct {
panic(fmt.Sprintf("value is %s, not struct", ty.Kind()))
}
tags := getFieldTags(ty)
populateBody(rv, ty, tags, dst)
}
// EncodeAsBlock creates a new hclwrite.Block populated with the data from
// the given value, which must be a struct or pointer to struct with the
// struct tags defined in this package.
//
// If the given struct type has fields tagged with "label" tags then they
// will be used in order to annotate the created block with labels.
//
// This function has the same constraints as EncodeIntoBody and will panic
// if they are violated.
func EncodeAsBlock(val interface{}, blockType string) *hclwrite.Block {
rv := reflect.ValueOf(val)
ty := rv.Type()
if ty.Kind() == reflect.Ptr {
rv = rv.Elem()
ty = rv.Type()
}
if ty.Kind() != reflect.Struct {
panic(fmt.Sprintf("value is %s, not struct", ty.Kind()))
}
tags := getFieldTags(ty)
labels := make([]string, len(tags.Labels))
for i, lf := range tags.Labels {
lv := rv.Field(lf.FieldIndex)
// We just stringify whatever we find. It should always be a string
// but if not then we'll still do something reasonable.
labels[i] = fmt.Sprintf("%s", lv.Interface())
}
block := hclwrite.NewBlock(blockType, labels)
populateBody(rv, ty, tags, block.Body())
return block
}
func populateBody(rv reflect.Value, ty reflect.Type, tags *fieldTags, dst *hclwrite.Body) {
nameIdxs := make(map[string]int, len(tags.Attributes)+len(tags.Blocks))
namesOrder := make([]string, 0, len(tags.Attributes)+len(tags.Blocks))
for n, i := range tags.Attributes {
nameIdxs[n] = i
namesOrder = append(namesOrder, n)
}
for n, i := range tags.Blocks {
nameIdxs[n] = i
namesOrder = append(namesOrder, n)
}
sort.SliceStable(namesOrder, func(i, j int) bool {
ni, nj := namesOrder[i], namesOrder[j]
return nameIdxs[ni] < nameIdxs[nj]
})
dst.Clear()
prevWasBlock := false
for _, name := range namesOrder {
fieldIdx := nameIdxs[name]
field := ty.Field(fieldIdx)
fieldTy := field.Type
fieldVal := rv.Field(fieldIdx)
if fieldTy.Kind() == reflect.Ptr {
fieldTy = fieldTy.Elem()
fieldVal = fieldVal.Elem()
}
if _, isAttr := tags.Attributes[name]; isAttr {
if exprType.AssignableTo(fieldTy) || attrType.AssignableTo(fieldTy) {
continue // ignore undecoded fields
}
if !fieldVal.IsValid() {
continue // ignore (field value is nil pointer)
}
if fieldTy.Kind() == reflect.Ptr && fieldVal.IsNil() {
continue // ignore
}
if prevWasBlock {
dst.AppendNewline()
prevWasBlock = false
}
valTy, err := gocty.ImpliedType(fieldVal.Interface())
if err != nil {
panic(fmt.Sprintf("cannot encode %T as HCL expression: %s", fieldVal.Interface(), err))
}
val, err := gocty.ToCtyValue(fieldVal.Interface(), valTy)
if err != nil {
// This should never happen, since we should always be able
// to decode into the implied type.
panic(fmt.Sprintf("failed to encode %T as %#v: %s", fieldVal.Interface(), valTy, err))
}
dst.SetAttributeValue(name, val)
} else { // must be a block, then
elemTy := fieldTy
isSeq := false
if elemTy.Kind() == reflect.Slice || elemTy.Kind() == reflect.Array {
isSeq = true
elemTy = elemTy.Elem()
}
if bodyType.AssignableTo(elemTy) || attrsType.AssignableTo(elemTy) {
continue // ignore undecoded fields
}
prevWasBlock = false
if isSeq {
l := fieldVal.Len()
for i := 0; i < l; i++ {
elemVal := fieldVal.Index(i)
if !elemVal.IsValid() {
continue // ignore (elem value is nil pointer)
}
if elemTy.Kind() == reflect.Ptr && elemVal.IsNil() {
continue // ignore
}
block := EncodeAsBlock(elemVal.Interface(), name)
if !prevWasBlock {
dst.AppendNewline()
prevWasBlock = true
}
dst.AppendBlock(block)
}
} else {
if !fieldVal.IsValid() {
continue // ignore (field value is nil pointer)
}
if elemTy.Kind() == reflect.Ptr && fieldVal.IsNil() {
continue // ignore
}
block := EncodeAsBlock(fieldVal.Interface(), name)
if !prevWasBlock {
dst.AppendNewline()
prevWasBlock = true
}
dst.AppendBlock(block)
}
}
}
}

64
gohcl/encode_test.go Normal file
View File

@ -0,0 +1,64 @@
package gohcl_test
import (
"fmt"
"github.com/hashicorp/hcl2/gohcl"
"github.com/hashicorp/hcl2/hclwrite"
)
func ExampleEncodeIntoBody() {
type Service struct {
Name string `hcl:"name,label"`
Exe []string `hcl:"executable"`
}
type Constraints struct {
OS string `hcl:"os"`
Arch string `hcl:"arch"`
}
type App struct {
Name string `hcl:"name"`
Desc string `hcl:"description"`
Constraints *Constraints `hcl:"constraints,block"`
Services []Service `hcl:"service,block"`
}
app := App{
Name: "awesome-app",
Desc: "Such an awesome application",
Constraints: &Constraints{
OS: "linux",
Arch: "amd64",
},
Services: []Service{
{
Name: "web",
Exe: []string{"./web", "--listen=:8080"},
},
{
Name: "worker",
Exe: []string{"./worker"},
},
},
}
f := hclwrite.NewEmptyFile()
gohcl.EncodeIntoBody(&app, f.Body())
fmt.Printf("%s", f.Bytes())
// Output:
// name = "awesome-app"
// description = "Such an awesome application"
//
// constraints {
// os = "linux"
// arch = "amd64"
// }
//
// service "web" {
// executable = ["./web", "--listen=:8080"]
// }
// service "worker" {
// executable = ["./worker"]
// }
}

174
gohcl/schema.go Normal file
View File

@ -0,0 +1,174 @@
package gohcl
import (
"fmt"
"reflect"
"sort"
"strings"
"github.com/hashicorp/hcl2/hcl"
)
// ImpliedBodySchema produces a hcl.BodySchema derived from the type of the
// given value, which must be a struct value or a pointer to one. If an
// inappropriate value is passed, this function will panic.
//
// The second return argument indicates whether the given struct includes
// a "remain" field, and thus the returned schema is non-exhaustive.
//
// This uses the tags on the fields of the struct to discover how each
// field's value should be expressed within configuration. If an invalid
// mapping is attempted, this function will panic.
func ImpliedBodySchema(val interface{}) (schema *hcl.BodySchema, partial bool) {
ty := reflect.TypeOf(val)
if ty.Kind() == reflect.Ptr {
ty = ty.Elem()
}
if ty.Kind() != reflect.Struct {
panic(fmt.Sprintf("given value must be struct, not %T", val))
}
var attrSchemas []hcl.AttributeSchema
var blockSchemas []hcl.BlockHeaderSchema
tags := getFieldTags(ty)
attrNames := make([]string, 0, len(tags.Attributes))
for n := range tags.Attributes {
attrNames = append(attrNames, n)
}
sort.Strings(attrNames)
for _, n := range attrNames {
idx := tags.Attributes[n]
optional := tags.Optional[n]
field := ty.Field(idx)
var required bool
switch {
case field.Type.AssignableTo(exprType):
// If we're decoding to hcl.Expression then absense can be
// indicated via a null value, so we don't specify that
// the field is required during decoding.
required = false
case field.Type.Kind() != reflect.Ptr && !optional:
required = true
default:
required = false
}
attrSchemas = append(attrSchemas, hcl.AttributeSchema{
Name: n,
Required: required,
})
}
blockNames := make([]string, 0, len(tags.Blocks))
for n := range tags.Blocks {
blockNames = append(blockNames, n)
}
sort.Strings(blockNames)
for _, n := range blockNames {
idx := tags.Blocks[n]
field := ty.Field(idx)
fty := field.Type
if fty.Kind() == reflect.Slice {
fty = fty.Elem()
}
if fty.Kind() == reflect.Ptr {
fty = fty.Elem()
}
if fty.Kind() != reflect.Struct {
panic(fmt.Sprintf(
"hcl 'block' tag kind cannot be applied to %s field %s: struct required", field.Type.String(), field.Name,
))
}
ftags := getFieldTags(fty)
var labelNames []string
if len(ftags.Labels) > 0 {
labelNames = make([]string, len(ftags.Labels))
for i, l := range ftags.Labels {
labelNames[i] = l.Name
}
}
blockSchemas = append(blockSchemas, hcl.BlockHeaderSchema{
Type: n,
LabelNames: labelNames,
})
}
partial = tags.Remain != nil
schema = &hcl.BodySchema{
Attributes: attrSchemas,
Blocks: blockSchemas,
}
return schema, partial
}
type fieldTags struct {
Attributes map[string]int
Blocks map[string]int
Labels []labelField
Remain *int
Optional map[string]bool
}
type labelField struct {
FieldIndex int
Name string
}
func getFieldTags(ty reflect.Type) *fieldTags {
ret := &fieldTags{
Attributes: map[string]int{},
Blocks: map[string]int{},
Optional: map[string]bool{},
}
ct := ty.NumField()
for i := 0; i < ct; i++ {
field := ty.Field(i)
tag := field.Tag.Get("hcl")
if tag == "" {
continue
}
comma := strings.Index(tag, ",")
var name, kind string
if comma != -1 {
name = tag[:comma]
kind = tag[comma+1:]
} else {
name = tag
kind = "attr"
}
switch kind {
case "attr":
ret.Attributes[name] = i
case "block":
ret.Blocks[name] = i
case "label":
ret.Labels = append(ret.Labels, labelField{
FieldIndex: i,
Name: name,
})
case "remain":
if ret.Remain != nil {
panic("only one 'remain' tag is permitted")
}
idx := i // copy, because this loop will continue assigning to i
ret.Remain = &idx
case "optional":
ret.Attributes[name] = i
ret.Optional[name] = true
default:
panic(fmt.Sprintf("invalid hcl field tag kind %q on %s %q", kind, field.Type.String(), field.Name))
}
}
return ret
}

230
gohcl/schema_test.go Normal file
View File

@ -0,0 +1,230 @@
package gohcl
import (
"fmt"
"reflect"
"testing"
"github.com/davecgh/go-spew/spew"
"github.com/hashicorp/hcl2/hcl"
)
func TestImpliedBodySchema(t *testing.T) {
tests := []struct {
val interface{}
wantSchema *hcl.BodySchema
wantPartial bool
}{
{
struct{}{},
&hcl.BodySchema{},
false,
},
{
struct {
Ignored bool
}{},
&hcl.BodySchema{},
false,
},
{
struct {
Attr1 bool `hcl:"attr1"`
Attr2 bool `hcl:"attr2"`
}{},
&hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "attr1",
Required: true,
},
{
Name: "attr2",
Required: true,
},
},
},
false,
},
{
struct {
Attr *bool `hcl:"attr,attr"`
}{},
&hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "attr",
Required: false,
},
},
},
false,
},
{
struct {
Thing struct{} `hcl:"thing,block"`
}{},
&hcl.BodySchema{
Blocks: []hcl.BlockHeaderSchema{
{
Type: "thing",
},
},
},
false,
},
{
struct {
Thing struct {
Type string `hcl:"type,label"`
Name string `hcl:"name,label"`
} `hcl:"thing,block"`
}{},
&hcl.BodySchema{
Blocks: []hcl.BlockHeaderSchema{
{
Type: "thing",
LabelNames: []string{"type", "name"},
},
},
},
false,
},
{
struct {
Thing []struct {
Type string `hcl:"type,label"`
Name string `hcl:"name,label"`
} `hcl:"thing,block"`
}{},
&hcl.BodySchema{
Blocks: []hcl.BlockHeaderSchema{
{
Type: "thing",
LabelNames: []string{"type", "name"},
},
},
},
false,
},
{
struct {
Thing *struct {
Type string `hcl:"type,label"`
Name string `hcl:"name,label"`
} `hcl:"thing,block"`
}{},
&hcl.BodySchema{
Blocks: []hcl.BlockHeaderSchema{
{
Type: "thing",
LabelNames: []string{"type", "name"},
},
},
},
false,
},
{
struct {
Thing struct {
Name string `hcl:"name,label"`
Something string `hcl:"something"`
} `hcl:"thing,block"`
}{},
&hcl.BodySchema{
Blocks: []hcl.BlockHeaderSchema{
{
Type: "thing",
LabelNames: []string{"name"},
},
},
},
false,
},
{
struct {
Doodad string `hcl:"doodad"`
Thing struct {
Name string `hcl:"name,label"`
} `hcl:"thing,block"`
}{},
&hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "doodad",
Required: true,
},
},
Blocks: []hcl.BlockHeaderSchema{
{
Type: "thing",
LabelNames: []string{"name"},
},
},
},
false,
},
{
struct {
Doodad string `hcl:"doodad"`
Config string `hcl:",remain"`
}{},
&hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "doodad",
Required: true,
},
},
},
true,
},
{
struct {
Expr hcl.Expression `hcl:"expr"`
}{},
&hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "expr",
Required: false,
},
},
},
false,
},
{
struct {
Meh string `hcl:"meh,optional"`
}{},
&hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "meh",
Required: false,
},
},
},
false,
},
}
for _, test := range tests {
t.Run(fmt.Sprintf("%#v", test.val), func(t *testing.T) {
schema, partial := ImpliedBodySchema(test.val)
if !reflect.DeepEqual(schema, test.wantSchema) {
t.Errorf(
"wrong schema\ngot: %s\nwant: %s",
spew.Sdump(schema), spew.Sdump(test.wantSchema),
)
}
if partial != test.wantPartial {
t.Errorf(
"wrong partial flag\ngot: %#v\nwant: %#v",
partial, test.wantPartial,
)
}
})
}
}

16
gohcl/types.go Normal file
View File

@ -0,0 +1,16 @@
package gohcl
import (
"reflect"
"github.com/hashicorp/hcl2/hcl"
)
var victimExpr hcl.Expression
var victimBody hcl.Body
var exprType = reflect.TypeOf(&victimExpr).Elem()
var bodyType = reflect.TypeOf(&victimBody).Elem()
var blockType = reflect.TypeOf((*hcl.Block)(nil))
var attrType = reflect.TypeOf((*hcl.Attribute)(nil))
var attrsType = reflect.TypeOf(hcl.Attributes(nil))

2
guide/.gitignore vendored Normal file
View File

@ -0,0 +1,2 @@
env/*
_build/*

20
guide/Makefile Normal file
View File

@ -0,0 +1,20 @@
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
SPHINXPROJ = HCL
SOURCEDIR = .
BUILDDIR = _build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

157
guide/conf.py Normal file
View File

@ -0,0 +1,157 @@
import subprocess
import os
import os.path
# -- Project information -----------------------------------------------------
project = u'HCL'
copyright = u'2018, HashiCorp'
author = u'HashiCorp'
if 'READTHEDOCS_VERSION' in os.environ:
version_str = os.environ['READTHEDOCS_VERSION']
else:
version_str = subprocess.check_output(['git', 'describe', '--always']).strip()
# The short X.Y version
version = unicode(version_str)
# The full version, including alpha/beta/rc tags
release = unicode(version_str)
# -- General configuration ---------------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
#
# needs_sphinx = '1.0'
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
'sphinx.ext.todo',
'sphinx.ext.githubpages',
'sphinxcontrib.golangdomain',
'sphinx.ext.autodoc',
]
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
#
# source_suffix = ['.rst', '.md']
source_suffix = '.rst'
# The master toctree document.
master_doc = 'index'
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = None
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This pattern also affects html_static_path and html_extra_path .
exclude_patterns = [u'_build', 'Thumbs.db', '.DS_Store', 'env']
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# -- Options for HTML output -------------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
html_theme = 'alabaster'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
#
# html_theme_options = {}
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
# Custom sidebar templates, must be a dictionary that maps document names
# to template names.
#
# The default sidebars (for documents that don't match any pattern) are
# defined by theme itself. Builtin themes are using these templates by
# default: ``['localtoc.html', 'relations.html', 'sourcelink.html',
# 'searchbox.html']``.
#
# html_sidebars = {}
# -- Options for HTMLHelp output ---------------------------------------------
# Output file base name for HTML help builder.
htmlhelp_basename = 'HCLdoc'
# -- Options for LaTeX output ------------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#
# 'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
#
# 'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
#
# 'preamble': '',
# Latex figure (float) alignment
#
# 'figure_align': 'htbp',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
(master_doc, 'HCL.tex', u'HCL Documentation',
u'HashiCorp', 'manual'),
]
# -- Options for manual page output ------------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
(master_doc, 'hcl', u'HCL Documentation',
[author], 1)
]
# -- Options for Texinfo output ----------------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
(master_doc, 'HCL', u'HCL Documentation',
author, 'HCL', 'One line description of project.',
'Miscellaneous'),
]
# -- Extension configuration -------------------------------------------------
# -- Options for todo extension ----------------------------------------------
# If true, `todo` and `todoList` produce output, else they produce nothing.
todo_include_todos = True

31
guide/go.rst Normal file
View File

@ -0,0 +1,31 @@
Using HCL in a Go application
=============================
HCL is itself written in Go_ and currently it is primarily intended for use as
a library within other Go programs.
This section describes a number of different ways HCL can be used to define
and process a configuration language within a Go program. For simple situations,
HCL can decode directly into Go ``struct`` values in a similar way as encoding
packages such as ``encoding/json`` and ``encoding/xml``.
The HCL Go API also offers some alternative approaches however, for processing
languages that may be more complex or that include portions whose expected
structure cannot be determined until runtime.
The following sections give an overview of different ways HCL can be used in
a Go program.
.. toctree::
:maxdepth: 1
:caption: Sub-sections:
go_parsing
go_diagnostics
go_decoding_gohcl
go_decoding_hcldec
go_expression_eval
go_decoding_lowlevel
go_patterns
.. _Go: https://golang.org/

130
guide/go_decoding_gohcl.rst Normal file
View File

@ -0,0 +1,130 @@
.. go:package:: gohcl
.. _go-decoding-gohcl:
Decoding Into Native Go Values
==============================
The most straightforward way to access the content of an HCL file is to
decode into native Go values using ``reflect``, similar to the technique used
by packages like ``encoding/json`` and ``encoding/xml``.
Package ``gohcl`` provides functions for this sort of decoding. Function
``DecodeBody`` attempts to extract values from an HCL *body* and write them
into a Go value given as a pointer:
.. code-block:: go
type ServiceConfig struct {
Type string `hcl:"type,label"`
Name string `hcl:"name,label"`
ListenAddr string `hcl:"listen_addr"`
}
type Config struct {
IOMode string `hcl:"io_mode"`
Services []ServiceConfig `hcl:"service,block"`
}
var c Config
moreDiags := gohcl.DecodeBody(f.Body, nil, &c)
diags = append(diags, moreDiags...)
The above example decodes the *root body* of a file ``f``, presumably loaded
previously using a parser, into the variable ``c``. The field labels within
the struct types imply the schema of the expected language, which is a cut-down
version of the hypothetical language we showed in :ref:`intro`.
The struct field labels consist of two comma-separated values. The first is
the name of the corresponding argument or block type as it will appear in
the input file, and the second is the type of element being named. If the
second value is omitted, it defaults to ``attr``, requesting an attribute.
Nested blocks are represented by a struct or a slice of that struct, and the
special element type ``label`` within that struct declares that each instance
of that block type must be followed by one or more block labels. In the above
example, the ``service`` block type is defined to require two labels, named
``type`` and ``name``. For label fields in particular, the given name is used
only to refer to the particular label in error messages when the wrong number
of labels is used.
By default, all declared attributes and blocks are considered to be required.
An optional value is indicated by making its field have a pointer type, in
which case ``nil`` is written to indicate the absense of the argument.
The sections below discuss some additional decoding use-cases. For full details
on the `gohcl` package, see
`the godoc reference <https://godoc.org/github.com/hashicorp/hcl2/gohcl>`_.
.. _go-decoding-gohcl-evalcontext:
Variables and Functions
-----------------------
By default, arguments given in the configuration may use only literal values
and the built in expression language operators, such as arithmetic.
The second argument to ``gohcl.DecodeBody``, shown as ``nil`` in the previous
example, allows the calling application to additionally offer variables and
functions for use in expressions. Its value is a pointer to an
``hcl.EvalContext``, which will be covered in more detail in the later section
:ref:`go-expression-eval`. For now, a simple example of making the id of the
current process available as a single variable called ``pid``:
.. code-block:: go
type Context struct {
Pid string
}
ctx := gohcl.EvalContext(&Context{
Pid: os.Getpid()
})
var c Config
moreDiags := gohcl.DecodeBody(f.Body, ctx, &c)
diags = append(diags, moreDiags...)
``gohcl.EvalContext`` constructs an expression evaluation context from a Go
struct value, making the fields available as variables and the methods
available as functions, after transforming the field and method names such
that each word (starting with an uppercase letter) is all lowercase and
separated by underscores.
.. code-block:: hcl
name = "example-program (${pid})"
Partial Decoding
----------------
In the examples so far, we've extracted the content from the entire input file
in a single call to ``DecodeBody``. This is sufficient for many simple
situations, but sometimes different parts of the file must be evaluated
separately. For example:
* If different parts of the file must be evaluated with different variables
or functions available.
* If the result of evaluating one part of the file is used to set variables
or functions in another part of the file.
There are several ways to perform partial decoding with ``gohcl``, all of
which involve decoding into HCL's own types, such as ``hcl.Body``.
The most general approach is to declare an additional struct field of type
``hcl.Body``, with the special field tag type ``remain``:
.. code-block:: go
type ServiceConfig struct {
Type string `hcl:"type,label"`
Name string `hcl:"name,label"`
ListenAddr string `hcl:"listen_addr"`
Remain hcl.Body `hcl:",remain"`
}
When a ``remain`` field is present, any element of the input body that is
not matched is retained in a body saved into that field, which can then be
decoded in a later call, potentially with a different evaluation context.
Another option is to decode an attribute into a value of type `hcl.Expression`,
which can then be evaluated separately as described in
:ref:`expression-eval`.

View File

@ -0,0 +1,242 @@
.. go:package:: hcldec
.. _go-decoding-hcldec:
Decoding With Dynamic Schema
============================
In section :ref:`go-decoding-gohcl`, we saw the most straightforward way to
access the content from an HCL file, decoding directly into a Go value whose
type is known at application compile time.
For some applications, it is not possible to know the schema of the entire
configuration when the application is built. For example, `HashiCorp Terraform`_
uses HCL as the foundation of its configuration language, but parts of the
configuration are handled by plugins loaded dynamically at runtime, and so
the schemas for these portions cannot be encoded directly in the Terraform
source code.
HCL's ``hcldec`` package offers a different approach to decoding that allows
schemas to be created at runtime, and the result to be decoded into
dynamically-typed data structures.
The sections below are an overview of the main parts of package ``hcldec``.
For full details, see
`the package godoc <https://godoc.org/github.com/hashicorp/hcl2/hcldec>`_.
.. _`HashiCorp Terraform`: https://www.terraform.io/
Decoder Specification
---------------------
Whereas :go:pkg:`gohcl` infers the expected schema by using reflection against
the given value, ``hcldec`` obtains schema through a decoding *specification*,
which is a set of instructions for mapping HCL constructs onto a dynamic
data structure.
The ``hcldec`` package contains a number of different specifications, each
implementing :go:type:`hcldec.Spec` and having a ``Spec`` suffix on its name.
Each spec has two distinct functions:
* Adding zero or more validation constraints on the input configuration file.
* Producing a result value based on some elements from the input file.
The most common pattern is for the top-level spec to be a
:go:type:`hcldec.ObjectSpec` with nested specifications defining either blocks
or attributes, depending on whether the configuration file will be
block-structured or flat.
.. code-block:: go
spec := hcldec.ObjectSpec{
"io_mode": &hcldec.AttrSpec{
Name: "io_mode",
Type: cty.String,
},
"services": &hcldec.BlockMapSpec{
TypeName: "service",
LabelNames: []string{"type", "name"},
Nested: hcldec.ObjectSpec{
"listen_addr": &hcldec.AttrSpec{
Name: "listen_addr",
Type: cty.String,
Required: true,
},
"processes": &hcldec.BlockMapSpec{
TypeName: "service",
LabelNames: []string{"name"},
Nested: hcldec.ObjectSpec{
"command": &hcldec.AttrSpec{
Name: "command",
Type: cty.List(cty.String),
Required: true,
},
},
},
},
},
}
val, moreDiags := hcldec.Decode(f.Body, spec, nil)
diags = append(diags, moreDiags...)
The above specification expects a configuration shaped like our example in
:ref:`intro`, and calls for it to be decoded into a dynamic data structure
that would have the following shape if serialized to JSON:
.. code-block:: JSON
{
"io_mode": "async",
"services": {
"http": {
"web_proxy": {
"listen_addr": "127.0.0.1:8080",
"processes": {
"main": {
"command": ["/usr/local/bin/awesome-app", "server"]
},
"mgmt": {
"command": ["/usr/local/bin/awesome-app", "mgmt"]
}
}
}
}
}
}
.. go:package:: cty
Types and Values With ``cty``
-----------------------------
HCL's expression interpreter is implemented in terms of another library called
:go:pkg:`cty`, which provides a type system which HCL builds on and a robust
representation of dynamic values in that type system. You could think of
:go:pkg:`cty` as being a bit like Go's own :go:pkg:`reflect`, but for the
results of HCL expressions rather than Go programs.
The full details of this system can be found in
`its own repository <https://github.com/zclconf/go-cty>`_, but this section
will cover the most important highlights, because ``hcldec`` specifications
include :go:pkg:`cty` types (as seen in the above example) and its results are
:go:pkg:`cty` values.
``hcldec`` works directly with :go:pkg:`cty` — as opposed to converting values
directly into Go native types — because the functionality of the :go:pkg:`cty`
packages then allows further processing of those values without any loss of
fidelity or range. For example, :go:pkg:`cty` defines a JSON encoding of its
values that can be decoded losslessly as long as both sides agree on the value
type that is expected, which is a useful capability in systems where some sort
of RPC barrier separates the main program from its plugins.
Types are instances of :go:type:`cty.Type`, and are constructed from functions
and variables in :go:pkg:`cty` as shown in the above example, where the string
attributes are typed as ``cty.String``, which is a primitive type, and the list
attribute is typed as ``cty.List(cty.String)``, which constructs a new list
type with string elements.
Values are instances of :go:type:`cty.Value`, and can also be constructed from
functions in :go:pkg:`cty`, using the functions that include ``Val`` in their
names or using the operation methods available on :go:type:`cty.Value`.
In most cases you will eventually want to use the resulting data as native Go
types, to pass it to non-:go:pkg:`cty`-aware code. To do this, see the guides
on
`Converting between types <https://github.com/zclconf/go-cty/blob/master/docs/convert.md>`_
(staying within :go:pkg:`cty`) and
`Converting to and from native Go values <https://github.com/zclconf/go-cty/blob/master/docs/gocty.md>`_.
Partial Decoding
----------------
Because the ``hcldec`` result is always a value, the input is always entirely
processed in a single call, unlike with :go:pkg:`gohcl`.
However, both :go:pkg:`gohcl` and :go:pkg:`hcldec` take :go:type:`hcl.Body` as
the representation of input, and so it is possible and common to mix them both
in the same program.
A common situation is that :go:pkg:`gohcl` is used in the main program to
decode the top level of configuration, which then allows the main program to
determine which plugins need to be loaded to process the leaf portions of
configuration. In this case, the portions that will be interpreted by plugins
are retained as opaque :go:type:`hcl.Body` until the plugins have been loaded,
and then each plugin provides its :go:type:`hcldec.Spec` to allow decoding the
plugin-specific configuration into a :go:type:`cty.Value` which be
transmitted to the plugin for further processing.
In our example from :ref:`intro`, perhaps each of the different service types
is managed by a plugin, and so the main program would decode the block headers
to learn which plugins are needed, but process the block bodies dynamically:
.. code-block:: go
type ServiceConfig struct {
Type string `hcl:"type,label"`
Name string `hcl:"name,label"`
PluginConfig hcl.Body `hcl:",remain"`
}
type Config struct {
IOMode string `hcl:"io_mode"`
Services []ServiceConfig `hcl:"service,block"`
}
var c Config
moreDiags := gohcl.DecodeBody(f.Body, nil, &c)
diags = append(diags, moreDiags...)
if moreDiags.HasErrors() {
// (show diags in the UI)
return
}
for _, sc := range c.Services {
pluginName := block.Type
// Totally-hypothetical plugin manager (not part of HCL)
plugin, err := pluginMgr.GetPlugin(pluginName)
if err != nil {
diags = diags.Append(&hcl.Diagnostic{ /* ... */ })
continue
}
spec := plugin.ConfigSpec() // returns hcldec.Spec
// Decode the block body using the plugin's given specification
configVal, moreDiags := hcldec.Decode(sc.PluginConfig, spec, nil)
diags = append(diags, moreDiags...)
if moreDiags.HasErrors() {
continue
}
// Again, hypothetical API within your application itself, and not
// part of HCL. Perhaps plugin system serializes configVal as JSON
// and sends it over to the plugin.
svc := plugin.NewService(configVal)
serviceMgr.AddService(sc.Name, svc)
}
Variables and Functions
-----------------------
The final argument to ``hcldec.Decode`` is an expression evaluation context,
just as with ``gohcl.DecodeBlock``.
This object can be constructed using
:ref:`the gohcl helper function <go-decoding-gohcl-evalcontext>` as before if desired, but
you can also choose to work directly with :go:type:`hcl.EvalContext` as
discussed in :ref:`go-expression-eval`:
.. code-block:: go
ctx := &hcl.EvalContext{
Variables: map[string]cty.Value{
"pid": cty.NumberIntVal(int64(os.Getpid())),
},
}
val, moreDiags := hcldec.Decode(f.Body, spec, ctx)
diags = append(diags, moreDiags...)
As you can see, this lower-level API also uses :go:pkg:`cty`, so it can be
particularly convenient in situations where the result of dynamically decoding
one block must be available to expressions in another block.

View File

@ -0,0 +1,199 @@
.. _go-decoding-lowlevel:
Advanced Decoding With The Low-level API
========================================
In previous sections we've discussed :go:pkg:`gohcl` and :go:pkg:`hcldec`,
which both deal with decoding of HCL bodies and the expressions within them
using a high-level description of the expected configuration schema.
Both of these packages are implemented in terms of HCL's low-level decoding
interfaces, which we will explore in this section.
HCL decoding in the low-level API has two distinct phases:
* Structural decoding: analyzing the arguments and nested blocks present in a
particular body.
* Expression evaluation: obtaining final values for each argument expression
found during structural decoding.
The low-level API gives the calling application full control over when each
body is decoded and when each expression is evaluated, allowing for more
complex configuration formats where e.g. different variables are available in
different contexts, or perhaps expressions within one block can refer to
values defined in another block.
The low-level API also gives more detailed access to source location
information for decoded elements, and so may be desirable for applications that
do a lot of additional validation of decoded data where more specific source
locations lead to better diagnostic messages.
Since all of the decoding mechanisms work with the same :go:type:`hcl.Body`
type, it is fine and expected to mix them within an application to get access
to the more detailed information where needed while using the higher-level APIs
for the more straightforward portions of a configuration language.
The following subsections will give an overview of the low-level API. For full
details, see `the godoc reference <https://godoc.org/github.com/hashicorp/hcl2/hcl>`_.
Structural Decoding
-------------------
As seen in prior sections, :go:type:`hcl.Body` is an opaque representation of
the arguments and child blocks at a particular nesting level. An HCL file has
a root body containing the top-level elements, and then each nested block has
its own body presenting its own content.
:go:type:`hcl.Body` is a Go interface whose methods serve as the structural
decoding API:
.. go:currentpackage:: hcl
.. go:type:: Body
Represents the structural elements at a particular nesting level.
.. go:function:: func (b Body) Content(schema *BodySchema) (*BodyContent, Diagnostics)
Decode the content from the receiving body using the given schema. The
schema is considered exhaustive of all content within the body, and so
any elements not covered by the schema will generate error diagnostics.
.. go:function:: func (b Body) PartialContent(schema *BodySchema) (*BodyContent, Body, Diagnostics)
Similar to `Content`, but allows for additional arguments and block types
that are not described in the given schema. The additional body return
value is a special body that contains only the *remaining* elements, after
extraction of the ones covered by the schema. This returned body can be
used to decode the remaining content elsewhere in the calling program.
.. go:function:: func (b Body) JustAttributes() (Attributes, Diagnostics)
Decode the content from the receving body in a special *attributes-only*
mode, allowing the calling application to enumerate the arguments given
inside the body without needing to predict them in schema.
When this method is used, a body can be treated somewhat like a map
expression, but it still has a rigid structure where the arguments must
be given directly with no expression evaluation. This is an advantage for
declarations that must themselves be resolved before expression
evaluation is possible.
If the body contains any blocks, error diagnostics are returned. JSON
syntax relies on schema to distinguish arguments from nested blocks, and
so a JSON body in attributes-only mode will treat all JSON object
properties as arguments.
.. go:function:: func (b Body) MissingItemRange() Range
Returns a source range that points to where an absent required item in
the body might be placed. This is a "best effort" sort of thing, required
only to be somewhere inside the receving body, as a way to give source
location information for a "missing required argument" sort of error.
The main content-decoding methods each require a :go:type:`hcl.BodySchema`
object describing the expected content. The fields of this type describe the
expected arguments and nested block types respectively:
.. code-block:: go
schema := &hcl.BodySchema{
Attributes: []hcl.AttributeSchema{
{
Name: "io_mode",
Required: false,
},
},
Blocks: []hcl.BlockHeaderSchema{
{
Type: "service",
LabelNames: []string{"type", "name"},
},
},
}
content, moreDiags := body.Content(schema)
diags = append(diags, moreDiags...)
:go:type:`hcl.BodyContent` is the result of both ``Content`` and
``PartialContent``, giving the actual attributes and nested blocks that were
found. Since arguments are uniquely named within a body and unordered, they
are returned as a map. Nested blocks are ordered and may have many instances
of a given type, so they are returned all together in a single slice for
further interpretation by the caller.
Unlike the two higher-level approaches, the low-level API *always* works only
with one nesting level at a time. Decoding a nested block returns the "header"
for that block, giving its type and label values, but its body remains an
:go:type:`hcl.Body` for later decoding.
Each returned attribute corresponds to one of the arguments in the body, and
it has an :go:type:`hcl.Expression` object that can be used to obtain a value
for the argument during expression evaluation, as described in the next
section.
Expression Evaluation
---------------------
Expression evaluation *in general* has its own section, imaginitively titled
:ref:`go-expression-eval`, so this section will focus only on how it is
achieved in the low-level API.
All expression evaluation in the low-level API starts with an
:go:type:`hcl.Expression` object. This is another interface type, with various
implementations depending on the expression type and the syntax it was parsed
from.
.. go:currentpackage:: hcl
.. go:type:: Expression
Represents a unevaluated single expression.
.. go:function:: func (e Expression) Value(ctx *EvalContext) (cty.Value, Diagnostics)
Evaluates the receiving expression in the given evaluation context. The
result is a :go:type:`cty.Value` representing the result value, along
with any diagnostics that were raised during evaluation.
If the diagnostics contains errors, the value may be incomplete or
invalid and should either be discarded altogether or used with care for
analysis.
.. go:function:: func (e Expression) Variables() []Traversal
Returns information about any nested expressions that access variables
from the *global* evaluation context. Does not include references to
temporary local variables, such as those generated by a
"``for`` expression".
.. go:function:: func (e Expression) Range() Range
Returns the source range for the entire expression. This can be useful
when generating application-specific diagnostic messages, such as
value validation errors.
.. go:function:: func (e Expression) StartRange() Range
Similar to ``Range``, but if the expression is complex, such as a tuple
or object constructor, may indicate only the opening tokens for the
construct to avoid creating an overwhelming source code snippet.
This should be used in diagnostic messages only in situations where the
error is clearly with the construct itself and not with the overall
expression. For example, a type error indicating that a tuple was not
expected might use ``StartRange`` to draw attention to the beginning
of a tuple constructor, without highlighting the entire expression.
Method ``Value`` is the primary API for expressions, and takes the same kind
of evaluation context object described in :ref:`go-expression-eval`.
.. code-block:: go
ctx := &hcl.EvalContext{
Variables: map[string]cty.Value{
"name": cty.StringVal("Ermintrude"),
"age": cty.NumberIntVal(32),
},
}
val, moreDiags := expr.Value(ctx)
diags = append(diags, moreDiags...)

97
guide/go_diagnostics.rst Normal file
View File

@ -0,0 +1,97 @@
.. _go-diagnostics:
Diagnostic Messages
===================
An important concern for any machine language intended for human authoring is
to produce good error messages when the input is somehow invalid, or has
other problems.
HCL uses *diagnostics* to describe problems in an end-user-oriented manner,
such that the calling application can render helpful error or warning messages.
The word "diagnostic" is a general term that covers both errors and warnings,
where errors are problems that prevent complete processing while warnings are
possible concerns that do not block processing.
HCL deviates from usual Go API practice by returning its own ``hcl.Diagnostics``
type, instead of Go's own ``error`` type. This allows functions to return
warnings without accompanying errors while not violating the usual expectation
that the absense of errors is indicated by a nil ``error``.
In order to easily accumulate and return multiple diagnostics at once, the
usual pattern for functions returning diagnostics is to gather them in a
local variable and then return it at the end of the function, or possibly
earlier if the function cannot continue due to the problems.
.. code-block:: go
func returningDiagnosticsExample() hcl.Diagnostics {
var diags hcl.Diagnostics
// ...
// Call a function that may itself produce diagnostics.
f, moreDiags := parser.LoadHCLFile("example.conf")
// always append, in case warnings are present
diags = append(diags, moreDiags...)
if diags.HasErrors() {
// If we can't safely continue in the presence of errors here, we
// can optionally return early.
return diags
}
// ...
return diags
}
A common variant of the above pattern is calling another diagnostics-generating
function in a loop, using ``continue`` to begin the next iteration when errors
are detected, but still completing all iterations and returning the union of
all of the problems encountered along the way.
In :ref:`go-parsing`, we saw that the parser can generate diagnostics which
are related to syntax problems within the loaded file. Further steps to decode
content from the loaded file can also generate diagnostics related to *semantic*
problems within the file, such as invalid expressions or type mismatches, and
so a program using HCL will generally need to accumulate diagnostics across
these various steps and then render them in the application UI somehow.
Rendering Diagnostics in the UI
-------------------------------
The best way to render diagnostics to an end-user will depend a lot on the
type of application: they might be printed into a terminal, written into a
log for later review, or even shown in a GUI.
HCL leaves the responsibility for rendering diagnostics to the calling
application, but since rendering to a terminal is a common case for command-line
tools, the `hcl` package contains a default implementation of this in the
form of a "diagnostic text writer":
.. code-block:: go
wr := hcl.NewDiagnosticTextWriter(
os.Stdout, // writer to send messages to
parser.Files(), // the parser's file cache, for source snippets
78, // wrapping width
true, // generate colored/highlighted output
)
wr.WriteDiagnostics(diags)
This default implementation of diagnostic rendering includes relevant lines
of source code for context, like this:
::
Error: Unsupported block type
on example.tf line 4, in resource "aws_instance" "example":
2: provisionr "local-exec" {
Blocks of type "provisionr" are not expected here. Did you mean "provisioner"?
If the "color" flag is enabled, the severity will be additionally indicated by
a text color and the relevant portion of the source code snippet will be
underlined to draw further attention.

View File

@ -0,0 +1,149 @@
.. _go-expression-eval:
Expression Evaluation
=====================
Each argument attribute in a configuration file is interpreted as an
expression. In the HCL native syntax, certain basic expression functionality
is always available, such as arithmetic and template strings, and the calling
application can extend this by making available specific variables and/or
functions via an *evaluation context*.
We saw in :ref:`go-decoding-gohcl` and :ref:`go-decoding-hcldec` some basic
examples of populating an evaluation context to make a variable available.
This section will look more closely at the ``hcl.EvalContext`` type and how
HCL expression evaluation behaves in different cases.
This section does not discuss in detail the expression syntax itself. For more
information on that, see the HCL Native Syntax specification.
.. go:currentpackage:: hcl
.. go:type:: EvalContext
``hcl.EvalContext`` is the type used to describe the variables and functions
available during expression evaluation, if any. Its usage is described in
the following sections.
Defining Variables
------------------
As we saw in :ref:`go-decoding-hcldec`, HCL represents values using an
underlying library called :go:pkg:`cty`. When defining variables, their values
must be given as :go:type:`cty.Value` values.
A full description of the types and value constructors in :go:pkg:`cty` is
in `the reference documentation <https://github.com/zclconf/go-cty/blob/master/docs/types.md>`_.
Variables in HCL are defined by assigning values into a map from string names
to :go:type:`cty.Value`:
.. code-block:: go
ctx := &hcl.EvalContext{
Variables: map[string]cty.Value{
"name": cty.StringVal("Ermintrude"),
"age": cty.NumberIntVal(32),
},
}
If this evaluation context were passed to one of the evaluation functions we
saw in previous sections, the user would be able to refer to these variable
names in any argument expression appearing in the evaluated portion of
configuration:
.. code-block:: hcl
message = "${name} is ${age} ${age == 1 ? "year" : "years"} old!"
If you place ``cty``'s *object* values in the evaluation context, then their
attributes can be referenced using the HCL attribute syntax, allowing for more
complex structures:
.. code-block:: go
ctx := &hcl.EvalContext{
Variables: map[string]cty.Value{
"path": cty.ObjectVal(map[string]cty.Value{
"root": cty.StringVal(rootDir),
"module": cty.StringVal(moduleDir),
"current": cty.StringVal(currentDir),
}),
},
}
.. code-block:: hcl
source_file = "${path.module}/foo.txt"
.. _go-expression-funcs:
Defining Functions
------------------
Custom functions can be defined by you application to allow users of its
language to transform data in application-specific ways. The underlying
function mechanism is also provided by :go:pkg:`cty`, allowing you to define
the arguments a given function expects, what value type it will return for
given argument types, etc. The full functions model is described in the
``cty`` documentation section
`Functions System <https://github.com/zclconf/go-cty/blob/master/docs/functions.md>`_.
There are `a number of "standard library" functions <https://godoc.org/github.com/apparentlymart/go-cty/cty/function/stdlib>`_
available in a ``stdlib`` package within the :go:pkg:`cty` repository, avoiding
the need for each application to re-implement basic functions for string
manipulation, list manipulation, etc. It also includes function-shaped versions
of several operations that are native operators in HCL, which should generally
*not* be exposed as functions in HCL-based configuration formats to avoid user
confusion.
You can define functions in the ``Functions`` field of :go:type:`hcl.EvalContext`:
.. code-block:: go
ctx := &hcl.EvalContext{
Variables: map[string]cty.Value{
"name": cty.StringVal("Ermintrude"),
},
Functions: map[string]function.Function{
"upper": stdlib.UpperFunc,
"lower": stdlib.LowerFunc,
"min": stdlib.MinFunc,
"max": stdlib.MaxFunc,
"strlen": stdlib.StrlenFunc,
"substr": stdlib.SubstrFunc,
},
}
If this evaluation context were passed to one of the evaluation functions we
saw in previous sections, the user would be able to call any of these functions
in any argument expression appearing in the evaluated portion of configuration:
.. code-block:: hcl
message = "HELLO, ${upper(name)}!"
Expression Evaluation Modes
---------------------------
HCL uses a different expression evaluation mode depending on the evaluation
context provided. In HCL native syntax, evaluation modes are used to provide
more relevant error messages. In JSON syntax, which embeds the native
expression syntax in strings using "template" syntax, the evaluation mode
determines whether strings are evaluated as templates at all.
If the given :go:type:`hcl.EvalContext` is ``nil``, native syntax expressions
will react to users attempting to refer to variables or functions by producing
errors indicating that these features are not available at all, rather than
by saying that the specific variable or function does not exist. JSON syntax
strings will not be evaluated as templates *at all* in this mode, making them
function as literal strings.
If the evaluation context is non-``nil`` but either ``Variables`` or
``Functions`` within it is ``nil``, native syntax will similarly produce
"not supported" error messages. JSON syntax strings *will* parse templates
in this case, but can also generate "not supported" messages if e.g. the
user accesses a variable when the variables map is ``nil``.
If neither map is ``nil``, HCL assumes that both variables and functions are
supported and will instead produce error messages stating that the specific
variable or function accessed by the user is not defined.

64
guide/go_parsing.rst Normal file
View File

@ -0,0 +1,64 @@
.. _go-parsing:
Parsing HCL Input
=================
The first step in processing HCL input provided by a user is to parse it.
Parsing turns the raw bytes from an input file into a higher-level
representation of the arguments and blocks, ready to be *decoded* into an
application-specific form.
The main entry point into HCL parsing is :go:pkg:`hclparse`, which provides
:go:type:`hclparse.Parser`:
.. code-block:: go
parser := hclparse.NewParser()
f, diags := parser.ParseHCLFile("server.conf")
Variable ``f`` is then a pointer to an :go:type:`hcl.File`, which is an
opaque abstract representation of the file, ready to be decoded.
Variable ``diags`` describes any errors or warnings that were encountered
during processing; HCL conventionally uses this in place of the usual ``error``
return value in Go, to allow returning a mixture of multiple errors and
warnings together with enough information to present good error messages to the
user. We'll cover this in more detail in the next section,
:ref:`go-diagnostics`.
.. go:package:: hclparse
Package ``hclparse``
--------------------
.. go:type:: Parser
.. go:function:: func NewParser() *Parser
Constructs a new parser object. Each parser contains a cache of files
that have already been read, so repeated calls to load the same file
will return the same object.
.. go:function:: func (*Parser) ParseHCL(src []byte, filename string) (*hcl.File, hcl.Diagnostics)
Parse the given source code as HCL native syntax, saving the result into
the parser's file cache under the given filename.
.. go:function:: func (*Parser) ParseHCLFile(filename string) (*hcl.File, hcl.Diagnostics)
Parse the contents of the given file as HCL native syntax. This is a
convenience wrapper around ParseHCL that first reads the file into memory.
.. go:function:: func (*Parser) ParseJSON(src []byte, filename string) (*hcl.File, hcl.Diagnostics)
Parse the given source code as JSON syntax, saving the result into
the parser's file cache under the given filename.
.. go:function:: func (*Parser) ParseJSONFile(filename string) (*hcl.File, hcl.Diagnostics)
Parse the contents of the given file as JSON syntax. This is a
convenience wrapper around ParseJSON that first reads the file into memory.
The above list just highlights the main functions in this package.
For full documentation, see
`the hclparse godoc <https://godoc.org/github.com/hashicorp/hcl2/hclparse>`_.

315
guide/go_patterns.rst Normal file
View File

@ -0,0 +1,315 @@
Design Patterns for Complex Systems
===================================
In previous sections we've seen an overview of some different ways an
application can decode a language its has defined in terms of the HCL grammar.
For many applications, those mechanisms are sufficient. However, there are
some more complex situations that can benefit from some additional techniques.
This section lists a few of these situations and ways to use the HCL API to
accommodate them.
.. _go-interdep-blocks:
Interdependent Blocks
---------------------
In some configuration languages, the variables available for use in one
configuration block depend on values defined in other blocks.
For example, in Terraform many of the top-level constructs are also implicitly
definitions of values that are available for use in expressions elsewhere:
.. code-block:: hcl
variable "network_numbers" {
type = list(number)
}
variable "base_network_addr" {
type = string
default = "10.0.0.0/8"
}
locals {
network_blocks = {
for x in var.number:
x => cidrsubnet(var.base_network_addr, 8, x)
}
}
resource "cloud_subnet" "example" {
for_each = local.network_blocks
cidr_block = each.value
}
output "subnet_ids" {
value = cloud_subnet.example[*].id
}
In this example, the `variable "network_numbers"` block makes
``var.base_network_addr`` available to expressions, the
``resource "cloud_subnet" "example"`` block makes ``cloud_subnet.example``
available, etc.
Terraform achieves this by decoding the top-level structure in isolation to
start. You can do this either using the low-level API or using :go:pkg:`gohcl`
with :go:type:`hcl.Body` fields tagged as "remain".
Once you have a separate body for each top-level block, you can inspect each
of the attribute expressions inside using the ``Variables`` method on
:go:type:`hcl.Expression`, or the ``Variables`` function from package
:go:pkg:`hcldec` if you will eventually use its higher-level API to decode as
Terraform does.
The detected variable references can then be used to construct a dependency
graph between the blocks, and then perform a
`topological sort <https://en.wikipedia.org/wiki/Topological_sorting>`_ to
determine the correct order to evaluate each block's contents so that values
will always be available before they are needed.
Since :go:pkg:`cty` values are immutable, it is not convenient to directly
change values in a :go:type:`hcl.EvalContext` during this gradual evaluation,
so instead construct a specialized data structure that has a separate value
per object and construct an evaluation context from that each time a new
value becomes available.
Using :go:pkg:`hcldec` to evaluate block bodies is particularly convenient in
this scenario because it produces :go:type:`cty.Value` results which can then
just be directly incorporated into the evaluation context.
Distributed Systems
-------------------
Distributed systems cause a number of extra challenges, and configuration
management is rarely the worst of these. However, there are some specific
considerations for using HCL-based configuration in distributed systems.
For the sake of this section, we are concerned with distributed systems where
at least two separate components both depend on the content of HCL-based
configuration files. Real-world examples include the following:
* **HashiCorp Nomad** loads configuration (job specifications) in its servers
but also needs these results in its clients and in its various driver plugins.
* **HashiCorp Terraform** parses configuration in Terraform Core but can write
a partially-evaluated execution plan to disk and continue evaluation in a
separate process later. It must also pass configuration values into provider
plugins.
Broadly speaking, there are two approaches to allowing configuration to be
accessed in multiple subsystems, which the following subsections will discuss
separately.
Ahead-of-time Evaluation
^^^^^^^^^^^^^^^^^^^^^^^^
Ahead-of-time evaluation is the simplest path, with the configuration files
being entirely evaluated on entry to the system, and then only the resulting
*constant values* being passed between subsystems.
This approach is relatively straightforward because the resulting
:go:type:`cty.Value` results can be losslessly serialized as either JSON or
msgpack as long as all system components agree on the expected value types.
Aside from passing these values around "on the wire", parsing and decoding of
configuration proceeds as normal.
Both Nomad and Terraform use this approach for interacting with *plugins*,
because the plugins themselves are written by various different teams that do
not coordinate closely, and so doing all expression evaluation in the core
subsystems ensures consistency between plugins and simplifies plugin development.
In both applications, the plugin is expected to describe (using an
application-specific protocol) the schema it expects for each element of
configuration it is responsible for, allowing the core subsystems to perform
decoding on the plugin's behalf and pass a value that is guaranteed to conform
to the schema.
Gradual Evaluation
^^^^^^^^^^^^^^^^^^
Although ahead-of-time evaluation is relatively straightforward, it has the
significant disadvantage that all data available for access via variables or
functions must be known by whichever subsystem performs that initial
evaluation.
For example, in Terraform, the "plan" subcommand is responsible for evaluating
the configuration and presenting to the user an execution plan for approval, but
certain values in that plan cannot be determined until the plan is already
being applied, since the specific values used depend on remote API decisions
such as the allocation of opaque id strings for objects.
In Terraform's case, both the creation of the plan and the eventual apply
of that plan *both* entail evaluating configuration, with the apply step
having a more complete set of input values and thus producing a more complete
result. However, this means that Terraform must somehow make the expressions
from the original input configuration available to the separate process that
applies the generated plan.
Good usability requires error and warning messages that are able to refer back
to specific sections of the input configuration as context for the reported
problem, and the best way to achieve this in a distributed system doing
gradual evaluation is to send the configuration *source code* between
subsystems. This is generally the most compact representation that retains
source location information, and will avoid any inconsistency caused by
introducing another intermediate serialization.
In Terraform's, for example, the serialized plan incorporates both the data
structure describing the partial evaluation results from the plan phase and
the original configuration files that produced those results, which can then
be re-evalauated during the apply step.
In a gradual evaluation scenario, the application should verify correctness of
the input configuration as completely as possible at each state. To help with
this, :go:pkg:`cty` has the concept of
`unknown values <https://github.com/zclconf/go-cty/blob/master/docs/concepts.md#unknown-values-and-the-dynamic-pseudo-type>`_,
which can stand in for values the application does not yet know while still
retaining correct type information. HCL expression evaluation reacts to unknown
values by performing type checking but then returning another unknown value,
causing the unknowns to propagate through expressions automatically.
.. code-block:: go
ctx := &hcl.EvalContext{
Variables: map[string]cty.Value{
"name": cty.UnknownVal(cty.String),
"age": cty.UnknownVal(cty.Number),
},
}
val, moreDiags := expr.Value(ctx)
diags = append(diags, moreDiags...)
Each time an expression is re-evaluated with additional information, fewer of
the input values will be unknown and thus more of the result will be known.
Eventually the application should evaluate the expressions with no unknown
values at all, which then guarantees that the result will also be wholly-known.
Static References, Calls, Lists, and Maps
-----------------------------------------
In most cases, we care more about the final result value of an expression than
how that value was obtained. A particular list argument, for example, might
be defined by the user via a tuple constructor, by a `for` expression, or by
assigning the value of a variable that has a suitable list type.
In some special cases, the structure of the expression is more important than
the result value, or an expression may not *have* a reasonable result value.
For example, in Terraform there are a few arguments that call for the user
to name another object by reference, rather than provide an object value:
.. code-block:: hcl
resource "cloud_network" "example" {
# ...
}
resource "cloud_subnet" "example" {
cidr_block = "10.1.2.0/24"
depends_on = [
cloud_network.example,
]
}
The ``depends_on`` argument in the second ``resource`` block *appears* as an
expression that would construct a single-element tuple containing an object
representation of the first resource block. However, Terraform uses this
expression to construct its dependency graph, and so it needs to see
specifically that this expression refers to ``cloud_network.example``, rather
than determine a result value for it.
HCL offers a number of "static analysis" functions to help with this sort of
situation. These all live in the :go:pkg:`hcl` package, and each one imposes
a particular requirement on the syntax tree of the expression it is given,
and returns a result derived from that if the expression conforms to that
requirement.
.. go:currentpackage:: hcl
.. go:function:: func ExprAsKeyword(expr Expression) string
This function attempts to interpret the given expression as a single keyword,
returning that keyword as a string if possible.
A "keyword" for the purposes of this function is an expression that can be
understood as a valid single identifier. For example, the simple variable
reference ``foo`` can be interpreted as a keyword, while ``foo.bar``
cannot.
As a special case, the language-level keywords ``true``, ``false``, and
``null`` are also considered to be valid keywords, allowing the calling
application to disregard their usual meaning.
If the given expression cannot be reduced to a single keyword, the result
is an empty string. Since an empty string is never a valid keyword, this
result unambiguously signals failure.
.. go:function:: func AbsTraversalForExpr(expr Expression) (Traversal, Diagnostics)
This is a generalization of ``ExprAsKeyword`` that will accept anything that
can be interpreted as a *traversal*, which is a variable name followed by
zero or more attribute access or index operators with constant operands.
For example, all of ``foo``, ``foo.bar`` and ``foo[0]`` are valid
traversals, but ``foo[bar]`` is not, because the ``bar`` index is not
constant.
This is the function that Terraform uses to interpret the items within the
``depends_on`` sequence in our example above.
As with ``ExprAsKeyword``, this function has a special case that the
keywords ``true``, ``false``, and ``null`` will be accepted as if they were
variable names by this function, allowing ``null.foo`` to be interpreted
as a traversal even though it would be invalid if evaluated.
If error diagnostics are returned, the traversal result is invalid and
should not be used.
.. go:function:: func RelTraversalForExpr(expr Expression) (Traversal, Diagnostics)
This is very similar to ``AbsTraversalForExpr``, but the result is a
*relative* traversal, which is one whose first name is considered to be
an attribute of some other (implied) object.
The processing rules are identical to ``AbsTraversalForExpr``, with the
only exception being that the first element of the returned traversal is
marked as being an attribute, rather than as a root variable.
.. go:function:: func ExprList(expr Expression) ([]Expression, Diagnostics)
This function requires that the given expression be a tuple constructor,
and if so returns a slice of the element expressions in that constructor.
Applications can then perform further static analysis on these, or evaluate
them as normal.
If error diagnostics are returned, the result is invalid and should not be
used.
This is the fucntion that Terraform uses to interpret the expression
assigned to ``depends_on`` in our example above, then in turn using
``AbsTraversalForExpr`` on each enclosed expression.
.. go:function:: func ExprMap(expr Expression) ([]KeyValuePair, Diagnostics)
This function requires that the given expression be an object constructor,
and if so returns a slice of the element key/value pairs in that constructor.
Applications can then perform further static analysis on these, or evaluate
them as normal.
If error diagnostics are returned, the result is invalid and should not be
used.
.. go:function:: func ExprCall(expr Expression) (*StaticCall, Diagnostics)
This function requires that the given expression be a function call, and
if so returns an object describing the name of the called function and
expression objects representing the call arguments.
If error diagnostics are returned, the result is invalid and should not be
used.
The ``Variables`` method on :go:type:`hcl.Expression` is also considered to be
a "static analysis" helper, but is built in as a fundamental feature because
analysis of referenced variables is often important for static validation and
for implementing interdependent blocks as we saw in the section above.

35
guide/index.rst Normal file
View File

@ -0,0 +1,35 @@
HCL Config Language Toolkit
===========================
HCL is a toolkit for creating structured configuration languages that are both
human- and machine-friendly, for use with command-line tools, servers, etc.
HCL has both a native syntax, intended to be pleasant to read and write for
humans, and a JSON-based variant that is easier for machines to generate and
parse. The native syntax is inspired by libucl_, `nginx configuration`_, and
others.
It includes an expression syntax that allows basic inline computation and, with
support from the calling application, use of variables and functions for more
dynamic configuration languages.
HCL provides a set of constructs that can be used by a calling application to
construct a configuration language. The application defines which argument
names and nested block types are expected, and HCL parses the configuration
file, verifies that it conforms to the expected structure, and returns
high-level objects that the application can use for further processing.
At present, HCL is primarily intended for use in applications written in Go_,
via its library API.
.. toctree::
:maxdepth: 1
:caption: Contents:
intro
go
language_design
.. _libucl: https://github.com/vstakhov/libucl
.. _`nginx configuration`: http://nginx.org/en/docs/beginners_guide.html#conf_structure
.. _Go: https://golang.org/

108
guide/intro.rst Normal file
View File

@ -0,0 +1,108 @@
.. _intro:
Introduction to HCL
===================
HCL-based configuration is built from two main constructs: arguments and
blocks. The following is an example of a configuration language for a
hypothetical application:
.. code-block:: hcl
io_mode = "async"
service "http" "web_proxy" {
listen_addr = "127.0.0.1:8080"
process "main" {
command = ["/usr/local/bin/awesome-app", "server"]
}
process "mgmt" {
command = ["/usr/local/bin/awesome-app", "mgmt"]
}
}
In the above example, ``io_mode`` is a top-level argument, while ``service``
introduces a block. Within the body of a block, further arguments and nested
blocks are allowed. A block type may also expect a number of *labels*, which
are the quoted names following the ``service`` keyword in the above example.
The specific keywords ``io_mode``, ``service``, ``process``, etc here are
application-defined. HCL provides the general block structure syntax, and
can validate and decode configuration based on the application's provided
schema.
HCL is a structured configuration language rather than a data structure
serialization language. This means that unlike languages such as JSON, YAML,
or TOML, HCL is always decoded using an application-defined schema.
However, HCL does have a JSON-based alternative syntax, which allows the same
structure above to be generated using a standard JSON serializer when users
wish to generate configuration programmatically rather than hand-write it:
.. code-block:: json
{
"io_mode": "async",
"service": {
"http": {
"web_proxy": {
"listen_addr": "127.0.0.1:8080",
"process": {
"main": {
"command": ["/usr/local/bin/awesome-app", "server"]
},
"mgmt": {
"command": ["/usr/local/bin/awesome-app", "mgmt"]
},
}
}
}
}
}
The calling application can choose which syntaxes to support. JSON syntax may
not be important or desirable for certain applications, but it is available for
applications that need it. The schema provided by the calling application
allows JSON input to be properly decoded even though JSON syntax is ambiguous
in various ways, such as whether a JSON object is representing a nested block
or an object expression.
The collection of arguments and blocks at a particular nesting level is called
a *body*. A file always has a root body containing the top-level elements,
and each block also has its own body representing the elements within it.
The term "attribute" can also be used to refer to what we've called an
"argument" so far. The term "attribute" is also used for the fields of an
object value in argument expressions, and so "argument" is used to refer
specifically to the type of attribute that appears directly within a body.
The above examples show the general "texture" of HCL-based configuration. The
full details of the syntax are covered in the language specifications.
.. todo:: Once the language specification documents have settled into a
final location, link them from above.
Argument Expressions
--------------------
The value of an argument can be a literal value shown above, or it may be an
expression to allow arithmetic, deriving one value from another, etc.
.. code-block:: hcl
listen_addr = env.LISTEN_ADDR
Built-in arithmetic and comparison operators are automatically available in all
HCL-based configuration languages. A calling application may optionally
provide variables that users can reference, like ``env`` in the above example,
and custom functions to transform values in application-specific ways.
Full details of the expression syntax are in the HCL native syntax
specification. Since JSON does not have an expression syntax, JSON-based
configuration files use the native syntax expression language embedded inside
JSON strings.
.. todo:: Once the language specification documents have settled into a
final location, link to the native syntax specification from above.

318
guide/language_design.rst Normal file
View File

@ -0,0 +1,318 @@
Configuration Language Design
=============================
In this section we will cover some conventions for HCL-based configuration
languages that can help make them feel consistent with other HCL-based
languages, and make the best use of HCL's building blocks.
HCL's native and JSON syntaxes both define a mapping from input bytes to a
higher-level information model. In designing a configuration language based on
HCL, your building blocks are the components in that information model:
blocks, arguments, and expressions.
Each calling application of HCL, then, effectively defines its own language.
Just as Atom and RSS are higher-level languages built on XML, HashiCorp
Terraform has a higher-level language built on HCL, while HashiCorp Nomad has
its own distinct language that is *also* built on HCL.
From an end-user perspective, these are distinct languages but have a common
underlying texture. Users of both are therefore likely to bring some
expectations from one to the other, and so this section is an attempt to
codify some of these shared expectations to reduce user surprise.
These are subjective guidelines however, and so applications may choose to
ignore them entirely or ignore them in certain specialized cases. An
application providing a configuration language for a pre-existing system, for
example, may choose to eschew the identifier naming conventions in this section
in order to exactly match the existing names in that underlying system.
Language Keywords and Identifiers
---------------------------------
Much of the work in defining an HCL-based language is in selecting good names
for arguments, block types, variables, and functions.
The standard for naming in HCL is to use all-lowercase identifiers with
underscores separating words, like ``service`` or ``io_mode``. HCL identifiers
do allow uppercase letters and dashes, but this primarily for natural
interfacing with external systems that may have other identifier conventions,
and so these should generally be avoided for the identifiers native to your
own language.
The distinction between "keywords" and other identifiers is really just a
convention. In your own language documentation, you may use the word "keyword"
to refer to names that are presented as an intrinsic part of your language,
such as important top-level block type names.
Block type names are usually singular, since each block defines a single
object. Use a plural block name only if the block is serving only as a
namespacing container for a number of other objects. A block with a plural
type name will generally contain only nested blocks, and no arguments of its
own.
Argument names are also singular unless they expect a collection value, in
which case they should be plural. For example, ``name = "foo"`` but
``subnet_ids = ["abc", "123"]``.
Function names will generally *not* use underscores and will instead just run
words together, as is common in the C standard library. This is a result of
the fact that several of the standard library functions offered in ``cty``
(covered in a later section) have names that follow C library function names
like ``substr``. This is not a strong rule, and applications that use longer
names may choose to use underscores for them to improve readability.
Blocks vs. Object Values
------------------------
HCL blocks and argument values of object type have quite a similar appearance
in the native syntax, and are identical in JSON syntax:
.. code-block:: hcl
block {
foo = bar
}
# argument with object constructor expression
argument = {
foo = bar
}
In spite of this superficial similarity, there are some important differences
between these two forms.
The most significant difference is that a child block can contain nested blocks
of its own, while an object constructor expression can define only attributes
of the object it is creating.
The user-facing model for blocks is that they generally form the more "rigid"
structure of the language itself, while argument values can be more free-form.
An application will generally define in its schema and documentation all of
the arguments that are valid for a particular block type, while arguments
accepting object constructors are more appropriate for situations where the
arguments themselves are freely selected by the user, such as when the
expression will be converted by the application to a map type.
As a less contrived example, consider the ``resource`` block type in Terraform
and its use with a particular resource type ``aws_instance``:
.. code-block:: hcl
resource "aws_instance" "example" {
ami = "ami-abc123"
instance_type = "t2.micro"
tags = {
Name = "example instance"
}
ebs_block_device {
device_name = "hda1"
volume_size = 8
volume_type = "standard"
}
}
The top-level block type ``resource`` is fundamental to Terraform itself and
so an obvious candidate for block syntax: it maps directly onto an object in
Terraform's own domain model.
Within this block we see a mixture of arguments and nested blocks, all defined
as part of the schema of the ``aws_instance`` resource type. The ``tags``
map here is specified as an argument because its keys are free-form, chosen
by the user and mapped directly onto a map in the underlying system.
``ebs_block_device`` is specified as a nested block, because it is a separate
domain object within the remote system and has a rigid schema of its own.
As a special case, block syntax may sometimes be used with free-form keys if
those keys each serve as a separate declaration of some first-class object
in the language. For example, Terraform has a top-level block type ``locals``
which behaves in this way:
.. code-block:: hcl
locals {
instance_type = "t2.micro"
instance_id = aws_instance.example.id
}
Although the argument names in this block are arbitrarily selected by the
user, each one defines a distinct top-level object. In other words, this
approach is used to create a more ergonomic syntax for defining these simple
single-expression objects, as a pragmatic alternative to more verbose and
redundant declarations using blocks:
.. code-block:: hcl
local "instance_type" {
value = "t2.micro"
}
local "instance_id" {
value = aws_instance.example.id
}
The distinction between domain objects, language constructs and user data will
always be subjective, so the final decision is up to you as the language
designer.
Standard Functions
------------------
HCL itself does not define a common set of functions available in all HCL-based
languages; the built-in language operators give a baseline of functionality
that is always available, but applications are free to define functions as they
see fit.
With that said, there's a number of generally-useful functions that don't
belong to the domain of any one application: string manipulation, sequence
manipulation, date formatting, JSON serialization and parsing, etc.
Given the general need such functions serve, it's helpful if a similar set of
functions is available with compatible behavior across multiple HCL-based
languages, assuming the language is for an application where function calls
make sense at all.
The Go implementation of HCL is built on an underlying type and function system
:go:pkg:`cty`, whose usage was introduced in :ref:`go-expression-funcs`. That
library also has a package of "standard library" functions which we encourage
applications to offer with consistent names and compatible behavior, either by
using the standard implementations directly or offering compatible
implementations under the same name.
The "standard" functions that new configuration formats should consider
offering are:
* ``abs(number)`` - returns the absolute (positive) value of the given number.
* ``coalesce(vals...)`` - returns the value of the first argument that isn't null. Useful only in formats where null values may appear.
* ``compact(vals...)`` - returns a new tuple with the non-null values given as arguments, preserving order.
* ``concat(seqs...)`` - builds a tuple value by concatenating together all of the given sequence (list or tuple) arguments.
* ``format(fmt, args...)`` - performs simple string formatting similar to the C library function ``printf``.
* ``hasindex(coll, idx)`` - returns true if the given collection has the given index. ``coll`` may be of list, tuple, map, or object type.
* ``int(number)`` - returns the integer component of the given number, rounding towards zero.
* ``jsondecode(str)`` - interprets the given string as JSON format and return the corresponding decoded value.
* ``jsonencode(val)`` - encodes the given value as a JSON string.
* ``length(coll)`` - returns the length of the given collection.
* ``lower(str)`` - converts the letters in the given string to lowercase, using Unicode case folding rules.
* ``max(numbers...)`` - returns the highest of the given number values.
* ``min(numbers...)`` - returns the lowest of the given number values.
* ``sethas(set, val)`` - returns true only if the given set has the given value as an element.
* ``setintersection(sets...)`` - returns the intersection of the given sets
* ``setsubtract(set1, set2)`` - returns a set with the elements from ``set1`` that are not also in ``set2``.
* ``setsymdiff(sets...)`` - returns the symmetric difference of the given sets.
* ``setunion(sets...)`` - returns the union of the given sets.
* ``strlen(str)`` - returns the length of the given string in Unicode grapheme clusters.
* ``substr(str, offset, length)`` - returns a substring from the given string by splitting it between Unicode grapheme clusters.
* ``timeadd(time, duration)`` - takes a timestamp in RFC3339 format and a possibly-negative duration given as a string like ``"1h"`` (for "one hour") and returns a new RFC3339 timestamp after adding the duration to the given timestamp.
* ``upper(str)`` - converts the letters in the given string to uppercase, using Unicode case folding rules.
Not all of these functions will make sense in all applications. For example, an
application that doesn't use set types at all would have no reason to provide
the set-manipulation functions here.
Some languages will not provide functions at all, since they are primarily for
assigning values to arguments and thus do not need nor want any custom
computations of those values.
Block Results as Expression Variables
-------------------------------------
In some applications, top-level blocks serve also as declarations of variables
(or of attributes of object variables) available during expression evaluation,
as discussed in :ref:`go-interdep-blocks`.
In this case, it's most intuitive for the variables map in the evaluation
context to contain an value named after each valid top-level block
type and for these values to be object-typed or map-typed and reflect the
structure implied by block type labels.
For example, an application may have a top-level ``service`` block type
used like this:
.. code-block:: hcl
service "http" "web_proxy" {
listen_addr = "127.0.0.1:8080"
process "main" {
command = ["/usr/local/bin/awesome-app", "server"]
}
process "mgmt" {
command = ["/usr/local/bin/awesome-app", "mgmt"]
}
}
If the result of decoding this block were available for use in expressions
elsewhere in configuration, the above convention would call for it to be
available to expressions as an object at ``service.http.web_proxy``.
If it the contents of the block itself that are offered to evaluation -- or
a superset object *derived* from the block contents -- then the block arguments
can map directly to object attributes, but it is up to the application to
decide which value type is most appropriate for each block type, since this
depends on how multiple blocks of the same type relate to one another, or if
multiple blocks of that type are even allowed.
In the above example, an application would probably expose the ``listen_addr``
argument value as ``service.http.web_proxy.listen_addr``, and may choose to
expose the ``process`` blocks as a map of objects using the labels as keys,
which would allow an expression like
``service.http.web_proxy.service["main"].command``.
If multiple blocks of a given type do not have a significant order relative to
one another, as seems to be the case with these ``process`` blocks,
representation as a map is often the most intuitive. If the ordering of the
blocks *is* significant then a list may be more appropriate, allowing the use
of HCL's "splat operators" for convenient access to child arguments. However,
there is no one-size-fits-all solution here and language designers must
instead consider the likely usage patterns of each value and select the
value representation that best accommodates those patterns.
Some applications may choose to offer variables with slightly different names
than the top-level blocks in order to allow for more concise references, such
as abbreviating ``service`` to ``svc`` in the above examples. This should be
done with care since it may make the relationship between the two less obvious,
but this may be a good tradeoff for names that are accessed frequently that
might otherwise hurt the readability of expressions they are embedded in.
Familiarity permits brevity.
Many applications will not make blocks results available for use in other
expressions at all, in which case they are free to select whichever variable
names make sense for what is being exposed. For example, a format may make
environment variable values available for use in expressions, and may do so
either as top-level variables (if no other variables are needed) or as an
object named ``env``, which can be used as in ``env.HOME``.
Text Editor and IDE Integrations
--------------------------------
Since HCL defines only low-level syntax, a text editor or IDE integration for
HCL itself can only really provide basic syntax highlighting.
For non-trivial HCL-based languages, a more specialized editor integration may
be warranted. For example, users writing configuration for HashiCorp Terraform
must recall the argument names for numerous different provider plugins, and so
auto-completion and documentation hovertips can be a great help, and
configurations are commonly spread over multiple files making "Go to Definition"
functionality useful. None of this functionality can be implemented generically
for all HCL-based languages since it relies on knowledge of the structure of
Terraform's own language.
Writing such text editor integrations is out of the scope of this guide. The
Go implementation of HCL does have some building blocks to help with this, but
it will always be an application-specific effort.
However, in order to *enable* such integrations, it is best to establish a
conventional file extension *other than* `.hcl` for each non-trivial HCL-based
language, thus allowing text editors to recognize it and enable the suitable
integration. For example, Terraform requires ``.tf`` and ``.tf.json`` filenames
for its main configuration, and the ``hcldec`` utility in the HCL repository
accepts spec files that should conventionally be named with an ``.hcldec``
extension.
For simple languages that are unlikely to benefit from specific editor
integrations, using the ``.hcl`` extension is fine and may cause an editor to
enable basic syntax highlighting, absent any other deeper features. An editor
extension for a specific HCL-based language should *not* match generically the
``.hcl`` extension, since this can cause confusing results for users
attempting to write configuration files targeting other applications.

Some files were not shown because too many files have changed in this diff Show More