Fixes#338
Add methods to update block type and labels to enable us to refactor HCL
configurations such as renaming Terraform resources.
- `*Block.SetType(typeName string)`
- `*Block.SetLabels(labels []string)`
Some additional notes about SetLabels:
Since we cannot assume that old and new labels are equal in length,
remove old labels and insert new ones before TokenOBrace.
To implement this, I also added the following methods.
- `*nodes.Insert(pos *node, c nodeContent) *node`
- `*nodes.InsertNode(pos *node, n *node) *node`
They are similar to the existing Append / AppendNode,
but insert a node before a given position.
This adds ValidateSpec, a new decoder Spec that allows one to add
custom validations to work with values at decode-time.
The validation is run on the value after the wrapped spec is applied to
the expression in question. Diagnostics are expected to be returned,
with the author having flexibility over whether or not they want to
specify a range; if one is not supplied, the range of the wrapped
expression is used.
Previously functions such as concat() would result in a panic if there
was a null element and a sequence, as in the included test. This PR adds
a check if the error index is outside of the range of arguments and
crafts an error that references the entire function instead of the null
argument.
The following expression caused a panic in hclwrite:
a = foo.*
This was due to the unusual dotted form of a full splat (where the splat
operator is at the end of the expression) being generated with an
invalid source range. In the full splat case, the end of the range was
uninitialized, which caused the token slice to be empty, and thus the
panic.
This commit fixes the bug, adds test coverage, and includes some bonus
tests for other splat expression cases.
The previous syntax for object and map values was a single line of
key-value pairs. For example:
object = { bar = 5, baz = true, foo = "foo" }
This is very compact, but in practice for many HCL values, less readable
and less common than a multi-line object format. This commit changes the
generated output from hclwrite to one line per attribute.
Examples of the new format:
// Empty object/map is a single line
a = {}
// Single-value object/map has the attribute on a separate line
b = {
bar = 5
}
// Multi-value object/map has one line per attribute
c = {
bar = 5
baz = true
}
When scanning JSON, upon encountering an invalid token, we immediately
return. Previously this return happened without inserting an EOF token.
Since other functions assume that a token sequence always ends in EOF,
this could cause a panic.
This commit adds a synthetic EOF token after the invalid token before
returning. While this does not match the real end-of-file of the source
JSON, it is marking the end of the scanned bytes, so it seems reasonable.
Fixes#339
HCL uses grapheme cluster segmentation to produce accurate "column"
indications in diagnostic messages and other human-oriented source
location information. Each new major version of Unicode introduces new
codepoints, some of which are defined to combine with other codepoints to
produce a single visible character (grapheme cluster).
We were previously using the rules from Unicode 9.0.0. This change
switches to using the segmentation rules from Unicode 12.0.0, which is
the latest version at the time of this commit and is also the version of
Unicode used for other purposes by the Go 1.14 runtime.
HCL does not use text segmentation results for any purpose that would
affect the meaning of decoded data extracted from HCL files, so this
change will only affect the human-oriented source positions generated for
files containing characters that were newly-introduced in Unicode 10, 11,
or 12. (Machine-oriented uses of source location information are based on
byte offsets and not affected by text segmentation.)
To allow easir adaptation of data already serialized as JSON, HCL native
syntax allows both equals signs _and_ colons for object constructors.
This was already implemented, but not reflected in the pseudo-BNF in
the specification.
The try(...) and can(...) functions are intended to make it more
convenient to work with deep data structures of unknown shape, by allowing
a caller to concisely try a complex traversal operation against a value
without having to guard against each possible failure mode individually.
These rely on the customdecode extension to get access to their argument
expressions directly, rather than only the results of evaluating those
expressions. The expressions can then be evaluated in a controlled manner
so that any resulting errors can be recognized and suppressed as
appropriate.